Logic is the study of reasoning. Logic is used in most intellectual activity, but is studied primarily in the disciplines of philosophy, mathematics, and computer science. Logic examines general forms which arguments may take, which forms are valid, and which are fallacies. It is one kind of critical thinking. In philosophy, the study of logic falls in the area of epistemology: how do we know what we know. In mathematics, it is the study of valid inferences within some formal language.

Symbolic logic is the study of symbolic abstractions that capture the formal features of logical inference. It is the area of mathematics which studies the purely formal properties of strings of symbols. The interest in this area springs from two sources. First, the symbols used in symbolic logic can be seen as representing the words used in philosophical logic. Second, the rules for manipulating symbols found in symbolic logic can be implemented on a computing machine

Symbolic logic is often divided into two branches, propositional logic and predicate logic.

Using Symbolic Logic in  Symbolic Analysis

We use propositional logic in order to express the clauses of the Symbolic language, who have been translated from Java and C++. We use predicate logic for executing the necessary traversing and simulating features of the symbolic atomistic model.

First-Order Logic

First-order logic is a formal logic used in mathematics, philosophy, linguistics, and computer science. It goes by many names, including: first-order predicate calculus, the lower predicate calculus, and predicate logic. First-order logic is distinguished from propositional logic by its use of quantifiers; each interpretation of first-order logic includes a domain of discourse over which the quantifiers range.

There are many deductive systems for first-order logic that are sound (only deriving correct results) and complete (able to derive any logically valid implication). Although the logical consequence relation is only semidecidable, much progress has been made in automated theorem proving in first-order logic. First-order logic also satisfies several metalogical theorems that make it amenable to analysis in proof theory, such as the Löwenheim–Skolem theorem and the compactness theorem.

Higher-order logic

In mathematics and logic, a higher-order logic is distinguished from first-order logic in a number of ways. One of these is the type of variables appearing in quantifications; in first-order logic, roughly speaking, it is forbidden to quantify over predicates. See second-order logic for systems in which this is permitted. Another way in which higher-order logic differs from first-order logic is in the constructions allowed in the underlying type theory. A higher-order predicate is a predicate that takes one or more other predicates as arguments. In general, a higher-order predicate of order n takes one or more predicates of order n − 1 as arguments, where n > 1. A similar remark holds for higher-order functions.

Higher-order logic, abbreviated as HOL, is also commonly used to mean higher order simple predicate logic, that is the underlying type theory is simple, not polymorphic or dependent.[1]

Higher-order logics are more expressive, but their properties, in particular with respect to model theory, make them less well-behaved for many applications. By a result of Gödel, classical higher-order logic does not admit a (recursively axiomatized) sound and complete proof calculus; however, such a proof calculus does exist which is sound and complete with respect to Henkin models.

Using Different Orders of Logic in Symbolic Analysis

Original clauses captured from the code are first-order logic. Side effects created by the original clauses are second-order logic. Side effect elements stacks the information created by the original clause with its arguments. For example, the side effect predicate created(A, B) means that symbol A created symbol B.

However, there are some ambiguities when simulating object-oriented code, freely, wherever. These ambiguities increase the order of logic, when they are not solved by the user. However, the user can in certain conditions work as a Decider to select most relevant information for the ambiguous values and types and conditions.

Output is Program Comprehension Specific Information

Different orders of logic define certain graphs connecting the relevant symbols with propositional logic and predicate logic and higher-order relations. Some parts of that information are architecture specific (module dependencies), some are behavior-oriented (side effects) and some are static.

Some links:

Advertisements