Boolean logic is a complete system for logical operations. It was named after George Boole, an English mathematician at University College Cork who first defined an algebraic system of logic in the mid 19th century. Boolean logic has many applications in electronics, computer hardware and software. In 1938, Claude Shannon showed how electric circuits with relays were a model for Boolean logic. This fact soon proved enormously consequential with the emergence of the electronic computer.
Boolean logic is used extensively in creating computer programs particularly to control flow of the program. 'If' statements are flow control instructions which are encountered in nearly every computer language and which use boolean logic. The 'if' statement returns a value of either true or false and the program is redirected accordingly.
Boolean expressions can only output two possible values:
- True: Abbreviated as the number 1 or the letter T.
- False: Abbreviated as the number 0 or the letter F.
These three operators are the key operators in boolean expressions, AND and OR are binary operators, meaning they take in two values and output one value, NOT is a unary operator, it takes in one value and outputs one value.
Both inputs must be true for the output to be true. This operator is commutative and associative.
A conjunction of two boolean values analogous to an intersection of two sets in set theory and multiplication in arithmetic, this is evident when combining probablities in statistics; the probability of both A and B occurring is given by or . Because of the analogy to multiplication, in electronics/electrical engineering, a conjunction is written as .
When constructing switching circuits, an AND gate is reprsented as two switches in series, both switches must be closed for the current to flow.
In everyday language, or usually means one input or the other however, in boolean algebra, it means one input, the other or both i.e. at least one input must be true for the output for be true. See XOR below for the operator that corresponds to the "one or the other" definition of or. The disjunction operator is commutative and associative, it is also distributive over the conjunction operator
Just like how a conjunction is analogous to an intersection of two sets and the product of two numbers, a logical disjunction is analogous to a union (where being true is replaced by presence in one set, the other or both) and addition (the probability of event A occurring, event B occurring or both is given by adding together the probabilities). In electronics, the disjunction of two boolean values is written as .
Unlike the previous operators, a negation is a unary operator, it takes in one input and outputs one output.
In electronics, the notation is a bar over the statement.
is instead written as and would be written as .
These operators can be made from the elementary operators. You may have come across some of the symbols used to represent them in other areas of maths before, their relationship will be explained further in the sections below.
XOR (Exclusive OR)
As explained in the section dealing with the OR (Disjunction) operator, in boolean algebra, or means one, the other or both in contrast to the common meaning which is one or the other, in boolean algebra, the XOR operator means this. It is like the disjunction except if both inputs are true, it outputs false i.e. both outputs must be different from one another for the XOR operator to output true.
The implication outputs true as long as both inputs are the same and the second input can still be true if the first input is false. Unlike the conjunction and disjunction operators, it is not commutative.
The sufficiency is similar the implication, however the second input cannot be true if the first input is false and the second input may not be true if the first input is. Like the implication operator, it is not commutative. If you are swapping the order of the inputs you must change the direction the arrow faces.
Relationship to other Disciplines
Each binary operator has an analogous operation in set theory. The laws mentioned above are also valid, especially De Morgan's laws.
The relational operators used in arithmetic and algebra such as equals, greater than etc. relate two arithmetical expressions that each have a numerical value; the LHS and the RHS. Equals forms an equation. Equals does not output a numerical value but it can output binary value, despite the inputs being numerical. With that in mind, this concept can be extended to the other relational operaotrs such as not equal to (analogous to the XOR operator).
The following means that the is always true.
Boolean expressions can be represented visually as a logic circuit with the symbols seen previously, logic circuits are abstracted versions of transistor circuits which are an advanced form of switching circuit, also seen previously. Every micro-processing chip in the computer you are using have many transistors, and each of them run on the principles of boolean logic
A switch may only take two states; ON (True) or OFF (False) hence the use of boolean logic. Being limited to two states required a new way to represent more information, a decimal (base 10) system was inadequate, binary numerals (base 2) was needed.
The following logic circuit is known as an "Adder circuit". All three inputs; A, B and Cin can only be 1 or 0 because they are boolean values. The circuit has two outputs which must also be boolean values. Both of these together can represent a number in binary. The "S" represents how many units or how many 20 which is 1 in both binary and decimal, the "Cout" represents how many 21 which is 10 in binary. The highest number achievable with this is or in binary, the lowest is .
To go higher, everyday quantities such as 8, 16 or something larger such as 4294967295 requires an enormous quantity of transistors (switches) and with the invention of the micro-processor, this allowed such massive circuits to be possible. The part of a computer's CPU (Central Processing Unit) called the ALU (Arithmetic Logic Unit), as the name suggests is what does the arithmetic for the computer, and it does this using circuits built from millions of transistors, allowing it to do arithmetic with numbers far beyond 3.