Introduction to Boolean Algebra

Robert P. Webber and Don Blaheta, Longwood University
Why are we working on this? As circuits become bigger and more complex, they become more unwieldy to reason about---to know when two circuits are equivalent, or when making a small change to simplify the circuit will change how it works. Boolean algebra gives us a framework to confidently reason about complex circuits, and to know when two circuits are exactly equivalent. Skills in this section: Apply a Boolean law to an expression to produce an equivalent expression --- Demonstrate that two expressions are equivalent using Boolean laws --- Simplify complicated expressions using Boolean laws Concepts: Algorithm representation, Interpretation of computed results

Boolean algebra is named for George Boole, an English logician and mathematician in the middle 1800s.  He was interested in developing rules of algebra for logical thinking, similar to the rules of algebra for numerical thinking.  Boole's genius was realizing that apparently dissimilar systems, such as symbolic logic and set theory, actually worked very much alike.  Not much was done with Boole's work until the 1930s, when a graduate student, Claude Shannon, realized that Boole's work also applied to electrical circuits—which hadn't even been invented when Boole was alive, of course!

The core idea of the Boolean algebra is this: what if we take the basic operations of arithmetic—addition and multiplication—and apply them to the numbers 0 and 1, but redefine them so the results are also always 0 and 1? Multiplication is easy: even in standard grade school arithmetic, if you multiply any combination of 0 and 1, you can only end up with 0 or 1. Adding 0 to anything is also not a problem. So the only "trick" to making the results always be 0 or 1 is to say that the Boolean version of addition will have to say that the operation 1+1 results in 1; in digital logic terms, we define addition as or. Every other fact about the Boolean algebra follows from this change.

The Boolean operators

The basic Boolean operations correspond to the basic gates we have seen.  They also correspond to standard operations in set theory and in symbolic logic.

Boolean
operator
as in Gate-logic
operator
Symbolic logic
operator
Set theory
operator
+ (sum) x+y or ∨ (disjunction) ∪ (union)
· (product) x·y and ∧ (conjunction) ∩ (intersection)
¯ (negation) x not ¬ (negation) C (complement)

The Boolean operation · is often abbreviated by writing the operands side by side and leaving out the operation symbol: xy instead of x·y, for instance.  Also, you may see the negation operation written as either a line over the expression being negated (as in x) or as a prime mark or apostrophe after the expression (as in x'). The overline notation can visually apply to arbitrarily long expressions (e.g. x + yz), but note that the prime mark, when it's used, applies to only the immediately adjacent term, and may require parentheses: (x + yz)'.

Like arithmetic operations, Boolean operations have a default order in which a series of them are performed.  This is called the precedence order or hierarchy of operations—or sometimes, just as with arithmetic, the order of operations.

Among the three Boolean operations:

Negation ¯ has the highest precedence; then
Product · has the next highest precedence; then
Sum + has the lowest precedence.

This means that in a complex expression to be evaluated, we would evaluate the negations first, then products, then sums, just like with arithmetic expressions. And just as in arithmetic order-of-operations, parentheses can be used to override the default order. You should be able to rewrite any Boolean expression as its corresponding gate-logic expression, just by rewriting each Boolean operator with its corresponding gate name, and possibly adding some parentheses.

For example,

xy is rewritten as not (x and y)
xy is rewritten as x and (not y)
xy is rewritten as (not x) and y
xy is rewritten as (not x) and (not y)

Notice that xy is not the same thing as xy, and the two are not functionally equivalent!

Equivalent expressions

Boole realized that there are some arithmetic laws that apply to these operations. In this context, a "law" is simply an equation that formally states a pattern that guarantees the equivalence of two expressions: if you have some expression that matches one side of the equation, you can transform it in accordance with the equation and the resulting expression will be true or false in exactly the same situations as the original. The collection of all the laws is called Boolean algebra.

Here are two (groups of) laws in the Boolean algebra: one that works just like the analogous laws you're familiar with from grade school arithmetic, and one that is substantially different.

Identity laws:  Every Boolean algebra has an identity element for + (called 0) and one for · (called 1). That is,

1x = x
x1 = x
and
0 + x = x
x + 0 = x

This is true regardless of the value of x; and the law applies even when the expression to be substituted for x is itself a complex expression. That is, not only can you substitute (for example) z in these equations:

0 + z = z
You can also substitute something like (xy + yz):
0 + (xy + yz) = (xy + yz)
or any other Boolean expression.

The identity laws should feel pretty familiar—they work just like the identity laws in the real-number algebra you've been using for years. But "it feels familiar" is not sufficient justification for a mathematical law, of course. We can prove these laws using truth tables.

x 0 + x 0 + x
0 0 or 0 0
1 0 or 1 1

According to this table, no matter the value of x, the result of 0 + x is the same as that value of x. This is due (in this case) to the definition of or.

It is important to realize that these symbols don't necessarily literally mean the integers 1 and 0. In set theory, for example, 1 is the universal set and 0 is the empty set. In symbolic logic, 1 is true and 0 is false. The laws of the Boolean algebra apply to all of these systems.

DeMorgan's Laws: In contrast to the identity laws, this pair of laws does not apply in any sense to the real number algebra, and to many people, the underlying fact is quite counterintuitive.

x + y = xy
x + y = xy

Expressing these laws in terms of gate-logic expressions, they tell us that a circuit to compute not (x or y) is not equivalent to not x or not y, as we might expect, but rather to not x and not y.

We can prove these by writing truth tables. Here is the proof of the first of the laws:

x y x + y x + y x y xy
0 0 0 1 1 1 1
0 1 1 0 1 0 0
1 0 1 0 0 1 0
1 1 1 0 0 0 0

Since the columns for  x + y and xy are the same, the expressions are equivalent, no matter what value might be assigned to x and y.

As with the identity law, the equivalence applies regardless of what gets substituted. If we have an expression like (ay+yz) + z, it may look complicated, but it follows the pattern: if x is substituted with (ay+yz) and y is substituted with z, we have the left side of the first DeMorgan's Law. That means that the expression (ay+yz)z must be equivalent. (Whether that's useful in any particular context is quite another question!)

Here's an example of how a teacher might make use of these laws. Suppose that students were assigned as homework the problem of designing a circuit that is equivalent to a  nand  gate using only  and, or, and  not  gates.  One student designed the circuit

(not x) or (not y)

and another designed the circuit

not (x and y) .

Are the circuits equivalent?  The answer is yes, because the first circuit is x + y and the second circuit is xy. By the second DeMorgan's Law, they are equivalent.

More Boolean Algebra Laws

Here are some more important laws of Boolean algebra. The first few work basically like real-number algebra; others are somewhat less familiar.

Commutative laws:

xy = yx
x + y = y + x

Knowing that the operations are commutative, we could have written the Identity Law a little more simply, not having to separately address the x1 and the 1x case. But we hadn't covered the commutative law yet at that point. For subsequent laws, we'll only explicitly write a form once, and assume you can apply the commutative law as necessary.

Associative laws:

x(yz) = (xy)z
x + (y + z) = (x + y) + z

Because of associativity, it's generally ok to write the sum of multiple items in a row without putting parens around each pair, as in a + b + c + d, because it actually doesn't matter how it's parenthesised. (Although, for the record, the rule is still that we process left-to-right when nothing takes precedence.)

Double negation:

x = x

That is, not (not x)) = x.

Null laws:

0x = 0
1 + x = 1

The first one is familiar, but the second one is different! Remember, though, that + corresponds to or, and 1 corresponds to true.

Distributive laws:

a(b + c) = ab + ac
a + bc = (a + b)(a + c)

Again, one of the laws works in a seemingly familiar way: · distributes over + just as multiplication distributes over addition. But in the Boolean algebra, + also distributes over ·.

Inverse laws:

xx = 0
x + x = 1
For these laws (and all the remaining ones), you need to completely discard your arithmetic intuitions, or they will lead you astray. While the additive inverse (i.e. the negative of a number) is what you add to something to get zero, in the Boolean algebra, the result of "adding" a value to its inverse (i.e. using +) is one. We can make sense of this by remembering the corresponding gate logic and truth table. Whether x is true or false, its negation x is the other one, so x + x is essentially asking, is it the case that x is either true or false? Well, yes, it's one of those (even if we're not sure which one). So the overall expression is true. Similarly, the expression xx must be 0, or false, precisely because something cannot be both true and not true.

These laws, together with the null and identity laws, are particularly useful for simplifying expressions. If we can rearrange an expression to bring together a value with its inverse—combined with either operation—we can replace that expression with a simple 1 or 0 by applying an inverse law, and then remove it from the expression entirely by applying a null or identity law.

Idempotent laws:

xx = x
x + x = x

Again, these laws seem supremely weird from the context of real number arithmetic, but they are a direct consequence of the truth tables for and and or. (The more formal proof of these laws are left as an exercise.)

Absorption laws:

a(a + b) = a
a + ab = a

The name refers to the way that one of the terms "absorbs" the other. In the first of these, if a has to be true (to satisfy the and), then a has to be true and it doesn't matter if b is also true. Similarly, in the second one, if a being true is sufficient (to satisfy the or), then it doesn't matter if b is also true. (The more formal proof of these laws are left as an exercise.)

Using the laws of Boolean algebra

Using these rules, we can simplify Boolean expressions while being certain that we don't change their meaning. Here is an example. One way to simplify the expression (a+a)(bb) is as follows:

(a + a)(bb) = (a + a)(bb) (Double negation, because b = b )
= (a + a)b (Idempotent, because bb = b )
= 1b (Inverse, because a + a = 1 )
= b (Identity, because 1b = b )

So the circuits corresponding to (a+a)(bb) and b are equivalent! This type of manipulation is used constantly in circuit design to simplify circuits or to show two circuits are equivalent. It is quicker and less tedious than constructing truth tables—but it requires more ingenuity!

Here's another example of simplifying a circuit. Consider again the implication function, whose truth table is

A B
0 0 1
0 1 1
1 0 0
1 1 1

We saw before that it could be computed with a circuit corresponding to the gate-logic expression

(not A and not B) or (not A and B) or (A and B)
In Boolean algebra notation, we would write that same expression as
Version 1: AB + AB + AB

We'll claim that this expression (and hence the corresponding circuit) is functionally equivalent to

Version 2: AB + B
and even to
Version 3: A + B
Our proof that this is true involves starting with one version, and then applying boolean algebra laws—each of which is truth-preserving—to yield other, functionally equivalent, expressions.

Proof that versions 1 and 2 are equivalent:

AB + AB + AB = AB + (A + A)B (Distributive)
= AB + 1B (Inverse)
= AB + B (Identity)

Proof that versions 2 and 3 are equivalent:

AB + B = AB + AB + B (Absorption)
= A(B + B) + B (Distributive)
= A1 + B (Inverse)
= A + B (Identity)

Therefore, all three circuits are equivalent. Note that in the second proof, the use of absorption is a little tricky: rather than using it to directly remove terms to simplify the expression, we use it to (temporarily) add a term, one that doesn't change the truth value but can be combined with the first term so that the Distributive Law can apply.

Exercises

In exercises 1 through 8, add parentheses to each expression to clearly indicate the order of operations in the expression, and add an explicit · (dot) where appropriate to indicate where a boolean product is being computed.

  1. x + y
  2. x + y
  3. x + y
  4. x + y
  5. x + yz
  6. x + yz
  7. (x + y)z
  8. x + yz

In exercises 9 through 14, prove the law using truth tables.

  1. x = x
  2. xy = x + y
  3. x + x = x
  4. xx = x
  5. x(x + y) = x
  6. x + xy = x

In exercises 15 through 20, use laws of Boolean algebra to show the expressions are equivalent. You can refer to the Laws of Boolean Algebra page for a quick reference to all the laws covered in this article.

  1. x + ww + yz; and (x + y)(x + z)
  2. p + pq + pr + ps; and p + s
  3. a + b + a + b; and 0
  4. (a + b)(a + b); and ab + ab
  5. ab + ab; and (a + b)ab
  6. abc + abc; and ac

Credits and licensing

This article is by Robert P. Webber and Don Blaheta, licensed under a Creative Commons BY-SA 3.0 license.

Version 2015-Oct-27 19:00