Historical Engineers: George Boole, Pioneer of Algebraic LogicFebruary 01, 2020 by Tyler Charboneau
Boole’s work redefined 19th-century mathematics and would revolutionize computer programming over a century later.
A mathematician and founder of Boolean logic, George Boole guided existing algebraic thought while improving longstanding Aristotelian methods. While Boole himself wasn't an engineer, his work redefined 19th-century mathematics and would revolutionize computer programming over a century later—achievements that integrally affect the work of EEs.
George Boole. Image used courtesy of Wikimedia Commons (public domain)
Boole’s methodology thrived in a world gripped by symbolic algebra’s popularity. His algorithms unlocked immense mathematical potential, encompassing both differential equations and unique operators. Ahead of his time, George Boole helped unite a divided academic landscape during his career.
Today, Boole's work, particularly Boolean algebra, is often considered key pillars in electrical engineering.
Early Life and Education
Born to a working-class family in Lincolnshire, England, Boole inherited his zeal for learning from his father—a devotee to science and technology. From a young age, Boole taught himself high-level mathematics and language studies.
Boole's early education and career included working in private schools, founding his own school in Lincoln, Lincolnshire, and accepting a professorship at Queen’s University in Cork, Ireland.
Despite his daily responsibilities, he always made it a point to research differential equations, calculus, and variables derived from the works of Laplace and Lagrange.
Boole also invested himself in texts from Isaac Newton and habitually read select writings in the Lincoln Mechanics Institute.
Early Academic and Professional Achievements
As a budding logistician, George Boole contributed a host of papers to the academic community. Some of his most notable works penned between 1839 and 1847 included:
- Researches on the Theory of Analytical Transformations (1839)
- A paper on marrying algebra and calculus, as seen in the Philosophical Transactions of the Royal Society (1844)
- The pamphlet Mathematical Analysis of Logic (1847)
Engineers are familiar with computer architectures with ordered sequences of Boolean values ("bits"), of 32 or 64 values. Image used courtesy of pmphoto
The Royal Society awarded Boole their first gold medal for mathematics for his work on differential equations that, according to the Stanford Encyclopedia of Philosophy, "combined exponential substitution and variation of parameters with the separation of symbols method."
This interconnectedness between disciplines legitimized his logical methods.
Boole published a major summary of his ideas in 1854, dubbed An Investigation into the Laws of Thought, on Which Are Founded the Mathematical Theories of Logic and Probabilities.
He detailed his theories on logical inference soon thereafter within the Laws of Thought. In this, he summarized general methods of probability and how probabilities are logically linked.
From 1855 to 1864, Boole published 17 papers and two books, all centered on mathematics. His notoriety swelled further with his 1857 induction into Britain’s Royal Society.
The twilight of Boole’s career was unequivocally impactful. He published his Treatise on Differential Equations in 1859, followed by the Treatise on the Calculus of Finite Differences in 1860. These formed the bedrock of what is now called “Boolean logic.”
These works both introduced and expanded concepts like class symbols, elective symbols, Index Law, categorical propositions, and syllogisms. Boole’s works included manipulations to Aristotelian arguments and encompassed his General Method.
His General Method included eight distinct steps involving algebraic manipulation. These focus on converting equations into propositions—names into algebraic terms:
A naming-conversion table from Boole’s General Method, step one. Image used courtesy of Stanford Encyclopedia of Philosophy
Enduring Legacies and Impact on Engineering
Perhaps the furthest-reaching result of Boole’s research is his “Rule of 0 and 1.” This approach asserts that variables must be assigned a value of 0 or 1 and that all values are either true or false.
This binary logic also relies on three operators: AND (conjunction), OR (disjunction), and NOT (negation).
Venn diagrams for conjunction, disjunction, and complement. Image used courtesy of Tilman Piesk (public domain)
Such rules have guided the development of computer logic and digital electronics. All modern programming languages rely on Boolean logic. They’re also crucial components of set theory and statistics.
During their development, Boole couldn’t have envisioned how crucial they would be to our modern devices.
George Boole died of pneumonia in 1864 at the age of 49. Thankfully, Boole effectively passed the mathematical torch to a growing number of academics who have since made refinements to Boole's findings.
Most notably, when Claude Shannon, the "father of information theory," applied Boolean algebra to switching circuits, he founded switching algebra—an algebraic means to analyze and design circuits in terms of logic gates.
In this way, Boole’s research has formed the foundation upon which other mathematicians and engineers have innovated modern circuit design.
Do you consider Boole's research foundational in your work? In what ways?
Brush Up on the Boolean Basics
Do you have suggestions for historical engineers who have impacted modern circuit design? Drop your suggestions in the comments below.