## (2,3): Archaic Laws, Forgotten Names ### Or: The History of Logic --- > _When I was a student, even the topologists regarded mathematical logicians as living in outer space. Today the connections between logic and computers are a matter of engineering practice at every level of computer organization._ > -Martin Davis, Influences of Logic in Computer Science --- 0: Ok so naturally, the history of logic starts... ![[aristotle-organon.jpg|300]] 0: With Aristotle in the ORGA--- 1: NO. No way we're starting with Aristotle. 0: Just teasing. 1: Promise? 0: Promise. We wouldn't benefit much from going back that far. But that's always the risk with any proper history. In academic histories -- unlike bibles -- it's extremely hard to know where one should start. Beginnings are tricky, and deciding when they are is a hard problem. But that problem is now our problem, seeing as we've chosen _not_ to do the hard work of compiling a proper creation story, and instead chosen to be lazy by doing the much easier work of covering the History of Logic in the standard non-bible'ed academic format. 1: The _second_ of those two things is the "lazy" choice in your mind? 0: Absolutely! 1: I'm gonna need some convincing. 0: Ok well, in what follows, we're gonna see a very unusual field. 1: Kindergarten? 0: Super advanced kindergarten! See in this era of early prehistory, the study of foundations seemed a bit childish... at best... or mentally ill, at worst. 1: How do you figure that? 0: Are you kidding? Back then, nobody knew that formal theories of logic and arithmetic would eventually turn out to be all that's needed to build a machine that can emulate the logic of any other field. So what would you think if one of your colleagues, or even worse, one of your friends or family, suddenly showed up trying to study these things! Especially if you open their notebooks and find it full of strange symbolic notations, instead of something more respectable, like words. 1: I don't believe it was that bad. 0: Ok then, put your self in their shoes. 1: Whose shoes? 0: Imagine you're a serious adult mathematician, back in the 1600s, 1700s, 1800s, whenever. You're a highly esteemed professor in the mathematics or physics or philosophy department at the University of Serious Adult Academics who study serious adult things. 1: Such as... 0: Well your colleagues have various areas of expertise. One of them is doing some work on Newton's method of fluxions and fluents. 1: What's a fluxio--- 0: That's what Newton called the derivative back when he invented calculus. 1: Weird. Why did--- 0: Others of your colleagues might be studying other serious adult things, like the insolubility of the quintic by radicals, or Riemann's new prime-counting function and how it relates to his theory of manifolds. 1: What era was this? 0: All of them. The point is that in most any era of the 1600s or 1700s or 1800s, when serious adult mathematicians at serious adult universities looked around at their colleagues, they saw other serious adults doing serious adult things. 1: Where are you going with this? 0: Well one day, one of your colleagues -- who you've heard is very intelligent and capable -- submits a paper, or gives a talk on... oh dear. You haven't the faintest idea what it is. It looks like the ramblings of a mental patient, you whisper to your other colleagues, as the gossip spreads. As you probe your clearly unwell colleague for more details, you find nothing that looks familiar to you or any of your academic colleagues. Instead, you find notebooks full of pages and pages of strange symbols, symbols that aren't standard in any established area of mathematics. And worst of all, the stated goal of all this nonsense research appears to be some kind of schizophrenic desire to both: 1. "Formalize thought" with a "symbolic calculus of all mental procedures", surely too ambitious for any sensible research program by several orders of magnitude. 2. Then, in the same breath, your colleague claims that their technical work mostly focuses on "Elementary arithmetic," and attempting to secure the foundations of rationality itself by studying not the esteemed field of number theory, but the basic addition and multiplication of whole numbers, using a language they've invented themselves, and in which they've proved a series of theorems that are of precisely no use to anyone since they contain no new results and no one can read them anyway. 0: I mean it's almost as bad as if you came to study with a teacher you'd long admired from a distance, and found that instead of studying modern computing, this once great teacher had lost his mind and was now talking nonstop about bibles and asking you to help him write one! 1: I can't imagine what that's like. 0: But that's the backdrop against which the remainder of this file occurs. The history of logic has one of the strangest histories of any field. The people involved didn't build on each other's work for most of the field's history, and until somewhere between 1900 and 1930, "doing logic" meant constructing an entirely new system to capture "all mental activity" or "all reasoning" or at least a large part thereof -- from scratch -- using new symbols and new notations no one had ever seen before. In short, in these situations, it's important to be... 1: What? 0: _As Formal As Possible!_ 1: Why? 0: Because otherwise we might appear silly. 1: Excu--- _(Narrator: 0 clears 0's throat.)_ 0: _(ahem)_ --- ABSTRACT In this file, we shall demonstrate, by way of example, the necessity of the creation story as a genre for properly telling a certain class of histories of a certain class for fields, including but not limited to the history of computing. The function of a creation myth is twofold: A creation myth should be designed to be as true as possible, while at the same time allowing an effectively infinite number of historical precursors to be treated as rounding errors or archaeology. To see the need for a creation story when it comes to the history of computing, we'll begin by demonstrating what would happen to the narrative of the current book if we tried to do without one. We'll see that this attempt leads to an overemphasis on a wacky grab bag of partially cooked ideas and bizarre notations, some of which were legitimately brilliant and ahead of their time, but none of which are relevant enough to the history of computing to be worth mentioning In The Beginning. --- 1: Is there a point to all thi--- 0: Please, don't interrupt the abstract. 1: _(Rolls eyes)_ --- 0: The Creation Story data type solves this problem. Properly executed, the Creation Story is a pragmatic truncation of history at a finite time in the past, before which all events are declared not to be relevant. This is, of course, in any literal sense, untrue, in no small part because the motivation for creating anything (whether a universe or a book) has, as its cause, a set of events that came before it. Nevertheless, the Creation Story is a pragmatic necessity. If history is a Taylor Series, the Creation Story is its truncation to finitely many terms. The only alternative is the sort of history that the author of a (very good) book like "Quantum Computing Since Democritus" pokes fun at in the title, in which a field with a substantial but non-infinite history (e.g. quantum computing) is traced back to the first human in recorded history who said something roughly like "The world is made of things, not stuff." --- 1: Things not stuff? I thought this was you trying to act formal. 0: The distinction between things and stuff is formal at the linguistic level, as thing is a count noun and stuff is a mass noun, and as such we are confident that the attentive reader is capable of recognizing that the phrase "things, not stuff" is isomorphic to the phrase "discrete, not continuous." 1: _(Eyes roll harder)_ --- 0: Therefore, in this file, we will sketch the prehistory of the field that became computing, to make the case that a proper Creation Story is the right solution to the problems at hand. What follows is a brief prehistory of the field, which will be rounded to zero and treated as nonexistent in the creation story that follows. --- 1: Are you done? 0: Yes. At least with the abstract. 1: What was the point of that? 0: My point is twofold... 1: Don't start talking like that again. 0: Both to outline the purpose of the historical review below, and to make a broader point about the history of logic in general. 1: Which is? 0: It's damn hard to get started. 1: Clearly. Are you going to start soon? 0: I mean logic faces an infinitely harder "premathematics" problem than any other part of mathematics. 1: What's premathematics again? 0: Everything that doesn't show up in the books. Everything you do _before_ you figure out what the right definitions are that'll lead to the theory you end up developing. And how you ended up at the definitions you eventually chose rather than all the others. Logic has a harder problem than any other field when it comes to premathematics. When you create a notation for geometry, number theory, or calculus, you only need it to be powerful enough to talk about those areas. When creating a notation for logic, however, one needs to decide on a notation and a set of basic definitions for the formalization of potentially _all of thought._ Or if you choose to focus only on a subset of thought, then you've got an equally hard problem of deciding up front which parts to exclude and include. You see this difficulty throughout the history of logic, especially in the history of attempts to develop a notation for it, where "it" is "the thing that wasn't yet but eventually became logic." That's why it shouldn't be surprising in retrospect to see how long it took for formal logic to get started as a field. Logic was an extremely late comer, notwithstanding Aristotle or semi-formal systems like Euclid which involved some embedded reasoning about a specific content domain. The history of mathematics is littered with the bones of all sorts of mathematicians who tried and failed to develop a general system for formalizing thought. 1: For example? ## The Real History of Logic ### Or: Painful Irrelevant Historical Trash ### Or: Symbolic Horse Giver Description Language ### Or: Don't describe a Gift Horse with the mouth ### Or: Reductio Ad Confederacism ### Or: Ad hoc laws for f\*cking everything ### Or: Leviticus (again) 1: Gotta admit, after that section heading I'm curious to see what's coming. 0: Well get ready 1, it's time for _Serious Adult History!_ _(Narrator: Lights flash and music plays for like two seconds before zero proceeds.)_ 0: The following is from an 820 page book on the history of mathematical notations. 1: Oh no. ![[history-of-logic-notation-01.png|400]] 0: It starts with 400 pages about arithmetic. 1: WE'RE SKIPPING THAT. 0: Of course, we're here to cover the real life no-nonsense non-narrative History of Logic, just like you asked for. We'll be using some other books too, not just that one. Lots of content! 1: Are you just gonna torture me until I beg for a creation story instead? 0: No! This'll actually be fun. Check it out. ## Pre-Pre-History ### Pre-Pre-Pre-History ![[history-of-logic-notation-02.png]] ![[history-of-logic-notation-03.png]] ![[history-of-logic-notation-03-2.png]] ### Leibniz > _It is unworthy of excellent men to lose hours like slaves in the labour of calculation which could safely be relegated to anyone else if machines were used._ > - Leibniz, Machina Arithmetica, 1685. 0: Ok so Leibniz was an absolute badass. Invented calculus, but thought most math was kinda "meh." But he desperately wanted to build a universal system for reasoning and computing. Dude was literally the most "Foundational" a Foundational person could possibly be. > _For if praise is given to the men who have determined the number of regular solids - which is of no use, except insofar as it is pleasant to contemplate - and if it is thought to be an exercise worthy of a mathematical genius to have brought to light the more elegant properties of a conchoid or cissoid, or some other figure which rarely has any use, how much better will it be to bring under mathematical laws human reasoning, which is the most excellent and useful thing we have._ > -Leibniz 0: He also invented binary. But we're getting ahead of ourselves. Here's him saying "Most math is meh, let's formalize thought, that's way more important than your lame puzzles guys seriously." Take it away Leibniz. - Leibniz's "Something like primes" idea where he tries to represent atomic concepts by prime numbers, composite concepts by composite numbers, and universal quantification by divisibility. - Leibniz then goes on to use Cartesian notation to invent something vaguely like Boolean algebra with different notation. - These attempts look like the behavior of a distinguished professor suggesting ideas to one of their grad students, hoping the grad student's youthful energy and need for publications will drive them to explore the ideas technically. The only problem was he didn't have anyone like that to talk to. Include his comments about "They have given it no more attention than if I had related a dream." ![[history-of-logic-notation-04.png]] ![[history-of-logic-notation-05.png]] ![[history-of-logic-notation-06.png]] ![[history-of-logic-notation-06-2.png]] ![[history-of-logic-notation-06-3.png]] ## Lambert In Lambert's notation we a see a formal-ish language with the seeds of: - constants - variables - types - universal and (possibly?) existential quantification, called universality and particularity. This very clearly mirrors the definition of modern formal languages as found in (for example) Mendelson or Kleene. ![[history-of-logic-notation-07-1.png]] Lambert's notation also contains the seeds of ideas like set theoretic complement. Lambert's > and < are a bit like universal and existential quantifiers, but he also introduces mA to mean "some A" and nA to mean (I think) "all A" and the _divides by the quantifiers!_ The richness of ideas you encounter when you see mathematicians doing proper premathematics is incredible, and often has a flavor a lot like programming, since definitions are the beginning of our implementations and we deal with the need to create our own definitions or compare the value of two definitions as developers much more often than mathematicians do. Lambert's "Fire is to Heat as Cause is to Effect" is an early example of embedding algebra. Quadrangles lol. Holland's objection to Lambert's notation makes a great point: A good notation should support the implicit affordances suggested by its structure. If you have a thing that looks like a fraction, you should expect users to assume that they can clear terms from the denominator by something like multiplication. If they can't, the fraction notation is failing to support its Implicit Affordances. (Flesh this out. This needs to be a principle.) ![[history-of-logic-notation-07-2.png]] ## Castillion, Gergonne, Bolyai, Bentham In Castillion, we see "genus" and "species" concepts that show seeds of the concept of types and values. Gergonne is the first use of the backwards C notation for "is contained in." In Bolyai we see something like equality vs isomorphism, and also subset and superset. A(=)B as distinct from his "equal with respect to content" is unclear, but having three ideas here instead of two almost suggests that we have a distinction that's more like "is" vs `__eq__` in python, plus an additional concept of isomorphism. Bentham's notation covers equality of wholes or parts. ![[history-of-logic-notation-08-1.png]] ![[history-of-logic-notation-08-2.png]] ![[history-of-logic-notation-09-1.png]] ## DeMorgan - DeMorgan uses parentheses, dot, colon, and juxtaposition to stand for various boolean algebra style relations. - DeMorgan uses CASE for negation, a behavior we also see mirrored in modern regular expressions (though obviously not by direct inheritance of the idea.) - DeMorgan also uses inverse notation for the converse of a statement. - DeMorgan great quote: "First, logic is the only science which has made no progress since the revival of letters; Second, logic is the only science that has produced no growth of symbols." - At this point we're in the mid 1800s, and we arrive at Boole, when things start to look more modern, though still very different in notation. ![[history-of-logic-notation-10-1.png]] ![[history-of-logic-notation-10-2.png]] ![[history-of-logic-notation-10-3.png]] ## Boole 0: In programming, when we use booleans, do you ever wonder "Why the last name?" 1: Not following. 0: Like usually we use last names before we get to know someone. But once we're on familiar terms, we switch to the first name. So at this point it should be the first. 1: Still not following. 0: This section's about George, of the Boole family, father of Booleans. 1: IT'S A GUY?! 0: Of course! Now I agree it's a bit silly to name the two basic truth values after "some guy." But it's a choice we're probably stuck with. And because of that, I'm a firm believer that the type of `True` and `False` should be `George`. 1: Like `int x; george bool;`? 0: Right, but remember to initialize your `george` before you use him or else he'll have undefined behavior. 1: Is history always this absurd? 0: Yes. - Boole is more systematic than anyone before him. He uses the + - \* and / of ordinary arithmetic but gives them new meanings. - Boole's choice of words makes it very clear that he's attempting to formalize thought itself, not some particular content domain. - He shows that if we make the analogy of "or" with plus and "and" with times, then the distributive laws of algebra hold for concepts. Just noticing this seemingly trivial thing is a big step in the history of logic! - The simple example of z(x+y) = zx + zy (where x is men, y is women, and z is European) is HUGE in that it shows the analogy of plus and times with "or" and "and" extends in at least some cases across linguistic categories! The z here is an adjective while the x and y are nouns, and the equality goes through just fine. In a sense that's obvious, because "European" here is just the noun phrase "European person," but at first glance it's a huge step toward the goal of formalizing the laws of thought. - He then says that double application of adjectives in his system is the same as single application, which seems to be the thing that leads to the idea that the variables in the system should only take the values 0 and 1, aka "Booleans." - "The equation x(1-x)=0 represents the principle of contradiction." ![[history-of-logic-notation-11-1.png]] ![[history-of-logic-notation-11-2.png]] ## Why was this so hard? - MacFarlane 1879: The reason why Formal Logic has for so long been unable to cope with the subtlety of nature is that too much attention has been given to _pictorial notations._ Arithmetic could never be developed by means of the Roman system of notations; and Formal Logic cannot be developed so long as Barbara is represented by (holy shit insert picture wtf)... We cannot manipulate data so crudely expressed; because the nature of the symbols has not been investigated, and laws of manipulation derived from their general properties." ![[history-of-logic-notation-12-1.png]] ## Peirce 0: Check out this quote. It's one of my favorites ever. ![[history-of-logic-peirce-01.jpg]] 1: I'm having trouble understanding what he's saying. 0: Let's read it slow. I'll translate. > Much of my work will never be published. > If I can, before I die, get so much of my stuff out into the world that folks will have trouble finding it all or reading it all, then I'll feel comfortable that I've written enough, and I won't blame myself for not doing more. 1: This just sounds like a guy wanting to be remembered. 0: Keep reading. It's coming up. > I can't stand publishing stuff, but it isn't because people haven't been asking me to write more. > > It's because some ideas can only be properly conveyed in dialogue. I need someone to push back, interrupt me, ask questions. In dialogue, I can explain. > > But we aren't allowed to write like that in serious adult publications. So fuck em, I'd rather leave my stuff unpublished than write their way. 1: Aahahah. 0: Right? 1: That's a pretty loose translation but I see it now. 0: Love that quote. 1: I always felt like I love ideas and hate textbooks. 0: Same. 1: Ooh, we should write a book with dialogues! 0: Let's not get distracted. We have work to do. - Peirce has a totally wacky notation for e and pi, and ends up developing a "Logic of Relatives." - Peirce: Absolute terms are in Roman font, Relative terms are in Italic, and Conjugate terms are in a type face called Madisonian (seeing a spiritual ancestor of APL and K here.) - Bertrand Russell said of Peirce "he was one of the most original minds of the later nineteenth century and certainly the greatest American thinker ever." > The contributions of C. S. Peirce to symbolic logic are more numerous and varied than those of any other writer—at least in the nineteenth century." For Peirce, logic also encompassed much of what is now called epistemology and the philosophy of science. He saw logic as the formal branch of semiotics or study of signs, of which he is a founder, which foreshadowed the debate among logical positivists and proponents of philosophy of language that dominated 20th-century Western philosophy. Peirce's study of signs also included a tripartite theory of predication. > > -The Dynamic Read-Writable Free Encyclopedic Repository of the Modern State of Human Knowledge > Additionally, he defined the concept of abductive reasoning, as well as rigorously formulating mathematical induction and deductive reasoning. He was one of the founders of statistics. As early as 1886, he saw that logical operations could be carried out by electrical switching circuits. The same idea was used decades later to produce digital computers. > > -The Dynamic Read-Writable Free Encyclopedic Repository of the Modern State of Human Knowledge - Peirce: Include the bit from his Wikipedia page about the "negroes syllogism" that shows the confederate context he came from. - Need the "giver of a horse to a lover of a woman" example. It's too wacky and bizarre not to include. ![[history-of-logic-notation-12-2.png]] ![[history-of-logic-notation-13-1.png]] ![[history-of-logic-notation-13-2.png]] ![[history-of-logic-notation-13-3.png]] ![[history-of-logic-notation-13-4.png]] ![[history-of-logic-peirce-03.jpg]] ![[history-of-logic-peirce-02.jpg]] ![[history-of-logic-peirce-04.jpg]] ![[history-of-logic-peirce-05.jpg]] ## Grassman ![[history-of-logic-notation-14-1.png]] ![[history-of-logic-notation-14-2.png]] 1: What's Op. cit.? ![[history-of-logic-notation-15-1.png]] ![[history-of-logic-notation-15-2.png]] > _In symbolic logic we have elaborated an instrument and nothing for it to do._ > - Schröder, an early logician goto: [[lost+found/2/4]]