# Which computer accepts data in binary form?

## Too stupid to count to three

### Thanks to electricity and Boolean algebra, computers are still running at top form

Computers can literally only count to two, which would be considered bullshit in a human. Their arithmetic skills are actually very modest: they limit themselves to adding. All other types of arithmetic such as subtracting, multiplying and dividing are done by computers on the basis of addition.

The fact that a computer can only count to two is due to the fact that it only knows the states "power on" and "power off". Everything that the computer conjures up on the screen is ultimately based on the conversion of these two switching states into the numbers 1 and 0: from arithmetic operations to texts, images and sounds to videos.

By means of very simply structured circuits, the computer converts these two possible states into the logical functions AND, OR and NOT. The "electronic brains", as the first computers were reverently called, are therefore more like complicated light switches than a human brain.

The logical AND operation is achieved by two switches that are connected in series or one behind the other:

The logical operation OR is generated in a similarly simple manner by connecting in parallel:

The NOT circuit is also very simple: It consists of a switch that opens a contact when current flows and thus interrupts a secondary circuit. The normal switching process is thus reversed, as is the case, for example, with a relay with closed-circuit contact.

These three circuits correspond to the "conjunction", the "disjunction" and the "negation" as developed by the English mathematician George Boole over a hundred years ago in his "Algebra of Logic". Boolean algebra is the theoretical foundation of all electronic computers. It only knows the elements "true" and "not true", for which "power on" and "power off" can also stand. Using Boolean algebra, even the most complex logical operations can be translated into the language of the computer. You only need to subdivide them into a corresponding number of individual steps, which the computer then works through successively.

### Transistors replaced electromechanical relays and tubes

In the very first computers, "Stromein" actually still meant that an electromagnetically actuated switch (relay) closed or opened a switching contact. So it clicked and clacked powerfully when the computer was calculating. The computers only became quieter with the introduction of the electronic switching elements: These were initially tubes in which the voltage applied to the "grid" blocked or released the flow of current between the cathode and anode. But the tube computers were still huge machines with comparatively little computing power. When the IBM company released their first computer in 1948, it contained 12,500 tubes and 21,400 relays. And every few minutes one of the countless switching elements gave up the ghost ...

The global triumph of electronic computers could therefore only begin after the tubes had been replaced by the newly invented transistors: These electronic switching and amplifying elements consume almost no electricity and are extremely reliable. Above all, they can be accommodated in tremendously large numbers on a semiconductor chip the size of a fingernail.

Of course, you don't have a computer if you combine the three basic logic circuits appropriately and put them in large numbers on one chip. You then only get a hardwired logic circuit, which is referred to as transistor-transistor logic (TTL). TTL circuits can be found, for example, in the safety-sensitive areas of German nuclear power plants. Since such circuits do not contain date functions, they are immune to the Millennium Bug.

In a computer, the hardwired logic (hardware) basically consists of four sub-systems: control unit, arithmetic unit, main memory, input and output control. Together this makes up the central processing unit (CPU). In addition, there is a non-wired logic that controls the hardware in different ways and is referred to as software (operating system, user programs). This software makes the hardwired logic of the computer dance, so to speak. A quartz beats the beat, performing many millions of vibrations per second. With each cycle, the computer hardware performs a further work step specified by the program.

### The binary notation of numbers has been known for centuries

The trouble with that Year 2000 problem As is well known, it stemmed from the fact that the programmers wanted to save two decimal places in the year. That was the superficial impression. In reality, however, the computer does not calculate with decimal numbers. This is also due to its circuit logic, which only knows the two states "power on" - "power off". A decimal number can only be read and processed by him if it is presented to him in binary, i.e. two-valued form.

Fortunately, the philosopher and mathematician Leibniz discovered a method as early as 1679 to represent every decimal number with only the digits 1 and 0. This notation had no practical use for the time being. For Leibniz it symbolized God and nothing.

This binary notation of numbers is known as the dual system. Just as in the decimal system every digit to the left means a higher power of the number 10 (1,10,100, 1000, etc.), in the dual system every digit to the left means a higher power of the number 2 (1, 2, 4, 8, 16, 32 etc.). To avoid confusion with decimal numbers, an L is often used instead of the number 1.

Specifically, it looks like the decimal number two becomes 10 in dual notation (or L0 to avoid confusion with the decimal number ten). The decimal number ten becomes 1010 in dual notation, and one hundred becomes 1100100.

As you can see, binary numbers are a lot longer than decimal numbers. If you write the year 1999 as a binary number in a computer-friendly way, you still need eleven digits (11111001111). If you shorten it to 99, you get by with seven digits (1100011). The programmers saved four digits in the dual system when they shortened the year in the date by two decimal places.

In practice, the savings were even greater. In computer technology, eight bits - these are the smallest information units that correspond to the 1 or 0 in the dual system - are packed together to form a "byte". The associated eight bits of a byte result in 2 to the power of 8 different possible combinations. That is 256 different bit patterns - enough to represent all possible numbers, letters and other characters with only the two values ​​1 and 0.

By shortening the year in the date, the programmers saved a whole byte of eight bits, as they would have needed a total of twelve bits to represent the complete year.

Today more than ever, however, it must be doubted whether this stinginess made sense.

#### View inside the computer Z 11, which was still working with electromechanical relays.

##### My book recommendations on this topic:
• The intellectual and technical roots of the personal computer are dealt with in detail and competently by Michael Friedewald in his dissertation "The computer as a tool and medium" (GNT-Verlag 1999, 495 pp., 38.50 euros)
• How even the most complex computer operations can be traced back to a few simple functional principles is illustrated by Daniel Hillis in the book "Computerlogik" (Goldmann 2001, 192 p., 8.90 Euro)
• The fatal consequences of the factual Microsoft monopoly in the PC area are highlighted by various authors in the book "Microsoft - Media - Power - Monopoly" (edition suhrkamp 2002, 272 pp., 11. Euro)
• If you are looking for nothing more than a generally understandable overview of everything that has to do with the PC and its accessories, the book "PC Basics" by Florence Maurice (Franzis-Verlag 2004, 9.95 euros) is ideal.