Close
0%
0%

ALUMinimum

Learn deep logic optimization by implementing the fastest ALU West of the Pecos River, using no specialized arithmetic chip

Similar projects worth following
"Give a man a fish and you feed him for a day; teach a man to fish and you feed him for a lifetime"

What a greater project that one that involves education? Only learning can drive people away from extremism, obscurantism, ignorance or just plain idiocy. This is my main ideal goal for this project.

And at times when large global companies take the opportunity of the increasing technology abstraction and complication to push you to buy more of their now almost magic gadgets, learning how these computer-based devices work provides a kind of refreshingly uninterested entertainment.

In this regard, the Arithmetic and Logic Unit (ALU) plays an important role: at the heart of the Central Processing Unit (CPU) which in turns is the main part of a computer, it is often overlooked or considered as a mystical element that escapes human understanding.

Thus, the second practical goal for this project will be to learn how to build the fastest ALU based on non-specialized chip

"What is a computer, and what makes it different from all other man-made machines?"

Answering these questions leads to a surprisingly long cascade of concepts and practical decisions,. But interestingly, despite that computers are actually physical world implementations, they still keep a a very close relationship to pure concepts. Mathematics can be used to demonstrate the suitability of an abstract digital computer to solve any kind of problems using algorithms, and the Boolean algebra and its applied scientific counterpart, logic design, are the tools that helped transform these abstract machines into real ones.

Unlike abstract computers, real-world computers necessarily contain some limitations and rely on some physical architectures., Nevertheless, besides their differences, all these computer architectures still rely on the concept of processor, that executes instructions realizing logic and arithmetic operations.

Each project log will introduce another step towards the final goal of an optimized Arithmetic and Logic Unir (ALU) at the heart of the processor, made out of pure logic elements. This way, one can cherry pick a subject in particular, skip it if it is already mastered, or follow each step in turn to learn logic design with almost no prerequisite but common sense.

Beware! This project will go far beyond the classical ALU design covered in academic courses. If you ever followed one of these courses in logic design and felt yourself proud of having acquired this knowledge, but with a mixed feeling of underachievement, read on, this project is for you!

By starting from the historical and almost philosophical perspective of a computer, the goal of this project is to dive quickly into the subject of logic design, although a minimum of Boolean algebra theory will necessarily be covered. The idea is to get used to the basic rules required to use the small Boolean logic elements as building blocks into circuits that progressively realize more and more complex operations.

As one of the most complex operation, the basic arithmetic adder will be covered first, pointing out its inherent performance problem. But clever pure logic optimizations will be carried out in order to overcome this problem, and further optimizations will enable us to integrate into the adder all required logic operations at no performance cost, and eventually another last useful digital operation will be added to provide a complete, extremely optimized ALU design.

These logs will use as little theory as possible to keep them readable by everybody, and its is hoped to have them as interactive as possible based on the user comments.

Hopefully in the end, this project will result in a hardware proof of concept in the form of a real-world circuit made out of discrete logic chips on a breadboard.

ALUMinimum.circ

Logiisim file for the ALUMinimum project

circ - 126.62 kB - 04/25/2017 at 22:19

Download

  • Part 3: The Processor

    Squonk4205/01/2017 at 16:22 0 comments

      The Universal Turing Machine is a great and simple abstract structure, but is not really efficient to build a physical stored-program computer. The theoretical infinite tape model is not possible to implement in practice, and anyway, having a tape running back and forth is far from optimal for a physical storage medium.

      But the Turing machine was an abstract machine invented before the physical computer existed, and was intended to represent a virtual person executing a well known procedure, not as a physical implementation model.

      Clearly, the biggest challenge of the early computers was the memory capacity, and although Babbage's Analytical Engine was designed around 1835 to hold 1,000 numbers of 40 decimal digits each in a set of mechanical wheels, and flip-flops made up of purely electronic vacuum tubes were discovered back in 1920, tapes and punched cards were at that time (end of WWII) among the storage media with the best information density.

      Flip-flops on the other hand required to use 2 triodes to store a single binary digit, making them unpractical except for small amounts of storage, such as intermediate results, in what we would called today registers.

      In the "First Draft of a Report on the EDVAC" and attributed to Von Neumann was the first description of possible solutions to get a larger storage capacity using acoustic mercury delay line memories that were able to store 1,000 binary digits, and video camera tubes called iconoscopes, able to store up to an estimated 250,000 binary digits. At the same time, this paper introduced the concept of Random-Access Memory (RAM) for the first time. The first stored-program computer, the Manchester Baby, used a variant of the iconoscope, the Williams tube, to store 2,048 binary digits in a 64 x 32 array.

      This increase in storage memory capacities led to a clearer subdivision of the computer system as a whole. In the "First Draft of a Report on the EDVAC", Von Neumann distinguished:

      1. A Central Arithmetical part (CA)
      2. A Central Control (CC)
      3. Memory (M)
      4. An outside Recording medium (R)
      5. An Input (I)
      6. An Output (O)

      With (C) defined as being CA and CC (i.e. the Central Arithmetical and Central Control parts) put together.

      In "Proposal for Development in the Mathematics Division of an Automatic Computing Engine (ACE)", Turing details:

      1. A Dynamic Storage (DS)
      2. Temporary Storage (TS)
      3. An Input Organ (IO)
      4. An Output Organ (OO)
      5. A Logical Control (LC)
      6. A Central Arithmetic part (CA)
      7. Various 'trees" required in connection with LC and CA for the selection of the information required at any moment
      8. A Clock (CL)
      9. A Temperature control system for the delay lines
      10. Binary to decimal and decimal to binary converters
      11. A Starting device
      12. A power supply

      And although the Temporary Storage could be seen today as equivalent to RISC processor register banks, the explicit notion of processor is not clearly stated in his paper. But Turing himself admitted in 1947 that "[digital computing] Machines such as the ACE may be regarded as practical versions of this same type of [universal] machines.There is at least a very close analogy. Digital computing machines all have a central mechanism or control and some very extensive form of memory".

      Since then, the vast majority of modern computer feature a Central Processing Unit or CPU, also simply called processor, composed of a control unit and an arithmetic unit.

  • Part 2: The First Physical Digital Computer

    Squonk4204/30/2017 at 15:29 8 comments

    The second and last post on computer history...

    For a long time, the first physical digital computer was considered to be the ENIAC, but this is no longer considered true.

    Source:TexasDex at English Wikipedia GFDL or CC-BY-SA-3.0, via Wikimedia Commons

    It was not until the 70s that the fact that electronic computation had been used successfully during WWII by the Government Code and Cypher School (GC&CS) at Bletchley Park was acknowledged, where Turing played a major role by designing the electromechanical cryptologic bomb used for speeding up the codebreaking of the German Enigma machine.

    Source: Karsten Sperling, http://spiff.de/photo

    And it was not until 1983 that details on the hardware of the Colossus I machine used later during the war (February 1944) were declassified. Other details were held secret until 1986, and even today, some documents remain classified.

    Source: See page for author [Public domain], via Wikimedia Commons

    This machine was used to decipher messages from the Lorenz cipher machine used by the German High Command, and Turing was involved in it at least for some of the methods used to perform wheel-breaking, but also by recommending Tommy Flowers (with whom he worked on the bomb) to Max Newman, if not some other yet undocumented implications.

    To add to this complexity, Turing also had contacts as soon as in 1935-1938 during his Princeton years with John von Neumann who worked later on the ENIAC and the EDVAC, during his research for the first atomic bomb.

    Source: See page for author [Public domain], via Wikimedia Commons

    Because of this secrecy around Bletchley Park operation and later because of the revelation of Turing's homosexuality and death, it is very difficult to find out the truth regarding the first physical digital computer: having heroes and wise generals on the battlefield was considered for a long time to be a better image of victory during WWII than mathematicians in a dark room...

    F.H. Hinsley, official historian of GC&CS, has estimated that "the war in Europe was shortened by at least two years as a result of the signals intelligence operation carried out at Bletchley Park". But it may well be underestimated: if you consider that the Lorentz cipher was routinely cracked in July 1942 (the first battle of El Alamein was 1-27 July 1942) when for the first time the German troops were defeated, and the Mark I Colossus started its operation in February 1944 (just before D-Day), it may well be that it actually changed the war.

    Even worse, Turing's personal situation lead to strange rewriting of some part of History, amnesia, minimization of his role, if not complete credit attributions.

    But neither the Colossus nor the ENIAC could be actually considered as the first physical digital computer, both storing programs differently from numbers, and in this regards, were just massive electronic calculating machines.

    The first specification of an electronic stored-program general-purpose digital computer is von Neumann's "First Draft of a Report on the EDVAC" (dated May 1945) and contained little engineering detail, followed shortly by Turing's "Proposal for Development in the Mathematics Division of an Automatic Computing Engine (ACE)" (dated October-December 1945), which provided a relatively more complete specification. ACE and EDVAC differed fundamentally from one another; for example, ACE employed distributed processing, while EDVAC had a centralized structure.

    However, the EDVAC was only completed 6 years later (and not by von Neumann), and Turing left the National Physical Laboratory (NPL) in London in 1948, before a small pilot model of the Automatic Computing Engine was achieved in May 1950. He then joined Max Newman in September 1948 at the Royal Society Computing Machine Laboratory at Manchester University.

    This is where the Manchester Small-Scale Experimental Machine (SSEM), nicknamed Baby, ran its first program on 21 June 1948 and became the world's first stored-program computer. It was...

    Read more »

  • Part 1: The Digital Computer

    Squonk4204/14/2017 at 21:31 0 comments

      Today, digital computers are everywhere: not only on our desktops, but also in the palm of our hand as a smartphone, or behind the dashboard of our car. However, this situation is not very old: the first personal computers appeared in the 80's, mobile phones at the turn of the century, and smartphones only 10 years ago with the first iPhone. Actually, this acceleration is rather unique, since computers only appeared with WW II.

      But as far as can be traced back in History, humankind used means to represent or count items: livestock, grains, etc. First using the fingers, then stones, rods, abacus and numerous other aids.

      Other specialized devices (such as astrolabe, the Antikythera mechanism or clock towers) were invented to compute astronomical positions or for navigation use.

      Eventually, more general purpose devices were built in order to perform arbitrary sets of arithmetic or logical operations, like slide rules or (electro-)mechanical calculators.

      And despite the fact that some of these machines (like Jacquard's loom, which used punched cards to control pattern being woven) introduced to notion of programability, they cannot be considered as computers, since it applied to machine tools and not to general arithmetic or logical operations. The term of computer was used since the early 17th century to designate human computers performing long and tedious mathematical calculations.

      What would have been the first general-purpose computer is the Analytical Engine, proposed by Charles Babbage in 1834:

      Source: Bruno Barral (ByB) CC BY-SA 2.5, via Wikimedia Commons

      But unfortunately, this computer was never completed.

      Instead, at the turn of the 20th century, and because of their lower complexity, many computing machines were based on direct mapping of values to continuous physical phenomena such as electrical, mechanical or hydraulic quantities. However, because of the measurement accuracy and inherent random jitter added by various uncontrolled phenomenon, processes could not be reliably repeated with exact equivalence on these analog computers.

      In contrast, digital computers (from Latin digitus, finger) use sets of discrete symbols to represent values, and because of this indirect mapping, they are able to have repeatable results with no accuracy or random noise problem.

      And it was only in 1936 in its famous paper "On Computable Numbers, with an Application to the Entscheidungsproblem" that Alan Turing first described a class of digital computing machines now known as Turing machines. Being kind of odd by today's standards, a Turing machine can be considered having 3 distinct elements:

      1. an infinite tape, divided into a sequence of squares, each carrying a symbol from a finite alphabet
      2. a read/write head that can read the symbol contained in one single square of the tape at a time, write a new symbol there and move one square to the left or to the right
      3. a control element as a device with a finite number of internal states

      At any given time, the next operation of the Turing machine is determined by:

      1. the control element's current state
      2. the symbol read by the read head

      The operation consists in 3 parts:

      1. print a new symbol int the current square on the tape (it may be the same as the read symbol)
      2. put the control element into a new state (which may be the same as the previous state)
      3. move the read/write head one square to the left or to the right

      For its operation, a finite part of the tape is filled with a starting sequence of symbols, the remainder of the tape being left blank (containing a special blank symbol), and the read/write head is placed at a particular starting square on the tape. The Turing machine then starts to compute following its rules of operation.

      Turing Machine

      Source: wvbailey (Own work) GFDL or CC BY-SA 3.0, via Wikimedia Commons

      Despite its singular construction, the Turing machine model captures for the first time the notions of:

      1. algorithm as a self-contained sequence of actions to be performed in the control element's...
    Read more »

View all 3 project logs

Enjoy this project?

Share

Discussions

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates