# How do computers work the way they do? When does electricity become executable logic and how?

This is a great question. I asked this myself when I used to play FIFA 99 on PC as an 11 year old kid and went on a 6-year long college journey trying to find an answer to this.

Its really complex but let me dumb it down a bit (actually a lot). Let us start right from the bottom:

• Matter is composed of atoms (we can go smaller than this but this should be enough for this question)
• Atoms have electrons and flow of these electrons is defined as electricity
• Now, to make use of these electrons, we create transistors which can store/free electricity as needed. They are stored in units of 1 (5Volts) and 0 (0 Volts)
• An 8-bit number is then represented with 8 transistors. So 8-bit representation of the number 3 will be : 0000 0011. How is that achieved in hardware? Keep 8 transistors side-by-side (called registers and memory units). Make the first 6 transistors hold 0V and the next 2 transistors hold 5V
• Now, an organization of such registers and memory makes a cpu+ram
• To make it easy to compute using the CPU, we developed machine code. This language is what essentially runs on the CPU. What do I mean by "run"? It means, keep flipping bits. If I want to perform 2+3, in machine, I would store 2 in one register (register explained above) and 3 in another register. Then I would take these values to an Adder unit which would do a mathematical add (not the same as voltage addition) and give me the reply in another register. This is what a sample machine code would look like:

80 02 F3
80 03 F4
88 F3 F4 F5

Obviously, no one understood anything with this. So we came up with an ingenuous system to make it human readable. This is called assembly language. The following piece of code represents the above mentioned numbers:

MOVI 2, REG A
MOVI 3, REG B
ADD REG A, REG B, REG C (add A and B and store in C)

where MOVI = 80
REG A = F3
REG B = F4
REG C = F5

Voila, our first coding language 🙂

• Now, assembly is too hard for humans to remember and code properly in. So they developed compilers that would convert a high level language like C to assembly language (remember, this assembly language does the actual flipping of bits)

So, a C representation of the above mentioned assembly would be:
{
int a = 2; b = 3;
c = a+b;
}

• Just like people could write poems with English and not with hand signs, we realized that with an expressive language, people could write some better programs. Then compile it to assembly. Then that would flip bits in registers. Which in turn would affect transistors, which affect flow of electrons
• With the above found expressiveness, we wrote operating systems to maximize hardware usage, since it was seen that the CPU remained idle while we fetched data from disk
• Everything from your keyboard input to mouse to desktop to windows to sound is a program written in such expressive languages, running on top of the OS
• On the OS, we developed a network stack called TCP/IP. This stack provided a standardized methodology for computers to communicate with each other
• Once that was working and we managed to hook computers to each other using cables, we went on to create WWW and http. This allowed people from different networks to communicate with each other. Note that http is a protocol. Servers and Clients are programs that follow (at least) http in addition to internal protocols.

Let's walk the other way, from software to electrons now

• When you type Google in the browser and hit Enter key, an http request is sent from your browser (the client) to Google (server)
• In your own computer, the browser is a program written in C/C++
• This gets compiled to assembly (actually browser is already compiled, you're just giving input numbers to the compiled browser)
• The operating system (windows/linux etc) and device drivers are all already compiled to assembly and are running on your machine
• When the browser assembly gets it's turn to run on the CPU, it runs the assembly
• This assembly code does flipping of bits in registers and memory
• The registers and memory are composed of transistors
• Transistors control the flow of electrons and hence electricity.

I've overly simplified it. But this is what it is in essence. There are tons of other things happening but mostly, it's different software programs interacting with each other (remember the movie matrix?).

I am glad you asked this question. Computers are man-made miracles of the highest order. No single person could have thought of all this. It has taken more than 50 years and millions of smart people to get to this point. Most computer programmers and professionals I've talked to, have an incomplete picture of what a computer actually is and (as you put it) how does electricity get converted to software.

Hope this helps.

Edit:

Judging by the popularity of this answer, it seems many people would use this as reference somewhere. I would like to point out that I have dumbed it down to the definition of trivia. I have not touched even 1% of the actual detail. Every bullet point I mention has thousands of engineers working on it every day and the details are mind-numbing.

The answer has some inaccuracies. For example, it is not true that 1 transistor represents 1 bit. In fact, a group of 6 gates (making a D Flip-flop) that does it. There are other flip-flops doing the same. However, I did not feel like going into details of gates and logic circuits.

However, the overview is true. This is essentially how you would see a computer system from the outside.

Also note that we have come much further away from this trivial description. I did not even talk about caches, coherencies, consistencies in multiprocessor systems, schedulers, microarchitectures, register files, bridges, GPU, how display works, how does BIOS work, what is init?, what do you mean when you say something is a "program"? etc.

Nor did I talk about state machines, ALUs, pipelines, power supply, how current is measured, CLOCK, system ticks, HDLs, control logic, digital circuits like mux/demux, decoders, cryptos etc.

That is just too much info to sit and post online (I do have work and social life in the real world 😉 ). Also a lot of information on wikipedia about all this seems outdated/trivial.

Edit:

If *all* of the above seems like outrageously trivial to you, let's grab beer sometime and discuss the current state of Architecture and OS. You are obviously someone who I can learn more from.

How do computers work the way they do? When does electricity become executable logic and how?