Let's Make Robots!

To understand the in depth working of microcontrollers (and computers in general)...

This topic has been in my mind for a long time and my recent reading (this book) has made it even more worthy a question for me to ask-

What is puzzling me is that I know how a computer language (C++ in my case) works and I also know how basic electronics like diodes, resistors, transistors etc. work but what is the link between a language and electronics. How is an electronic device able to do the jobs such as copy, paste, move etc.

What I know- Languages are translated into binary or machine language by devices called compilers/interpretters (and assemblers in simpler languages). This binary code somehow is able to do our job. Assemblers/compilers/interpretters are themselves written in machine code(most probabaly).

Questions I need answered- How is this binary code doing our job? What kind of circuit is able to do the job? What all kinds of circuits do we need? What is an instruction set?

By understanding how the simplest computer works, I want to understand the working of all devices from microcontrollers upto the latest INTEL processors. The problem is, I'm missing the basic knowledge. I know a bit about what VLSI is and a bit about what Integrated Circuits are (a compact package of thousands of electronic components like transistors). Problem with my knowledge is that I need to know if someway I aquire a thousand odd transistors, how can I convert it to a working computer? Then again, if I'm able to wire everything up perfectly, how'll I be able to access it to perform the most basic jobs like read a file, store it, execute it etc. And for those who'll say that why use a thousand transistors, the answer is simple- I want to build my way up (or down because as technology grows, size tends to decrease). 

The simplest computer (or micro computer) I have are 12 of the idle ATMega 328PU chips I ordered from Atmel but never got around to use due to lack of materials. I can easily spare 6 of them for hacking. The next step is my INTEL Core2Duo processor and then the INTEL Core i5 processor which are to complex for me to hack and understand. So, it'll be the best if I understand the basics at ATMega.

Thanks in advance for your effort!!


UPDATED- 3/11/2012

Thank you everyone for your reply!! I didn't have my internet working for a few days so I couldn't come around to check the replies. As there are a lot of replies, it'll take me some time to read though everyone's reply (including the links they posted) and if you allow me, I'll keep on bugging you with questions.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Originally logic gates were constructed of transistors. A good explanation can be found here: http://www.cs.bu.edu/~best/courses/modules/Transistors2Gates/

The gates were then used to build "Flip-Flops" and Flip-Flops used to build memory, counters and registers. A good set of tutorials can be found here: http://www.eecs.tufts.edu/~dsculley/tutorial/

These are the basic building blocks of digital life. Just as all code can be broken down into binary 1's and 0's, all modern day digital processors boil down to gates (Usually NAND) made from 4 transistors. Admittedly, these days I am not sure if this is still true. It is what I was taught 20 years ago.

I'm not sure if it is possible or not (I have not tried) but if you wanted to experiment with building your own processor the a FPGA might be your best bet. It is basically a large collection of gates in a chip. You can reconfigure how they are connected to perform different functions.

True, and I would add to that before ~1960 there were computers like Eniac and Brainiac which took up nearly a city block and the logic gates and flip-flops were made with combinations of vaccuum tubes.

 

The Atanasoff-Berry Computer, or ABC, created at my Alma-mater just before WWII, which was ruled by SCOTUS to be the first official computer for patent considerations but never awarded a patent. It was the first machine to separate memory and processing and to do digital calculations in binary. It was only able to solve complex differential equations but it was programmable via punch card. And it was only about the size of a large desk.

I was unaware of that one myself, Max. Amazing that they had a computer that small in that era (before WWII).

I know by the late 50s/early 60s there were several contenders, some using relays for the digital switching instead of tubes, or ones like the 1959 Heathkit "computer" which was all analog. Physically, it looked a lot like the Altair and Imsai boxes of the late 1970s.

At my university we were taught using the Little Computer 3 (LC-3). Which is a simulated computer. It can also be implemented by hardware, although we never did anything like that. Well, we did implement it using Verilog. We were also shown how C was implemented using the LC3, i.e. how a certain 'for loop' would be executed. 

 

We used this book: http://highered.mcgraw-hill.com/sites/0072467509/

Simple than the AtMega CPU would be the DCPU16. It is entirely fictional and was designed as a simple emulated CPU but many of the concepts are the same as a real CPU.

 

 

Anyway, lets take a simplified CPU. It has some RAM which contain our program. Well assume a 16 bit computer, this means it can only use a ram address reaching a maximum of 16 bits long so its ram runs from 0x0000 to 0xFFFF. Each one of these memory locations stores what we refer to as a word. The word length in this case will also be 16 bits to match the CPU but strangely this is not always the case.

Aswell as this ram the CPU has a selection of what we call registers. These registers will again, not necessarily be the same number of bits as the CPU, in x86 there are a few 128 bit registers for example. But for simplicities sake we'll say that they are also 16 bits. These registers can be used much like RAM for storing some data. 2 of these registers are very important though, so important that some people exclude them from being called registers. They are the program counter and the stack pointer.

The program pointer basically tells us what RAM location we are looking at. If its says 0x3E04 then we are looking at the 0x3E04'th item in RAM.
When we power our CPU on some sort of bootloader chucks our program into the RAM somewhere and sets the stack pointer, we're going to assume its whacked at the start of ram so 0x0 (not usually the case).

Our Program counter says to look at 0x0. So the CPU pulls the data out of 0x0 (into some internal circuitry which we will ignore) for decoding. Usually this piece of data will be split into 3 sections. The opcode and 2 parameters. The opcode will be something like ADD, SUB, DIV, MUL and the parameters will consist roughly of: pop, push (for the stack, these will alter our stack pointer), next word (in which case the CPU takes the numeric value of the next word), mem location of next word (in which case the CPU takes the numeric value of the next word, goes to that location in memory and takes a numeric value from there), wordafternext and memlocationwordafternext (so the CPU does the same as before but skips a word, if PC is 0x0 we would read 0x2) or the parameter may be one of our registers.

Lets say our instruction consists of: MUL, NextWord, WordAfterNext and the PC is 0x0, 0x1 contains the value 32, 0x2 contains the value 4. The CPU sees next word so it takes the value 32 from our ram, then it sees that we need the word after that so it takes 4 from ram, then it multiplies them and stores it in the location shown by the first parameter which is 0x1 again, 0x2 still has 4 in it. 0x1 now says 128. The program counter gets jumped ahead to 0x3 for the next instruction. Programs can alter the state of the program counter if they wish (for looping) and so on.

Thats a heavily simplified and somewhat wrong version of what happens but its the basic procedure. 

How the opcodes, parameters and registers are built up are then the instruction set.

You might want to look at this project. They take you from the basic elements of logic gates to building and programming a computer.

...then the nand2tetris is probably your best option. 

I wish I had such great resources when I started learning about computers, I had to do it the hard way and read many books and magazines which touched on the topics that I was interested in.

nand2tetris is probably the fastest way to get a deep understanding of computers.