Send the link below via email or IMCopy
Present to your audienceStart remote presentation
- Invited audience members will follow you as you navigate and present
- People invited to a presentation do not need a Prezi account
- This link expires 10 minutes after you close the presentation
- A maximum of 30 users can follow your presentation
- Learn more about this feature in our knowledge base article
How Computers Work
Transcript of How Computers Work
int main(int argc, char **argv)
And how programming fits in...
This is a computer.
The most important part of a computer is a Central Processing Unit, or
A CPU is pretty similar to a calculator.
Each of these steps was telling the calculator to do a specific thing. Storing a number, telling it to multiply it with the next one, etc.
On a CPU, you also do things in steps. Each of these steps is called an
This is what instructions for a CPU might be like.
Load number 5 to a temporary location.
Load number 10 to a different temporary location.
Multiply the numbers in these two locations.
Your computer will go through millions of these steps each second.
The main difference with a pocket calculator is that a CPU doesn't have a person there telling it what to do at each step.
So let's think about what you would need to do to a calculator so that it too would be able to work on its own.
The CPU has such a place to store instructions. It's called memory.
CPUs have a very limited set of instructions. Not all that much more than a scientific calculator. They can load numbers, perform some math, and do a little few extra things related to computer-y stuff.
When you turn on your computer, the circuitry inside reads a known part of your hard drive to figure out where instructions for your operating system are located.
It then copies those instructions into memory.
The CPU then kicks in and starts reading the instructions located at the start of memory.
(Instructions for the operating system.)
You might be wondering why go through this process of copying instructions from the hard drive to memory. Seems kind of complicated.
Why not just put everything on memory and forget the hard disk?
It's done in a number of steps.
So let's go over how you use a calculator.
Those instructions are translated into 1s and 0s for the computer to understand. Instructions in 1s and 0s like that are in
When those instructions are written out in words instead of 1s and 0s, they're in
. A CPU wouldn't know what to do with it, but it makes it readable by people.
For a calculator to work on its own, it would need to be able to know what steps you want to run without you being there.
So those instructions would need to be written down somewhere for the CPU to read.
But if you install your operating system to a hard disk, how does it find it's way into memory?
The first instructions your computer reads when you turn it on are for your operating system.
It's a good question.
Okay so let's recap...
are the chip at the heart of a computer. They're like calculators that can run on their own.
We want stuff in memory because it's fast! 100,000 times faster than a hard drive.
But memory also needs constant power.
As soon as your computer turns off,
everything gets wiped.
Hard drives don't need constant power to
keep their data. They can also store lots
and lots of data for really cheap.
Storing data in the hard drive and running
it from memory leverages the strengths
But you do another thing when you're making a calculator that doesn't need you there for every step.
The ability to make decisions
If a calculator was running on its own, you wouldn't be there be pressing buttons. So
for the calculator would have to be stored somewhere. For a CPU, that location is memory.
So a CPU has instructions to load numbers, do math, etc.
But in addition to that it also has instructions like "
if the number you get was equal to zero, do this particular set of instructions instead of the ones you normally were going to do.
There's a few such instructions for decision making.
The CPU now has everything it needs to be fully autonomous. It has a list of steps that tell it what to do, and the ability to make decisions.
There's a lot of people out there who will program computers by writing in the few instructions the CPU will understand.
However, programming this way gets tedious. These instructions are so limited that modern programs require millions of them in order to do the simplest things.
Enter programming languages!
Programming languages are abstractions to make writing software less tedious.
For instance, take the following math...
In most programming languages it's this easy:
answer = 5 + 2 * 4 - 10;
But when written as instructions for the CPU:
Load the number 2.
Load the number 4.
Multiply the two numbers.
Load the number 5.
Add this number to the others.
Load the number 10.
Subtract this number from the others.
Still, CPUs only understand instructions. So special software called
turn the code written in these programming languages into machine language.
There's a lot of different programming languages out there. They all have different strengths.
Some are closer to the way computers actually work, so they produce fast code. They can also be more verbose, just like assembly.
Some offer more abstract ways to think about problems that make the code much simpler for the programmer. They can also be slower.
This is what programming in Python looks like...
While it may all seem like gibberish now,
What you're looking at is code from someone who has a bit of experience.
For newbies, you can start off real easy with a language like Python.
So to recap...
Computers have a CPU inside.
The CPU read a series of instructions to know what to do.
These instructions are stored in memory.
It gets really tedious to write everything as instructions like this. So we have programming languages to abstract things a little.
Good beginner programming languages