Loading presentation...

Present Remotely

Send the link below via email or IM

Copy

Present to your audience

Start remote presentation

  • Invited audience members will follow you as you navigate and present
  • People invited to a presentation do not need a Prezi account
  • This link expires 10 minutes after you close the presentation
  • A maximum of 30 users can follow your presentation
  • Learn more about this feature in our knowledge base article

Do you really want to delete this prezi?

Neither you, nor the coeditors you shared it with will be able to recover it again.

DeleteCancel

Quantum Computing

No description
by

Thai Nguyen

on 5 September 2014

Comments (0)

Please log in to add your comment.

Report abuse

Transcript of Quantum Computing

Conventional Computing
Quantum Mechanics
the study of the nanospopic particles and their interactions with eachother
Schrodingers Cat
Quantum Computing
Conventional computers, the computers we use now, use bits to create data
Bit: represented as
1 or 0



Qubit
replacement for bit
qubit
: uses the superpostion
so the qubit is
initially both 1 and 0
T
o understand
quantum computing
, you must first learn about quantum mechanics

-Richard Feynman, quantum theorist
Double Slit Experiment
the experiment that explained the nature of light
"If you think you understand quantum mechanics, then you don't understand quantum mechanics"
Is light a particle or wave?
Particle!
Wave!!!
What you'll need:
Light source
Board with two slits
Board to detect the light
Hypothesis:
If I shine a light through two slits, then it would results in two center points of light proving that light is a particle.
Our bias tells us that it will results like this
independent variable
Dependent variable
a pattern occurred
So what's happening ?

(analyze the data)
Scientific Laws of Gravity and motion
Light
...is a wave
but how did waves created that pattern?
Here's a better example:
light source
In the early 19th century, English scientist
Thomas Young
,conducts the Double Slit Experiment
So let's change our hypothesis:
If I shine a light through two slit, then it would results in a pattern that correlates with light being a pattern.
Quantum bit
But then in the 20th century, German physicist Max Planck, proposed Planck's Law of Black Body Radiation
So let's look at the what started quantum mechanics
which proved that light comes in packets of photon, essentially being a particle
Eistein further prove this law and light become both a wave and particle
And the theory of Particle-Wave Duality was created
Particle-Wave Duality gave birth to quantum mechanics
Particle-Wave Duality is also the central concept of quantum mechanics
ok, back to the technology that
quantum mechanics
has given us
or
What Can the Qubits Do?
On the quantum level, you’re able to program the qubits to represent all possible input combinations, and to do so simultaneously
when you run an algorithm, all possible input combinations are tested at once
while the regular computer have to go through every possible input combination at a time
It requires 8 bit from a normal computer to stores
one

number between 0 and 256

8 qubits can store
all 256
numbers at the same time
10 qubits = 1024
11 qubits = 2048
12 qubits = 4096
as you can see, the amount of numbers qubits can can stores increase
exponantially
for every extra quibit
100 qubits = 1,267,650,600,228,228,401,496,703,205,376
with 500 qubits, you can store more numbers...
...than there are
atoms
in the
observable universe
This means that with these quantum computers, we can solve problems on a
level that's beyond the conventional computers
weather prediction
forensics
air traffic control
finance
Problems with many variables
When Are We Getting These
Uber-Computers?
We already have
introducing...
The D-Wave System
the D-Wave System is still in its early stages so don't expect much from right now,
Back in May, 2013
NASA

and

G
o
o
g
l
e

team up to create the
Quantum Artificial Intelligence Lab
or QuAIL
whose goal is to pioneer research on how quantum computing might help with machine learning and other hard computer science problems.
Full transcript