Loading presentation...

Present Remotely

Send the link below via email or IM

Copy

Present to your audience

Start remote presentation

  • Invited audience members will follow you as you navigate and present
  • People invited to a presentation do not need a Prezi account
  • This link expires 10 minutes after you close the presentation
  • A maximum of 30 users can follow your presentation
  • Learn more about this feature in our knowledge base article

Do you really want to delete this prezi?

Neither you, nor the coeditors you shared it with will be able to recover it again.

DeleteCancel

Make your likes visible on Facebook?

Connect your Facebook account to Prezi and let your likes appear on your timeline.
You can change this under Settings & Account at any time.

No, thanks

Computer Science

Computer Science Presentation: Computer Information, History, Current use of Computer science, future of Computer Science
by

David Abalos

on 22 June 2011

Comments (0)

Please log in to add your comment.

Report abuse

Transcript of Computer Science

Computer Science History computer science began long before the modern computer we know today CS progressed from mechanical inventions and mathmatical theories into modern concepts and machines Early Computing The Abacus is one of the earliest known tools used for calculations.
It wasn't an automatic machine, but it did allow the user to remember their past calculation.
These were often used in China Forefathers of Modern Computing Blaise Pascal
(1623-1662) Gottfriend Wilhelm- Von Leibniz (1646-1716) Charles Babbage
(1791-1833) Built the first Mechanical adding machine in 1642.
1 tooth gear engages a single tooth with a 10 teeth gear every revolution
This works for counting kilometers
The principle of his calculator is still used today in water meters Charles Babbage realized that many long computations consisted of operations that were regularly repeated

He theorized a design of a calculating machine which could do these operations automatically

His prototype of this "difference engine" in 1822 started work on the machine in 1823

The intention was to be steam-powered; fully automatic, capable of printing of results; and commanded by instruction program.

The machine was never built

Began Analytical Engine, which was a decimal computer that could operate on words of 50 decimals and could store 1000 numbers

It would have many of built-in operations, which allowed the machine to function in a spectific order.

Instructions would be used on punch cards Leibniz was the first to employ denotation any of several geometric concepts derived from a curve, such as abscissa, ordinate, tangent,
chord, and the perpendicular.

In the 18th century, "function" lost these geometrical associations. Back in America... Aiken's Harvard Mark-I was the first major American development in the computing race Their Next contribution was the development of the giant ENIAC machine by John W. Mauchly and J. Presper Eckert. ENIAC
(Electrical Numerical Integrator and Computer) First Machine to use 10 decimals instead of binary

First machine to use more than 2,000 vacuum tubes, using nearly 18,000 vacuum tubes

Punched-card input and output and arithmetically had 1 multiplier, 1 divider-square rooter, and 20 adders employing decimal "ring counters," Birth of Computer Science Before the 1920s, computers were human clerks that performed computations.

Many thousands of "computers" were employed in commerce, government, and research establishments.

Most of these "computers" were women, and they were known to have a degree in calculus.

After the 1920s, the expression computing machine referred to any machine that performed the work of a human computer

The Church-Turing thesis says that a mathematical method is effective if it could be set out as instructions able to be followed by a human clerk

Machines that computed with continuous values became known as the analog kind

Digital machinery, in contrast to analog, were able to render a state of a numeric value and store each individual digit.

Digital machinery used difference engines or relays before the invention of faster memory devices.

These new computers were able to perform the calculations that were performed by the previous human clerks

Since the values stored by digital machines were not bound to physical properties like analog devices, a logical computer, based on digital equipment, was able to do anything that could be described "purely mechanical." Today In the 1950s, two devices were invented to improve the computer field and cause the beginning of the computer revolution The first was the transistor. Invented in 1947 by William Shockley, John Bardeen, and Walter Brattain of Bell Labs, the transistor was fated to oust the days of vacuum tubes in computers, radios, and other electronics. Vacuum tubes were highly inefficient, required a great deal of space, and needed to be replaced often. Computers such as ENIAC had 18,000 tubes in them and housing all these tubes and cooling the rooms from the heat produced by 18,000 tubes was not cheap. Bill Gates and Paul Allen in 1975 approached Ed Roberts of MITS, the company who developed Altair, and promised to deliever a BASIC compiler. They did so, and from the sale, Microsoft was born BASIC was designed to give an interactive, easy method for upcoming computer scientists to program computers. It allowed the usage of statements such as print "hello" or let b=10. It would be a great boost for the Altair if BASIC were available, so Robert's agreed to pay for it if it worked. The two young hackers worked feverishly and finished just in time to present it to Roberts. It was a success. The two young hackers? They were William Gates and Paul Allen. They later went on to form Microsoft and produce BASIC and operating systems for various machines. The transistor promised to solve all of these problems and it did so, however,it had their problems too. The main problem was that transistors needed to be soldered together. As a result, the more complex the circuits became, the more complicated and numerous the connections between the individual transistors and the likelihood of faulty wiring increased. In 1958, this problem too was solved by Jack St. Clair Kilby of Texas Instruments. He manufactured the first integrated circuit or chip. The chip was a collection of tiny transistors which are connected together when the transistor is manufactured. Thus, the need for soldering together large numbers of transistors was practically nullified; now only connections were needed to other electronic components. In addition to saving space, the speed of the machine was now increased since there was a less distance that the electrons had to follow. Following the Altair, a veritable explosion of personal computers occurred, starting with Steve Jobs and Steve Wozniak exhibiting the first Apple II at the First West Coast Computer Faire in San Francisco. The Apple II boasted built-in BASIC, colour graphics, and a 4100 character memory for only $1298. Programs and data could be stored on an everyday audio- cassette recorder. Before the end of the fair, Wozniak and Jobs had secured 300 orders for the Apple II and from there Apple just took off. By 1984, Apple released the first generation Macintosh, which was the first computer to come with a graphical user interface(GUI) and a mouse. The GUI made the machine much more attractive to home computer users because it was easy to use. Sales of the Macintosh soared like nothing ever seen before. That brings us up to about ten years ago. Now people have their own personal graphics workstations and powerful home computers. The average computer a person might have in their home is more powerful by several orders of magnitude than a machine like ENIAC. The computer revolution has been the fastest growing technology in man's history. Computers use a variety of different software applications. An application is any program that a computer runs that enables you to get things done. This includes things like word processors for creating text, graphics packages for drawing pictures, and communication packages for moving data around the globe. Operating systems are the interface between the user and the computer, enabling the user to type high-level commands such as "format a:" into the computer, rather that issuing complex assembler or C commands. Windows is one of the numerous graphical user interfaces around that allows the user to manipulate their environment using a mouse and icons. Other examples of graphical user interfaces (GUIs) include X-Windows, which runs on UNIX® machines, or Mac OS X, which is the operating system of the Macintosh. The Future... Nanotechnology and quantum computing and algorithms The construction and analysis of algorithms and data structures is a basic and very important part of modern computer science.
Its importance increases also by the rapid development of more powerful and faster computers.
All computer programs can be described as algorithms that operate on a structured set of data, or as a concatenation of such algorithms.
To construct a large program with a reasonable time and space consumption it is essential to have efficient solutions to the problem parts. "Genetic technology harbors the potential to change the human species forever. The soon to be completed Human Genome Project will empower genetic scientists with a human biological instruction book. The genes in all our cells contain the code for proteins that provide the structure and function to all our tissues and organs. Knowing this complete code will open new horizons for treating and perhaps curing diseases that have remained mysteries for millennia. But along with the commendable and compassionate use of genetic technology comes the specter of both shadowy purposes and malevolent aims."

http://www.leaderu.com/orgs/probe/docs/humgeneng.html The Human Genome Project The End...?
Full transcript