Loading presentation...

Present Remotely

Send the link below via email or IM


Present to your audience

Start remote presentation

  • Invited audience members will follow you as you navigate and present
  • People invited to a presentation do not need a Prezi account
  • This link expires 10 minutes after you close the presentation
  • A maximum of 30 users can follow your presentation
  • Learn more about this feature in our knowledge base article

Do you really want to delete this prezi?

Neither you, nor the coeditors you shared it with will be able to recover it again.


Make your likes visible on Facebook?

Connect your Facebook account to Prezi and let your likes appear on your timeline.
You can change this under Settings & Account at any time.

No, thanks


No description

on 9 December 2013

Comments (0)

Please log in to add your comment.

Report abuse



Create multimedia reading experiences
We’ve evolved over millions of years to sense the world around us. When we encounter
something, someone or some place, we use our five natural senses which includes eye, ear,
nose, tongue mind and body to perceive information about it; that information helps us make
decisions and chose the right actions to take. But arguably the most useful information that
can help us make the right decision is not naturally perceivable with our five senses, namely
the data, information and knowledge that mankind has accumulated about everything and
which is increasingly all available online.
the miniaturization of computing devices allows us to carry computers in our
pockets, keeping us continually connected to the digital world, there is no link between our digital devices and our interactions with the physical world.
“Sixth Sense Technology”, it is the newest jargon that has proclaimed its presence in the technical arena. This technology has emerged, which has its relation to the power of these six senses. Our ordinary computers will soon be able to sense the different feelings accumulated in the surroundings and it is all a gift of the ”Sixth Sense Technology” newly introduced.
SixthSense is a wearable “gesture based” device that augments the physical world with
digital information and lets people use natural hand gestures to interact with that information.
It was developed by Pranav Mistry, a PhD student in the Fluid Interfaces Group at the MIT
Media Lab. A grad student with the Fluid Interfaces Group at MIT, he caused a storm with
his creation of SixthSense. He says that the movies “Robocop” and “Minority Report” gave
him the inspiration to create his view of a world not dominated by computers, digital
information and human robots, but one where computers and other digital devices enhance
people’s enjoyment of the physical world.
Right now, we use our “devices” (computers, mobile phones, tablets, etc.) to go into the internet and get information that we want. With SixthSense we will use a device no bigger than current cell phones and probably eventually as small as a button on our shirts to bring
the internet to us in order to interact with our world!
SixthSense will allow us to interact with our world like never before. We can get information on anything we want from anywhere within a few moments! We will not only be able to interact with things on a whole new level but also with people! One great part of the device is its ability to scan objects or even people and project out information regarding what you are looking at.
Sixth Sense in scientific (or non-scientific) terms is defined as
erception or in short (ESP). It involves the reception of information not gained through any of the five senses. Nor is it taken from any experiences from the past or known. Sixth Sense aims to more seamlessly integrate online information and tech into everyday life. By making available information needed for decision-making beyond what we have access to with our five senses, it effectively gives users a sixth sense.
Maes’ MIT group, which includes seven graduate students, were thinking about how a
person could be more integrated into the world around them and access information without having to do something like take out a phone. They initially produced a wristband that would read an Radio Frequency Identification tag to know, for example, which book a user is holding in a store.
They also had a ring that used infrared to communicate by beacon to supermarket smart shelves to give you information about products. As we grab a package of macaroni, the ring would glow red or green to tell us if the product was organic or free of peanut traces — whatever criteria we program into the system.
They wanted to make information more useful to people in real time with minimal effort
in a way that doesn’t require any behaviour changes. The wristband was getting close, but we still had to take out our cell phone to look at the information.
That’s when they struck on the idea of accessing information from the internet and
projecting it. So someone wearing the wristband could pick up a paperback in the bookstore and immediately call up reviews about the book, projecting them onto a surface in the store or doing a keyword search through the book by accessing digitized pages on Amazon or Google books.
They started with a larger projector that was mounted on a helmet. But that proved
cumbersome if someone was projecting data onto a wall then turned to speak to friend — the data would project on the friend’s face.
Now they have switched to a smaller projector and created the pendant prototype to be worn around the neck.
The SixthSense prototype is composed of a pocket projector, a mirror and a camera. The hardware components are coupled in a pendant-like mobile wearable device. Both the
projector and the camera are connected to the mobile computing device in the user’s pocket.
We can very well consider the Sixth Sense Technology as a blend of the computer and the
cell phone. It works as the device associated to it is hanged around the neck of a person and thus the projection starts by means of the micro projector attached to the device. Therefore, in course, you turn out to be a moving computer in yourself and the fingers act like a mouse and a keyboard.
The prototype was built from an ordinary webcam and a battery-powered 3M projector,
with an attached mirror — all connected to an internet-enabled mobile phone.
A webcam captures and recognises an object in view and tracks the user’s hand gestures using computer-vision based techniques.
It sends the data to the smart phone. The camera, in a sense, acts as a digital eye, seeing what the user sees. It also tracks the ovements of the thumbs and index fingers of both of the user's hands. The camera cognizes objects around you instantly, with the microprojector overlaying the information on any surface, including the object itself or your hand.
Also, a projector opens up interaction and sharing. The project itself contains a battery inside, with 3 hours of battery life. The projector projects visual information enabling surfaces, walls and physical objects around us to be used as interfaces. We want this thing to merge with the physical world in a real physical sense. You are touching that object and projecting info onto that object. The information will look like it is part of the object. A tiny LED projector displays data sent from the smart phone on any surface in view–object, wall, or person.
The usage of the mirror is significant as the projector dangles pointing downwards from the neck.
The mobile devices like Smartphone in our pockets transmit and receive voice and data anywhere and to anyone via the mobile internet.
An accompanying Smartphone runs the SixthSense software, and handles the connection to the internet. A Web-enabled smart phone
in the user’s pocket processes the video data.
Other software searches the Web and interprets
the hand gestures.
It is at the tip of the user’s fingers.
Marking the user’s fingers with red, yellow, green, and blue tape helps the webcam recognize gestures.
The movements and arrangements of these makers are interpreted into gestures that act as interaction instructions for the projected application interfaces.
-The hardware that makes Sixth Sense work is a pendant like mobile wearable interface
- It has a camera, a mirror and a projector and is connected wirelessly to a Bluetooth or 3G or wifi smart phone that can slip comfortably into one’s pocket
-The camera recognizes individuals, images, pictures, gestures one makes with their
-Information is sent to the Smartphone for processing
-The down ward-facing projector projects the output image on to the mirror
-Mirror reflects image on to the desired surface
-Thus, digital information is freed from its confines and placed in the physical world
The software recognizes 3 kinds of gestures:
Multitouch gestures, like the ones you see in Microsoft Surface or the iPhone.
where you touch the screen and make the map move by pinching and dragging.
Freehand gestures, like when you take a picture [as in the photo above]. Or, you
might have noticed in the demo, because of my culture, I do a namaste gesture to start
the projection on the wall.
Iconic gestures, drawing an icon in the air. Like, whenever I draw a star, show me the
weather. When I draw a magnifying glass, show me the map. You might want to use
other gestures that you use in everyday life. This system is very customizable.
The technology is mainly based on hand gesture recognition, image capturing, processing,
and manipulation, etc. The map application lets the user navigate a map displayed on a
nearby surface using hand gestures, similar to gestures supported by multi-touch based
systems, letting the user zoom in, zoom out or pan using intuitive hand movements. The
drawing application lets the user draw on any surface by tracking the fingertip movements of
the user’s index finger.
is a term for a live direct or indirect view of a physical realworld environment whose elements are augmented by virtual computer-generated imagery. It is related to a more general concept called mediated reality in which a view of reality is modified (possibly even diminished rather than augmented) by a computer. The augmentation is onventionally in real-time and in semantic context with environmental elements.
Sixth sense technology which uses Augmented Reality concept to super imposes digital information on the physical world. With the help of advanced AR echnology (e.g. adding computer vision and object recognition) the information about the surrounding real world of the user becomes interactive and digitally usable. Artificial information about the environment and the objects in it can be stored and retrieved as an information layer on top of the real world view.
The main hardware components for augmented reality are: display, tracking, input
devices, and computer. Combination of powerful CPU, camera, accelerometers, GPS and
solid state compass are often present in modern Smartphone, which make them prospective
There are three major display
techniques for Augmented Reality:
Head Mounted Displays
Handheld Displays
Spatial Displays
Gesture recognition is a topic in computer science and language technology with the goal of interpreting human gestures via mathematical algorithms.
Gestures can originate from any bodily motion or state but commonly originate from the face or hand. Current focuses in the field include emotion recognition from the face and hand gesture recognition. Many approaches have been made using cameras and computer vision algorithms to interpret sign language.
Gesture recognition can be seen as a way for computers to begin to understand human
body language,
Computer vision

the science and technology of machines that can see.
As a scientific discipline, computer vision is concerned with the theory behind artificial systems that extract information from images. The image data can take many forms, such as video sequences, views from multiple cameras, or multi-dimensional data from a medical scanner.
Computer vision, on the other hand, studies and describes the processes implemented in software and hardware behind artificial vision systems.
SixthSense technology takes a different approach to computing and tries to make the
digital aspect of our lives more intuitive, interactive and, above all, more natural. When you
bring in connectivity, you can get instant, relevant visual information projected on any object
you pick up or interact with. So, pick up a box of cereal and your device will project whether
it suits your preferences.
The SixthSense prototype implements several applications that demonstrate the usefulness, viability and flexibility of the system.
The SixthSense device has a huge number of applications. The following are few of the applications of Sixth Sense Technology.
Drawing application
Zooming features
Get product information
Get book information
Get flight updates
Feed information on people
Take pictures
Check the email
You can use the Sixth Sense to project a keypad onto your hand, then use that virtual keypad to make a call.
Calling a number also will not be a great task with the introduction of Sixth Sense Technology.
The sixth sense also implements map which lets the user display the map on any physical surface and find his destination and he can use his thumbs and index fingers to
navigate the map, for example, to zoom in and out and do other controls.
Sixth Sense all we have to do is draw a circle on our wrist with our index finger to get a virtual
watch that gives us the correct time. The computer tracks the red marker cap or piece of tape,
recognizes the gesture, and instructs the projector to flash the image of a watch onto his wrist.
The SixthSense system also augments physical objects the user is interacting with by
projecting more information about these objects projected on them. For example, a
newspaper can show live video news or dynamic information can be provided on a regular.
The drawing application lets the user draw on any surface by tracking
the fingertip movements of the user’s index finger.
The user can zoom in or zoom out using intuitive hand movements.
Sixth Sense uses image recognition or marker technology to recognize products you pick up, then feeds you information on those products. For example, if you're trying to shop "green" and are looking for paper towels with the least amount of bleach in
them, the system will scan the product you pick up off the shelf and give you guidance on whether this product is a good choice for you.
Sixth Sense uses image recognition or marker technology to recognize products you pick up, then feeds you information on books.
The system can project Amazon ratings on that book, as well as reviews and other relevant information.
If we fashion our index fingers and thumbs into a square (the typical "framing" gesture), the system will snap a photo. After taking the desired number of photos, we can project them onto a surface, and use gestures to sort through the photos, and organize and resize them.
The system will recognize your boarding pass and let you know whether your flight is on time and if the gate has changed.
Sixth Sense also is capable of "a more controversial use”.
When you go out and meet someone, projecting relevant information such as what they do, where they work, and also it could display tags about the person floating on their shirt. It could be handy if it displayed their facebook relationship status so that you knew not to waste your time.
all we have to do is Gesture an "@" with our index finger or select it from menu to log in the email service.
SixthSense is an user friendly interface which integrates digital information into
the physical world and its objects, making the entire world your computer.
SixthSense does not change human habits but causes computer and other
machines to adapt to human needs.
It uses hand gestures to interact with digital information.
Supports multi-touch and multi-user interaction
Data access directly from machine in real time
It is an open source and cost effective and we can mind map the idea anywhere
It is gesture-controlled wearable computing device that feeds our relevant
information and turns any surface into an interactive display.
It is portable and easy to carry as we can wear it in our neck.
The device could be used by anyone without even a basic knowledge of a
keyboard or mouse.
There is no need to carry a camera anymore. If we are going for a holiday, then
from now on wards it will be easy to capture photos by using mere fingers
To get rid of color markers
To incorporate camera and projector inside mobile computing device.
Whenever we place pendant- style wearable device on table, it should allow us to
use the table as multi touch user interface.
Applying this technology in various interest like gaming, education systems etc.
To have 3D gesture tracking.
To make sixth sense work as fifth sense for disabled person.
Full transcript