Loading presentation...

Present Remotely

Send the link below via email or IM

Copy

Present to your audience

Start remote presentation

  • Invited audience members will follow you as you navigate and present
  • People invited to a presentation do not need a Prezi account
  • This link expires 10 minutes after you close the presentation
  • A maximum of 30 users can follow your presentation
  • Learn more about this feature in our knowledge base article

Do you really want to delete this prezi?

Neither you, nor the coeditors you shared it with will be able to recover it again.

DeleteCancel

Navigation using Sensor Fusion of Optical Flow and Range Dat

No description
by

pranav maheshwari

on 11 December 2013

Comments (0)

Please log in to add your comment.

Report abuse

Transcript of Navigation using Sensor Fusion of Optical Flow and Range Dat

INTRODUCTION
For any mobile device, the ability to navigate in its environment is important.
Robot navigation means the robot's ability to determine its own position in its frame of reference and then to plan a path towards some goal location.
Navigation can be defined as the combination of the three fundamental competences:
1. Self-localization
2. Path planning
3. Map-building and map interpretation

Principles and Sensors for Navigation
Physical Scales
The physical scale of a device's navigation requirements can be measured by the accuracy to which the mobile robot needs to navigate - this is the resolution of navigation.
To help in categorizing this scale of requirements, we use three terms:-
• Global Navigation
• Local Navigation
• Personal Navigation
Algorithms and Methods for Navigation
Odometry provides good short-term accuracy, is inexpensive and allows very high sampling rates. However, the fundamental idea of odometry is the integration of incremental motion information over time, which leads inevitably to the accumulation of errors. These errors are of two types:
1) Systematic Errors
• Unequal wheel diameters
• Actual wheelbase differs from nominal wheelbase
• Misalignment of wheels
• Finite encoder resolution
• Finite encoder sampling rate
2) Non Systematic Errors
• Travel over uneven floors
• Travel over unexpected objects on the floor
• Wheel slippage due to:
o Slippery floors
o Over acceleration
o Fast turning (skidding)
o External forces (interaction with external bodies)
o Internal forces (castor wheels)

Vision-Based Positioning
Vision based positioning or localisation uses the same basic principles of landmark-based and map-based positioning but relies on optical sensors rather than ultrasound, dead-reckoning and inertial sensors. The advantage of these type of sensors lies in their ability to directly provide distance information needed for collision avoidance.
The most common optical sensors include laser-based range finders and photometric cameras using CCD arrays.
N
avigation using Sensor Fusion of Optical Flow and Range Data
Project By:
Bhavya Narain Gupta 2K10/EP/015
Pranav Maheshwari 2K10/EP/038
Under:
Dr. A. Srinivas Rao
Outdoor Navigation
GPS
The Global Positioning System (GPS) is a space-based satellite navigation system that provides location and time information in all weather conditions, anywhere on or near the Earth where there is an unobstructed line of sight to four or more GPS satellites. The system provides critical capabilities to military, civil and commercial users around the world.

Indoor Navigation
There are many systems that can be used for indoor navigation. These are:

Beacons
They help guide navigators to their destinations. Types of navigational beacons include radar reflectors, radio beacons, sonic and visual signals. Visual beacons range from small, single-pile structures to large lighthouses or light stations and can be located on land or on water. Lighted beacons are called lights; unlighted beacons are called day-beacons.

Environment Ranging Sensors
Most sensors used for the purpose of map building involve some kind of distance measurement. Below are the three distinct approaches to measuring range:
• Sensors based on measuring the time of flight (TOF) of a pulse of emitted energy traveling to a reflecting object, then echoing back to a receiver.
• The phase-shift measurement (or phase-detection) ranging technique involves continuous wave transmission as opposed to the short pulsed outputs used in TOF systems.
• Sensors based on frequency-modulated (FM) radar. This technique is somewhat related to the (amplitude-modulated) phase-shift measurement technique.

Time of Flight Range Sensors
The measured pulses used in TOF systems typically come from an ultrasonic, RF or optical energy source. The parameters required to calculate range are simply the speed of sound in air or the speed of light.
(a) Two transmitter configuration
(b) Three transmitter configuration
Inertial Navigation
This is an alternative method for enhancing dead reckoning. The principle of operation involves continuous sensing of minute accelerations in each of the three directional axes and integrating over time to derive velocity and position. A gyroscopically stabilized sensor platform is used to maintain consistent orientation of the three accelerometers throughout this process.

Landmark-Based Navigation
Natural Landmarks
The main problem in natural landmark navigation is to detect and match characteristic features from sensory inputs. The sensor of choice for this task is computer vision. Most computer vision-based natural landmarks are long vertical edges, such as doors and wall junctions.

Artificial Landmarks
Detection is much easier with artificial landmarks, which are designed for optimal contrast. In addition, the exact size and shape of artificial landmarks are known in advance. Many artificial landmark positioning systems are based on computer vision and some examples of typical landmarks are black rectangles with white dots in the corners, a sphere with horizontal and vertical calibration circles to achieve three-dimensional localization from a single image.

Map-based positioning (also known as "map matching"), is a technique in which the robot uses its sensors to create a map of its local environment. This local map is then compared to the global map previously stored in memory. If a match is found then the robot can compute its actual position and orientation in the environment.
Map-Based Navigation
Our Approach and Current Status
In order to control a MAV swarm, we use a three-pronged strategy that should allow autonomous navigation in GPS denied environments:
1. Use Inertial Measurement for attitude stabilization at high update rates. - Accomplished
2. Use a fusion of range and optical flow data for simultaneous localization and mapping. – In progress
3. Use optical flow data for obstacle avoidance in dynamic environments. – Future Work

Ground Speed estimation using Optical Flow
The optical flow sensor provides a 2D measurement of the angular speed of the image that is moving through its field of view, which is perfect for a mouse but by itself not that useful for a free-flying robot. It cannot differentiate in a near thing moving slowly, a far thing moving quickly or its own camera rotating. Optical flow must be integrated with other kinds of sensor data for it to make sense in 3D space.

Future Work
Multiple Robot Communications and Co-ordination
Improving Speed
Improving Accuracy

THANK YOU!
Full transcript