Loading presentation...

Present Remotely

Send the link below via email or IM


Present to your audience

Start remote presentation

  • Invited audience members will follow you as you navigate and present
  • People invited to a presentation do not need a Prezi account
  • This link expires 10 minutes after you close the presentation
  • A maximum of 30 users can follow your presentation
  • Learn more about this feature in our knowledge base article

Do you really want to delete this prezi?

Neither you, nor the coeditors you shared it with will be able to recover it again.


Make your likes visible on Facebook?

Connect your Facebook account to Prezi and let your likes appear on your timeline.
You can change this under Settings & Account at any time.

No, thanks

Quadcopter Project

No description

Abdulla Hussein

on 23 June 2013

Comments (0)

Please log in to add your comment.

Report abuse

Transcript of Quadcopter Project

Quadcopter Project
AR.Drone 2.0
Rescue and search
Real Interaction
Video Games
Obstacles Detection & Avoidance
Autonomous Take off & Landing
Swarm Motion
Air-Ground Cooperation
(Air - Air)
Brain Control
Using artificial markers (LEDs or passive markers) attached to the leader QC and using the frontal camera on the follower QC which can detect and react according to the direction of the motion of the leader.
Using Decentralized swarm intelligence approaches (Altruism algorithm, Ant colony optimization, Artificial bee colony algorithm, Artificial immune systems).
Use the quadcopter to improve the visual navigation of mobile robots by grants a large and complete map and guide it to avoid obstacles and choosing the best path, this cooperation are suited to different tasks.
Integration between the
Autonomous Navigation
Approaches with the
Swarm Intelligence
Algorithms to locate and communicate with the rescue team.
and using the
downward camera
to detect the human been.
Presenting an Autonomous take-off and landing phases for the quadcopter during the critical situations (i.e. loosing wireless connection with the Ground Station).
The Concept:
Using the downward camera to detect the landing markers (fixed or moving), and the on-board IMU to determine the current Altitude and velocity of the vehicle to guarantees the robustness and stability of these phases.
While most of the recently researches are focused on the improving the 3D Games, a new concept of videogames is appeared which is called (Real Interaction or Alive Games).
Lots of video games are implemented to ARDrone Quadcopter and compatible with Android/Apple Devises such as (AR.Pursuit and Target Hunter) games, which are used the WIFI connection and using the forward camera and based on detecting the colorful markers.
In our project, we can implement the algorithms of human detection/recognition and/or moving target detection without using special markers.
This will give more real interaction between the players and the aerial robot.
Previous works proposed autonomous (Indoor & Outdoor) navigation systems using infrared cameras, stereo cameras or RGB-D sensors.
another researches used ultrasonic and laser detection to determine the distance from obstacles. and for outdoor environments lots of systems depends only on the GPS.
The Idea is to present a navigation system for both indoor/outdoor and GPS-denied Environments using visual odometery approaches (Optical flow motion, Region based obstacles detection, and SLAM monocular system) based on the RGB front ward camera to segment the regions, detect the borders and determine the direction of the motion.
and the use of the on-board IMU associated with Dead-Reckoning approaches for position and velocity estimation, without using additional sensors or guiding markers.
Full transcript