Loading presentation...

Present Remotely

Send the link below via email or IM


Present to your audience

Start remote presentation

  • Invited audience members will follow you as you navigate and present
  • People invited to a presentation do not need a Prezi account
  • This link expires 10 minutes after you close the presentation
  • A maximum of 30 users can follow your presentation
  • Learn more about this feature in our knowledge base article

Do you really want to delete this prezi?

Neither you, nor the coeditors you shared it with will be able to recover it again.


FYP pitch {AR.Drone Zone}

An introductory to what our FYP is all about...

Habeeb Mohamed

on 10 December 2012

Comments (0)

Please log in to add your comment.

Report abuse

Transcript of FYP pitch {AR.Drone Zone}

In this FYP, 3 students begin piloting the AR.Drone using
various controllers for a mission About Synopsis... In this FYP, a swarm of AR.Drones will perform a series of formations and take various flight paths, avoiding impending obstacles, to achieve its main mission, rescue "Big John".

Big John was injured and captured and General Prahlad has issued a Level 8 emergency operation to rescue him.

This is achieved in sequence from the Assembly Point A, and ending at HQ Point, C, where the anxious family and friends of Big John await.

Let the rescue begin.. Over at this point, Big John will be ported back to safety where company awaits him.

Happiness and joy fill the air...

As a fitting sendoff, the drones perform a musical play, orchestrated by Habeeb. Assembly Point Point A A Commander: MJ All the drones assemble here to receive their
flight instructions from Ipad. After gathering in a formation, drones will start the journey.

There will be a master drone that is controlled by the Commander. Then, all the rest of the drones will follow the master in a formation.

Along the way from A to B, drones might encounter obstacles. This will not stop the drones from continuing the mission. They will change their formation to avoid the obstacles.

After avoiding all the obstacles, all the drones will finally reach point B. KinectiZer Point B B Finale Point C C Commander: Habeeb Commander: Jason From this point, the Commander will pilot the master AR Drone using his bodily gestures, which are picked up by a Microsoft Kinect.

The slave drones will constantly be shadowing the master drone on their way to the rescue.

Upon reaching Big John, the drones will attach themselves to him and collectively lift him off to safety at Point C. Goal of Individual Component To create a natural and intuitive method of navigating and controlling an Unmanned Aerial Vehicle using a Microsoft Kinect

Overall integration

Successful simulation of SAR mission Gesture Control Movements of hands within 3D space with respect to the position of the user's torso, head and shoulders captured by Kinect.

Captured data computed by PC to translate movement to Pitch, Roll, Yaw Speed, and Vertical Speed of AR.Drone

Most applications make use of full-body gestures. However: non-intuitive, cannot alter lower level settings Current Developments Image Generator
Depth Generator
Filtered Depth Generator
Skeleton Tracker
Pitch, Roll, Yaw, Vertical speed Computation
Skeleton Smoothing
Fuzzy Inference System for Gesture Control
Successful Test Flights Future Developments Improvements on Fuzzy Inference System
Multiple Drone Control: Simultaneous/ Individual
Multiple Users controlling multiple drones on same Kinect
Vanishing Point Navigation The AR Drone and Communication Mechanism In this project the AR.Drones will be communicating to the end users, a.k.a clients, via a WAP (Wireless Access Point) So why this method of WAP (Wireless Access Point)? Firstly the AR Drone are manufactured by default to broadcast a network

That means they are able to instantiate an ad-hoc wifi network and broadcast

The DHCP server on the AR Drone issues the IP addresses to clients that connect to the Drone and when pairing is done, only one client will be able to control the drone although the rest are issued an IP address.

The current AR Drones communication range of 50 metres may not be sufficient when deployed under fields where the area of coverage is large. Thus with the help of a WAP, the range can be extended 3-4 folds to 150m-200m Using a WAP, multiple controllers can be connected through the same network in an infrastructure manner

Although ad-hoc is meant specific purpose without requiring an infrastructure, when multiple controllers are involved it is a wiser choice to use an WAP

Moreover it is provides for a simpler implementation than using an ad-hoc routing protocol as a communication medium, which is required when more than 2 devices are connected in an ad-hoc mode Continued So how are “controllers” and AR Drone any different from each other? Basically the controllers have the intelligence and they control the drones via a custom application (eg. FreeFlight 2.0, etc)

The drones cannot communicate among each other to make decisions such as collision avoidance, path navigation without the means of a controller which monitors the drone status and responds to dynamic changes. The server and its role In this project, there are 2 aspects of communication framework that has been decided to be implemented

The wireless interface: using 802.11b/g standard, the controllers and drones will intercommunicate with the help of WAP.

The 3G interface: the controllers (i.e iPad,Kinect based Pc etc.) will use this interface to interact with the MYSQL database which stores critical flight information such as commands issued, current coordinates.

This interaction to the database is handled by the server, which receives update and fetch queries from the controllers and services them. Continued The server plays an important role is enabling and disabling controllers from sending information to the drone

This is done via an application that runs on the server which responds to requests from individual controllers.

A sample program run of the draft version is as follows How will the latter version GUI look like? Point C Continued How will the drones play the music? Big John will be deported to the safe zone in Point C.

After safely landing him to the ground, the drones will assemble outside the conservatory.

A windows based application, that is to be implemented in Visual Basic, will control the Drones to play music. QR codes have been placed on the vertical beam structures within the music school.

Once a drone reaches the scanning point with the front camera facing the QR code, the image captured from the video will be processed.

The result contains the code of the appropriate file located in the database, MYSQL The windows based application will send a request to the server via a 3G connection to retrieve the appropriate file

The music codec on the file will then be played on a media player in the PC whose audio output line is connected to a stereo system. To control formation of AR-Drones in avoiding obstacles using an iPad Formation Control Approach Leader follower approach
Followers can only see the leader in one direction
Blind in other directions
If loses track of the drone, it will stop and collide with other drones which are still following Use the existing vertical camera for tracking the location of each slaves.
Resolution too poor.
Higher altitude will lose track of the tag Testing for Master Slave Approach Multiple drones controlling Create two different socket to connect to the drones.
Initial Method
Sending out the control command sequentially
Delay for the second drone to receive the command Multiple Drones Controlling Improved Method
Sending out the control command concurrently
No more delay in the system.
Switching control implemented for the formation control approach
Means only control one drone at one time
Send hovering command to the rest Vision Detect Future Development Still have overshoot in the tracking system

Due to inclination of drones when it moves.

Vertical camera tilts as the drones inclines Individual goal Gesture Control The method of controlling the AR.Drone is as follows:

Right hand movement in the X-axis translates to Roll angle of the AR.Drone

Right hand movement in the Y-axis translates to Vertical speed of the AR.Drone

Right hand movement in the Z-axis translates to Pitch angle of the AR.Drone

Left hand movement in the X-axis translates to Yaw speed of the AR.Drone 1: Dead zone
2: Control zone Default Method of Control Distance that the hand moves above the origin determines the vertical speed of the drone, not a direct translation to the distance the drone should move.
User tries to switch the movement of the drone from upwards to downwards, he would intuitively move the hand downwards.
As seen from the figure, the drone will still be moving upwards, but at a slower rate.
Will cause confusion and lack of proper control and intuitiveness for user. Control Using Fuzzy Inference System Judging from the distance of the hand away from the origin in all three axes, the magnitude of the hand’s speed and its direction of movement, the fuzzy inference system would be able to correctly realize which is the best course of movement for the drone. Input Linguistic Variables:
'Distance': 'Speed': Output Linguistic Variable:
'Drone Movement': Control Using Fuzzy Inference System New Individual goal Encountered difficulties in changing the code for control multiple drones.

Switch to Java Programming language

Reason for choosing Java
Cross platform
Ported over to Android OS
Existing Javadrone API New Formation Control Approach Master Slave approach Modify existing camera
Higher resolution
Will not affect the stability
Lose the front view from horizontal camera Testing for Master Slave Approach Solution Proposed
Buy an extra IP camera
Higher resolution
Cant use the existing detecting algorithm
Add weight to the drones and affect the stability Vision Detection Tag used for detection

Use the detecting function API from the drone.

The returned result is the location of centre of the tag Vision Detection Writing the code for moving the drone to hover at a specific location
Assume the desired location is the tag to be at the center of the image view frame.
When the detected tag is not at the center, the application will move the drone to the center and hovers when it reaches there.
simulate the concept of using the master drone to detect the location of each slave drone and move the slave drone to its desired location. Separate the image view frame into 9 regions

If the tag is detected inside the safe region, it is considered at the center of the image and the application will not move the drone

If the drone falls in the rest of the 8 regions, it will move the drone in the direction as specified in the table below with constant speed. Vision Detect Previous approach not pratical
Drone should move slower when it gets closer to the center
Convert the location from the Cartesian coordinate system to Polar coordinate system Find the next movement based on

Next Movement (Y Direction/Pitch) = sin ( theta ) * radial distance
Next Movement (X Direction/Roll) = cosine ( theta ) * radial distance Vision Detect Speed will be different depend on the radial distance from the desired position

If the drone is closed to the desired location, it should slow down to avoid too much overshoot Vision Detect Further improve on moving drone to desired location
Error parameter - Radial distance
Apply PD control loop to the tracking system to make the error parameter zero.
P gain - Accelerate the speed of getting back to radial distance zero
D gain - reduce the magnitude of overshoot
Code radiusError = radial distance (calculated from Cartesion Coordinate System)
radiusDerivative = radiusError – previousError
totalRadiusMove = Kp * radiusError + Kd * radiusDerivative Utilize the on board detecting algorithm

Little control over the type of tag to detect

Not stable when detect more than one tag

Definitely need more than one type of tags because it is needed for controlling multiple slaves drones.

Import OpenCV onto our application for image processing. Future Development
Full transcript