Loading presentation...
Prezi is an interactive zooming presentation

Present Remotely

Send the link below via email or IM

Copy

Present to your audience

Start remote presentation

  • Invited audience members will follow you as you navigate and present
  • People invited to a presentation do not need a Prezi account
  • This link expires 10 minutes after you close the presentation
  • A maximum of 30 users can follow your presentation
  • Learn more about this feature in our knowledge base article

Do you really want to delete this prezi?

Neither you, nor the coeditors you shared it with will be able to recover it again.

DeleteCancel

Make your likes visible on Facebook?

Connect your Facebook account to Prezi and let your likes appear on your timeline.
You can change this under Settings & Account at any time.

No, thanks

EMYS head

No description
by

jan kedzierski

on 18 April 2016

Comments (0)

Please log in to add your comment.

Report abuse

Transcript of EMYS head

EMYS Mk II
Open project
Robot software is distributed according to GNU GPL license
All mechanical drawings (2D and 3D) as well as a full list of commercially available parts are publicly accessible

Open source
Open hardware
electronics and wiring
aluminum support frame
commercially available parts
3D printed parts
Materials

Sensors
touch sensors made out of copper tape
up to 5 touch points (upper/lower disc, sides, nose)

modified camera Microsoft LifeCam Studio
30fps @ 720p, 15fps @ 1080p
viewing angle up to 75°
built-in microphone
optional Kinect sensor (RGB-D)
built-in 4 microphone array
allows to detect a person, silhouette, gestures, expressions, emotions, speech, sound source, etc.
Where can you find EMYS?
3x USA (Chicago, Los Angeles)
3x Poland (Wrocław)
2x Portugal (Lisboa)
3x Scotland (Edinburgh, Glasgow)
EMYS at the Museum of Science and Industry, Chicago
EMYS in the INESC-ID, Lisboa
On-board controller
Master PC controller
3rd/4th Generation Intel Quad Core i7 Platform
8GB Memory (DDR3 1333/1600MHz)
min. 120GB SSD disk
embedded graphics (Intel® HD Graphics)
Gigabit LAN controller (Intel 82579LM)
Audio Line-Out and Mic-In
Wireless LAN (optional)
Specification
Advantech DS-060
Mac Mini
Gigabyte BRIX Pro
Currently, there are 11 EMYS heads operating in different places:
project created in cooperation with ASP designers
more DoF than Roman (mouth, eyelids/brows)
two cameras (Logitech Sphere) inside eyes
all utterances prerecorded with a professional actor
all parts printed using 3D prototyping technology (SLS, FDM)
neck joints driven by Dynamixel
motors

Samuel (2010)
Roman (2009)
History
Motivation
Introduction
Robot documentation is available at:
http://www.doc.flash-robotics.com
NXP Kinetis Arm Cortex M4 CPU
4 RC servos control (with current measurement)
2 DC motor control (with current and position measurement)
5 touch sensor inputs
unified (Dynamixel) communication protocol for motors and touch sensors
debug port with shell command line
MQX RTOS
Mechanics
Software
Application programming interface
The designed control system enables accessing the robot hardware and competencies in a unified manner - using a tree structure called robot.
11 DoF
vision system (Logitech Pro 9000 webcam)
can detect a person, play prerecorded utterances, track a face, express emotions
all joints driven by RC servomotors
Kinematics
x3
x1
x2
x2
x4
Drive system
Neck pitch
Upper disc drive (1 DoF)
Head pitch and lower disc drive (2 DoF)
Neck drive (2 DoF)
Eye drive (3 DoF)
Linear drive SLN-27
EMYS Mk II Robot
mgr inż. Michał Dziergwa
dr inż. Jan Kędzierski
Wrocław, 18 April 2016
FLASH Robotics Sp. z o.o.
www.flash-robotics.com
History
Software modules
UAria
UKNearest
UKinect
UColorDetecotr
UDynamixel
UCamera
UPlayer
UMoveDetecotr
UObjectDetecotr
UFacetDetecotr
UImageTool
USpeech
URecog
URecord
UEigenfaces
UImageDisplay
USerial
UJoystick
OpenNI
UKinect
Kinect SDK
UFacebook
UMail
UBrowser
UGCalendar
robot.identity.name;

// get robot name
robot.body.arm.hand.Open(
1s
);
// open hand for 1 sec.
robot.body.neck.head.Smile(
3s
);
// smile for 3 sec.
robot.body.x.speed =
0.5
;
// set longitudinal platform speed to 0.5 m/s
robot.body.laser.getClosest(
-20, 20
);
// get the angle with the nearest object

robot.audio.speech.recognition.result;
// get speech recognition result phrase
robot.audio.musicPlayer.Play(
"song.mp3"
);
// play audio file

robot.video.camera.image;
// get image from camera
robot.video.humanDetector.position;
// get human xyz position
robot.video.rightHandColorDetector.value;
// get color of the object held in the hand

robot.ml.colorLearning.LearnFromRightHand(
"red"
);
// teach robot colors

robot.network.weather.condition.temperature;
// get the temperature from weather forecast
robot.network.mail.Check();
// check if there are new emails
robot.network.facebook.Post(
"me"
,
"Hello world!"
);
// post a message on Facebook
Robot competencies
video
ml

(machine learning)
User recognition utilizes face detector implemented in the UObjectDetector module and UEigenfaces module is responsible for user recognition.
head

Movement competencies (look around, move up/down/left/right, emotion expression,...) use Facial Action Coding System (FACS). FACS is based on AU (Action Units).
Speech imitation is realized by visualizing spoken phonemes. Every phonem has its own visem representation.
Phonemes
Description
fear
sad
angry
joy
disgust
surprise
Color detection utilizes UKinect and UImageTool modules.
Experiments
In order to examine both children’s engagement in the interaction with EMYS and whether the children are able to decode the intended expressed emotions correctly, an experiment was conducted.
The experiment was conducted in a primary school in a small village near Wrocław and involved 48 schoolchildren aged 8 to 12 years.
The robotic head was programmed to operate autonomously and to provide two game scenarios – each subject went through both scenarios. With its implemented vision system, EMYS was able to recognize the color of the toy and to react accordingly, i.e. praising or dispraising.
Experiment with EMYS
The experiment was conducted at the Wrocław Main Railway Station and involved 113 people aged 15 to 82 years.
The experiment was divided into two parts. The first - a session with eye-tracking glasses. During about 4 minutes of interaction, FLASH greeted the respondent, presented himself, and then invited to a short game. After that, test subjects were invited for an interview.
The aim of the experiment was to answer how the robot appearance and intensity
of emotional reactions influence the human perception. How people keep eye contact was also studied.
Experiments
Experiment with FLASH
The next step was to determine the set of Areas of Interest (AOIs), such as head, torso, arms, etc.
Experiments
Experiment with FLASH
The
head
of the robot is drawing
attention the

most
. A similar effect was observed by researchers studying how we perceive human faces.
Most people
glanced at the hands
, which was to be expected as the robot gesticulated lively with his fingers in the early parts of the experiment.
Results
The robot was definitely perceived as
emotional, calm and collected
, and the majority of
respondents - children and adults, declared their wish to
meet with the robot
again.
Despite the robot often being
schematic
and
not showing advanced skills
, respondents stated that he is
intelligent and smart
.
Even the youngest participants did not have any difficulty in suggesting examples of robot application. Most of them mentioned activities that kids do not like, such as
homework, cleaning the room and walking dog
. However, adults often see the robot in the role of
domestic help.
Women, especially often, saw him as a
chef
.
Results of experiments with the two robots definitely confirm that the subjects were convinced that FLASH/EMYS
posses more skill than in reality
.
In general, FLASH/EMYS aroused
positive emotions
, and participants of the experiment
felt safe around
the robot.
The quality

of
recognition of emotions
expressed by the robot was
quite high
. Participants assign many
human features (appearance and character)
to the robot.
Experiments
Another study involved long-term cohabitation with a social robot.
Experiments
Experiment with EMYS
During the experiment,
three
people with different attachment styles spent
ten days
each with an EMYS robot, which was operating
fully autonomously
.
EMYS became an affective home assistant. Everyday he served such functions as: waking up the user. checking weather forecast, playing radio, checking news, posting on/reading from Facebook, checking personal calendar, sending/receiving e-mails, etc.
Experiments
Experiment with EMYS
The conducted experiment confirmed that
endowing robots

with functions
, more often
associated with mobile devices
, adds a
new, "human" quality
into the human-machine
communication
.
Three-layer control architecture
URBI based system
Lowest layer
provides necessary hardware abstraction, and integrates low-level motion controllers, sensor systems and algorithms implemented as external software.
Middle layer
is responsible for the functions of the robot and the implementation of his competencies. It defines a set of tasks the robot will be able to perform.
Highest layer
may incorporate a dedicated decision system.
URBI is a platform that provides a robot programming language - urbiscript. It is parallel-oriented and provides managing events and components (orchestrator).
Urbi - a compact operating system for robots
dynamic script language
embedded events and parallelism,
expandable with C++/Java modules,
intuitive syntax with similarities to C++
Urbiscript:
URBI based system
Appraisal, emotion emulation and more.
Robot mind
EMYS' decision system may incorporate a custom state machine (FSM) or a comprehensive program simulating various functionalities of a human mind.
Action Unit
Movement
Joint
Visem
Visem
Phoneme
Phoneme
Visualization
Full transcript