Send the link below via email or IMCopy
Present to your audienceStart remote presentation
- Invited audience members will follow you as you navigate and present
- People invited to a presentation do not need a Prezi account
- This link expires 10 minutes after you close the presentation
- A maximum of 30 users can follow your presentation
- Learn more about this feature in our knowledge base article
Do you really want to delete this prezi?
Neither you, nor the coeditors you shared it with will be able to recover it again.
Make your likes visible on Facebook?
Connect your Facebook account to Prezi and let your likes appear on your timeline.
You can change this under Settings & Account at any time.
Transcript of EMYS head
Robot software is distributed according to GNU GPL license
All mechanical drawings (2D and 3D) as well as a full list of commercially available parts are publicly accessible
electronics and wiring
aluminum support frame
commercially available parts
3D printed parts
touch sensors made out of copper tape
up to 5 touch points (upper/lower disc, sides, nose)
modified camera Microsoft LifeCam Studio
30fps @ 720p, 15fps @ 1080p
viewing angle up to 75°
optional Kinect sensor (RGB-D)
built-in 4 microphone array
allows to detect a person, silhouette, gestures, expressions, emotions, speech, sound source, etc.
Where can you find EMYS?
3x USA (Chicago, Los Angeles)
3x Poland (Wrocław)
2x Portugal (Lisboa)
3x Scotland (Edinburgh, Glasgow)
EMYS at the Museum of Science and Industry, Chicago
EMYS in the INESC-ID, Lisboa
Master PC controller
3rd/4th Generation Intel Quad Core i7 Platform
8GB Memory (DDR3 1333/1600MHz)
min. 120GB SSD disk
embedded graphics (Intel® HD Graphics)
Gigabit LAN controller (Intel 82579LM)
Audio Line-Out and Mic-In
Wireless LAN (optional)
Gigabyte BRIX Pro
Currently, there are 11 EMYS heads operating in different places:
project created in cooperation with ASP designers
more DoF than Roman (mouth, eyelids/brows)
two cameras (Logitech Sphere) inside eyes
all utterances prerecorded with a professional actor
all parts printed using 3D prototyping technology (SLS, FDM)
neck joints driven by Dynamixel
Robot documentation is available at:
NXP Kinetis Arm Cortex M4 CPU
4 RC servos control (with current measurement)
2 DC motor control (with current and position measurement)
5 touch sensor inputs
unified (Dynamixel) communication protocol for motors and touch sensors
debug port with shell command line
Application programming interface
The designed control system enables accessing the robot hardware and competencies in a unified manner - using a tree structure called robot.
vision system (Logitech Pro 9000 webcam)
can detect a person, play prerecorded utterances, track a face, express emotions
all joints driven by RC servomotors
Upper disc drive (1 DoF)
Head pitch and lower disc drive (2 DoF)
Neck drive (2 DoF)
Eye drive (3 DoF)
Linear drive SLN-27
EMYS Mk II Robot
mgr inż. Michał Dziergwa
dr inż. Jan Kędzierski
Wrocław, 18 April 2016
FLASH Robotics Sp. z o.o.
// get robot name
// open hand for 1 sec.
// smile for 3 sec.
// set longitudinal platform speed to 0.5 m/s
// get the angle with the nearest object
// get speech recognition result phrase
// play audio file
// get image from camera
// get human xyz position
// get color of the object held in the hand
// teach robot colors
// get the temperature from weather forecast
// check if there are new emails
// post a message on Facebook
User recognition utilizes face detector implemented in the UObjectDetector module and UEigenfaces module is responsible for user recognition.
Movement competencies (look around, move up/down/left/right, emotion expression,...) use Facial Action Coding System (FACS). FACS is based on AU (Action Units).
Speech imitation is realized by visualizing spoken phonemes. Every phonem has its own visem representation.
Color detection utilizes UKinect and UImageTool modules.
In order to examine both children’s engagement in the interaction with EMYS and whether the children are able to decode the intended expressed emotions correctly, an experiment was conducted.
The experiment was conducted in a primary school in a small village near Wrocław and involved 48 schoolchildren aged 8 to 12 years.
The robotic head was programmed to operate autonomously and to provide two game scenarios – each subject went through both scenarios. With its implemented vision system, EMYS was able to recognize the color of the toy and to react accordingly, i.e. praising or dispraising.
Experiment with EMYS
The experiment was conducted at the Wrocław Main Railway Station and involved 113 people aged 15 to 82 years.
The experiment was divided into two parts. The first - a session with eye-tracking glasses. During about 4 minutes of interaction, FLASH greeted the respondent, presented himself, and then invited to a short game. After that, test subjects were invited for an interview.
The aim of the experiment was to answer how the robot appearance and intensity
of emotional reactions influence the human perception. How people keep eye contact was also studied.
Experiment with FLASH
The next step was to determine the set of Areas of Interest (AOIs), such as head, torso, arms, etc.
Experiment with FLASH
of the robot is drawing
. A similar effect was observed by researchers studying how we perceive human faces.
glanced at the hands
, which was to be expected as the robot gesticulated lively with his fingers in the early parts of the experiment.
The robot was definitely perceived as
emotional, calm and collected
, and the majority of
respondents - children and adults, declared their wish to
meet with the robot
Despite the robot often being
not showing advanced skills
, respondents stated that he is
intelligent and smart
Even the youngest participants did not have any difficulty in suggesting examples of robot application. Most of them mentioned activities that kids do not like, such as
homework, cleaning the room and walking dog
. However, adults often see the robot in the role of
Women, especially often, saw him as a
Results of experiments with the two robots definitely confirm that the subjects were convinced that FLASH/EMYS
posses more skill than in reality
In general, FLASH/EMYS aroused
, and participants of the experiment
felt safe around
recognition of emotions
expressed by the robot was
. Participants assign many
human features (appearance and character)
to the robot.
Another study involved long-term cohabitation with a social robot.
Experiment with EMYS
During the experiment,
people with different attachment styles spent
each with an EMYS robot, which was operating
EMYS became an affective home assistant. Everyday he served such functions as: waking up the user. checking weather forecast, playing radio, checking news, posting on/reading from Facebook, checking personal calendar, sending/receiving e-mails, etc.
Experiment with EMYS
The conducted experiment confirmed that
, more often
associated with mobile devices
, adds a
new, "human" quality
into the human-machine
Three-layer control architecture
URBI based system
provides necessary hardware abstraction, and integrates low-level motion controllers, sensor systems and algorithms implemented as external software.
is responsible for the functions of the robot and the implementation of his competencies. It defines a set of tasks the robot will be able to perform.
may incorporate a dedicated decision system.
URBI is a platform that provides a robot programming language - urbiscript. It is parallel-oriented and provides managing events and components (orchestrator).
Urbi - a compact operating system for robots
dynamic script language
embedded events and parallelism,
expandable with C++/Java modules,
intuitive syntax with similarities to C++
URBI based system
Appraisal, emotion emulation and more.
EMYS' decision system may incorporate a custom state machine (FSM) or a comprehensive program simulating various functionalities of a human mind.