Loading presentation...

Present Remotely

Send the link below via email or IM

Copy

Present to your audience

Start remote presentation

  • Invited audience members will follow you as you navigate and present
  • People invited to a presentation do not need a Prezi account
  • This link expires 10 minutes after you close the presentation
  • A maximum of 30 users can follow your presentation
  • Learn more about this feature in our knowledge base article

Do you really want to delete this prezi?

Neither you, nor the coeditors you shared it with will be able to recover it again.

DeleteCancel

Make your likes visible on Facebook?

Connect your Facebook account to Prezi and let your likes appear on your timeline.
You can change this under Settings & Account at any time.

No, thanks

ME 597: Rock Band Project

9-6-2012
by

Kelsey Rodgers

on 9 July 2014

Comments (0)

Please log in to add your comment.

Report abuse

Transcript of ME 597: Rock Band Project

Person
Active Learning
Autonomous
Robot
Perspective
Active Learning
through an Artifact
Human Perspective
Opportunity Gap: Educational Perspective
Persons Reaction to Human-Robot Interaction
Human-Robot
Interactions

Task learning through imitation and Human-Robot Interaction (Nicolescu and Mataric, 2004)
How the Robot Learns and Interacts with Humans
Cooperative Multi-Robot Box-Pushing
(Matric, Nisson, and Simsarian, 1995)
Bilateral Teleoperation of Multiple Cooperative Robots over Delayed Communication Networks: Application (Lee, 2005)
Programmed Logic
How the Human Learns and Interacts with Robots
Wireless Sensors Based Feedback System
for Human Body Movement Practices
(Aravind and Manickam, 2012)
Palro
Force Sensor - Piano Lessons
KASPAR - Robot that helps autistic
children build relationships
Learning by Design: Game as
Learning Machines (Gee, 2004)
Game-Based Learning: What it is, Why it Works,
and Where it's Going (Trybus, 2012)
MIT uses reality TV, online games to show science's appeal (ChemLab Boot Camp)
Game-Based Learning in Engineering Education (Darline, Drew, Joiner, Lacovides, and Gavin)
How do people react to various types of
Feedback from a Robot or Gaming system?
Robots and Gaming Systems give Instantaneous Feedback
(Feedback is a complex issue in education, especially open-ended learning environments)
Research Interests:
* Feedback
* Motivation
Plan for Experiment
Research Setting:
Rock Band to learn the Guitar or Drums
Analysis Method:
* Quantitative: Surveys - Persons Thoughts
* Qualitative: Video Recording - Reactions
* Jessica Trybus
CEO of Etcetera Edutainment
Expertise: Game-Based Training Technology
* Jos Darline
Engineering Professor at University of BATH
Research with Car Simulator for Engineering Education Research
* Demetra Evangelou
Engr. Ed. Professor at Purdue University
Expertise: Learning through Artifacts
Interviews to Complete
Kelsey
Rodgers
Mariana
Tafur
Professor: Justin Seipel
"Tell me, and I'll forget. Show me, and I may remember. Involve me, and I'll understand.“ – Chinese Proverb
Remote Controlled
* Fabian Winkler
Visual and Performing Arts at Purdue
Expertise: Working at the intersection of art and robotics

* George Lee
Electrical Engineering Professor at Purdue
Expertise: Humanoid Robotics Researcher
* Robin Adams
Engr. Ed. Professor at Purdue University
Expertise: Design and Cognition
Interviews to Complete
Characterizing the motivation of people who learn with robots and video games.
Ritter, M., & Low, K. G. (1996). Effects of dance/movement therapy : A meta-analysis (Vol. 23). Amsterdam, PAYS-BAS: Elsevier.
Ritter, M., & Low, K. G. (1996). Effects of dance/movement therapy : A meta-analysis (Vol. 23). Amsterdam, PAYS-BAS: Elsevier.
Michalis, R., Darko, K., Hugues, H., & Proceedings of the, A. C. M. S. E. S. C. A. (2011). Real-time classification of dance gestures from skeleton animation. ACM, 2 Penn Plaza, Suite 701, New York, NY 10121-0701, USA
Shaw-Garlock, G. (2009). Looking forward to sociable robots. International Journal of Social Robotics, 1(3), 249-260. doi: 10.1007/s12369-009-0021-7
Essid S., et al. (2011) A multi-modal dance corpus for research into real-time interaction between humans in online virtual environments
Face Movements
Future Work
Approach to our Study
Relation
Engagement
Feedback
Overall score
Sounds
Visuals
Eye & eyebrows
Mouth & head
Torso & hands
Feedback and Response
Visual
Sound
IRB approval
Volunteers?
Goal: Conference Papers for FIE
Final Goal: Collaboration Paper

Plan of
Action
Body Movement
Performance
Rhythm
Notes
Mariana
Kelsey
Starting Happy Concentrated Struggling Disengaged
Straight



Inclined



Disengaged



Forth
Showing Engagement
Showing Anxiety
Thank you!
Hamido Fujita, Jun Hakura, Masaki Kurematu
Intelligent human interface based on mental cloning-based software
Knowledge-Based Systems, Volume 22, Issue 3, April 2009, Pages 216–234
http://dx.doi.org/10.1016/j.knosys.2008.11.005
What has been done
J. Sanghvi, G. Castellano, I. Leite, A. Pereira, P. W. McOwan, A. Paiva
Automatic analysis of affective postures and body motion to detect engagement with a game companion
Proceedings of the 6th international conference on Human-robot interaction
March 06-09, 2011, Lausanne, Switzerland
What has been done
(cc) image by anemoneprojectors on Flickr
RockBand
Feedback
Person
Response
*
Constructive:
Missing Keys
*
Praise: SuperStar
*
Progress: Written
*
Negative: Booing
*
Positive:

Cheering
*
Facial Expressions
*
Body Movements
*
Talking
*
Cursing
*
Exclamations
*
Sound Effects
Examples of Feedback
and Responses
Theoretical Framework:
Theory
Experimental:
Rock Band Studies
Application:
Inform Education or Research
Research
Literature Reviews
Collect and
Analyze Data
dissemination
Recruit
Participants
IRB
Design Detailed Plan of Experiment
Collect Data:
# of participant
# of songs
# of difficulty levels
(song/ instrument)
Analyze Data based on developed Theory-Based Coding Scheme
* Feedback
* Engagement
* Interaction of
Concepts
Write Papers
& Disseminate Findings
Recruit
Participants

IRB
Design Detailed Plan of Experiment
Collect Data:
# of participant
# of songs
# of difficulty levels
(song/ instrument)

Analyze Data
FIRST:
Video Data - Find Different Points of Feedback (as shown)
NEXT:
Vicon System Collected Data and Think Aloud Data
* Look for Any Changes
1.Frequency of Head Movement
2.Body Movements
3. Aloud Thought Process

Write Papers
& Disseminate Findings

Experiment:
Location: ME Lab
Data Collection Method:
Vicon Cameras and Video (of screen)
Difficulty and Song Selection:
* Eye of the Tiger
1. Easy 2. Medium 3. Hard
* Varying Level of Song Difficulty
1. Hungry like the Wolf (0 stars)
2. The Middle (3 stars)
3. Battery (6 stars)
IRB is POSTPONED:
*
Collected Data is for Educational Purposes in class setting so okay without IRB Approval
*
An IRB will be submitted in January (before any publications) to ensure following ethical protocols
Participants: (n = 2)
Mariana and Kelsey
Due to a high data collection and analysis demand, we have decided to narrow our scope down to a case study on 2 participants
Data Collection
Vicon Camera
Sensors
Head:
Temples and Chin
(Head movement
,
no longer face movement)
Body:
* Spine (Posture) * Hand, Forearm, Upper Arm (Drumming Motion - Velocity) * Left Foot (Drumming)
Guitar vs. Drums
(only collecting data from drums now)
Drummer's Body
Movement
FIE: iEEE Frontier's In Education
Abstract Due: 1-1-13
major international conference about educational innovations and research in engineering and computing
Head Movements
Foot Movements
Head Movement
(not collecting data from face now)
Head Movements
(Turning Head)
Drumming
(Wrist Bending)
Final Data Collection
Full transcript