Title: Curriculum Vitae Pramila Rani
1Curriculum VitaePramila Rani
Robotics and Autonomous Systems Laboratory
Vanderbilt University
2About Me
- Education
- Vanderbilt University, Nashville, TN (June
2003-Present) - PhD in Electrical Engineering and Computer
Science - Dissertation Topic Affect- based implicit
human-robot interaction - Vanderbilt University, Nashville, TN (August
2001-May 2003) - M.S. in Electrical Engineering and Computer
Science - Major Robotics and Control
- Birla Institute of Technology and Science,
Pilani, India (August 1997-June 2001) - B.E. (Hons.) in Electrical and Electronics
Engineering
3Experience
- Research Assistant At Robotics and Autonomous
Systems Laboratory (Under Prof. Nilanjan Sarkar),
Vanderbilt University (January 2002 - Present) - Implementing pattern recognition techniques
(Fuzzy Logic, Regression Tress, Bayesian
Networks, KNN classifier and Neural Networks). - Design and implementation of an affect-sensitive
robot architecture. - Real-time acquisition and analysis of
physiological signals using advanced signal
acquisition and processing techniques - Design and development of a closed-loop feedback
system for robot-based and computer-based games. - Teaching Assistant At Department of Mechanical
Engineering, Vanderbilt University for ME 234
System Dynamics (August 2001 - December 2001) - Intern At Motorola India Electronics Limited,
Bangalore, India (January-June 2001)
4Professional Achievements
- Selected for the 2004-2005 Chancellor's List
published by the National Academic Affairs (Only
1 students from the 3000 National Colleges and
Universities are selected each year) - My Research in News Featured on Tech TV and
various news magazines including BBC, ABC News,
Science Daily and ACM Technews - (http//robotics.vuse.vanderbilt.edu/affect.htmn
ews) - Among the 35 students selected world-wide to
attend the RAS/IFRR Summer School on "Human-Robot
Interaction" held at Volterra, Italy in July,
2004 - Professional memberships IEEE Robotics and
Automation Society (RAS), American Association
for Artificial Intelligence (AAAI)
5Publications
- Dissertation Related Publications
- Rani, P, Sims, J, Brackin, R, and N. Sarkar,
Online Stress Detection using Psychophysiological
Signal for Implicit Human-Robot Cooperation, in
Robotica, Vol. 20, No. 6, pp. 673-686, 2002. - Rani, P., Sarkar, N., Smith, C., and L. Kirby,
Anxiety Detecting Robotic Systems Towards
Implicit Human-Robot Collaboration, in Robotica,
Vol. 22, No. 1, pp. 85-95, 2004. - (Under Review) Rani, P., Sarkar, N., Smith, C.,
A., Adams, J., A., Affective Communication for
Implicit Human-Machine Interaction, IEEE
Transactions on Systems, Man, and Cybernetics. - (Under Review) Rani, P., Sarkar, N., An Approach
to Human-Robot Interaction Using Affective Cues,
IEEE Transactions on Robotics. - Rani, P., Sarkar, N., "Operator Engagement
Detection and Robot Behavior Adaptation in
Human-Robot Interaction", IEEE International
Conference on Robotics and Automation, April
2005, Barcelona, Spain. - Rani, P., Sarkar, N., Smith, C., "Affect-Sensitive
Human-Robot Cooperation Theory and
Experiments", IEEE International Conference on
Robotics and Automation, pp 2382-2387, Taiwan,
September 2003. - Rani, P., Sarkar, N., "Maintaining Optimal
Challenge in Computer Games Through Real-Time
Physiological Feedback ", HCI International, July
2005, Las Vegas, USA. - Rani, P., Sarkar, N., Smith, Anxiety Detection
for Implicit Human-Robot Collaboration, IEEE
International Conference on Systems, Man
Cybernetics, Washington D.C., pp 4896-4903,
October 2003. - Rani, P., Sarkar, N., "Emotion-Sensitive Robots-
A New Paradigm for Human-Robot Interaction",
IEEE-RAS/RSJ International Conference on Humanoid
Robots (Humanoids 2004), November 2004, Los
Angeles, USA - Adams, J, Rani, P, Sarkar, N, Mixed Initiative
Interaction and Robotic Systems, Workshop on
Supervisory Control of Learning and
Adaptive Systems, Nineteenth National Conference
on Artificial Intelligence (AAAI-04), San Jose,
CA, July, 2004. - (Submitted) Liu, C, Rani, P., Sarkar, N.,
"Comparison of Machine Learning Techniques for
Affect Detection in Human Robot Interaction,"
IEEE/RSJ International Conference on Intelligent
Robots and Systems, August 2005, Canada. - (Submitted), Rani, P., Sarkar, N., Making Robots
Emotion-Sensitive - Preliminary Experiments and
Results,, ROMAN 2005
6Psychophysiology-Based Affective Communication
for Implicit Human-Robot Interaction
7Some Definitions
- Human-Robot Interaction (HRI)
- The study of humans, robots and the ways in which
they influence each other - Psychophysiology
- Science of understanding the link between
psychology and physiology - Affective Communication
- Communication relating to, arising from, or
influencing feelings or emotions
8Goal
- This goal is to develop an intuitive
affect-sensitive human-robot interaction
framework - robot will interact with a human based on his/her
probable affective state - affective state will be inferred from the human's
physiological signals - robot will adapt its behavior in response to the
human's affective state - emotion
9Outline
- Motivation
- HRI Framework and Main Components
- Signal Processing for Affect-Recognition
- Simulink Design for Real-Time Affective feedback
Robot Control
10Motivation
- The Robot Invasion
- There is a projected increase of 1,145 in the
number of personal service robots in use within a
year - According to World robotics 2004 report, at the
end of 2003, about 610,000 autonomous vacuum
cleaners and lawn-mowing robots were in operation - In 2004-2007, more than 4 million new units are
forecasted to be added!!! - Need for Natural and Intuitive Human-Robot
Communication - Unlike industrial robots, personal and
professional service robots will need to
communicate more naturally and spontaneously with
people around - Robots will be expected to be understanding,
emphatic and intelligent
11Motivation
- Attempt to mimic Human-Human Interaction
- More than 70 of communication is non-verbal or
implicit - Emotions are a significant part of communication
- 7 percent of the emotional meaning of a message
is communicated verbally. About 38 by
paralanguage and 55 via nonverbal channels 1 - Most Significant Channels of Implicit
Communication in Humans - Facial Expressions
- Vocal Intonation
- Gestures and Postures
- Physiology
1 Mehrabian, A. (1971). Silent Messages.
Wadsworth, Belmont, California
12Motivation
- Giving Robots Emotional Intelligence
- Robots should be capable of implicit
communication with humans - They should detect human emotions
- They should modify their behavior to adapt to
human emotions -
13Application Areas
Some Potential Application Areas of
Affect-Sensitive Robots
14Human-Robot Interaction Framework
Extend Architecture Capabilities
Basic Framework
Extend Communication Capabilities
15Main Components
- Theoretical
- Computational
- Signal conditioning and processing
- Machine learning for affect recognition
- System Development
- Task design for training (Phase I) and validation
(Phase II) phases - Experimental
16System Development
- System Set-up for Interactive Pong Game
- One player Pong Player against Computer
- Continuous Physiological Monitoring
- Anxiety Detection from Physiology
- Dynamic Game Adaptation based on
- Anxiety
- Performance
17Computational
- Signal conditioning and processing
- Algorithms for artifact-rejection, adaptive
thresholding, signal conditioning and
feature-extraction for various signals - Fourier transform, Wavelet transform, and
statistical analysis and were extensively used
in order to perform signal processing - Machine learning for affect recognition
- Regression Tree Methodology was employed to build
an affect-recognition system - A systematic comparison of the strengths and
weaknesses of four machine learning methods -
K-Nearest Neighbor, Regression Tree, Bayesian
Network and Support Vector Machine was performed
SVM analysis was done by Mr. Changchun Liu
18Real-Time Affect Recognition
C2
Biomedical Signal Processing
C1
Affect-Recognition via Regression Tree
ECG
PPG
ICG
Physiological Features
Affective Trigger Generation
Real-Time Signal Acquisition
C Library
SC
Medical Acquisition Device
EMG
Performance Measure
PCG
19Signal Processing
R Waves
Peak Amp
Signals
PPG
PPG (Photoplethysmogram)
ECG
ECG (Electrocardiogram)
PTT
IBI
Peak Amplitude
IBI (Interbeat Interval)
Mean
Variability
Pulse Transit Time
Mean
Variability
Sympathetic Power
Parasympathetic Power
20ECG and PPG Signals
- Electrocardiogram (ECG) and Photoplethysmogram
(PPG) Signals
Inputs
Outputs
ECG
- Mean Pulse Transit Time
- Var. Pulse Transit Time
- Mean Interbeat Interval
- Var. Interbeat Interval
- Peak Time Array
- Mean Peak Amplitude
- Max Peak Amplitude
- Sympathetic Activity
- Parasympathetic activity
ECG Waveform
PPG
PPG Waveform
21Electrocardiogram
- Electrocardiogram (ECG) Signal
22Photoplethysmogram
- Photoplethysmogram (PPG) Signal
Input
PPG Waveform
23Other Biomedical Signals
Physiological Signals
Feature Vectors
EMG Waveform
- EMG
- Mean EMG activity
- Var. EMG activity
- Slope EMG activity
- Mean Frequency
- Median Frequency
Impedance Waveform
- ICG
- Mean IBI
- Var. IBI
- Mean PEP
- Var. PEP
- GSR
- Mean Tonic
- Slope Tonic
- Mean Amp Phasic
- Max Amp Phasic
- Rate Phasic
Galvanic Skin Response
- Tools
- Wavelet Trans .
- Fourier Trans.
- Statistical SP
- Challenges
- High Speed
- High Accuracy
- Handle Artifacts
24System Development
- System Set-up for Robot Basketball Game
- Basketball hoop on 5 DOF robotic arm
- Robot can vary game difficulty
- Continuous Physiological Monitoring
- Anxiety Detection from Physiology
- Dynamic Game Adaptation based on
- Anxiety
- Performance
25Robot-Control
C2
Robot Controller
Inverse Kinematics
Affective Triggers
Config. Selection
Trajectory Generation
PD Controller
MultiQ Data Acquisition Card
Serial Communication
C1 Running Matlab for Signal Acquisition
Medical Acquisition Device
Data Acquisition Functionality from MultiQ Board
in Simulink provided By Quanser
Configurations Data Base
26Simulink Implementation of C2
Serial Acquisition of Affective Triggers
27Serial Acquisition of Affective Triggers
The S-Function Block is responsible for
processing the Affective Triggers and sending the
appropriate handshake signals to the computer
being serially communicated with
Simulink Blocks for Serial Communication Provided
by Quanser
28Simulink Implementation of C2
Trajectory Generation and Robot Control
29Trajectory Generation and Robot Control
Robot X-Motion
30Robot X-Motion
Trajectory Selection
Trajectory Selection
31Trajectory and Speed Selection
Speed Selection
Trajectory Selection
32Trajectory Generation and Robot Control
Robot Joint Control
33Robot Joints Control
PD Controller
Simulink Blocks for Data Acquisition Provided by
Quanser
34Conclusion
- Feasibility of real-time physiology-based
affect-recognition demonstrated - Affect-detection capability integrated in a
robot-control architecture to allow implicit
communication - Robot behavior dynamically adapted as a function
of perceived affective stets - Computer-based and robot-based experiments
designed to investigate impact of affective
communication in human-machine interactions
35HRI and MathWorks
- Human-Robot Interaction is an emerging focus area
requiring synergistic integration of Robotics,
Control Systems, AI, and Psychology. - Matlab Simulink provide an ideal platform for
combining the above domains knowledge and for
rapid prototyping of intelligent HRI frameworks - Potential for New Matlab Toolboxes
- Biomedical Signal Processing
- Robotics (Forward/Inverse Kinematics, Controller
Design etc.) - Ultimate goal Achieve seamless integration of
diverse science and engineering domains and
MathWorks well-place to achieve this - It is an exciting time for MathWorks and I would
love to be a part of it!!
36Acknowledgements
- Advisor Dr. Nilanjan Sarkar,
- Mech. Engg., Vanderbilt
UniversityTeam Members Dr. Eric Vanman, - Psychology, Georgia State University
- Mr. Changchun Liu,
- Graduate Student, Vanderbilt University
37Questions?
?