Main Frame (acces to all the site)
I am involve in a project that have the goal to make a computer more sensitive to the physical and emotional state of the user without any physical contact in order to develop man-machine communication. This project is compound from some specifically hardware interfaces and software modules. We use the next input signals:
All of these devices are without any physical contact with the body. At this moment all of the devices work separately and now I work to integrate them in a one system.
Introduction
Building high
performance interactive computer systems and other "intelligent" systems
requires the understanding of both: the physical, psychological, emotional user
state and the mode in which an user interact with such systems. Moreover, the
human efficiency in the activity related to the human-computer interaction (or,
more general, human-machine interaction) are directly dependent on both, the
subject state and the capability of the systems to recognize the specific needs
of the user and to change their response accordingly. Unfortunately, acquiring
and interpreting this kind of information is very difficult and, in consequence,
all the actual systems have only a limited ability of communication. Current
strategies for user's state acquisition are either obtrusive (only a small part
of them are not) or, the data, captured by the systems, consist in a low level
of useful information (keystrokes, mouse and joystick movements).
Physiological variables have been
traditionally used to measure the changes in physical, psychological and emotional
state. Typically, biological signals like: heart rate, blood pressure, galvanic
skin response (GSR), temperature, respiration, somatic movement, as well as
electromyography (EMG), electrocardiography (ECK) and electroencefalography
(EEG) are used for this purpose.
At this moment, the researcher
differentiates between the monitoring signals. The first class of the basic
physiological signals is related to the subject biological state (among them
being pulse, respiration, gesture, etc.). The second class is that of the signals
related to the specific activity during the task execution: (i) processing parameters
such as specific and general EEG waves, (ii) response/actions related parameters
as voice changes, changes of voluntary and involuntary movements, signals related
to typing on the keyboard, signals related with mouse movements, eye movement
etc.
Also, the face expression is
an important aspect in the affective and physical state communication. Tracking
and identifying the face is a complex task, but to extract features and to correlate
them with physical and psychological state is even harder. In an attempt to
solve the face tracking problem, in a robust real-time, the identification of
the pupil based on the red eye effect was used [6], [7]. In [7], the system
is completed with a new method that extracts the facial features based on a
template technique used to parameterize the facial features. The template parameters
are extracted based on a PCA analysis. The resulting system is able to run at
30 frames per second, does not need any manual alignment or calibration and
works well on image sequences with head movements and occlusions.
Another alternative to the above
method or to the general-purposed methods for face features recognition is the
"Expressions Glasses" [8] that senses changes in the facial muscles
movements and uses the pattern recognition method to identify expressions like
confusion or interest.
Mainly, brain computer interfaces are used to communicate/control some commands
to different systems (like up/down command [9]), to differentiate between some
mental tasks and, also, in the neurofeedback applications. But, also, the emotional
state of a person could be determined from the EEG signal; one of the methods
used for this was the correlation analysis [10], [24]. The features of the EEG
signals, acquired with ten electrodes, were used, in the first mentioned research,
to characterize four elementary states: anger, sadness, joy, and relaxation.
The features vectors associated with these elementary emotional states are obtained
from sets of 135 cross-correlation coefficients computed for 45 pairs of EEG
channels.
As technology progresses and the power of computer increases, the human state
analysis starts to use not only one, but a group of physiological variables.
For example, in the patent [11], the system uses a plurality of physiological
variables such as heart rate, electromyography (EMG) and electrodermal activity
(EDA). In the next step, the values for each of the physiological variables
are translated, based on statistical calculation, into unitless statistical
measure named z-score. Based on the z-score, the associated emotion or state
of feeling is determined.
In the patent [12], it is described
a device which implements a method that calculates individual's stress point
during performing a cognitive task. Physiological variables of a subject (galvanic
skin response, temperature, respiration, blood pressure, pulse rate) are monitored
during playing a computer game. When the user's stress is beyond a given level,
the biofeedback stimuli are visually presented on the video screen in order
to assist the subject to reduce his level of stress. In the patent literature
we can find more similar examples. Thus, it becomes obviously that such practical
solutions gained credibility for commercial applications and have gated support
from companies potentially interested in manufacturing devices based on the
patented solutions.
Using the gestures related to work
- like keyboard typing forces, hand movements over the mouse, the cursor mouse
displacement on the screen coupled with a physiological signal like breath signal
(acquired with a thermistor placed in front of the nostril), skin temperature
(the sensor is placed on the middle finger) and body movement (using accelerometric
and impedance sensor) -, the system presented in [13], identifies the state
of the user and, simultaneously, gives a confidence level to conclude that there
are significant differences between persons and/or their states.
Based on the observation that a user
spends approximately 1/3 of his total computer-interacting time touching the
computer input devices, the BlueEyes research team from the IBM Company built
an "Emotion Mouse" [14]. This system uses the heart rate, the temperature,
the GSR and the somatic movement - all measures being obtained from the hand.
The relation between physiological measurements and emotions are extracted based
on a correlation model. The correlation model is extracted from the calibration
process, in which stimuli are presented to evoke the desired emotions; in the
same time, the physiological signals are recorded. The correlation model is
build based on a statistical analysis.
Other sensor systems used to sense
physiological signals are the following systems used to sense: the blood volume
pressure putted into an earring jewelry [16], the skin conductivity in a shoe
[16] or glove [17], the respiration in a clothing [16] etc. But, the main aim
of all this sensor systems is that of making the systems of sensors more bearable,
without any psychological effects that could influence the user's state. However,
almost all these sensors, with only few exceptions, must be attached on the
user in a mode or another (obtrusive methods) or we must to ask the user to
wear some special cloths. In my project, based on the sensors systems that are
already developed and that are all without any physical contact with the body,
these drawbacks are overcome.
The previous sensor's problems, the large amount of data, the complexity of
the classification analysis and the variability among the population are only
few factors that determine the difficulty of using the user's states in human
computer interface.
The
Sensors Systems
As we saw it
above, a major difficulty in the development of bio-instrumental complex used
to understand and to adapt to the user's state is to create efficient, high
quality interfaces that allow natural interaction with a computer system.
It was largely recognized that a better evaluation of the user's state and the
discrimination between states of the same user are obtained by taking into account
the correlation between two or more such parameters provided by different biological
signals. For examples, heart rate data directly reflect various physiological
states like biological workload, stress at work, drowsiness [18], or affective
state such as anger; but, also, they change with physical exercises or with
sneezing [16]. If we want to infer the source of change it will be easy if we
can independently detect the change based on the context sensors like one that
senses the movements. From these reasons, in my project I used three different
types of sensors that acquire three different biological signals. Two of the
sensor systems are based on the same inductive transducer; the other is working
on a different principle.
Part of the sensors systems development
was supported by Romanian National Council for Research in High Education under
contract number 33479/2002, theme 104, cod CNCSIS 67; this research project
was conducted by myself.
One of the
biological signals that will be used by the non-contact bio-instrumental complex
is the respiratory signal. The respiratory signal is acquired using the reography
(impedancemetry) method [4], [19]. The part of the system used to acquire respiratory
signal consists in a chair with one sensor [15], [20] incorporated in the back
support, with no direct contact with the body; the second sensor is a piezoelectric
one, placed like in Figure 1.
Mainly, the sensor consists in a coil. It is known that an element generating an external electromagnetic field changes its impedance due to the properties of the objects in its close vicinity. The change is due to the variation of the equivalent impedance (either reactive or resistive) viewed at the port of the measuring device. The idea is to determine the respiratory movements, based on the change of impedance that these movements produce in the sensor. Inherently, the respiratory signal (presented in the upper part of the Windows window from Figure 1) also includes another components such as those derived from the tremor movements, the blood flow induced movements and the small involuntary movements. But, these components are of small values and they are removed with properly digital filters. Moreover, the user's hand and body movements (Figure 1, central part of the respiratory signal) generate another artifacts, which appear and disturb the respiratory signal. Classical filtering techniques can not eliminate these artifacts, first, because the amplitude of the artifacts is a number of times larger than the respiratory signal and second, the artifacts overlap - in time and frequency domain - the respiratory signal. Using the information supplied by the piezoelectric sensor, two neural networks, trained with anti-Hebbian rules and connected in a special topology, remove the artifacts induced by the movements mentioned above [1]. Back
Tremor Sensor
System (Virtual Joystick)
The proposed "virtual joystick" interface [2], [5], [21], [23] is able to sense the proximity, the movements and, when the joystick is referred, to track the position of the hand and to yield commands, similarly to a classic joystick; more than that, it can acquires simultaneously the hand tremor signal. In the Joystick application, the sensor system includes 3 sensors operating together in order to derive the position of the hand. The hand position commands the position of a controlled virtual object, without any physical contact to the Virtual Joystick. The sensor circuit may interface with a PC, like any normal joystick and, more, it will send the tremor signal through serial line. Because the sensor is similar with one used in the respiratory sensor system, the work principle is the same. In the proposed non-contact bio-instrumental complex, through this device the user will directly interact and command the computer.
![]() |
![]() |
When the hand is above one of the
sensors, the output of the corresponding circuit has a high value. The sensors
that sense left-right balance are placed symmetrically on the board (Figure
2) and are paired, such that the signal from the couple of opposite sensors
evidences the balance movements of the hand. The
principle is similar when detecting the forward and backward movement. The distance
between the proximity sensor and the hand is another factor that can influence
the magnitude of the output signal on the corresponding channel. It is used
for a supplementary control. The software package was developed to interrogate
Joystick port, to control - in the virtual space - the position of the point
corresponding with the hand position (Figure 3) and to request, receive, present
and manage the tremor signal. A fuzzy system is included in the Virtual Joystick
[5] control unit and it is used to control and to improve the characteristics
of the virtual device (the fuzzy system models the way of computing the hand
position and compensates the hand's dimension variability). Back
Body Torso Movement
Sensor System
The body movement
is related to various aspects of the degree of health or disease and, in real
life, beyond the speech, what gives its real substance to face-to-face interaction
and communication is the bodily activity of the people [22]. The body represents
a way through which humans express their feelings, thoughts and states. A "sensitive
computer" can use the body movement and the position of the body, linked
to the artifacts from the environment, in order to assess the state of the person
such: nervousity, lack of attention, motor fatigue and agitation, confusion
etc.
The study of the relationships between
the body movements and human state is difficult and cumbersome, mostly due to
the complex nature of physical activity and the resulting difficulty in measuring
it. The proposed sensor system [3] that will be used in the project to assess
the body torso movements is compounded from a laser scanner, a video camera
and a software program that controls the scanner and extracts the people's torso
position information.
The
principle of the whole system relays on a laser scanner that generates a laser
plane at a constant angle from the horizontal plane (consider this plane the
floor). When the laser plane hits a target in the imaged area, a line of laser
light appears on the target. The system uses a single conventional video camera
that is capable to acquire images from the area where the laser plane can hit
the target (in our case the person's torso). With this camera the software gets
two consecutive images: first, with the laser diode on, with a line of laser
light that appears on the target, and the second, with the laser diode off.
Making the difference between these two images we get, as a result, only the
laser line projected on the people's torso. At this point, we know which are:
the angle between the laser scanner and the horizontal plane, the position in
space of the video camera and, respectively, the extracted shape of the laser
light line on a trunk. Then, the depth information is calculated using a genetic
algorithm and some of the basic geometric formula. Thus, we extract the real
3D-body position related with CCD camera point of view. Back
Bibliography
[1] Dan-Marius Dobrea, Horia-Nicolai Teodorescu, Monica-Claudia
Serban, Method to remove respiratory artefacts from a system used to assess
bio-psychic state of a person, Third European Symposium in Biomedical Engineering
and Medical Physics, 2002, August 30 -September 30th, Patras, Greece
[2] Dan-Marius Dobrea, H. N. Teodorescu, Daniel Mlynek, An Interface for Virtual
Reality Applications, Romanian Journal of Information Science and Technology,
Publishing House of the Romanian Academy, Vol. 5, No. 3, 2002, pp. 269 - 282,
ISSN: 1453 - 8245
[3] Dan-Marius Dobrea, A New Type of Sensor to Monitor the Body Torso Movements
Without Physical Contact, EMBEC'2002, Proceedings of Second European Medical
and Biological Engineering Conference, Vol. 3, Part 1, December 4-8, 2002, Vienna,
Austria, pp. 810-811, ISBN 3-901351-62-0
[4] Horia-Nicolai Teodorescu, Dan-Marius Dobrea, A Non-contact Respiration Monitoring
System Based on a New Type of Sensor, Journal of the International Federation
for Medical & Biological Engineering (submited)
[5] Dan-Marius Dobrea, Horia-Nicolai Teodorescu, A Fuzzy System Used to Derive
Hand Movements from a New Virtual Joystick Interface Device, Scientific Bulletin
of The "POLITEHNICA" University of Timisoara, Vol. 1, No. 47(61),
2002, pp. 27 - 31, ISSN 1224 - 6034
[6] Carlos H. Morimoto, Myron Flickner, Real-Time Multiple Face Detection Using
Active Illumination, Proceedings of 4th IEEE International Conference on Automatic
Face and Gesture Recognition, March 26 - 30, 2000, Grenoble, France, pp. 8-13
[7] Ashish Kapoor, Rosalind W. Picard, Real-Time, Fully Automatic Upper Facial
Feature Tracking, Proceedings of 5th IEEE International Conference on Automatic
Face and Gesture Recognition, May 20 - 21, 2002, Washinton D.C., USA, pp. 10-15
[8] Jocelyn Scheirer, Raul Fernandez, Rosalind W. Picard, Expression Glasses:
A Wearable Device for Facial Expression Recognition, CHI 2000 Conference on
Human Factors in Computing Systems, May 15-20, 1999, Pittsburgh, USA
[9] S. J. Roberts, W. D. Penny, Real-timebrain-computer interfacing: a preliminary
study using Bayesian learning, Medical & Biological Engineering & Computing,
Vol. 38, 2000, pp. 56-61
[10] T. Musha, Y. Terasaki, T. Takahashi, H. A., Haque, Numerical estimation
of the state of mind, Proceeding of IIZUKA, 1996, Japan, pp-25-29
[11] Kenneth Michael Zawilinski, EmotionalResponse analyzer system with multimedia
display, Patent No. 5,676,139, Publication date: 1997, November 16
[12] Vincent Cornellier, Thomas K. Ziegler, Biomonitoring stress management
method and device, Patent No. 4,683,891, Publication date: 1987, August 4th
[13] Frederic de Coulon, Eddy Forte, Daniel Mlynek, Horia-Nicolai L. Teodorescu,
Stefan Suceveanu, Subject-State Analysis by Computer and Applications in CAE,
Proceedings of the International Conference on Intelligent Technologies in Human-Related
Sciences, ITHURS 1996, July 5-7, 1996, Leon, Spain, pp. 243-250
[14] Ark, W., Dryer, D., Lu, D., The Emotion Mouse, Proceedings of 8th International
Conference on Human-Computer Interaction, Vol. 1, 1999, New Jersey, SUA, pp.
818-823.
[15] Horia-Nicolai Teodorescu, Position and movement resonant sensor, Patent
No. 5986549, United States, Publication date: 1999, Oct. 14
[16] Rosalind W. Picard, Affective Medicine: Technology with Emotional Intelligence,
MIT Media Lab Tech Report No 537, 2001
[17] Rosalind W. Picard, Jocelyn Scheirer, The Galvactivator: A glove that senses
and communicates skin conductivity, Proceedings of 9th International Conference
on Human-Computer Interaction, August 5-10, 2001, New Orleans, USA
[18] Task force of the European Society of Cardiology and the North American
Society of Pacing and Electrophysiology, Heart rate variability, standards of
measurement, physiological interpretation, and clinical use, Circulation, Vol.
93, No. 5, March 1th, 1996, pp. 1043-1065
[19] Horia-Nicolai Teodorescu, Dan-Marius Dobrea: A system to monitor the respiration
and movements without contact with the body, Proceedings of the European Conference
on Intelligent Technologies, ECIT'2000 International Conference, Romania, 2000,
Iasi, ISBN 973-95156-1-4
[20] Horia-Nicolai Teodorescu, Dan-Marius Dobrea, E. Forte, M. Wentland-Forte:
A High Sensitivity Sensor for Proximity Measurements and Its Use in Virtual
Reality Applications, Proceedings of the European Conference on Intelligent
Technologies, ECIT'2000 International Conference, Romania, 2000, Iasi, ISBN
973-95156-1-4
[21] Dan-Marius Dobrea, Horia-Nicolai Teodorescu, A New Type of Non-Contact
2D Multimodal Interface to Track and Acquire Hand Position and Tremor Signal,
BEC' 2002, Proceedings of Baltic Electronics Conference 2002, October 6-9, Tallinn,
Estonia, pp. 359-362, ISBN 9985-59-292-1
[22] A. Pease, Body Language - How to read other's thoughts by their gesture,
Sheldon Press, 18th edition, London, 1992, ISBN: 0-85969-782-7
[23] Dan-Marius Dobrea, A New Type of Non-Contact 3D Multimodal Interface to
Track and Acquire Hand Position and Tremor Signal, ECIT'2002, Proceedings of
European Conference on Intelligent Technologies 2002, July 20-22, Iasi, Romania,
ISBN 973- 8075-20-3
[24] T. Musha, Y. Terasaki, H. A. haque, G. A. Ivanitsky, Feature extraction
from EEG associated with emotions, Artificial Life Robotics, Vol. 1, 1997, pp.
15-19
You are - th visitors since 5 december 2000. |
Last update:
March 11, 2002
|