Multimodal human-machine interactions for bedridden people
Thesis subject for doctorate in computer science, BCI team, CRIStAL laboratory, University of Lille, campaign 2024
- PhD thesis 2024-2027 in computer science
- Location: Lille (CRIStAL, CNRS, University of Lille, in Villeneuve d’Ascq), BCI team
- Thesis supervisor: José Rouillard (MdC HDR, CRIStAL, BCI, University of Lille)
- Funding: Scholarship from the University of Lille (currently applying)
Summary: This thesis subject proposes to study human-machine interactions for bedridden people, following an accident or illness reducing their abilities. Patients who have suffered an accident or suffer from a degenerative disease are often bedridden and dependent on other people or robots to carry out tasks and communicate with those around them. Under these conditions, traditional means of interaction (keyboard/mouse) can no longer be used easily and new ways must be implemented to offer adaptive and personalized communication between the patient and the machine. Oral, visual interfaces (gaze tracking, detection of postures, gestures or micro-gestures), even brain-computer interfaces can allow a bedridden person to communicate (see locked-in syndrome for the most extreme cases, for which the patient can absolutely no longer move his muscles or communicate orally). Each case requires lengthy preparation and adjustment work of the computer system. Automatic adaptation of the system is sought within the framework of this thesis work in order to allow personalization, not only for the patient, but also for those around them and the evolution of their pathology.
Keywords: multimodality, natural user interface, health, disability, bedridden
Scientific and economic context
Despite the progress made in the field of human-machine interfaces (HMI), implementing effective solutions that can be used on a daily basis is a real scientific, technological and societal challenge. Dependent people (at home, in hospital, in nursing homes, etc.) and/or disabled people have a crucial need for communication, but no simple, effective solution that is easily adaptable to their physical and cognitive abilities is yet truly available. When these users are bedridden, certain tools and technologies are only partially usable. For example, the occipital part of the skull often rests on a pillow and the use of electrodes on this part of the brain is very difficult if not impossible. It is therefore necessary to study other complementary means to establish effective communication with the patient, and to propose original ideas to explore, such as the design of an adaptable pillow incorporating electrodes.
In this doctoral thesis in computer science, we recommend the study of different interaction modalities which can be used successively or jointly by a patient lying on a bed. A multimodal approach (Coutaz et al. 1995) could probably improve communication performance by coupling endogenous and exogenous solutions already known, but rarely used simultaneously (monitoring gaze, speech, when possible, detection of gestures and micro-gestures, detection of movement intention by electroencephalogram (EEG), magnetoencephalogram (MEG), etc. (Corsi et al. 2018).
The state of the subject in the host laboratory
The BCI (Brain-Computer Interface) team at the CRIStAL laboratory is particularly interested in brain-computer interfaces for patients suffering from severe disabilities and/or illnesses preventing them from using traditional HMIs to communicate and act on the world. We have been collaborating for many years with various organizations in the health field (Lille University Hospital, INSERM, SCALAB laboratory, etc.) to study solutions in which non-invasive BCIs are used. The theses of Alban Duprès (2013-2016) (Duprès 2016) and Jimmy Petit (2019-2022) (Petit 2022) have made it possible to advance in the study of hybrid multimodal BCIs and the exploitation of somesthetic evoked potentials, in particular by studying the cerebral responses to vibrations applied to the wrists of patients. This thesis subject aims to explore other avenues that are still little exploited, such as the use of different interaction modalities in a synergistic manner (visual, auditory, kinesthetic) in order to allow the patient to regain autonomy by dialoguing with their entourage or with an assistant robot. This could be done with partners from the European Brain and Technology Alliance1, in which our research team and the University from Lille are involved.
Objectives and expected results
This involves modeling and designing a truly usable human-machine dialogue solution for people who can no longer use a single mode of interaction. BCI will be one of the potential solutions, to be coupled with other means of interacting, adaptable according to the user, the session, the context, etc. A demonstrator will be feasible, initially within the interaction rooms of the CRIStAL laboratory, then tested outside the laboratory (depending on the availability of our partners, such as the Lille University Hospital, or the Hopale foundation2 from Berck, for example).
Forecast work program
First year:
- State of the art: A bibliographic study will be conducted by the candidate in order to identify the state of the art in the field, for these bedridden users, both in terms of signal processing and on the ergonomic aspect of the solutions currently proposed around natural user interfaces (Han et al. 2023), (Spandana et al 2021). Typically, it has been shown that an SSVEP is certainly relatively easy to detect on the occipital part of the skull, but that for bedridden patients, this solution is difficult or even impossible to implement.
- Usage study: The candidate will then have to find out from our partners (Hub Santé, health foundations and organizations, medical homes, etc.) about the uses and technologies currently used to help patients with reduced mobility and/or so-called
impeded
users (following a stroke, an accident, a progressive illness, etc.) to perform various tasks, with and without the help of a computerized system (communicate, ask for one-off help, emergency help, etc.).
Second year:
- System modeling; Study protocol; Preliminary laboratory study and development
Third year:
A laboratory experiment will collect data and check whether the scientific hypotheses put forward will be validated or rejected. Ideally, an experimental campaign outside the laboratory will be carried out (medical homes, patients’ homes, etc.) in order to test the usability of the solutions proposed in situ. Publication of the scientific results obtained, writing of the thesis and preparation of the student’s professional project.
Application and skills sought
The successful candidate must hold a Master M2 or equivalent in computer science and must show a strong interest in carrying out high-quality research. The candidate must have experience or a strong interest in software development (Python, C#, JS, Firebase, MQTT, Node-RED Unity, etc.) and health-oriented human-machine interactions. Skills in signal processing (EEG, EMG, etc.), data fusion/fission and multimodality will be a plus.
Creativity, autonomy, team spirit and sense of communication are valuable assets. A good level of technical and scientific English will also be appreciated.
Finally, concerning the question of secularism, it is recalled by the lawyer of the University of Lille, that the contractual doctoral student, whether or not he is in a teaching position, is assimilated to a public agent and is not cannot therefore demonstrate their religious affiliation, in particular by displaying a sign or an outfit intended to mark their religious affiliation.
If this thesis subject interests you, please send an application email as soon as possible to jose.rouillard@univ-lille.fr with CV, cover letter, transcripts as well as any element allowing us to assess your application.
References
-
Corsi MC, Chavez M, Schwartz D, Hugueville L, Khambhati AN, Bassett DS, De Vico Fallani F. Integrating EEG and MEG Signals to Improve Motor Imagery Classification in Brain-Computer Interface. Int J Neural Syst. 2019 Feb;29(1):1850014. Epub 2018 Apr 2. PMID: 29768971.
https://doi.org/10.1142/S0129065718500144. -
Coutaz, J., Nigay, L., Salber, D., Blandford, A., May, J., & Young, R.M. (1995). Four easy pieces for assessing the usability of multimodal interaction: the CARE properties. INTERACT.
-
Duprès Alban. Interface cerveau-machine hybride pour pallier le handicap causé par la myopathie de Duchenne. Thèse de doctorat. Traitement du signal et de l’image. Université Lille 1 – Sciences et Technologies, 2016. https://hal.science/tel-01411217/file/these.pdf.
-
Toward a hybrid brain-machine interface for palliating motor handicap with Duchenne muscular dystrophy: A case report. Alban Duprès, François Cabestaing, José Rouillard, Vincent Tiffreau, Charles Pradeau Annals of Physical and Rehabilitation Medicine, Elsevier Masson, 2019, https://doi.org/10.1016/j.rehab.2019.07.005.
-
Han, Yi, Xiangliang Zhang, Ning Zhang, Shuguang Meng, Tao Liu, Shuoyu Wang, Min Pan, Xiufeng Zhang, and Jingang Yi. 2023. “Hybrid Target Selections by ”Hand Gestures + Facial Expression” for a Rehabilitation Robot” Sensors 23, no. 1: 237. https://doi.org/10.3390/s23010237.
-
Petit, Jimmy, Filtrage somesthésique pour des interfaces cerveau-ordinateur utilisant des stimulations vibro-tactiles, Thèse de doctorat, Université Lille, 2022. En langue anglaise. https://theses.hal.science/tel-04207062/file/These_PETIT_Jimmy.pdf.
-
Petit Jimmy, Rouillard José, Cabestaing François, EEG-based Brain-Computer Interfaces exploiting Steady-State Somatosensory-Evoked Potentials: A Literature Review. Journal of Neural Engineering, IOP Publishing, 2021, 18 (5), pp.051003. https://doi.org/10.1088/1741-2552/ac2fc4.
-
E. Spandana et al 2021, Care-giver alerting for bedridden patients using hand gesture recognition system, J. Phys.: Conf. Ser. 1921 012077 https://doi.org/10.1088/1742-6596/1921/1/012077.