Robotic Grasp Initiation by Gaze Independent Brain-controlled Selection of Virtual Reality Objects

Christoph Reichert, Matthias Kennel, Rudolf Kruse, Hans-Jochen Heinze, Ulrich Schmucker, Hermann Hinrichs, Jochem W. Rieger

2013

Abstract

Assistive devices controlled by human brain activity could help severely paralyzed patients to perform everyday tasks such as reaching and grasping objects. However, the continuous control of anthropomorphic prostheses requires control of a large number of degrees of freedom which is challenging with the currently achievable information transfer rate of noninvasive Brain Computer Interfaces (BCI). In this work we present an autonomous grasping system that allows grasping of natural objects even with the very low information transfer rates obtained in noninvasive BCIs. The grasp of one out of several objects is initiated by decoded voluntary brain wave modulations. A universal online grasp planning algorithm was developed that grasps the object selected by the user in a virtual reality environment. Our results with subjects demonstrate that training effort required to control the system is very low (<10 min) and that the decoding accuracy increases over time. We also found that the system works most reliably when subjects freely select objects and receive virtual grasp feedback.

References

  1. Aloise, F., Schettini, F., Aricò, P., Salinari, S., Babiloni, F., and Cincotti, F. (2012). A comparison of classification techniques for a gaze-independent P300- based brain-computer interface. Journal of Neural Engineering, 9(4), 045012. doi:10.1088/1741- 2560/9/4/045012.
  2. Bianchi, L., Sami, S., Hillebrand, A., Fawcett, I. P., Quitadamo, L. R., and Seri, S. (2010). Which physiological components are more suitable for visual ERP based brain-computer interface? A preliminary MEG/EEG study. Brain Topography 23(2), 180-185. doi:10.1007/s10548-010-0143-0.
  3. Bradshaw, L. A., Wijesinghe, R. S., and Wikswo, Jr, J. (2001). Spatial filter approach for comparison of the forward and inverse problems of electroencephalography and magnetoencephalography. Annals of Biomedical Engineering, 29(3), 214-226. doi:10.1114/1.1352641.
  4. Brunner, P., Joshi, S., Briskin, S., Wolpaw, J. R., Bischof, H., and Schalk, G. (2010). Does the 'P30078 speller depend on eye gaze? Journal of Neural Engineering, 7(5), 056013. doi:10.1088/1741-2560/7/5/056013.
  5. Cherkassky V. and Mulier, F. (1998). Learning from Data: Concepts, Theory, and Methods. John Wiley & Sons.
  6. Curran, E. A. and Stokes, M. J. (2003). Learning to control brain activity: a review of the production and control of EEG components for driving braincomputer interface (BCI) systems. Brain and Cognition, 51(3), 326-336. doi:10.1016/S0278- 2626(03)00036-8.
  7. Ericson, C. (2005). Real-time collision detection. Amsterdam: Elsevier.
  8. Farwell, L. A. and Donchin, E. (1988). Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. Electroencephalography and Clinical Neurophysiology, 70(6), 510-523.
  9. Frenzel, S., Neubert, E., and Bandt, C. (2011). Two communication lines in a 3 × 3 matrix speller. Journal of Neural Engineering, 8(3), 036021. doi:10.1088/1741-2560/8/3/036021.
  10. Guger, C., Daban, S., Sellers, E., Holzner, C., Krausz, G., Carabalona, R., Gramatica, F., and Edlinger, G. (2009). How many people are able to control a P300- based brain-computer interface (BCI)? Neuroscience Letters, 462(1), 94-98.
  11. doi:10.1016/j.neulet.2009.06.045.
  12. Guger, C., Edlinger, G., Harkam, W., Niedermayer, I., and Pfurtscheller, G. (2003). How many people are able to operate an EEG-based brain-computer interface (BCI)? IEEE Transactions on Neural Systems and Rehabilitation Engineering, 11(2), 145-147. doi:10.1109/TNSRE.2003.814481.
  13. Hochberg, L. R., Bacher, D., Jarosiewicz, B., Masse, N. Y., Simeral, J. D., Vogel, J., Haddadin, S., Liu, J., Cash, S. S., van der Smagt, P., and Donoghue, J. P. (2012). Reach and grasp by people with tetraplegia using a neurally controlled robotic arm. Nature, 485(7398), 372-375. doi:10.1038/nature11076.
  14. Hoffmann, U., Vesin, J.-M., Ebrahimi, T., and Diserens, K. (2008). An efficient P300-based brain-computer interface for disabled subjects. Journal of Neuroscience Methods, 167(1), 115-125. doi:10.1016/j.jneumeth.2007.03.005.
  15. Khatib, O. (1986). Real-time obstacle avoidance for manipulators and mobile robots. The International Journal of Robotics Research, 5(1), 90-98.
  16. Krusienski, D. J., Sellers, E. W., Cabestaing, F., Bayoudh, S., McFarland, D. J., Vaughan, T. M., & Wolpaw, J. R. (2006). A comparison of classification techniques for the P300 speller. Journal of Neural Engineering, 3(4). doi: 10.1088/1741-2560/3/4/007.
  17. Kuzborskij, I., Gijsberts, A., and Caputo, B. (2012).On the challenge of classifying 52 hand movements from surface electromyography. Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC). doi:10.1109/EMBC.2012.6347099.
  18. Liu, Y., Zhou, Z., and Hu, D., (2011). Gaze independent brain-computer speller with covert visual search tasks. Clinical Neurophysiology, 6. doi:10.1016/j.clinph.2010.10.049.
  19. Pfurtscheller, G., Neuper, C., Guger, C., Harkam, W., Ramoser, H., Schlögl, A., Obermaier, B., and Pregenzer, M. (2000). Current trends in Graz Brain-Computer Interface (BCI) research. IEEE Transactions on Rehabilitation Engineering, 8(2), 216-219. doi:10.1109/86.847821.
  20. Quandt, F., Reichert, C., Hinrichs, H., Heinze, H. J., Knight, R. T., and Rieger, J. W. (2012). Single trial discrimination of individual finger movements on one hand: A combined MEG and EEG study. Neuroimage, 59(4), 3316-3324.
  21. doi:10.1016/j.neuroimage.2011.11.053.
  22. Rieger, J. W., Reichert, C., Gegenfurtner, K. R., Noesselt, T., Braun, C., Heinze, H.-J., Kruse, R., and Hinrichs, H. (2008). Predicting the recognition of natural scenes from single trial MEG recordings of brain activity. Neuroimage, 42(3), 1056-1068. doi:10.1016/j.neuroimage.2008.06.014.
  23. Sahbani, A., El-Khoury, S., and Bidaud, P. (2012). An overview of 3D object grasp synthesis algorithms. Robotics and Autonomous Systems, 60(3), 326-336. doi:10.1016/j.robot.2011.07.016.
  24. Siciliano, B. and Khatib, O. (2008). Springer handbook of robotics. Springer.
  25. Siciliano, B. and Villani, L. (1999). Robot force control. Boston: Kluwer Academic.
  26. Treder, M. S. and Blankertz, B. (2010). (C)overt attention and visual speller design in an ERP-based brain-computer interface. Behavioral Brain Functions, 6, 28. doi:10.1186/1744-9081-6-28.
  27. Treder, M. S., Schmidt, N. M., and Blankertz, B. (2011). Gaze-independent brain-computer interfaces based on covert attention and feature attention. Journal of Neural Engineering, 8(6), 066003. doi:10.1088/1741- 2560/8/6/066003.
  28. Velliste, M., Perel, S., Spalding, M. C., Whitford, A. S., and Schwartz, A. B. (2008). Cortical control of a prosthetic arm for self-feeding. Nature, 453(7198), 1098-1101. doi:10.1038/nature06996.
  29. Vidaurre, C. and Blankertz, B. (2010). Towards a cure for BCI illiteracy. Brain Topography, 23(2), 194-198. doi:10.1007/s10548-009-0121-6.
  30. Wolpaw, J. R. (2013). Brain-computer interfaces. Handbook of Clinical Neurology 110, 67-74. doi:10.1016/B978-0-444-52901-5.00006-X.
  31. Wolpaw, J. R., Birbaumer, N., Heetderks, W. J., McFarland, D. J., Peckham, P. H., Schalk, G., Donchin, E., Quatrano, L. A., Robinson, C. J., and Vaughan, T. M. (2000). Brain-computer interface technology: a review of the first international meeting. IEEE Transactions on Rehabilitation Engineering, 8(2), 164-173. doi:10.1109/TRE.2000.847807.
  32. Wolpaw, J. R. and McFarland, D. J. (2004). Control of a two-dimensional movement signal by a noninvasive brain-computer interface in humans. Proceedings of the National Academy of Sciences of the United States of America, 101(51), 17849-17854. doi:10.1073/pnas.0403504101.
Download


Paper Citation


in Harvard Style

Reichert C., Kennel M., Kruse R., Heinze H., Schmucker U., Hinrichs H. and W. Rieger J. (2013). Robotic Grasp Initiation by Gaze Independent Brain-controlled Selection of Virtual Reality Objects . In Proceedings of the International Congress on Neurotechnology, Electronics and Informatics - Volume 1: NEUROTECHNIX, ISBN 978-989-8565-80-8, pages 5-12. DOI: 10.5220/0004608800050012


in Bibtex Style

@conference{neurotechnix13,
author={Christoph Reichert and Matthias Kennel and Rudolf Kruse and Hans-Jochen Heinze and Ulrich Schmucker and Hermann Hinrichs and Jochem W. Rieger},
title={Robotic Grasp Initiation by Gaze Independent Brain-controlled Selection of Virtual Reality Objects},
booktitle={Proceedings of the International Congress on Neurotechnology, Electronics and Informatics - Volume 1: NEUROTECHNIX,},
year={2013},
pages={5-12},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0004608800050012},
isbn={978-989-8565-80-8},
}


in EndNote Style

TY - CONF
JO - Proceedings of the International Congress on Neurotechnology, Electronics and Informatics - Volume 1: NEUROTECHNIX,
TI - Robotic Grasp Initiation by Gaze Independent Brain-controlled Selection of Virtual Reality Objects
SN - 978-989-8565-80-8
AU - Reichert C.
AU - Kennel M.
AU - Kruse R.
AU - Heinze H.
AU - Schmucker U.
AU - Hinrichs H.
AU - W. Rieger J.
PY - 2013
SP - 5
EP - 12
DO - 10.5220/0004608800050012