Research


Haptic Perception
In my research I explores haptic perception. Haptic perception could be defined as active tactile perception, i.e. perception resulting from the tactile information gathered by an active exploration of objects by the hands. The aim of this research project is twofold, to research the human haptic perception by the aid of computational and robot models and to design and implement an artificial system with capacity for haptic perception. So far several bio-inspired robot systems using different approaches to recognize shape, size and individual objects have been implemented. All my systems so far are self-organizing, i.e. they learn by examples. In my work I design and implement all parts of the systems, i.e. I design and build the tactile grippers as well as design the bio-inspired computational models and implement them in software. The systems employ various kinds of AI algorithms, but especially artificial neural networks. In my haptic systems I have made heavy use of IKAROS, which is an infrastructure for brain modelling and control of robots.
The first step in the project was to develop a limited system able to learn to categorize the explored objects according to size. To this aim a simple robot hand, the LUCS Haptic Hand I, equipped with nine touch sensitive piezo electric sensors was built. In addition, a number of computational models of the human haptic system were designed and implemented. The LUCS Haptic Hand I was successful in generating tactile signal patterns containing enough information for size categorization, and the computational models have demonstrated an ability to learn to categorize objects according to size.
The LUCS Haptic Hand I was only capable of passive touch. Therefore its successor, the LUCS Haptic Hand II, has been developed. The LUCS Haptic Hand II is an 8 d.o.f. robot hand equipped with 45 piezo electric touch sensors. The robot hand consists of three fingers, each with two phalanges, mounted on a triangular plastic plate. It is equipped with a wrist and a mechanism for vertical repositioning. The piezo electric touch sensors are distributed on the palmar side of the finger phalanges. The LUCS Haptic Hand II is used in a sequence of artificial systems for haptic shape perception. These systems use active explorations of the objects by several grasps with the robot hand to gather cutaneous and proprioceptive information. The systems make heavy use of tensor product operations or a novel neural network called Tensor Multiple Peak SOM (T-MPSOM) to code the tactile information in a useful way. The systems are able to learn to map different shapes in different parts of a self-organizing map (SOM). A picture of the LUCS Haptic Hand II is available here and a movie that shows the robot hand (before it was equipped with tactile sensors) during the grasping of a ball is available here.
The LUCS Haptic Hand III is a five fingered 12 d.o.f. robot hand equipped with 11 proprioceptive sensors. This robot hand was developed to get an anthropomorphic robot hand. A successful haptic system, based solely on proprioception, with capacity to recognize shape, size as well as individual objects has been implemented with this robot hand. This system has shown great generalization ability as well. A picture of the robot hand is available here and a couple of pictures of the robot hand equipped with 18 very sensitive experimental binary tactile sensors of my own design, is available here and here. There are also some movies of the robot hand available. One movie that shows the robot hand when it shakes hands with its creator is available here. Another movie that shows the robot hand while grasping a beer can is available here. A movie that shows finger movements from one side is available here and another movie that shows finger movements seen from the other side is available here. Unfortunately, I have been told that some people have experienced problems with seeing the movies of the LUCS Haptic Hand III. If you experience such problems the hand shake movie is also available in another format here.
My current efforts aim at developing an anthropomorphic robot arm as well as to study visual perception and computer vision in order to be able to develop a visual system that can guide the robot arm and interact with the haptic system. The robot arm has in total 14 d.o.f. and will be equipped with at least 10 proprioceptive sensors. The robot hand is designed in a similar fashion as the LUCS Haptic Hand III but is of a much smaller size. The arm is equipped with a flexible elbow and a shoulder that enables horizontal as well as vertical rotation. The robot arm will be used for further experimentation with artificial haptic systems as well as in models of interaction between the haptic and visual system.
Others
I am a member of the AI group at the Department of Computer Science at Lund University.
I'm involved in teaching at the Department of Computer Science at Lund University.
I use IKAROS. Ikaros is an infrastructure for brain modelling and control of robots. It is very much employed in my modelling of the human haptic system.
I have done simulations of the re-organization of the somatosensory cortex after injuries to the nerves between the hand and the brain. This was done to get an idea about what kind of tactile stimulation should be provided after a surgical replantation of a lost hand in order to optimize the recovery. More.
I have been involved in experiments together with IBIS at the University of Alicante about using different kinds of neural networks as an aid in diagnosing dysfunctions in the lower urinary tract.

Publications

2008

Gil Mendez, D., Johnsson, M., Garcia Chamizo, J.M., Soriano Paya, A., and Ruiz Fernandez, D. (Submitted). Application of artificial neural networks in the diagnosis of urological dysfunctions. Journal of Expert Systems With Applications.

Johnsson, M., & Balkenius, C. (Submitted). A Self-Organizing Object Recognition System Based on Proprioception. Journal of Robotics and Autonomous Systems.

Gil Mendez, D., Johnsson, M., Soriano Paya, A., and Ruiz Fernandez, D. (2008). Artificial Neural Networks for Diagnoses of Dysfunctions in Urology. Proceedings of Healthinf 2008, Madeira, Portugal.

2007

Johnsson, M., & Balkenius, C. (2007). Experiments with Proprioception in a Self-Organizing System for Haptic Perception. In Wilson, M., S., Labrosse, F., Nehmzow, U., Melhuish, C., Witkowski, M. (Eds.) Towards Autonomous Robotic Systems 2007, University of Wales, Aberystwyth, UK, 239-245.

Balkenius, C., Johnsson, M., & Johansson, B. (2007). Explorations in Cognitive Robotics. In Löfström, T., Johansson, U., Sönströd, C., König, R. and Niklasson, L. (Eds.) Proceedings of SAIS 2007, University College of Borås, Borås, Sweden, 189-192.

Johnsson, M., & Balkenius, C. (2007). LUCS Haptic Hand III - An Anthropomorphic Robot Hand with Proprioception. LUCS Minor 13.

Johnsson, M., & Balkenius, C. (2007). Neural Network Models of Haptic Shape Perception. Journal of Robotics and Autonomous Systems, 55, 720-727. Abstract

2006

Johnsson, M., & Balkenius, C. (2006). Haptic Perception with a Robotic Hand. In Honkela, T., Raiko, T., Kortela, J. and Valpola, H. (Eds.) Proceedings of the Ninth Scandinavian Conference on Artificial Intelligence (SCAI 2006), Helsinki University of Technology, Espoo, Finland, 127-134.

Johnsson, M., & Balkenius, C. (2006). A Robot Hand with T-MPSOM Neural Networks in a Model of the Human Haptic System. In Witkowski, M., Nehmzow, U., Melhuish, C., Moxey, E. and Ellery, A. (Eds.) Towards Autonomous Robotic Systems 2006, Surrey University, Guildford, UK, 80-87.

Johnsson, M., & Balkenius, C. (2006). LUCS Haptic Hand II. LUCS Minor, 9.

Johnsson, M., & Balkenius, C. (2006). Experiments with Artificial Haptic Perception in a Robotic Hand. Journal of Intelligent and Fuzzy Systems, 17, 4, 377-385. Abstract

2005

Johnsson, M., Pallbo, R., & Balkenius, C. (2005). A Haptic System for the LUCS Haptic Hand I. In Mira, J., and Alvarez, J.R. (Eds.), Mechanisms, Symbols, and Models Underlying Cognition: First International Work-Conference on the Interplay Between Natural and Artificial Computation, IWINAC 2005, Las Palmas, Canary Islands, Spain, June 15-18, 2005, Proceedings, Part I, 386-395. Springer-Verlag. Abstract

Johnsson, M., Pallbo, R., & Balkenius, C. (2005). Experiments with Haptic Perception in a Robotic Hand . In Funk, P., Rognvaldsson, T., and Xiong, N. (Eds.) Advances in Artificial Intelligence in Sweden, Västerås, Sweden: Mälardalen University, 81-86.

Johnsson, M. (2005). LUCS Haptic Hand I – Technical Report. LUCS Minor, 8.

Theses

2004

Johnsson, M. (2004). Cortical Plasticity – A Model of Somatosensory Cortex. A study of cortical reorganization, after nerve injuries between the hand and the brain, with the aid of computational models. MSc thesis from the Dept. of Cognitive Science, Lund University.

2003

Johnsson, M. (2003). Force feedback and cyber sickness in virtual reality. Examinations of force feedback usability in virtual reality, and cyber sickness in virtual environments. MSc thesis from the Dept. of Computer Science, Lund University. (Swedish with english summary).