Human-Computer Interaction

Implementation of Augmented Reality Interface for Elderly and Patients Undergoing Rehabilitation

We consider and implement interfaces (in augmented reality) to reduce various physical and mental problems that elderly and patients have. We are collaborating with National Cheng Kung University (Taiwan) in implementing a rehabilitation device for fingers using gamification with music.We are also currently collaborating with University of Trento (Italy) with a project to investigate the effect of visualizing physical information of a patient during sit-to-stand rehabilitation.

  • Yanjiao Ao, Masayuki Kanbara, Yuichiro Fujimoto, Hirokazu Kato, “MR System to Promote Social Participation of People Who Have Difficulty Going Out,” International Conference on Human-Computer Interaction (HCII2021), Springer, Vol.12787, pp.383-402, 2021.
  • Isidro III Butaslac, Alessandro Luchetti, Edoardo Parolin, Yuichiro Fujimoto, Masayuki Kanbara, Mariolino De Cecco, Hirokazu Kato, “The Feasibility of Augmented Reality as a Support Tool for Motor Rehabilitation,” International Conference on Augmented Reality, Virtual Reality and Computer Graphics (Salento AVR), Springer, Online, 7 Sep. 2020.
  • Naoki Inoue, Yuichiro Fujimoto, Alexander Plopski, Sayaka Okahashi, Masayuki Kanbara, Hsiu-Yun Hsu, Li-Chieh Kuo, Fong-Chin Su, Hirokazu Kato, “Effect of Display Location on Finger Motor Skill Training with Music-based Gamification,” International Conference on Human-Computer Interaction (HCII2020), Springer, pp.78-79, Online, 22 Jul. 2020.
  • Michele Stocco, Alessandro Luchetti, Paolo Tomasin, Alberto Fornaser, Patrizia Ianes, Giovanni Guandalini, J. Flores Ty, Sayaka Okahashi, Alexander Plopski, Hirokazu Kato, Mariolino De Cecco, “Augmented Reality to Enhance the Clinical Eye: The Improvement of ADL Evaluation by Mean of a Sensors Based Observation,” 16th EuroVR International Conference (EuroVR 2019), Springer, Vol.LNCS 11883, pp.291-296, Tallinn, Estonia, 23 Oct. 2019.
  • Alexander Plopski, Ada Virginia Taylor, Elizabeth Jeanne Carter, Henny Admoni, “InvisibleRobot: Facilitating Robot Manipulation Through Diminished Reality,” Poster in Proceedings of the International Symposium on Mixed and Augmented Reality, Beijing, China, 14 Oct. 2019
  • Silviya Hasana, Yuichiro Fujimoto, Alexander Plopski, Masayuki Kanbara, Hirokazu Kato, “Improving Color Discrimination for Color Vision Deficiency (CVD) with Temporal-domain Modulation,” International Symposium on Mixed and Augmented Reality, Beijing, China, 14 Oct. 2019

Guideline on Designing and Implementing Augmented Reality Systems for Supporting Assembly Tasks

We proposed a guideline to support developers who aim to design an assisting system using augmented reality. We also published this guideline online (here). Even people who are not familiar with augmented reality can also design applications with high quality specifications.

  • Keishi Tainaka, Yuichiro Fujimoto, Masayuki Kanbara, Hirokazu Kato, Atsunori Moteki, Kensuke Kuraki, Kazuki Osamura, Toshiyuki Yoshitake, Toshiyuki Fukuoka, “Guideline and Tool for Designing an Assembly Task Support System Using Augmented Reality,” IEEE International Symposium on Mixed and Augmented Reality (ISMAR), IEEE , Online, 9 Nov. 2020.

Support and training system using virtual and augmented reality for public speaking (social skills)

We are researching methods using VR/AR to train and support social skills such as presentations. In the process, we have verified the effectiveness of a VR-based presentation recording and reviewing method to alleviate the fear of public speaking. Currently, we are studying and constructing a real-time feedback method.

  • Hangyu Zhou, Yuichiro Fujimoto, Masayuki Kanbara, Hirokazu Kato, “Virtual Reality as a Reflection Technique for Public Speaking Training,” Applied Sciences, MDPI, Vol.11, No.9, p.3988, Apr. 2021.
  • Hangyu Zhou, Yuichiro Fujimoto, Masayuki Kanbara, Hirokazu Kato, “Effectiveness of Virtual Reality Playback in Public Speaking Training,” Workshop on Social Affective Multimodal Interaction for Health (SAMIH) in 22nd ACM International Conference on Multimodal Interaction (ICMI2020), ACM, Online, 29 Oct. 2020.

Human-Robot Interactions (HRI) for Boosting Daily-Use Motivation

Our research focuses on implementing multimodal conversation systems using visual, haptic, and gesture information to maintain daily life as safe, secure, and comfortable as possible. We are focusing on various human robot interfaces, not only physical robots, but also robots in the virtuality-reality spectrum (e.g. augmented reality [AR] robots, virtual robots).

  • Shogo Nishimura, Masayuki Kanbara, Norihiro Hagita, “Atmosphere Sharing with TV Chat Agents for Increase of User’s Motivation for Conversation,” International Conference on Human-Computer Interaction 2019 (HCI International), Springer , pp.482-492, 29 Jul. 2019.

XR Mobility Platform for Passenger Comforts (HRI for Self-driving car)

Our research focuses to realize the Comfort Intelligence (CI) for Autonomous Vehicles. CI is the intelligence system to consider passengers’ comfort inside moving vehicles. Autonomous Vehicle Stress and Autonomous Vehicle Motion Sickness would be the main two factors to make passenger discomfort. By using the XR mobility platform that consists of immersive displays and motion platform tilting seats inside an autonomous vehicle to propose the stress reduction and motion sickness reduction method.

  • Sawabe, T. Kanbara, M. Fujimoto, Y. and Kato, H., “XR Mobility Platform: Multi-Modal XR System Mounted on Autonomous Vehicle for Passenger’s Comfort Improvement.” ISMAR, 2021.

Touch Care Robot (HRI)

Our research focuses to realize comfortable touch care robots. A gentle touch is an essential part of human interaction that produces a positive care effect. Previous robotics studies have shown that robots can reproduce gentle touch that elicits similar, positive emotional responses in humans. However, it was a combination of touch and speech by robots that could enhance positive emotional responses. This study aimed to support the hypothesis that a multimodal interaction combining gentle touch and gentle speech by a robot enhances positive emotional responses.

  • Taishi Sawabe, Suguru Honda, Yuichiro Fujimoto, Masayuki Kanbara and Hirokazu Kato, 3rd International Workshop on Virtual, Augmented, and Mixed Reality for Human-Robot Interaction, 2020.
  • Suguru Honda, Taishi Sawabe, Shogo Nishimura, Sato Wataru, Yuichiro Fujimoto, Alexander Plopski, Masayuki Kanbara and Hirokazu Kato, Proceedings of 7th annual International Conference on Human-Agent Interaction, 2019.

Computer Vision

Body Pose Estimation for Athlete

Movement analysis and visualization of sprinters’ movements can be improved by estimating poses of each body part of the sprinter from a series of RGB images.

Projector-based Augmented Reality

Projection-based augmented reality is a technique that visualizes digital information onto real scene by light projection. Recently, we have been researching projection mapping that can be used in places with strong ambient light such as outdoors, and technology that can be projected onto moving objects, using a special camera called an event camera.

  • Ryo Akiyama, Goshiro Yamamoto, Toshiyuki Amano, Takafumi Taketomi, Alexander Plopski, Yuichiro Fujimoto, Masayuki Kanbara, Christian Sandor, Hirokazu Kato, “Illusory Light: Perceptual Appearance Control Using a Projection-Induced Illusion,” Computers & Graphics, Elsevier, Vol.91, pp.129-140, Jul. 2020.
  • Ryo Akiyama, Goshiro Yamamoto, Toshiyuki Amano, Takafumi Taketomi, Alexander Plopski, Christian Sandor, Hirokazu Kato, “Robust Reflectance Estimation for Projection-Based Appearance Control in a Dynamic Light Environment,” Transactions on Visualization And Computer Graphics, IEEE, Sep. 2019
  • Ryo Akiyama, Goshiro Yamamoto, Toshiyuki Amano, Takafumi Taketomi, Alexander Plopski, Christian Sandor, Hirokazu Kato, “Perceptual Appearance Control by Projection-Induced Illusion,” Demo at the IEEE Conference on Virtual Reality and 3D User Interfaces, pp.1-2, Osaka, Japan, Mar. 2019

Computer Graphics

Developing HMDs with Light Field Display Capabilities with Corresponding Research on Rendering Methods

We are working on the development of a HMD model that features a micro-lens array and an eye tracking camera. On top of that, we are also studying a rendering method that provides a clear image based on pupil conditions.

  • Kohei Oshima, Kenneth R Moser, Damien Constantine Rompapas, J Edward Swan, Sei Ikeda, Goshiro Yamamoto, Takafumi Taketomi, Christian Sandor, Hirokazu Kato, Improved Clarity of Defocussed Content on Optical See-Through Head-Mounted Displays, IEEE Symposium on 3D User Interfaces 2016, Greenville, South Carolina, March, 2016.
  • Damien Constantine Rompapas, Kohei Oshima, Sei Ikeda, Takafumi Taketomi, Goshiro Yamamoto, Christian Sandor, Hirokazu Kato, [DEMO] EyeAR: Physically-Based Depth of Field through Eye Measurements, Demo at ISMAR’15: the 14th IEEE International Symposium on Mixed and Augmented Reality, Fukuoka, Japan, October, 2015.