본문 바로가기 주메뉴 바로가기
검색 검색영역닫기 검색 검색영역닫기 ENGLISH 메뉴 전체보기 메뉴 전체보기

논문

A Realistic Game System Using Multi-Modal User Interfaces

  • 저자이의철
  • 학술지IEEE T CONSUM ELECTR 56/3
  • 등재유형
  • 게재일자(2010)
This study is to propose a realistic game system using a multi-modal interface, including gaze tracking, hand gesture
recognition and bio-signal analysis based on the photoplethysmogram (PPG), the galvanic skin response (GSR), and
the skin temperature (SKT). Our research is novel in the following four ways, compared to previous game systems.
First, a highly immersive and realistic game is implemented on a head mounted display (HMD), with a gaze tracker, a
gesture recognizer and a bio-signal analyzer. Second, since the camera module for eye tracking is attached below
the HMD, the user’s gaze position onto the HMD display can be calculated without wearing any additional eye
tracking devices. Third, an aiming cursor in the game system is controlled by the gaze tracking. The grabbing and
throwing behaviors toward a target are performed by the user’s hand gestures using a data glove. Finally, the level of
difficulty in the game system is adaptively controlled according to the measurement and analysis of the user’s bio-
signals.
The experimental results show that the proposed method provides more effect on experience of immersion and interest
than conventional interaction method such as a keyboard and or a mouse.
This study is to propose a realistic game system using a multi-modal interface, including gaze tracking, hand gesture
recognition and bio-signal analysis based on the photoplethysmogram (PPG), the galvanic skin response (GSR), and
the skin temperature (SKT). Our research is novel in the following four ways, compared to previous game systems.
First, a highly immersive and realistic game is implemented on a head mounted display (HMD), with a gaze tracker, a
gesture recognizer and a bio-signal analyzer. Second, since the camera module for eye tracking is attached below
the HMD, the user’s gaze position onto the HMD display can be calculated without wearing any additional eye
tracking devices. Third, an aiming cursor in the game system is controlled by the gaze tracking. The grabbing and
throwing behaviors toward a target are performed by the user’s hand gestures using a data glove. Finally, the level of
difficulty in the game system is adaptively controlled according to the measurement and analysis of the user’s bio-
signals.
The experimental results show that the proposed method provides more effect on experience of immersion and interest
than conventional interaction method such as a keyboard and or a mouse.

이 페이지에서 제공하는 정보에 대해 만족하십니까?