Submit a Manuscript to the Journal

International Journal of Human-Computer Interaction

For a Special Issue on

State-of-the-Art Human-Computer-Interaction in Metaverse

Manuscript deadline
30 September 2023

Cover image - International Journal of Human-Computer Interaction

Special Issue Editor(s)

Zhihan Lv, Uppsala University
[email protected]

James J. Park, Seoul National University of Science and Technology (SeoulTech)
[email protected]

Jun Shen, University of Wollongong
[email protected]

Houbing Song, Embry-Riddle Aeronautical University
[email protected]

Submit an ArticleVisit JournalArticles

State-of-the-Art Human-Computer-Interaction in Metaverse

Metaverse aims to realize the informatization and intelligence of all things. It creates a virtual-real fusion space where information fully surrounds people, thus evolving into a new social form with boundless space and time. Metaverse is an integrated application environment for the vigorous development of Information Technology (IT). The famous seven-layer value chain map of Metaverse contains many advanced IT technologies, such as 5G/6G, wearable devices, Mixed Reality (MR) glasses, AI, Blockchain, and 3D engines. Metaverse is end-users oriented, where Human-Computer Interaction (HCI) is the core technology. HCI studies the highly available information exchange principle and technology between humans and computers. It establishes the multi-modal input and output software and hardware interaction interface and user terminal interface, forming a specific interaction mode.

HCI is also the basic ability of the Metaverse system. In a sense, the HCI performance directly determines the ability boundary of humans in the Metaverse, that is the value of the Metaverse to human beings. The Metaverse HCI is expected to break through the current two technologies: Command User Interface (CUI) and Graphical User Interface (GUI). It will enter the third stage: Natural User Interface (NUI). Both CUI and GUI require the user to learn the operations preset by the software developer. NUI allows users to communicate with the computer in the most natural way (such as through voice, facial expression, movement gesture, moving body, or head rotating head). The Artificial Intelligence (AI)-driven Metaverse enables users to undertake more interesting, attractive, and influential tasks. As a virtual space accessible by all users anytime and anywhere, the interaction behavior of users will be ubiquitous and continuous. This also greatly challenges AI's accuracy in inferring users' interaction intentions.

Topics applicable to this special issue include but are not limited to:

  • Change of Human-computer Interaction Mode from the Perspective of Metaverse
  • Human-computer Interaction Technology in Metaverse to Machine Active Perception
  • Natural Human-computer Interaction as the Main Mode of Interaction in the Metaverse
  • Breakthrough of Human-computer Interaction Technology under the Demand of Metaverse
  • Breakthrough of Interaction Path Solidification Mode in Metaverse
  • Multimodal Situational Sensing Reasoning for Human-computer Interaction in Metaverse
  • Optimization of Computational Framework for Natural Interaction Intention Understanding
  • Upgrading of Human-computer Interaction Devices and Algorithms in Metaverse
  • Analysis of Human-computer Interaction Communication Protocol in Metaverse
  • Tactile Perception of Human-computer Interaction Helps the Landing of Metaverse

Submission Instructions

Interested researchers are welcomed to submit their research works falling within the scope of the special issue.

Submitted papers decision will be based on reviewers recommendation by double blind review.

Submission deadline: 30 September 2023
Author notification: within 30 December 2023
Revised manuscript due: within 28 February 2024
Notification of acceptance: within 30 April 2024

Instructions for AuthorsSubmit an Article

We use cookies to improve your website experience. To learn about our use of cookies and how you can manage your cookie settings, please see our Cookie Policy. By closing this message, you are consenting to our use of cookies.