This page is filled with briefly introducing the completed and ongoing research projects. Most of them focus on exploring and understanding the concept of Believability of spatial user interactions in virtual environments.
Thanks to the rapid development of Virtual Reality (VR), current lightweight head-mounted displays commonly provide a larger field of view, binocular vision, higher resolution, along with lower-latency tracking, to evoke the sense of presence, the feeling of “being there”. Extensive prior works have been studied the formation of presence, but little on the concept of believability, refers to a suspension of disbelief or an illusion of reality. In short, users can believe the generated experiences despite they surely know it is not completely real, they can engage with it regardless of the inconsistency of the virtual world.
Image Browsing Interface in Virtual Reality (VR)
Image Browsing Landmark Interface
Current VR (Virtual Reality) browsing interfaces provide few landmarks. In 2D spatial memory interfaces, such artificial landmarks have been well studied to facilitate learning and performance. However, less is known about the effects of 3D landmarks on facilitating spatial learning and recalling for VR browsing interfaces.
In this project, the implementation of a new artificial landmark (3D visual pin) along with three previous landmarks (grid, image, and anchor) are presented with a curved visual wall layout, in which we examined the effects on facilitating learning and recalling items’ locations in small and large item sets. The goal is to provide the possibility of enabling artificial landmarks for future VR browsing interfaces.
Amphitheater Layout for Browsing in VR
Prior work has been shown that egocentric distance cue can give additional benefits for spatial memory. This project aims to explore the egocentric distance cue for browsing performance in VR.
The main goal is to provide a guideline for designing browsing interface in virtual environments, e.g. egocentric distance-based item sizing cue. Then users can easily memorize the locations of targets, the visual searching time can be significantly reduced. Additional retrieval and recall tasks will be conducted to examine the browsing performance in VR.
Auditory Feedback in Touch User Interfaces
This project focuses on the issue of “fat finger” from the view of sensory perception in touch user interfaces. The additional auditory feedback is proposed and investigated to study the effects on trajectory-based drawing performance.
The sketch-based interface also dubbed as a pen-based interface, which has an increasing demand in the field of HCI (Human-Computer Interaction). It can provide the intuitive interface compared with WIMP (Window, Icon, Menu, Pointer). Especially for sketch based modeling, 3D contents creation of freeform objects in design, production and science. However, traditional 3D modeling tools such as 3Ds max and Autodesk requires users to learn an interface wholly different from drawing skills. Although those tools can provide precise modeling, however, it can’t meet the requirements that are intuitive, efficient and easy, to some extent.
Especially for conceptual design stage, studies show that most of designers prefer using sketches to express or convey their ideas. 2D drawing remains much easier than 3D modeling, even for professionals and amateurs who used to draw 2D drawing before 3D modeling. In general, a 2D sketch is lack of interaction, which forces novices to have the imagination of 2D sketch for the 3D model. So, to develop an intuitive and efficient interface for 3D modeling which converts from 2D sketches is essential for novices in the stage of conceptual design. The intuitive is mean to let user draw seems like on the paper, which means don’t need to change the viewpoint frequently. The efficient is mean to let the user just draw few strokes to represent the shapes and real-time modeling.