Research at NUILab

Table of Contents

The Natural User Interaction (NUI) Lab’s primary research focuses on multi-modal interaction (gestures, gestures + speech, etc.) using elicitation, development of techniques, and recognition (when appropriate) for 3D user interfaces. 

The lab uses Augmented Reality and Virtual Reality technology to accomplish its primary objective. In addition, NUILab has been expanding to understand collaborative nature in augmented reality. A secondary focus is understanding how virtual reality can be used to promote larger diversity in computer science. 

In addition to our core research, this is a very multi-disciplinary lab, where we collaborate with Vision, Software Engineering, Psychology, Human-Factors, and Networking, among others.

DARPA

Ego-Centric Emotion Recognition using Augmented Reality Headsets

 
Current Status of Project: Active
Awarded: $299,957 

Postdoctoral Fellowship – Ego-Centric Emotion Recognition)

Embodied Intelligent Avatars: Diana

 
 
Current Status of Project: Completed
Communicting with Computers (PI: Ross Beveridge) is an award from DARPA where the emphasis is how computers and human communicate with each other. This project goes beyond a siri approach but Diana (our embeded agent) can see and understand the real world and the virtual world. The success of this project is leading newer projects, such as Faelyn Fox.

Innovative Embodied Agent Approaches to Assessment with Developmental and Intervention Science Applications – Faelyn Fox (World)

Current Status of Project: Completed
Awarded: $271,777 
Children user interaction is a very difficult problem considering the differences among ages and experience. We are concentrating in the usability of this avatar and its gestures.

National Science Foundation (NSF)

CRII: CHS: Understanding Gesture User Behavior in Augmented Reality Headsets

 
Current Status of Project: Active 

Awarded: $175,000

Description Coming Soon

CCRI: Planning, Collaborative Research: Low-Latency for Augmented Reality Interactive Systems (LLARIS)

Current Status of Project: Active 

Awarded:  $100,000

One of the critical problems that we are looking right now is how would augmented reality, once it becomes pervasive, will operate in multi-user environments, in particular where interaction is key and latency plays a detrimental role. (Planning grant)

FW-HTF-P: Optimizing Long-Term Human Performance in Future Work

 
Current Status of Project: Completed

Awarded: $150,000

Understanding how humans learn using virtual and augmented reality is criticial for the development of adaptive systems. This NSF award has provided the initial seed money to plan future development in this area.

NSF SBIR Phase IIA: 2.5D Extensions to Braille-based User Interaction

 

Current Status of Project: Completed
Awarded: $105,000
Description Coming Soon

Office of Naval Research (ONR)

Assessing Cognitive Load and Managing Extraneous Load to Optimize Training

 
 
Current Status of Project: Active 

Awarded: $750,000

Description Coming Soon

Perceptual/ Cognitive Aspects of Augmented Reality: Experimental Research and a Computational Model

Current Status of Project: Active 

Awarded: $900,000

Description Coming Soon

Fused Augmented Realities with Synthetic Vision (FAR/SV(Systems for Ground Forces VR Rehab Inc.

Current Status of Project: Completed

Awarded: $198,914

Description Coming Soon

N202-090 – Single Amphibious Integrated Precision Augmented Reality Navigation (SAIPAN) System via Fused Augmented Realities – User Interface (FAR-UI)

Current Status of Project: Completed

Awarded: $44,000

Description Coming Soon

War Fighting Performance: Augmented Reality Multi-Modal Interaction Techniques for JTAC and Battlefield Readiness – Defense University Research Instrumentation Program

Current Status of Project: Completed 

Awarded: $201,420

Description Coming Soon

Additional Projects

Microgestures, Gestures, and Multimodal Interaction Techniques

 

Current Status of Project: Active 
The project includes gesture+speech, gesture+speech+gaze, and gesture+gaze, among others. The objective is to improve user interaction, primarily in augmented reality applications using optical-see through lenses, such as HoloLens 2 and Magic Leap.

Multi-Modal Gesture Recognition: Gesture Centric

 
Current Status of Project: Active 
Multi-Modal Recognition (from a gesture-centric point of view) looks to understand how multiple modalities can be recognized for interactive application.

Gesture and Multi-Modal Elicitation Studies

 

 

Current Status of Project: Active 
Elicitation studies are key to understand how users interact with a system.

Improving Computer Science Education using Virtual Reality and Human-Computer Interaction

Current Status of Project: Active
We have developed Avatars using the property of immersion of Virtual Reality to see if we can improve computer science education for minorities by playing different scenarios.

Virtual Reality Soccer for Multiple Concussions

 

Current Status of Project: Active 
Athletes suffer multiple concussions in their career. We have developed a system to provide metrics for multiple concussions.

Improving User Interfaces for Young Adults with Autism, Gift from Dan Marino Foundation

Current Status of Project: Active 
Gift: $40,500

Description Coming Soon

AR Notifications

 
Current Status of Project: Active 
Description Coming Soon

AR Clutter Tradeoffs

 
Current Status of Project: Active 

Description Coming Soon

AR Annotations

Current Status of Project: Active 
Description Coming Soon

AR 3D Audio

 
Current Status of Project: Active

Description Coming Soon

Legacy Projects

CircGR: Multi-touch Gesture Recognition


Current Status of Project: Completed 
We developed a fast recgonizer that could learn almost any gesture design by the user. http://circgr.com/

3D Travel (Multi-Touch)

 
Current Status of Project: Active 
NUILAB continues to understand how users can improve travel using natural user intefaces (e.g., gestures).

Procedural Skybox Generation

 
Current Status of Project: Completed
A procedural sky generation that was extremely helpful in our 3D travel elicitation.

NIH-NIDA SUD Challenge – Biobrace VR: Bio-Interactive Device with Personalized Avatar Therapy for SUD

Current Status of Project: Completed 

Awarded: $10,000

Description Coming Soon