Welcome to the NUILab
Colorado State University's
Natural User Interaction Laboratory
Welcome to the NUILab
The Natural User Interaction Lab (NUILab) works primarily in the area of 3D user interfaces, which include augmented reality (AR) and virtual reality (VR).
One focus in this lab is multimodal and unimodal interaction (gesture-centric), which includes gesture elicitation (i.e., participatory design) and developing interaction techniques for unimodal and multimodal interaction of the most recent interest includes microgestures. The research domains for these types of research methods include immersive analytics and VR sketching, among others.
On the other side of the equation, the NUILab focuses on developing a model for information access effort using AR head-mounted displays (HMDs) that describe the tradeoffs between physical & digital clutter, tasks, cues, and other artifacts for visual search. The NUILab also looks at assessing cognitive load and managing extraneous load to optimize training in XR.
Social impact and education are important to the NUILab. We work together to improve mental health by using forest bathing in VR and other environments. The NUILab is also looking at the concepts of invisible winds and forces to demonstrate forces to civil engineering students. Much of the NUILab does is performed by our great students and collaborators worldwide.
Current News
Grant and Funding Updates
- PI: "CAREER: HCC: Microgesture and Multimodal Interaction Techniques for Augmented Reality". 06/2023-05/2028, NSF CAREER, $600,000.
- PI: Assessing Cognitive Load and Managing Extraneous Load to Optimize Training. 03/01/2023 – 08/15/2025. Office Of Naval Research. $750,000
- PI: “Ego-Centric Emotion Recognition using Augmented Reality Headsets (CSU - I2O Postdoctoral Fellowship - Ego-Centric Emotion Recognition)”. 01/01/2022 – 12/31/2023. DARPA-RA-21-02. $299,957
- PI: “WAR Fighting Performance: Augmented Reality Multi-Modal Interaction Techniques for JTAC and Battlefield Readiness.” 08/30/2021 – 07/30/2023. Defense University Research Instrumentation Program (DURIP). DOD-NAVY-ONR-Office of Naval Research. $201,420
- PI: Perceptual/Cognitive Aspects of Augmented Reality: Experimental Research and a Computational Model. 08/16/2021 – 08/15/2024. Office Of Naval Research. $900,000
- PI: “CRII: CHS: Understanding Gesture User Behavior in Augmented Reality Headsets”. 8/1/2020 – 7/31/2023. NSF CRII NSF 19-579. $175,000
- PI: Improving User Interfaces for Young Adults with Autism. August 2022. Gift from the Dan Marino Foundation. $40,500
- PI: “CCRI: Planning: Collaborative Research: Low-Latency for Augmented Reality Interactive Systems (LLARIS).” $100,000. NSF 19-512 CCRI. Collaborative proposal between CSU, Tennesse Tech, and University of Nebraska Omaha. 01/09/2020.
The Latest in NUILab Publications
- 2022 - IEEE VR - Williams, A. S. and Ortega, F. R.. The Impacts of Referent Display on Gesture and Speech Elicitation. IEEE vol. 28, iss. 11. (September 2022), pp. 3885–3895. In IEEE Trans-actions on Visualization and Computer Graphics. DOI: https://doi.org/10.1109/TVCG.2022. 3203090. [IF: 5.226, AR: 26.6%]
- 2022 - Human Factors Journal - Warden,A.C., Wickens, C. D., Rehberg, D., Clegg, B. A., and Ortega, F. R. .Information Access Effort: The Role of Head Movements for Information Presented at Increasing Eccentricity on Flat Panel and Head-Mounted Displays. In Human Factors Journal. In Press. [IF:3.598, 5-IF: 4.212]
- 2022 - ACM SUI - Plabst, L., Oberdorfer, S., Ortega, F. R., and Niebling, F.. Push the Red Button: Comparing Notification Placement with Augmented and Non-Augmented Tasks in AR. In Proceedings of the 2022 ACM Symposium on Spatial User Interaction (SUI ’22). Association for Computing Machinery. pp. 1–11. DOI: https://doi.org/10.1145/3565970.3567701.
- 2022 - ACM VRST - Zhou, X., Williams, A. S., and Ortega, F. R. 2022. Eliciting Multimodal Gesture+Speech Interactions in a Multi-Object Augmented Reality Environment. In Proceedings of the 28th ACM Symposium on Virtual Reality Software and Technology (VRST ’22). Association for Computing Machinery. pp. 1–10. DOI: https://doi.org/10.1145/3562939.3565637. [AR: 26%]
- 2021 - Human Factors Journal - Wickens, C. D., Mifsud, D., Rodriguez, R., and Ortega, F. R.. Mitigating the costs of spatial transformations with a situation awareness augmented reality display: Assistance for the Joint Terminal Attack Controller. SAGE (June 2011), pp. 3–17. In Human Factors: The Journal of the Human Factors and Ergonomics Society. SAGE. DOI: https://doi.org/10.1177/00187208211022468. [IF:3.598, 5-IF: 4.212]