ABSTRACT
On large displays, using keyboard and mouse input is challenging because small mouse movements do not scale well with the size of the display and individual elements on screen. We present “Large User Interface” (LUI), which coordinates gestural and vocal interaction to increase the range of dynamic surface area of interactions possible on large displays. The interface leverages real-time continuous feedback of free-handed gestures and voice to control a set of applications such as: photos, videos, 3D models, maps, and a gesture keyboard. Utilizing a single stereo camera and voice assistant, LUI does not require calibration or many sensors to operate, and it can be easily installed and deployed. We report results from user studies where participants found LUI efficient, learnable with minimal instruction, and preferred it to point-and-click interfaces.
Supplemental Material
- Anand Agarawala and Ravin Balakrishnan. 2006. Keepin’ It Real: Pushing the Desktop Metaphor with Physics, Piles and the Pen. In Proc. of CHI ’06(Montréal, Québec, Canada). ACM, 10 pages. https://doi.org/10.1145/1124772.1124965Google ScholarDigital Library
- Richard A. Bolt. 1980. “Put-That-There”: Voice and Gesture at the Graphics Interface. SIGGRAPH Comput. Graph. 14, 3 (July 1980), 262–270. https://doi.org/10.1145/965105.807503Google ScholarDigital Library
- A. Kendon. 1988. How gestures can become like words. Hogrefe, 131–141.Google Scholar
- Jinha Lee, Alex Olwal, Hiroshi Ishii, and Cati Boulanger. 2013. SpaceTop: Integrating 2D and Spatial 3D Interactions in a See-through Desktop Environment. ACM, 189–192. https://doi.org/10.1145/2470654.2470680Google Scholar
- Ugo Braga Sangiorgi, François Beuvens, and Jean Vanderdonckt. 2012. User Interface Design by Collaborative Sketching. In Proceedings of the Designing Interactive Systems Conference (Newcastle Upon Tyne, United Kingdom) (DIS ’12). Association for Computing Machinery, New York, NY, USA, 378–387. https://doi.org/10.1145/2317956.2318013Google ScholarDigital Library
- Arthur Sluÿters, Mehdi Ousmer, Paolo Roselli, and Jean Vanderdonckt. 2022. QuantumLeap, a Framework for Engineering Gestural User Interfaces based on the Leap Motion Controller. Proc. ACM Hum. Comput. Interact. 6, EICS (2022), 1–47. https://doi.org/10.1145/3532211Google ScholarDigital Library
- Arthur Sluÿters, Quentin Sellier, Jean Vanderdonckt, Vik Parthiban, and Pattie Maes. 2022. Consistent, Continuous, and Customizable Mid-Air Gesture Interaction for Browsing Multimedia Objects on Large Displays. International Journal of Human-Computer Interaction 38, 10(2022). https://doi.org/10.1080/10447318.2022.2078464Google Scholar
- M. Tavakol and R. Dennick. 2011. Making sense of Cronbach’s alpha. International Journal of Medical Education 2 (2011), 53–55. https://doi.org/10.5116/ijme.4dfb.8dfd arXiv:http://www.ijme.net/archive/2/cronbachs-alpha.pdfGoogle ScholarCross Ref
- Jamie Zigelbaum, Alan Browning, Daniel Leithinger, Olivier Bau, and Hiroshi Ishii. 2010. G-Stalt: A Chirocentric, Spatiotemporal, and Telekinetic Gestural Interface. In Proc. of TEI ’10 (Cambridge, Massachusetts, USA). ACM, 261–264.Google ScholarDigital Library
Recommendations
Investigating Cross-Device Interaction between a Handheld Device and a Large Display
CHI '17: Proceedings of the 2017 CHI Conference on Human Factors in Computing SystemsThere is a growing interest in HCI research to explore cross-device interaction, giving rise to an interest in different approaches facilitating interaction between handheld devices and large displays. Contributing to this, we have investigated the use ...
Touchless gestural interaction with small displays: a case study
CHItaly '13: Proceedings of the Biannual Conference of the Italian Chapter of SIGCHITouchless gestural interaction enables users to interact with digital devices using body movements and gestures, and without the burden of a physical contact with technology (e.g., data gloves, body markers, or remote controllers). Most gesture-based ...
Estimating virtual touchscreen for fingertip interaction with large displays
OZCHI '06: Proceedings of the 18th Australia conference on Computer-Human Interaction: Design: Activities, Artefacts and EnvironmentsLarge displays are everywhere. However, the computer mouse remains the most common interaction tool for such displays. We propose a new approach for fingertip interaction with large display systems using monocular computer vision. By taking into account ...
Comments