← Main ARPen project page

ARPen: Mid-Air Object Manipulation Techniques for a Bimanual AR System with Pen & Smartphone

Paper at ACM CHI '19
by Philipp Wacker, Oliver Nowak, Simon Voelker, and Jan Borchers

Abstract

Modeling in Augmented Reality (AR) lets users create and manipulate virtual objects in mid-air that are aligned to their real environment. We present ARPen, a bimanual input technique for AR modeling that combines a standard smartphone with a 3D-printed pen. Users sketch with the pen in mid-air, while holding their smartphone in the other hand to see the virtual pen traces in the live camera image. ARPen combines the pen's higher 3D input precision with the rich interactive capabilities of the smartphone touchscreen. We studied subjective preferences for this bimanual input technique, such as how people hold the smartphone while drawing, and analyzed the performance of different bimanual techniques for selecting and moving virtual objects. Users preferred a bimanual technique casting a ray through the pen tip for both selection and translation. We provide initial design guidelines for this new class of bimanual AR modeling systems.

Video

Full Reference

    2019

  • Philipp Wacker, Oliver Nowak, Simon Voelker and Jan Borchers. ARPen: Mid-Air Object Manipulation Techniques for a Bimanual AR System with Pen & Smartphone.  In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, CHI '19, pages 619:1–619:10, ACM, New York, NY, USA, May 2019.
    HomepageMoviePDF DocumentBibTeX Entry

Contact: Philipp Wacker

image.pngMake_Light_BMBF.png.jpeg


← Main ARPen project page