Evaluating Menu Techniques for Handheld AR with a Smartphone & Mid-Air Pen

Paper at ACM MobileHCI '20
by Philipp Wacker, Oliver Nowak, Simon Voelker, and Jan Borchers


Adding a mid-air pen to Handheld Augmented Reality creates a new kind of bimanual interaction for which many fundamental interaction design questions have not been answered yet. In particular, menus are an essential component in most visual interfaces, but it is unclear how to best interact with them in this setting: using the pen in mid-air or on a surface, using the touchscreen, or by moving the smartphone itself. We compared basic menus for these methods by analyzing success rates, selection times, device movement, and subjective ratings. Our results indicate that interacting with a mid-air menu using the pen, and operating a menu with the hand holding the smartphone, are sufficiently competitive to the current standard of two-handed touchscreen interaction, so that interaction designers can freely choose among them based on the interaction context of their application.







Software and Data

These are explanations for the supplementary materials provided of the MobileHCI paper: "Evaluating Menu Techniques for Handheld AR with a Smartphone & Mid-Air Pen".
There are three sets of materials provided.


01 Software:
The first is the software used in our study (01 Software). We made small adjustments to the functionality so that you can use it without having to build a "full" ARPen to test the interactions. The basis of the implementation is the ARPen system available here: https://github.com/i10/ARPen.

You can build the application for iOS using Xcode. Print the pdf "aruco-marker" to use as a simple ARPen for testing the menu techniques. To use the techniques, select a menu technique from the menu on the left. The scene will display a number of cubes and a target emoji in the top right corner of the screen. Buttons on the top left allow you to perform a button press. Note that this button overlaps with the one-handed menu but it would not exist if a full ARPen is connected. You can toggle the menu on the left by performing a three-finger swipe down gesture.
For the "Surface Menu", you need to print out the image "printMeForSurfaceMenu". The AR scene and the menu will align with this image. The physical size of the image is 18.4 cm by default. You can adapt the software to your printout by changing the width value in "ARPen/Assets.xcassets/AR Resources.arresourcegroup/stones.arreferenceimage/Contents.json".


02 Evaluation:
The study recordings and evaluation scripts are in the folder "02 Evaluation". We used Python scripts to perform the bootstrapping and calculation of confidence intervals. The recordings are in the subfolder /raw/:

  • "ARMenus.csv" contains the measurements of success, time to open the menu, time to select the item, translation and rotation (both individual for the axis and combined), as well as item position of the emoji and type of error for an unsuccessful selection
  • "ratings.csv" contains the subjective ratings of the participants regarding ease of use, comfort, and combination of opening the menu and selecting the item.

To perform the evaluation, run the python scripts "createDataTablesAndInitialPlots.py" and "createDataTablesAndInitialPlotsQual.py". These will prepare the data tables and perform bootstrapping to calculate the confidence intervals. The results are stored in a new folder "/DataTables/" with either a "Data" or "CI" prefix to the filename. Based on these data tables, the scripts also prepare initial plots for the results. These are stored in the folder "/Figures/". Note: the scripts require Python3, numpy, pandas, and the ARCH library for bootstrapping (https://pypi.org/project/arch/).


03 Alternative Analysis:
Due to the unfamiliar evaluation of the results, we have carried out a more traditional analysis as well. The PDF provides the results of that analysis and compares it to the original analysis from the paper.




  • Philipp Wacker, Oliver Nowak, Simon Voelker and Jan Borchers. Evaluating Menu Techniques for Handheld AR with a Smartphone & Mid-Air Pen.  In Proceedings of 22nd International Conference on Human-Computer Interaction with Mobile Devices and Services, MobileHCI '20, pages 10, ACM, New York, NY, USA, October 2020.
    HomepageMoviePDF DocumentBibTeX Entry

Contact: Philipp Wacker


← Main ARPen project page

We use cookies on our website. Some of them are essential for the operation of the site, while others help us to improve this site and the user experience (tracking cookies). You can decide for yourself whether you want to allow cookies or not. Please note that if you reject them, you may not be able to use all the functionalities of the site.