Headed by Prof. Dr. Jan Borchers, we work in Human-Computer Interaction (HCI). Grounded in Computer Science, we develop and study new interaction theories, techniques, and systems in personal digital fabrication and personal design, tangible, mobile, and wearable user interfaces, interactive textiles, multitouch tables and interactive surfaces, augmented reality, and visual coding environments. Our goal is to make the Brave New World of interactive technologies useful by making it usable.
Check out the current overview of our research projects below, or see our older Research Overview for a more detailed text.
Our publications at ACM CHI
Since starting in October 2003, we have become one of Germany's best-published research groups at CHI, the premier international academic conference in the field.
Our Research Areas — Jump to:
Tabletops and Tangibles
Knobology Revisited: A Comparison of User Performance between Tangible and Virtual Rotary Knobs: We present an experimental comparison of tangible rotary knobs and touch-based virtual knobs in three output conditions: eyes-on, eyes-free, and peripheral. Twenty participants completed a simple rotation task on a interactive surface with four different input techniques (two tangibles and two virtual touch widgets) in the three output conditions, re- presenting the distance from the locus of attention. We found that users were in average 20% faster using tangible knobs than using the virtual knobs. We found that tangible knobs retains performance even if they are not in the locus of attention of the users. We provide four recommendations of suitable choosing knobs based on tasks and design constraints. (read more)
Note at ITS 2015
PERCs: Persistently Trackable Tangibles on Capacitive Multi-Touch Displays: PERCs use the touch probing signal from capacitive touch screens to determine whether the tangibles are on- or off-screen. This allows robust capacitive tangible widgets regardless of being touched. (read more)
Paper and Demo at UIST 2015 and Demo at ITS 2015
Indirect Touch is a multitouch system in which users indirectly manipulate objects on the vertical screen by with touches on the horizontal screen. This interaction improves ergonomics of interactions over an extended period by reducing gorilla arm (vertical screen) and stiff neck (horizontal screen). (read more)
Full paper at CHI 2013
Passive Untouched Capacitive Widgets (PUCs) are simple physical widgets the can be detected by an unmodified capacitive touch screen without being touched by a user. They can be used on most existing projected-capacitive displays. (read more)
Note and demo at ITS 2013 and demo at UIST 2013
Touch interfaces provide great flexibility in designing an UI. However, the actual experience is often frustrating due to bad touch recognition. We saw that the relative location of the predecessor of a touch has a significant impact on the orientation and position of the touch ellipsis. We exploited this effect on an off-the-shelf touch display and showed that with only minimal preparation the touch accuracy of standard hardware can be improved allowing better recognition rates or more UI components on the same screen. (read more)
Note at CHI 2013.
HoloDesk is a novel interactive system combining an optical see through display and Kinect camera to create the illusion that users are directly interacting with 3D graphics. (read more)
Full paper at CHI 2012 in collaboration with Microsoft Research Cambridge.
BendDesk is an interactive desk environment that merges a vertical and a horizontal display with a curve. The entire surface is touch sensitive and supports direct manipulation of digital objects. (read more)
Full paper and demo at ITS 2010.
3D Displays can be controlled using direct manipulation. This, however, introduces problems as the direct notion means that our object should stick to our fingers during interaction. However, on a viewer-centric display, head movements will also translate the object. We compared different methods to cope with these constrains. (read more)
Full Paper at ITS 2012, workshop submission at CHI 2012.
We render physical effects in tabletop controls using electromagnetic actuation. Keeping the advantages of low-cost passive tangible, we allow designers to change physical properties on the fly, such as weight, friction, spring resistance, and detents.
Note at CHI 2011.
FingerFlux is a haptic output technique that provides near-surface haptic feedback on interactive tabletops. It combines an electromagnetic display with a permanent magnet attached to the user’s index finger. (read more)
Paper and demo at UIST 2011. Received 3rd place in the category “Best demo”.
Dynamic Portals is a light-weight touch-based technique that allows to transfer virtual objects to distant locations on an interactive tabletop. (read more)
Note at ITS 2011. Received Best Note Award.
Hybrid documents ease text corpus analysis for literary scholars. Co-locating physical and digital media in one surface combines their advantages. (read more)
Full Paper at ITS 2010
Madgets are low-cost, untethered, and easy-to-build tangible controls for interactive tabletops. Our tabletop maintains the consistency between physical controls and their digital state by using electromagnetic actuation. (read more)
Full paper at UIST 2010.
We are developing an HCI design pattern language that provides solutions for recurring problems when designing interactive tabletops and applications. (read more)
Paper at EuroPLoP 2010.
Mudpad is a multitouch screen that is enhanced with localized active haptic feedback. (read more)
Note at ITS 2010. Received Best Note Award and Best Demo Award.
SLAP Widgets are tangible general-purpose controls for interactive tabletops, made from silicone and acrylic. They combine the haptic feedback of physical controls with the flexibility of on-screen widgets. (read more)
Full paper at CHI 2009. Demo at CHI 2009 and TEI 2009.
Navigation and Exploration Interfaces
Chronicler: Interactive Exploration of Source Code History: We present Tree Flow, an interactive visualization of structural changes in source code history. Combining history graph navigation with a traditional source code view outperforms the more traditional history navigation on a file basis and users strongly prefer Chronicler for the exploration of source code. (read more)
Paper at CHI 2016.
Statsplorer is a software tool that allows users to do statistical analysis by interacting with visualizations. It guides the user to select appropriate statistical analysis tasks based on her research questions. It also prevents common statistical analysis mistakes and promotes understanding by collocating appropriate visualizations with statistical analysis results. (read more)
Paper at CHI 2015.
An Augmented Flute for Beginners
Note at NIME 2017.
Personal Orchestra is a series of projects to adapt this centuries-old interaction metaphor to digital multimedia so that everyone may enjoy the wonderful experience of conducting world-famous orchestras. (read more)
Paper at CHI 2005. Paper at NIME 2006. Paper at NIME 2006. Paper at ICMC 2006. Paper at ICMC 2007.
Use the Force Picker, Luke: Space-Efficient Value Input on Force-Sensitive Mobile Touchscreens: Value pickers on smartphone occupy significant gesture and display space. Instead, Force Picker allows for bidirectional force control to select values and requires no more space than the initial touch. (read more)
Paper at CHI 2018 (Received Honorable Mention Award).
Release, Don't Wait! Reliable Force Input Confirmation with Quick Release: Modern smartphones, like iPhone 7, feature touchscreens with co-located force sensing. This makes touch input more expressive, e.g., by enabling single-finger continuous zooming when coupling zoom levels to force intensity. Often, however, the user wants to select and confirm a particular force value, say, to lock a certain zoom level. The most common confirmation techniques are Dwell Time (DT) and Quick Release (QR). While DT has shown to be reliable, it slows the interaction, as the user must typically wait for 1 s before her selection is confirmed. Conversely, QR is fast but reported to be less reliable, although no reference reports how to actually detect and implement it. In this paper, we set out to challenge the low reliability of QR (read more)
Note at ISS 2017.
BackXPress: Using Back-of-Device Finger Pressure to Augment Touchscreen Input on Smartphones:BackXPress lets users create pressure input with six fingers at the back of landscape-held smartphones to augment their frontal touchscreen interaction. We present design guidelines derived from three studies. (read more)
Paper at CHI 2017.
Understanding Back-to-Front Pinching for Eyes-Free Mobile Touch Input: Investigates the influence of device thickness and tilt angle for back-to-front pinching on mobile touch devices. Pinch error significantly increases with thickness while the angle has no influence. (read more)
Paper at MobileHCI 2016
HaptiCase: Back-of-Device Tactile Landmarks for Eyes-Free Absolute Indirect Touch: An interaction technique for eyes-free absolute indirect touch based on back-of-device tactile landmarks and proprioception. Improves tapping accuracy significantly (14% target size reduction) without software modification. (read more)
Paper at CHI 2015
Evaluating the Benefits of Real-time Feedback in Mobile Augmented Reality with Hand-held Devices
Note at CHI 2012.
Evaluating Swabbing: a Touchscreen Input Method for Elderly Users with Tremor is a touchscreen target selection method designed for users with hand tremor. Instead of touching on the target, the user slides his hand towards a target placed along the edge of the screen. In collaboration with Alexander Mertens, Chair and Institute of Industrial Engineering and Ergonomics (read more)
Note and Workshop at CHI 2011.
Bringing Iterative Design To Ubiquitous Computing: Interaction Techniques, Toolkits, and Evaluation Methods. (read more)
Paper at CHI 2003 and 2007, PerCom 2004, IEEE Pervasive Computing 2006 and PERMID 2007.
ARPen: Create digital models in mid-air You can now track a pen in 3D using just your smartphone! We have built an ARKit-based iOS app that can track a pen. We will analyze how this novel interaction technique enables users to create 3D models. For instructions, head over to the GitHub repo. (read more)
Paper at SUI 2018.