Research



Tabletops and Tangibles | Navigation and Exploration Interfaces | Audio Interfaces | Mobile Interaction

Smart Textiles | Tactile Interfaces | Ubiquitous Computing | AR Applications | Mid-air Gestures | Fabrication

Full list of publications







Use the Force Picker, Luke Sketch&Stitch Tangible Awareness StatWire Physical Guides
by Corsten et al. by Hamdan et al. by Cherek et al. by Subramanian et al. by Wacker et al.





Top


Tabletops and Tangibles

Tangible Awareness: How Tangibles on Tabletops Influence Awareness of Each Other’s Actions: Players are significantly more aware of other players’ actions using tangibles than those using pure multitouch interaction. (read more)

Paper at CHI 2018
Knobology Revisited: A Comparison of User Performance between Tangible and Virtual Rotary Knobs: We present an experimental comparison of tangible rotary knobs and touch-based virtual knobs in three output conditions: eyes-on, eyes-free, and peripheral. Twenty participants completed a simple rotation task on a interactive surface with four different input techniques (two tangibles and two virtual touch widgets) in the three output conditions, re- presenting the distance from the locus of attention. We found that users were in average 20% faster using tangible knobs than using the virtual knobs. We found that tangible knobs retains performance even if they are not in the locus of attention of the users. We provide four recommendations of suitable choosing knobs based on tasks and design constraints. (read more)

Note at ITS 2015
PERCs: Persistently Trackable Tangibles on Capacitive Multi-Touch Displays: PERCs use the touch probing signal from capacitive touch screens to determine whether the tangibles are on- or off-screen. This allows robust capacitive tangible widgets regardless of being touched. (read more)

Paper and Demo at UIST 2015 and Demo at ITS 2015
Combining Direct and Indirect Touch Input for Interactive Workspaces using Gaze Input: Interactive workspaces combine horizontal and vertical touch surfaces into a single digital workspace.However, direct interaction on the vertical surface is cumbersome. To overcome these problems, indirect touch systems use the horizontal touch surface into an input devices that allows manipulation of objects on the vertical display. If the horizontal touch surface also acts as a display, however, it becomes necessary to distinguish which screen is currently in use by providing a switching mode. In three studies we investigate the use of gaze tracking to perform these mode switches. (read more)

FullPaper at SUI 2015. Received Honorable Mention Award.
An Interaction Model for Grasp-Aware Tangibles on Interactive Surfaces: Tangibles on interactive surfaces enable users to physically manipulate digital content by placing, manipulating, or removing a tangible object. However, the information whether and how a user grasps these objects has not been mapped out for tangibles on interactive surfaces so far. Based on Buxton’s Three-State Model for graphical input, this paer present an interaction model that describes input on tangibles that are aware of the user’s grasp. (read more)

Note at ITS 2014
Touch interfaces provide great flexibility in designing an UI. However, the actual experience is often frustrating due to bad touch recognition. We saw that the relative location of the predecessor of a touch has a significant impact on the orientation and position of the touch ellipsis. We exploited this effect on an off-the-shelf touch display and showed that with only minimal preparation the touch accuracy of standard hardware can be improved allowing better recognition rates or more UI components on the same screen. (read more)

Note at CHI 2013.
Passive Untouched Capacitive Widgets (PUCs) are simple physical widgets the can be detected by an unmodified capacitive touch screen without being touched by a user. They can be used on most existing projected-capacitive displays. (read more)

Note and demo at ITS 2013 and demo at UIST 2013
Indirect Touch is a multitouch system in which users indirectly manipulate objects on the vertical screen by with touches on the horizontal screen. This interaction improves ergonomics of interactions over an extended period by reducing gorilla arm (vertical screen) and stiff neck (horizontal screen). (read more)

Full paper at CHI 2013
Instant User Interfaces enable users to repurpose everyday objects as input devices in an ad-hoc manner. We implemented a reliable, marker-free object tracking system that enables users to assign semantic meaning to different poses or to touches in different areas. (read more)

Paper at ITS 2013.
HoloDesk is a novel interactive system combining an optical see through display and Kinect camera to create the illusion that users are directly interacting with 3D graphics. (read more)

Full paper at CHI 2012 in collaboration with Microsoft Research Cambridge.
Understanding Flicking on Curved Surfaces: In this paper we analyzed how people execute flicking gestures and explored what types of error they make and derived a first model to predict these errors . (read more)

Full paper at CHI 2012.
3D Displays can be controlled using direct manipulation. This, however, introduces problems as the "direct" notion means that our object should stick to our fingers during interaction. However, on a viewer-centric display, head movements will also translate the object. We compared different methods to cope with these constrains. (read more)

Full Paper at ITS 2012, workshop submission at CHI 2012.
We render physical effects in tabletop controls using electromagnetic actuation. Keeping the advantages of low-cost passive tangible, we allow designers to change physical properties on the fly, such as weight, friction, spring resistance, and detents.

Note at CHI 2011.
FingerFlux is a haptic output technique that provides near-surface haptic feedback on interactive tabletops. It combines an electromagnetic display with a permanent magnet attached to the user’s index finger. (read more)

Paper and demo at UIST 2011. Received 3rd place in the category “Best demo”.
Dynamic Portals is a light-weight touch-based technique that allows to transfer virtual objects to distant locations on an interactive tabletop. (read more)

Note at ITS 2011. Received Best Note Award.
Hybrid documents ease text corpus analysis for literary scholars. Co-locating physical and digital media in one surface combines their advantages. (read more)

Full Paper at ITS 2010
BendDesk is an interactive desk environment that merges a vertical and a horizontal display with a curve. The entire surface is touch sensitive and supports direct manipulation of digital objects. (read more)

Full paper and demo at ITS 2010.
Madgets are low-cost, untethered, and easy-to-build tangible controls for interactive tabletops. Our tabletop maintains the consistency between physical controls and their digital state by using electromagnetic actuation. (read more)

Full paper at UIST 2010.
We are developing an HCI design pattern language that provides solutions for recurring problems when designing interactive tabletops and applications. (read more)

Paper at EuroPLoP 2010.
Mudpad is a multitouch screen that is enhanced with localized active haptic feedback. (read more)

Note at ITS 2010. Received Best Note Award and Best Demo Award.
SLAP Widgets are tangible general-purpose controls for interactive tabletops, made from silicone and acrylic. They combine the haptic feedback of physical controls with the flexibility of on-screen widgets. (read more)

Full paper at CHI 2009. Demo at CHI 2009 and TEI 2009.




Top


Navigation and Exploration Interfaces

Chronicler: Interactive Exploration of Source Code History: We present Tree Flow, an interactive visualization of structural changes in source code history. Combining history graph navigation with a traditional source code view outperforms the more traditional history navigation on a file basis and users strongly prefer Chronicler for the exploration of source code. (read more)

Paper at CHI 2016.
Vesta implements the idea to leverage runtime information from manual executions performed by developers to support authoring of documentation and unit tests. (read more)

Note at CHI 2016.
Statsplorer is a software tool that allows users to do statistical analysis by interacting with visualizations. It guides the user to select appropriate statistical analysis tasks based on her research questions. It also prevents common statistical analysis mistakes and promotes understanding by collocating appropriate visualizations with statistical analysis results. (read more)

Paper at CHI 2015.
Live coding environments provide information about a program’s execution immediately after each change to the source code. (read more)

Note at VL/HCC 2014.
We suggest a method to analyse developers' Navigation Strategies using predictive models and apply this method to compare different call graph navigation tools. (read more)

Paper at CHI 2013.
Stacksplorer provides a novel way to visualize potential call stacks within a traditional IDE and allows navigation along these call stacks. (read more)

Paper at UIST 2011.
Fly is a novel Presentation Tool that breaks with the Slide Metaphor and assembles Information Landscapes on a 2D Plane. (read more)

Paper at CHI 2009. Note at CHI 2012. Paper INTERTACT 2015.
DRAGON is a direct-manipulation interaction technique for frame-accurate navigation in video scenes. (read more)

Note at CHI 2008 (Received Best Note Award). Note at CHI 2012. Paper at GI 2012.





Top


Audio Interfaces

An Augmented Flute for Beginners

Note at NIME 2017.
The audio guide for the Centre Charlemagne runs on the Aixplorer (read more)

Note at CHI 2016.
Corona lets you revive a medieval coronation feast using spatial audio rendering (read more)

Paper at CHI 2014. Note at CHI 2015. Note at MobileHCI 2016.
DiskPlay simplifies in-track navigation on turntables for DJs (read more)

Note and Demo at CHI 2012. Note and Demo at NIME 2014.
A Semantic Time Framework for Interactive Media Systems. (read more)

Papers at NIME 2005-2007 and ICMC 2006-2007.
Personal Orchestra is a series of projects to adapt this centuries-old interaction metaphor to digital multimedia so that everyone may enjoy the wonderful experience of conducting world-famous orchestras. (read more)

Paper at CHI 2005. Paper at NIME 2006. Paper at NIME 2006. Paper at ICMC 2006. Paper at ICMC 2007.




Top


Mobile Interaction

Use the Force Picker, Luke: Space-Efficient Value Input on Force-Sensitive Mobile Touchscreens: Value pickers on smartphone occupy significant gesture and display space. Instead, Force Picker allows for bidirectional force control to select values and requires no more space than the initial touch. (read more)

Paper at CHI 2018 (Received Honorable Mention Award).
Release, Don't Wait! Reliable Force Input Confirmation with Quick Release: Modern smartphones, like iPhone 7, feature touchscreens with co-located force sensing. This makes touch input more expressive, e.g., by enabling single-finger continuous zooming when coupling zoom levels to force intensity. Often, however, the user wants to select and confirm a particular force value, say, to lock a certain zoom level. The most common confirmation techniques are Dwell Time (DT) and Quick Release (QR). While DT has shown to be reliable, it slows the interaction, as the user must typically wait for 1 s before her selection is confirmed. Conversely, QR is fast but reported to be less reliable, although no reference reports how to actually detect and implement it. In this paper, we set out to challenge the low reliability of QR (read more)

Note at ISS 2017.
BackXPress: Using Back-of-Device Finger Pressure to Augment Touchscreen Input on Smartphones:BackXPress lets users create pressure input with six fingers at the back of landscape-held smartphones to augment their frontal touchscreen interaction. We present design guidelines derived from three studies. (read more)

Paper at CHI 2017.
Understanding Back-to-Front Pinching for Eyes-Free Mobile Touch Input: Investigates the influence of device thickness and tilt angle for back-to-front pinching on mobile touch devices. Pinch error significantly increases with thickness while the angle has no influence. (read more)

Paper at MobileHCI 2016
HaptiCase: Back-of-Device Tactile Landmarks for Eyes-Free Absolute Indirect Touch: An interaction technique for eyes-free absolute indirect touch based on back-of-device tactile landmarks and proprioception. Improves tapping accuracy significantly (14% target size reduction) without software modification. (read more)

Paper at CHI 2015
Evaluating the Benefits of Real-time Feedback in Mobile Augmented Reality with Hand-held Devices

Note at CHI 2012.
Evaluating Swabbing: a Touchscreen Input Method for Elderly Users with Tremor is a touchscreen target selection method designed for users with hand tremor. Instead of touching on the target, the user slides his hand towards a target placed along the edge of the screen. In collaboration with Alexander Mertens, Chair and Institute of Industrial Engineering and Ergonomics (read more)

Note and Workshop at CHI 2011.




Top


Smart Textiles

Sketch&Stitch: Interactive Embroidery for E-Textiles.... (read more)

Papet at CHI 2018.
Run&Tap: Investigation of On-Body Tapping for Runners... (read more)

Paper at ISS 2017.
Interactive FUrniTURE: Evaluation of Smart Interactive Textile Interfaces for Home Environments (read more)

Paper at ISS 2017.
Grabbing at an Angle: Menu Selection for Fabric Interfaces... (read more)

Paper at ISWC 2016.
FabriTouch: Exploring Flexible Touch Input on Textiles is a textile touchpad integrated into everyday clothing. (read more)

Note at ISWC 2014.
Pinstripe: Eyes-free Continuous Input on Interactive Clothing is a textile Sensor that enables eyes-free continuous input anywhere on interactive clothing. (read more)

Paper at CHI 2011.




Top


Tactile Interfaces

A Language of Tactile Motion Instructions for Physical Activities. (read more)

Papers at ICST 2009, CHI 2009 and MobileHCI 2009.




Top


Ubiquitous Computing

Bringing Iterative Design To Ubiquitous Computing: Interaction Techniques, Toolkits, and Evaluation Methods. (read more)

Paper at CHI 2003 and 2007, PerCom 2004, IEEE Pervasive Computing 2006 and PERMID 2007.




Top


AR Applications

ARPen: Create digital models in mid-air You can now track a pen in 3D using just your smartphone! We have built an ARKit-based iOS app that can track a pen. We will analyze how this novel interaction technique enables users to create 3D models. For instructions, head over to the GitHub repo. (read more)

Paper at SUI 2018.




Top


Mid-air Gestures

FINS: Understanding Finger Input in Near-Surface Space leverage midair space near device, e.g., above keyboards or mouses, as additional input modality. (read more)

Paper at CHI 2014.




Top


Fabrication

YAWN: Yet Another Wearable Toolkit is an easy-to-use rapid prototyping toolkit for wearables. Part of the Personal Photonics project. (read more)

Note at ISWC 2018.
CutCAD - An Open-source Tool to Design 3D Objects in 2D is a design software for laser cutted 3D objects designed in 2D vector art and using automatic calculated fingerjoints as connections. Part of the Personal Photonics project and Numerika2Neets. (read more)

Note and Demo at DIS 2018.





Created by hamdan. Last Modification: Friday 25 of January, 2019 11:37:06 by hamdan.

Media Computing Group at RWTH Aachen

Search

in: