Mobile Computing · Interaction
Bumping into street signs and other people while walking on a busy sidewalk is a common problem when people are also trying to check their email or update their Facebook status from their mobile phones. Interactive glasses have the potential to provide timely information (i.e., notifications) while still allowing people to pay attention to and keep an eye on other pedestrians on the sidewalk.
NotifEye allows a person to receive social network notifications on interactive glasses while walking on a busy street. The prototype uses a minimalistic user interface (UI) for interactive glasses to help people focus their attention on their surroundings and supports discreet interaction by using a finger rub pad to take action on incoming notifications.
-
NotifEye: using interactive glasses to deal with notifications while walking in public
Andrés Lucero, Akos Vetek
ACE '14, Article 17, 10 pages
Study: Mold-It
The pervasive rectangular touchscreen, which has dominated the display industry for decades is slowly giving room to a future in which devices may have any shape. For example, circular displays are already available on many smartwatches. However, little is known on the interplay between the rationale for the form factors of handheld devices and their interactive content.
We conducted two studies consisting in design sessions using modeling clay props to explore how users may interact with handheld freeform devices. We found three themes central to grasping such devices: freeform dexterity, shape features discoverability and shape adaptability. We further explored the interlink between shape dexterity, discoverabiliy and freeform shape features.
-
Mold-It: Understanding how Physical Shapes affect Interaction with Handheld Freeform Devices
Marcos Serrano, Jolee Finch, Pourang Irani, Andrés Lucero, Anne Roudaut
CHI '22, Article 402, 1–14
Prototype: Twisting Touch
Currently, touch is the dominant input technique for the design of interactions with rigid handheld devices. It is reasonable to predict that future flexible devices will also have touch sensitive surfaces. In this context, the following question arises: can interface deformation and touch co-exist in the same interaction cycle?
This study investigates the potential of combining, within the same interaction cycle, deformation and touch input in a handheld device. Using a flexible, input-only device connected to an external display, we compared a multitouch input technique and two hybrid deformation-plus-touch input techniques (bending and twisting the device, plus either front- or back-touch), in an image-docking task.
-
Twisting touch: combining deformation and touch as input within the same interaction cycle on handheld devices
Johan Kildal, Andrés Lucero, Marion Boberg
MobileHCI '13, 237-246
Prototype: Tilt Displays
We present a new type of actuatable display, called Tilt Displays, that provide visual feedback combined with multi-axis tilting and vertical actuation. Their ability to physically mutate provides users with an additional information channel that facilitates a range of new applications including collaboration and tangible entertainment while enhancing familiar applications such as terrain modelling by allowing 3D scenes to be rendered in a physical-3D manner.
Through a 3×3 handheld custom built prototype, we examine the design space around Tilt Displays, categorise output modalities and conduct two user studies. The first study examines users' initial impressions of Tilt Displays and probes potential interactions and uses. The second takes a quantitative approach to understand interaction possibilities with such displays, resulting in the production of two user-defined gesture sets: one for manipulating the surface of the Tilt Display, the second for conducting everyday interactions.
-
Tilt Displays: Designing Display Surfaces with Multi-axis Tilting and Actuation
Jason Alexander, Andrés Lucero, Sriram Subramanian
MobileHCI '12, 161-170
Prototype: Image Space
The increasing use of digital cameras and cameraphones has changed our behavior regarding the amount of photos we take. As a result, growing collections of photos are more difficult to understand, search and navigate. Helping people make sense of these collections and create an understanding of the world that they depict has become a challenging task.
We present the results of a two-month empirical study of geotagged mobile media content use. We equipped 20 people with ImageSpace, an Internet-based service that allows people to automatically share geotagged mobile media content captured with their cameraphones onto 2D and 3D representations of photo collections online. The service also introduces Scenes, which allow people to organize photos according to spatial and/or chronological associations.
-
Image Space: capturing, sharing and contextualizing personal pictures in a simple and playful way
Andrés Lucero, Marion Boberg, Severi Uusitalo
ACE '09, 215-222
Other Publications
-
Stranger Screens: Exploring the Application Themes for Interactive Freeform Devices
Marcos Serrano, Andrés Lucero, Pourang Irani, Anne Roudaut
CHI EA '22, Article 447, 1–5 -
Exploring the interaction design space for interactive glasses
Andrés Lucero, Kent Lyons, Akos Vetek, Toni Järvenpää, Sean White, Marja Salmimaa
CHI EA '13, 1341-1346 -
Image Space: An Empirical Study of Geotagged Mobile Media Content Capture and Sharing
Andrés Lucero, Marion Boberg, Sanna Malinen, Kaisa Väänänen-Vainio-Mattila
MindTrek '12, 187-194
Demos
-
Tilt Display Demonstration: A Display Surface with Multi-axis Tilting & Actuation
Jason Alexander, Andrés Lucero, Sriram Subramanian
MobileHCI '12 demo, 161-170
Workshops
-
Organic experiences: (re)shaping interactions with deformable displays
Jason Alexander, Ryan Brotman, David Holman, Audery Younkin, Roel Vertegaal, Johan Kildal, Andrés Lucero, Anne Roudaut, Sriram Subramanian
CHI '13 workshop, 3171-3174 -
Interaction with Deformable Displays
Jason Alexander, Johan Kildal, Kasper Hornbaek, Viljakaisa Aaltonen, Andrés Lucero, Sriram Subramanian
MobileHCI '12 workshop, 237-240