Bodily augmentations have been increasingly investigated by HCI researchers due to the associated benefits, such as accessibility, offering novel input space, and sensing environment. Most of these investigations have adopted a utilitarian perspective, providing impairment, mobility, and rehabilitation support. However, more recently a subset called bodily extensions has emerged that appears to also embrace experiential aspects. These bodily extensions physically extend the human body, going beyond traditional sensing and accessibility perspectives. Initial designs were mostly rigid, leading to a focus on short-term use, missing out on the opportunities for these bodily extensions to be integrated into everyday life. In our work, we build soft pneumatic-based bodily extensions that users can incorporate into their everyday lives. We detail the design and rationale behind these bodily extensions, along with the novel scenarios they enable. We share our insights from creating these systems and the associated user experiences from our field study with 48 participants, resulting in a design framework that will hopefully aid future designers in facilitating embodiment across everyday life through bodily extensions.
Shared Bodily Fusion: Leveraging Inter-Body Electrical Muscle Stimulation for Social Play
Rakesh Patibanda, Nathalie Overdevest, Shreyas Nisal, and 5 more authors
In Proceedings of the 2024 ACM Designing Interactive Systems Conference, 2024
Traditional games like "Tag" rely on shared control via inter-body interactions (IBIs) – touching, pushing, and pulling – that foster emotional and social connection. Digital games largely limit IBIs, with players using their bodies as input to control virtual avatars instead. Our “Shared Bodily Fusion” approach addresses this by fusing players’ bodies through a mediating computer, creating a shared input and output system. We demonstrate this approach with "Hidden Touch", a game where a novel social electrical muscle stimulation system transforms touch (input) into muscle actuations (output), facilitating IBIs. Through a study (n=27), we identified three player experience themes. Informed by these findings and our design process, we mapped their trajectories across our three experiential spaces – threshold, tolerance, and precision – which collectively form our design framework. This framework facilitates the creation of future digital games where IBIs are intrinsic, ultimately promoting the many benefits of social play.
PsiNet: Toward Understanding the Design of Brain-to-Brain Interfaces for Augmenting Inter-Brain Synchrony
Nathan Semertzidis, Michaela Jayne Vranic-Peters, Xiao Zoe Fang, and 5 more authors
In Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems, 2024
Underlying humanity’s social abilities is the brain’s capacity to interpersonally synchronize. Experimental, lab-based neuropsychological studies have demonstrated that inter-brain synchrony can be technologically mediated. However, knowledge in deploying these technologies in-the-wild and studying their user experience, an area HCI excels in, is lacking. With advances in mobile brain sensing and stimulation, we identify an opportunity for HCI to investigate the in-the-wild augmentation of inter-brain synchrony. We designed “PsiNet,” the first wearable brain-to-brain system aimed at augmenting inter-brain synchrony in-the-wild. Participant interviews illustrated three themes that describe the user experience of modulated inter-brain synchrony: hyper-awareness; relational interaction; and the dissolution of self. We contribute these three themes to assist HCI theorists’ discussions of inter-brain synchrony experiences. We also present three practical design tactics for HCI practitioners designing inter-brain synchrony, and hope that our work guides a HCI future of brain-to-brain experiences which fosters human connection.
PneuMa: Designing Pneumatic Bodily Extensions for Supporting Movement in Everyday Life
Aryan Saini, Rakesh Patibanda, Nathalie Overdevest, and 2 more authors
In Extended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems, 2024
Prior research around the design of interactive systems has highlighted the benefits of supporting embodiment in everyday life. This resulted in the creation of body-centric systems that leverage movement. However, these advances supporting movement in everyday life, aligning with the embodiment theory, so far focused on sensing movement as opposed to facilitating movement. We present PneuMa, a novel wearable system that can facilitate movement in everyday life through pneumatic-based bodily extensions. We showcase the system through three examples: "Pardon?", moving the ear forward; "Greetings", moving a hand towards the "Bye-bye" gesture; "Take a break", moving the hands away from the keyboard, enabling the bodily extensions that support movement in everyday life. We delve into some findings in relation to prior research around bodily extensions and embodied interaction in the video. Ultimately, we hope that our work helps more people profit from the benefits of everyday movement support.
GazeAway: Designing for Gaze Aversion Experiences
Nathalie Overdevest, Rakesh Patibanda, Aryan Saini, and 2 more authors
In Extended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems, 2024
Gaze aversion is embedded in our behaviour: we look at a blank area to support remembering and creative thinking, and as a social cue that we are thinking. We hypothesise that a person’s gaze aversion experience can be mediated through technology, in turn supporting embodied cognition. In this design exploration we present six ideas for interactive technologies that mediate the gaze aversion experience. One of these ideas we developed into “GazeAway”: a prototype that swings a screen into the wearer’s field of vision when they perform gaze aversion. Six participants experienced the prototype and based on their interviews, we found that GazeAway changed their gaze aversion experience threefold: increased awareness of gaze aversion behaviour, novel cross-modal perception of gaze aversion behaviour, and changing gaze aversion behaviour to suit social interaction. We hope that ultimately, our design exploration offers a starting point for the design of gaze aversion experiences.
Exploring Shared Bodily Control: Designing Augmented Human Systems for Intra- and Inter-Corporeality
Rakesh Patibanda, Nathalie Overdevest, Aryan Saini, and 5 more authors
In Proceedings of the Augmented Humans International Conference 2024, 2024
The human-computer interaction community has evolved from using body-sensing to body-actuating technologies, transforming the body’s role from a mere input to an input-output medium. With body-sensing, the separation between the human and the computer is clear, allowing for an easy understanding of who is in control. However, with body-actuating technologies, this separation diminishes. These technologies integrate more closely with our bodies, where both the user and the technology can share control over their bodily interactions. In this workshop, we will explore this notion of sharing control, specifically focusing on experiences where users interact with their own bodies (intra-corporeal experiences), and interact with others using technology (inter-corporeal experiences). Our discussions and group activities will focus on brainstorming and designing within human augmentation, examining how this shared control can lead to innovative applications.
PneuMa: Designing Pneumatic Bodily Extensions for Supporting Movement in Everyday Life
Aryan Saini, Rakesh Patibanda, Nathalie Overdevest, and 2 more authors
In Proceedings of the Eighteenth International Conference on Tangible, Embedded, and Embodied Interaction, 2024
Prior research around the design of interactive systems has highlighted the benefits of supporting embodiment in everyday life. This resulted in the creation of body-centric systems that leverage movement. However, these advances supporting movement in everyday life, aligning with the embodiment theory, so far focused on sensing movement as opposed to facilitating movement. We present PneuMa, a novel wearable system that can facilitate movement in everyday life through pneumatic-based bodily extensions. We showcase the system through three examples: "Pardon?", moving the ear forward; "Greetings", moving a hand towards the "Bye-bye" gesture; "Take a break", moving the hands away from the keyboard, enabling the bodily extensions that support movement in everyday life. From the thematic analysis of a field study with 12 participants, we identified three themes: bodily awareness, Perception of the scenarios, and anticipating movement. We discuss our findings in relation to prior research around bodily extensions and embodied interaction to provide strategies to design bodily extensions that support movement in everyday life. Ultimately, we hope that our work helps more people profit from the benefits of everyday movement support.
2023
Pneunocchio: A playful nose augmentation for facilitating embodied representation
Aryan Saini, Srihari Sridhar, Aarushi Raheja, and 5 more authors
In Adjunct Proceedings of the 36th Annual ACM Symposium on User Interface Software and Technology, Oct 2023
Prior research has offered a plethora of wearables centred around sensing bodily actions ranging from more explicit data, such as movement and physiological response, to implicit information, such as ocular and brain activity. Bodily augmentations that physically extend the user’s body along with altering body schema and image have been proposed recently as well, owing to factors such as accessibility and improving communication. However, these attempts have usually consisted of uncomfortable interfaces that either restrict the user’s movement or are intrusive in nature. In this work, we present Pneunocchio, a playful nose augmentation based on the lore of Pinocchio. Pneunocchio consists of a pneumatic-based inflatable that a user wears on their nose to play a game of two truths and a lie. With our work, we aim to explore expressive bodily augmentations that respond to a player’s physiological state that can alter the perception of their body while serving as an expressive match for a current part of the body.
Fluito: Towards Understanding the Design of Playful Water Experiences through an Extended Reality Floatation Tank System
Maria F. Montoya, YuYang Ji, Ryan Wee, and 5 more authors
Proceedings of the ACM on Human-Computer Interaction, Oct 2023
Water’s pleasant nature and associated health benefits have captivated the interest of HCI researchers. Prior WaterHCI work mainly focused on advancing instrumental applications, such as improving swimming performance, and less on designing systems that support interacting with technology in water in more playful contexts. In this regard, we propose floatation tanks as research vehicles to investigate the design of playful interactive water experiences. Employing somaesthetic design, we developed a playful extended reality floatation tank experience: "Fluito". We conducted a 13-participant study to understand how specific design features amplified participants’ water experiences. We used a postphenomenological lens to articulate eight strategies useful for designers aiming to develop digital playful experiences in water, such as designing to call attention to the water and designing to encourage breathing and body awareness in water experiences. Ultimately, we hope that our work supports people to be playful and benefit from the many advantages of being in water.
Auto-Paizo Games: Towards Understanding the Design of Games That Aim to Unify a Player’s Physical Body and the Virtual World
Rakesh Patibanda, Chris Hill, Aryan Saini, and 6 more authors
Proceedings of the ACM on Human-Computer Interaction, Oct 2023
Most digital bodily games focus on the body as they use movement as input. However, they also draw the player’s focus away from the body as the output occurs on visual displays, creating a divide between the physical body and the virtual world. We propose a novel approach – the "Body as a Play Material" – where a player uses their body as both input and output to unify the physical body and the virtual world. To showcase this approach, we designed three games where a player uses one of their hands (input) to play against the other hand (output) by loaning control over its movements to an Electrical Muscle Stimulation (EMS) system. We conducted a thematic analysis on the data obtained from a field study with 12 participants to articulate four player experience themes. We discuss our results about how participants appreciated the engagement with the variety of bodily movements for play and the ambiguity of using their body as a play material. Ultimately, our work aims to unify the physical body and the virtual world.
Fused Spectatorship: Designing Bodily Experiences Where Spectators Become Players
Rakesh Patibanda, Aryan Saini, Nathalie Overdevest, and 8 more authors
Proceedings of the ACM on Human-Computer Interaction, Oct 2023
Spectating digital games can be exciting. However, due to its vicarious nature, spectators often wish to engage in the gameplay beyond just watching and cheering. To blur the boundaries between spectators and players, we propose a novel approach called "Fused Spectatorship", where spectators watch their hands play games by loaning bodily control to a computational Electrical Muscle Stimulation (EMS) system. To showcase this concept, we designed three games where spectators loan control over both their hands to the EMS system and watch them play these competitive and collaborative games. A study with 12 participants suggested that participants could not distinguish if they were watching their hands play, or if they were playing the games themselves. We used our results to articulate four spectator experience themes and four fused spectator types, the behaviours they elicited and offer one design consideration to support each of these behaviours. We also discuss the ethical design considerations of our approach to help game designers create future fused spectatorship experiences.
Dancing Delicacies: Designing Computational Food for Dynamic Dining Trajectories
Jialin Deng, Humphrey Yang, Aryan Saini, and 4 more authors
In Proceedings of the 2023 ACM Designing Interactive Systems Conference, Jul 2023
Contemporary human-food interaction design is often a technology-driven endeavor in which food’s materiality has been largely underexplored. Building on the concept of “computational food”, this paper explores the design of food as a material realization of computation through a material-centered approach. We engaged with a “Research through Design” exploration by designing a computational food system called “Dancing Delicacies”, which enables food items to be “programmed” and “reconfigured” within dynamic trajectories. Our practice led to a design framework resulting in four original dish designs. Our dishes aim to illustrate the richness of this new design space for computational food. Furthermore, through engaging with expert practitioners from the hospitality industry, we provide a first account of understanding the design of computational food for dynamic dining trajectories and its speculative use contexts in the industry. With this work, we hope to inspire researchers and designers to envision a new future of human-food interaction.
Towards Designing for Everyday Embodied Remembering: Findings from a Diary Study
Nathalie Overdevest, Rakesh Patibanda, Aryan Saini, and 2 more authors
In Proceedings of the 2023 ACM Designing Interactive Systems Conference, Jul 2023
Our bodies play an important part in our remembering practices, for example when we can remember passwords by typing, even if we cannot verbalise them. An increasing number of technologies are being developed to support remembering. However, so far, they seem to have not taken the opportunity yet to support remembering through bodily movements. To better understand how to design for such embodied remembering, we conducted a diary study with 12 participants who recorded their embodied remembering experiences in everyday life over a three-week period. Our thematic analysis of the diaries and interviews led to the creation of a framework that helps understand embodied remembering experiences (ERXs) based on the level of skilled and conscious movements used. We describe how this ERX framework could help with the design of technologies to support embodied remembering.
DUMask: A Discrete and Unobtrusive Mask-Based Interface for Facial Gestures
Aryan Saini, Arpit Bhatia, Isha Kalra, and 2 more authors
In Proceedings of the Augmented Humans International Conference 2023, Mar 2023
Interactions using the face, not only enable multi-tasking but also enable us to create hands-free applications. Previous works in HCI used sensors attached directly to the person’s face or inside their mouth. However, a mask, which has now become a norm in our everyday life and is socially acceptable, has rarely been used to explore facial interactions. We designed, “DUMask”, an interface that uses face parts covered by a mask to discretely enable 14 (+1 default) interactions. DUMask uses an infrared camera embedded inside an off-the-shelf face mask to recognize the gestures, and we demonstrate the effectiveness of our interface through in-lab studies. We conducted two user studies evaluating the experience of both the wearer and the onlooker, which validated that the interface is indeed inconspicuous and unobtrusive.
2022
TouchMate: Understanding the Design of Body Actuating Games using Physical Touch
Shreyas Nisal, Rakesh Patibanda, Aryan Saini, and 2 more authors
In Extended Abstracts of the 2022 Annual Symposium on Computer-Human Interaction in Play, Nov 2022
Body-actuating technologies such as Electrical Muscle Stimulation (EMS) can actuate multiple players simultaneously via physical touch. To investigate this opportunity, we designed a game called “Touchmate”. Here, one guesser and two suspects sit across with their legs hidden under a table. The guesser attaches a ground electrode from one EMS channel, and each suspect attaches one active electrode from the same channel on their forearms. When a suspect touches the guesser’s leg, their bodies complete the electrical circuit, actuating both their hands involuntarily via the EMS. The guesser’s goal is to determine who touched their leg. In this paper, we present the results from our initial study and articulate three player experience themes. Ultimately, we hope our work inspires game designers to create physical touch games using body-actuating technologies.
SomaFlatables: Supporting Embodied Cognition through Pneumatic Bladders
Aryan Saini, Haotian Huang, Rakesh Patibanda, and 3 more authors
In Adjunct Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology, Oct 2022
Applying the theory of Embodied Cognition through design allows us to create computational interactions that engage our bodies by modifying our body schema. However, in HCI, most of these interactive experiences have been stationed around creating sensing-based systems that leverage our body’s position and movement to offer an experience, such as games using Nintendo Wii and Xbox Kinect. In this work, we created two pneumatic inflatables-based prototypes that actuate our body to support embodied cognition in two scenarios by altering the user’s body schema. We call these ”SomaFlatables” and demonstrate the design and implementation of these inflatables based prototypes that can move and even extend our bodies, allowing for novel bodily experiences. Furthermore, we discuss the future work and limitations of the current implementation.
Motor movements are performed while playing hand-games such as Rock-paper-scissors or Thumb-war. These games are believed to benefit both physical and mental health and are considered cultural assets. Electrical Muscle Stimulation (EMS) is a technology that can actuate muscles, triggering motor movements and hence offers an opportunity for novel play experiences based on these traditional hand-games. However, there is only limited understanding of the design of EMS games. We present the design and evaluation of two games inspired by traditional hand-games, ”Slap-me-if-you-can” and ”3-4-5”, which incorporate EMS and can be played alone, unlike traditional games. A thematic analysis of the data collected revealed three themes: 1) Gameplay experiences and influence of EMS hardware, 2) Interaction with EMS and the calibration process and, 3) Shared control and its effect on playing EMS games. We hope that an enhanced understanding of the potential of EMS to support hand-games can aid the advancement of movement-based games as a whole.
2019
Aesop: Authoring Engaging Digital Storytelling Experiences
Aryan Saini, Kartik Mathur, Abhinav Thukral, and 2 more authors
In Adjunct Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology, Oct 2019
The traditional storytelling experiences are often one dimensional, wherein they only contain a single channel of communication with the audience through narration. With the advancements in technology, storytelling experiences have been augmented with the help of digital media to be more engaging and immersive. Authoring these scenarios, however, is complicated as it requires technical knowledge to interface the means of engagement. In this work, we talk about Aesop, a system which assists the narrator to author engaging storytelling experiences. Aesop provides a block-based interface like Scratch and manifests words of a story, Cues, and Visualization (Outputs) as blocks that enable the user to create captivating stories. Our system also leverages physical actions performed by the user as Cues. These cues can trigger visualizations like robot actions, animations, environments simulation using sound and lighting effects.
Gehna: Exploring the Design Space of Jewelry as an Input Modality
Jatin Arora, Kartik Mathur, Aryan Saini, and 1 more author
In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, May 2019
Jewelry weaves into our everyday lives as no other wearable does. It comes in many wearable forms, is fashionable, and can adorn any part of the body. In this paper, through an exploratory, Research through Design (RtD) process, we tap into this vast potential space of input interaction that jewelry can enable. We do so by first identifying a small set of fundamental structural elements — called Jewelements — that any jewelry is composed of, and then defining their properties that enable the interaction. We leverage this synthesis along with observational data and literature to formulate a design space of jewelry-enabled input techniques. This work encapsulates both the extensions of common existing input methods (e.g., touch) as well as new ones inspired by jewelry. Furthermore, we discuss our prototypical sensor-based implementations. Through this work, we invite the community to engage in the conversation on how jewelry as a material can help shape wearable-based input.
VirtualBricks: Exploring a Scalable, Modular Toolkit for Enabling Physical Manipulation in VR
Jatin Arora, Aryan Saini, Nirmita Mehra, and 3 more authors
In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, May 2019
Often Virtual Reality (VR) experiences are limited by the design of standard controllers. This work aims to liberate a VR developer from these limitations in the physical realm to provide an expressive match to the limitless possibilities in the virtual realm. VirtualBricks is a LEGO based toolkit that enables construction of a variety of physical-manipulation enabled controllers for VR, by offering a set of feature bricks that emulate as well as extend the capabilities of default controllers. Based on the LEGO platform, the toolkit provides a modular, scalable solution for enabling passive haptics in VR. We demonstrate the versatility of our designs through a rich set of applications including re-implementations of artifacts from recent research. We share a VR Integration package for integration with Unity VR IDE, the CAD models for the feature bricks, for easy deployment of VirtualBricks within the community.