publications
2024
- PneuExtensio: Designing Pneumatic-based Bodily Extensions to Facilitate Embodiment across Everyday Life ExperiencesAryan Saini, Elise Van Den Hoven, and Florian ‘Floyd’ MuellerIn Companion Publication of the 2024 ACM Designing Interactive Systems Conference, 2024
Bodily augmentations have been increasingly investigated by HCI researchers due to the associated benefits, such as accessibility, offering novel input space, and sensing environment. Most of these investigations have adopted a utilitarian perspective, providing impairment, mobility, and rehabilitation support. However, more recently a subset called bodily extensions has emerged that appears to also embrace experiential aspects. These bodily extensions physically extend the human body, going beyond traditional sensing and accessibility perspectives. Initial designs were mostly rigid, leading to a focus on short-term use, missing out on the opportunities for these bodily extensions to be integrated into everyday life. In our work, we build soft pneumatic-based bodily extensions that users can incorporate into their everyday lives. We detail the design and rationale behind these bodily extensions, along with the novel scenarios they enable. We share our insights from creating these systems and the associated user experiences from our field study with 48 participants, resulting in a design framework that will hopefully aid future designers in facilitating embodiment across everyday life through bodily extensions.
@inproceedings{saini_pneuextensio_2024, author = {Saini, Aryan and Van Den Hoven, Elise and Mueller, Florian ‘Floyd’}, title = {PneuExtensio: Designing Pneumatic-based Bodily Extensions to Facilitate Embodiment across Everyday Life Experiences}, year = {2024}, isbn = {9798400706325}, publisher = {Association for Computing Machinery}, address = {New York, NY, USA}, url = {https://doi.org/10.1145/3656156.3665136}, doi = {10.1145/3656156.3665136}, booktitle = {Companion Publication of the 2024 ACM Designing Interactive Systems Conference}, pages = {19–23}, numpages = {5}, keywords = {bodily extensions, embodied experiences, embodied interactions, pneumatics}, location = {IT University of Copenhagen, Denmark}, series = {DIS '24 Companion}, }
- Shared Bodily Fusion: Leveraging Inter-Body Electrical Muscle Stimulation for Social PlayRakesh Patibanda, Nathalie Overdevest, Shreyas Nisal, and 5 more authorsIn Proceedings of the 2024 ACM Designing Interactive Systems Conference, 2024
Traditional games like "Tag" rely on shared control via inter-body interactions (IBIs) – touching, pushing, and pulling – that foster emotional and social connection. Digital games largely limit IBIs, with players using their bodies as input to control virtual avatars instead. Our “Shared Bodily Fusion” approach addresses this by fusing players’ bodies through a mediating computer, creating a shared input and output system. We demonstrate this approach with "Hidden Touch", a game where a novel social electrical muscle stimulation system transforms touch (input) into muscle actuations (output), facilitating IBIs. Through a study (n=27), we identified three player experience themes. Informed by these findings and our design process, we mapped their trajectories across our three experiential spaces – threshold, tolerance, and precision – which collectively form our design framework. This framework facilitates the creation of future digital games where IBIs are intrinsic, ultimately promoting the many benefits of social play.
@inproceedings{patibanda_bodily_fusion_2024, author = {Patibanda, Rakesh and Overdevest, Nathalie and Nisal, Shreyas and Saini, Aryan and Elvitigala, Don Samitha and Knibbe, Jarrod and Van Den Hoven, Elise and Mueller, Florian 'Floyd'}, title = {Shared Bodily Fusion: Leveraging Inter-Body Electrical Muscle Stimulation for Social Play}, year = {2024}, isbn = {9798400705830}, publisher = {Association for Computing Machinery}, address = {New York, NY, USA}, url = {https://doi.org/10.1145/3643834.3660723}, doi = {10.1145/3643834.3660723}, booktitle = {Proceedings of the 2024 ACM Designing Interactive Systems Conference}, pages = {2088–2106}, numpages = {19}, keywords = {body-actuating play, electrical muscle stimulation (EMS), movement-based play, social bodily games, wearable interaction}, location = {Copenhagen, Denmark}, series = {DIS '24}, }
- PsiNet: Toward Understanding the Design of Brain-to-Brain Interfaces for Augmenting Inter-Brain SynchronyNathan Semertzidis, Michaela Jayne Vranic-Peters, Xiao Zoe Fang, and 5 more authorsIn Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems, 2024
Underlying humanity’s social abilities is the brain’s capacity to interpersonally synchronize. Experimental, lab-based neuropsychological studies have demonstrated that inter-brain synchrony can be technologically mediated. However, knowledge in deploying these technologies in-the-wild and studying their user experience, an area HCI excels in, is lacking. With advances in mobile brain sensing and stimulation, we identify an opportunity for HCI to investigate the in-the-wild augmentation of inter-brain synchrony. We designed “PsiNet,” the first wearable brain-to-brain system aimed at augmenting inter-brain synchrony in-the-wild. Participant interviews illustrated three themes that describe the user experience of modulated inter-brain synchrony: hyper-awareness; relational interaction; and the dissolution of self. We contribute these three themes to assist HCI theorists’ discussions of inter-brain synchrony experiences. We also present three practical design tactics for HCI practitioners designing inter-brain synchrony, and hope that our work guides a HCI future of brain-to-brain experiences which fosters human connection.
@inproceedings{semertzidis_psinet_2024, author = {Semertzidis, Nathan and Vranic-Peters, Michaela Jayne and Fang, Xiao Zoe and Patibanda, Rakesh and Saini, Aryan and Elvitigala, Don Samitha and Zambetta, Fabio and Mueller, Florian ‘Floyd’}, title = {PsiNet: Toward Understanding the Design of Brain-to-Brain Interfaces for Augmenting Inter-Brain Synchrony}, year = {2024}, isbn = {9798400703300}, publisher = {Association for Computing Machinery}, address = {New York, NY, USA}, url = {https://doi.org/10.1145/3613904.3641983}, doi = {10.1145/3613904.3641983}, booktitle = {Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems}, articleno = {434}, numpages = {18}, keywords = {Brain-to-Brain Interface, EEG, Inter-Brain Synchrony, Neural Synchrony, tES}, location = {Honolulu, HI, USA}, series = {CHI '24}, }
- PneuMa: Designing Pneumatic Bodily Extensions for Supporting Movement in Everyday LifeAryan Saini, Rakesh Patibanda, Nathalie Overdevest, and 2 more authorsIn Extended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems, 2024
Prior research around the design of interactive systems has highlighted the benefits of supporting embodiment in everyday life. This resulted in the creation of body-centric systems that leverage movement. However, these advances supporting movement in everyday life, aligning with the embodiment theory, so far focused on sensing movement as opposed to facilitating movement. We present PneuMa, a novel wearable system that can facilitate movement in everyday life through pneumatic-based bodily extensions. We showcase the system through three examples: "Pardon?", moving the ear forward; "Greetings", moving a hand towards the "Bye-bye" gesture; "Take a break", moving the hands away from the keyboard, enabling the bodily extensions that support movement in everyday life. We delve into some findings in relation to prior research around bodily extensions and embodied interaction in the video. Ultimately, we hope that our work helps more people profit from the benefits of everyday movement support.
- GazeAway: Designing for Gaze Aversion ExperiencesNathalie Overdevest, Rakesh Patibanda, Aryan Saini, and 2 more authorsIn Extended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems, 2024
Gaze aversion is embedded in our behaviour: we look at a blank area to support remembering and creative thinking, and as a social cue that we are thinking. We hypothesise that a person’s gaze aversion experience can be mediated through technology, in turn supporting embodied cognition. In this design exploration we present six ideas for interactive technologies that mediate the gaze aversion experience. One of these ideas we developed into “GazeAway”: a prototype that swings a screen into the wearer’s field of vision when they perform gaze aversion. Six participants experienced the prototype and based on their interviews, we found that GazeAway changed their gaze aversion experience threefold: increased awareness of gaze aversion behaviour, novel cross-modal perception of gaze aversion behaviour, and changing gaze aversion behaviour to suit social interaction. We hope that ultimately, our design exploration offers a starting point for the design of gaze aversion experiences.
@inproceedings{overdevest_gazeaway_2024, author = {Overdevest, Nathalie and Patibanda, Rakesh and Saini, Aryan and Van Den Hoven, Elise and Mueller, Florian ‘Floyd’}, title = {GazeAway: Designing for Gaze Aversion Experiences}, year = {2024}, isbn = {9798400703317}, publisher = {Association for Computing Machinery}, address = {New York, NY, USA}, url = {https://doi.org/10.1145/3613905.3650771}, doi = {10.1145/3613905.3650771}, booktitle = {Extended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems}, articleno = {177}, numpages = {6}, location = { }, series = {CHI EA '24}, }
- Exploring Shared Bodily Control: Designing Augmented Human Systems for Intra- and Inter-CorporealityRakesh Patibanda, Nathalie Overdevest, Aryan Saini, and 5 more authorsIn Proceedings of the Augmented Humans International Conference 2024, 2024
The human-computer interaction community has evolved from using body-sensing to body-actuating technologies, transforming the body’s role from a mere input to an input-output medium. With body-sensing, the separation between the human and the computer is clear, allowing for an easy understanding of who is in control. However, with body-actuating technologies, this separation diminishes. These technologies integrate more closely with our bodies, where both the user and the technology can share control over their bodily interactions. In this workshop, we will explore this notion of sharing control, specifically focusing on experiences where users interact with their own bodies (intra-corporeal experiences), and interact with others using technology (inter-corporeal experiences). Our discussions and group activities will focus on brainstorming and designing within human augmentation, examining how this shared control can lead to innovative applications.
@inproceedings{patibanda_shared_ah_2024, author = {Patibanda, Rakesh and Overdevest, Nathalie and Saini, Aryan and Li, Zhuying and Andres, Josh and Knibbe, Jarrod and van den Hoven, Elise and Mueller, Florian 'Floyd'}, title = {Exploring Shared Bodily Control: Designing Augmented Human Systems for Intra- and Inter-Corporeality}, year = {2024}, isbn = {9798400709807}, publisher = {Association for Computing Machinery}, address = {New York, NY, USA}, url = {https://doi.org/10.1145/3652920.3653037}, doi = {10.1145/3652920.3653037}, booktitle = {Proceedings of the Augmented Humans International Conference 2024}, pages = {318–323}, numpages = {6}, keywords = {Bodily Interactions, Body-Actuating Technologies, Human Augmentation, Input-Output Medium, Inter-Corporeal Experiences, Intra-Corporeal Experiences, Physical Computing, Sensory Augmentation, Shared Control, Wearable Robotics}, location = {Melbourne, VIC, Australia}, series = {AHs '24}, }
- PneuMa: Designing Pneumatic Bodily Extensions for Supporting Movement in Everyday LifeAryan Saini, Rakesh Patibanda, Nathalie Overdevest, and 2 more authorsIn Proceedings of the Eighteenth International Conference on Tangible, Embedded, and Embodied Interaction, 2024
Prior research around the design of interactive systems has highlighted the benefits of supporting embodiment in everyday life. This resulted in the creation of body-centric systems that leverage movement. However, these advances supporting movement in everyday life, aligning with the embodiment theory, so far focused on sensing movement as opposed to facilitating movement. We present PneuMa, a novel wearable system that can facilitate movement in everyday life through pneumatic-based bodily extensions. We showcase the system through three examples: "Pardon?", moving the ear forward; "Greetings", moving a hand towards the "Bye-bye" gesture; "Take a break", moving the hands away from the keyboard, enabling the bodily extensions that support movement in everyday life. From the thematic analysis of a field study with 12 participants, we identified three themes: bodily awareness, Perception of the scenarios, and anticipating movement. We discuss our findings in relation to prior research around bodily extensions and embodied interaction to provide strategies to design bodily extensions that support movement in everyday life. Ultimately, we hope that our work helps more people profit from the benefits of everyday movement support.
@inproceedings{saini_pneuma_2024, author = {Saini, Aryan and Patibanda, Rakesh and Overdevest, Nathalie and Van Den Hoven, Elise and Mueller, Florian ‘Floyd’}, title = {PneuMa: Designing Pneumatic Bodily Extensions for Supporting Movement in Everyday Life}, year = {2024}, isbn = {9798400704024}, publisher = {Association for Computing Machinery}, address = {New York, NY, USA}, url = {https://doi.org/10.1145/3623509.3633349}, doi = {10.1145/3623509.3633349}, booktitle = {Proceedings of the Eighteenth International Conference on Tangible, Embedded, and Embodied Interaction}, articleno = {1}, numpages = {16}, keywords = {bodily extensions, embodied experiences, embodied interactions, pneumatics}, location = {Cork, Ireland}, series = {TEI '24}, video = {https://youtu.be/MpVxFC3npEc?si=skg393adkMttGE4C} }
2023
- Pneunocchio: A playful nose augmentation for facilitating embodied representationAryan Saini, Srihari Sridhar, Aarushi Raheja, and 5 more authorsIn Adjunct Proceedings of the 36th Annual ACM Symposium on User Interface Software and Technology, Oct 2023
Prior research has offered a plethora of wearables centred around sensing bodily actions ranging from more explicit data, such as movement and physiological response, to implicit information, such as ocular and brain activity. Bodily augmentations that physically extend the user’s body along with altering body schema and image have been proposed recently as well, owing to factors such as accessibility and improving communication. However, these attempts have usually consisted of uncomfortable interfaces that either restrict the user’s movement or are intrusive in nature. In this work, we present Pneunocchio, a playful nose augmentation based on the lore of Pinocchio. Pneunocchio consists of a pneumatic-based inflatable that a user wears on their nose to play a game of two truths and a lie. With our work, we aim to explore expressive bodily augmentations that respond to a player’s physiological state that can alter the perception of their body while serving as an expressive match for a current part of the body.
@inproceedings{saini_pneunocchio_2023, address = {New York, NY, USA}, series = {{UIST} '23 {Adjunct}}, title = {Pneunocchio: {A} playful nose augmentation for facilitating embodied representation}, isbn = {9798400700965}, shorttitle = {Pneunocchio}, url = {https://dl.acm.org/doi/10.1145/3586182.3616651}, doi = {10.1145/3586182.3616651}, urldate = {2023-11-10}, booktitle = {Adjunct {Proceedings} of the 36th {Annual} {ACM} {Symposium} on {User} {Interface} {Software} and {Technology}}, publisher = {Association for Computing Machinery}, author = {Saini, Aryan and Sridhar, Srihari and Raheja, Aarushi and Patibanda, Rakesh and Overdevest, Nathalie and Wang, Po-Yao (Cosmos) and Van Den Hoven, Elise and Mueller, Florian Floyd}, month = oct, year = {2023}, keywords = {bodily augmentation, embodied interaction, playful experience, pneumatics}, pages = {1--3}, file = {Full Text PDF:C\:\\Users\\saini\\Zotero\\storage\\D9MGFZWK\\Saini et al. - 2023 - Pneunocchio A playful nose augmentation for facil.pdf:application/pdf}, }
- Fluito: Towards Understanding the Design of Playful Water Experiences through an Extended Reality Floatation Tank SystemMaria F. Montoya, YuYang Ji, Ryan Wee, and 5 more authorsProceedings of the ACM on Human-Computer Interaction, Oct 2023
Water’s pleasant nature and associated health benefits have captivated the interest of HCI researchers. Prior WaterHCI work mainly focused on advancing instrumental applications, such as improving swimming performance, and less on designing systems that support interacting with technology in water in more playful contexts. In this regard, we propose floatation tanks as research vehicles to investigate the design of playful interactive water experiences. Employing somaesthetic design, we developed a playful extended reality floatation tank experience: "Fluito". We conducted a 13-participant study to understand how specific design features amplified participants’ water experiences. We used a postphenomenological lens to articulate eight strategies useful for designers aiming to develop digital playful experiences in water, such as designing to call attention to the water and designing to encourage breathing and body awareness in water experiences. Ultimately, we hope that our work supports people to be playful and benefit from the many advantages of being in water.
@article{montoya_fluito_2023, title = {Fluito: {Towards} {Understanding} the {Design} of {Playful} {Water} {Experiences} through an {Extended} {Reality} {Floatation} {Tank} {System}}, volume = {7}, shorttitle = {Fluito}, url = {https://dl.acm.org/doi/10.1145/3611056}, doi = {10.1145/3611056}, number = {CHI PLAY}, urldate = {2023-11-11}, journal = {Proceedings of the ACM on Human-Computer Interaction}, author = {Montoya, Maria F. and Ji, YuYang and Wee, Ryan and Overdevest, Nathalie and Patibanda, Rakesh and Saini, Aryan and Pell, Sarah Jane and Mueller, Florian ‘Floyd’}, month = oct, year = {2023}, keywords = {extended reality, floatation tank, flotation pod, playful experience, postphenomenology, somaesthetic, water, water activities, WaterHCI}, pages = {410:948--410:975}, file = {Full Text PDF:C\:\\Users\\saini\\Zotero\\storage\\44HK6REU\\Montoya et al. - 2023 - Fluito Towards Understanding the Design of Playfu.pdf:application/pdf}, }
- Auto-Paizo Games: Towards Understanding the Design of Games That Aim to Unify a Player’s Physical Body and the Virtual WorldRakesh Patibanda, Chris Hill, Aryan Saini, and 6 more authorsProceedings of the ACM on Human-Computer Interaction, Oct 2023
Most digital bodily games focus on the body as they use movement as input. However, they also draw the player’s focus away from the body as the output occurs on visual displays, creating a divide between the physical body and the virtual world. We propose a novel approach – the "Body as a Play Material" – where a player uses their body as both input and output to unify the physical body and the virtual world. To showcase this approach, we designed three games where a player uses one of their hands (input) to play against the other hand (output) by loaning control over its movements to an Electrical Muscle Stimulation (EMS) system. We conducted a thematic analysis on the data obtained from a field study with 12 participants to articulate four player experience themes. We discuss our results about how participants appreciated the engagement with the variety of bodily movements for play and the ambiguity of using their body as a play material. Ultimately, our work aims to unify the physical body and the virtual world.
@article{patibanda_auto-paizo_2023, title = {Auto-{Paizo} {Games}: {Towards} {Understanding} the {Design} of {Games} {That} {Aim} to {Unify} a {Player}’s {Physical} {Body} and the {Virtual} {World}}, volume = {7}, shorttitle = {Auto-{Paizo} {Games}}, url = {https://dl.acm.org/doi/10.1145/3611054}, doi = {10.1145/3611054}, number = {CHI PLAY}, urldate = {2023-11-11}, journal = {Proceedings of the ACM on Human-Computer Interaction}, author = {Patibanda, Rakesh and Hill, Chris and Saini, Aryan and Li, Xiang and Chen, Yuzheng and Matviienko, Andrii and Knibbe, Jarrod and van den Hoven, Elise and Mueller, Florian ‘Floyd’}, month = oct, year = {2023}, keywords = {bodily games, body as a play material, electrical muscle stimulation, hand games, integrated play, movement-based play, wearable interaction}, pages = {408:893--408:918}, file = {Full Text PDF:C\:\\Users\\saini\\Zotero\\storage\\2IPRLJ79\\Patibanda et al. - 2023 - Auto-Paizo Games Towards Understanding the Design.pdf:application/pdf}, }
- Fused Spectatorship: Designing Bodily Experiences Where Spectators Become PlayersRakesh Patibanda, Aryan Saini, Nathalie Overdevest, and 8 more authorsProceedings of the ACM on Human-Computer Interaction, Oct 2023
Spectating digital games can be exciting. However, due to its vicarious nature, spectators often wish to engage in the gameplay beyond just watching and cheering. To blur the boundaries between spectators and players, we propose a novel approach called "Fused Spectatorship", where spectators watch their hands play games by loaning bodily control to a computational Electrical Muscle Stimulation (EMS) system. To showcase this concept, we designed three games where spectators loan control over both their hands to the EMS system and watch them play these competitive and collaborative games. A study with 12 participants suggested that participants could not distinguish if they were watching their hands play, or if they were playing the games themselves. We used our results to articulate four spectator experience themes and four fused spectator types, the behaviours they elicited and offer one design consideration to support each of these behaviours. We also discuss the ethical design considerations of our approach to help game designers create future fused spectatorship experiences.
@article{patibanda_fused_2023, title = {Fused {Spectatorship}: {Designing} {Bodily} {Experiences} {Where} {Spectators} {Become} {Players}}, volume = {7}, shorttitle = {Fused {Spectatorship}}, url = {https://dl.acm.org/doi/10.1145/3611049}, doi = {10.1145/3611049}, number = {CHI PLAY}, urldate = {2023-11-11}, journal = {Proceedings of the ACM on Human-Computer Interaction}, author = {Patibanda, Rakesh and Saini, Aryan and Overdevest, Nathalie and Montoya, Maria F. and Li, Xiang and Chen, Yuzheng and Nisal, Shreyas and Andres, Josh and Knibbe, Jarrod and van den Hoven, Elise and Mueller, Florian ‘Floyd’}, month = oct, year = {2023}, keywords = {bodily play, electrical muscle stimulation, hand games, integrated motor play, movement-based design, spectating, spectatorship, watching games, wearable interaction}, pages = {403:769--403:802}, file = {Full Text PDF:C\:\\Users\\saini\\Zotero\\storage\\H8CZT2KV\\Patibanda et al. - 2023 - Fused Spectatorship Designing Bodily Experiences .pdf:application/pdf}, }
- Dancing Delicacies: Designing Computational Food for Dynamic Dining TrajectoriesJialin Deng, Humphrey Yang, Aryan Saini, and 4 more authorsIn Proceedings of the 2023 ACM Designing Interactive Systems Conference, Jul 2023
Contemporary human-food interaction design is often a technology-driven endeavor in which food’s materiality has been largely underexplored. Building on the concept of “computational food”, this paper explores the design of food as a material realization of computation through a material-centered approach. We engaged with a “Research through Design” exploration by designing a computational food system called “Dancing Delicacies”, which enables food items to be “programmed” and “reconfigured” within dynamic trajectories. Our practice led to a design framework resulting in four original dish designs. Our dishes aim to illustrate the richness of this new design space for computational food. Furthermore, through engaging with expert practitioners from the hospitality industry, we provide a first account of understanding the design of computational food for dynamic dining trajectories and its speculative use contexts in the industry. With this work, we hope to inspire researchers and designers to envision a new future of human-food interaction.
@inproceedings{deng_dancing_2023, address = {New York, NY, USA}, series = {{DIS} '23}, title = {Dancing {Delicacies}: {Designing} {Computational} {Food} for {Dynamic} {Dining} {Trajectories}}, isbn = {978-1-4503-9893-0}, shorttitle = {Dancing {Delicacies}}, url = {https://dl.acm.org/doi/10.1145/3563657.3596021}, doi = {10.1145/3563657.3596021}, urldate = {2023-11-10}, booktitle = {Proceedings of the 2023 {ACM} {Designing} {Interactive} {Systems} {Conference}}, publisher = {Association for Computing Machinery}, author = {Deng, Jialin and Yang, Humphrey and Saini, Aryan and Gaudenz, Urs Dominic and Yao, Lining and Olivier, Patrick and Mueller, Florian ‘Floyd’}, month = jul, year = {2023}, pages = {244--262}, file = {Full Text PDF:C\:\\Users\\saini\\Zotero\\storage\\4GTE7UMT\\Deng et al. - 2023 - Dancing Delicacies Designing Computational Food f.pdf:application/pdf}, }
- Towards Designing for Everyday Embodied Remembering: Findings from a Diary StudyNathalie Overdevest, Rakesh Patibanda, Aryan Saini, and 2 more authorsIn Proceedings of the 2023 ACM Designing Interactive Systems Conference, Jul 2023
Our bodies play an important part in our remembering practices, for example when we can remember passwords by typing, even if we cannot verbalise them. An increasing number of technologies are being developed to support remembering. However, so far, they seem to have not taken the opportunity yet to support remembering through bodily movements. To better understand how to design for such embodied remembering, we conducted a diary study with 12 participants who recorded their embodied remembering experiences in everyday life over a three-week period. Our thematic analysis of the diaries and interviews led to the creation of a framework that helps understand embodied remembering experiences (ERXs) based on the level of skilled and conscious movements used. We describe how this ERX framework could help with the design of technologies to support embodied remembering.
@inproceedings{overdevest_towards_2023, address = {New York, NY, USA}, series = {{DIS} '23}, title = {Towards {Designing} for {Everyday} {Embodied} {Remembering}: {Findings} from a {Diary} {Study}}, isbn = {978-1-4503-9893-0}, shorttitle = {Towards {Designing} for {Everyday} {Embodied} {Remembering}}, url = {https://dl.acm.org/doi/10.1145/3563657.3595999}, doi = {10.1145/3563657.3595999}, urldate = {2023-11-10}, booktitle = {Proceedings of the 2023 {ACM} {Designing} {Interactive} {Systems} {Conference}}, publisher = {Association for Computing Machinery}, author = {Overdevest, Nathalie and Patibanda, Rakesh and Saini, Aryan and Van Den Hoven, Elise and Mueller, Florian ‘Floyd’}, month = jul, year = {2023}, pages = {2611--2624}, file = {Full Text PDF:C\:\\Users\\saini\\Zotero\\storage\\7EYEKZNZ\\Overdevest et al. - 2023 - Towards Designing for Everyday Embodied Rememberin.pdf:application/pdf}, }
- DUMask: A Discrete and Unobtrusive Mask-Based Interface for Facial GesturesAryan Saini, Arpit Bhatia, Isha Kalra, and 2 more authorsIn Proceedings of the Augmented Humans International Conference 2023, Mar 2023
Interactions using the face, not only enable multi-tasking but also enable us to create hands-free applications. Previous works in HCI used sensors attached directly to the person’s face or inside their mouth. However, a mask, which has now become a norm in our everyday life and is socially acceptable, has rarely been used to explore facial interactions. We designed, “DUMask”, an interface that uses face parts covered by a mask to discretely enable 14 (+1 default) interactions. DUMask uses an infrared camera embedded inside an off-the-shelf face mask to recognize the gestures, and we demonstrate the effectiveness of our interface through in-lab studies. We conducted two user studies evaluating the experience of both the wearer and the onlooker, which validated that the interface is indeed inconspicuous and unobtrusive.
@inproceedings{bhatia_dumask_2023, address = {New York, NY, USA}, series = {{AHs} '23}, title = {{DUMask}: {A} {Discrete} and {Unobtrusive} {Mask}-{Based} {Interface} for {Facial} {Gestures}}, isbn = {978-1-4503-9984-5}, shorttitle = {{DUMask}}, url = {https://dl.acm.org/doi/10.1145/3582700.3582726}, doi = {10.1145/3582700.3582726}, urldate = {2023-11-10}, booktitle = {Proceedings of the {Augmented} {Humans} {International} {Conference} 2023}, publisher = {Association for Computing Machinery}, author = {Saini, Aryan and Bhatia, Arpit and Kalra, Isha and Mukherjee, Manideepa and Parnami, Aman}, month = mar, year = {2023}, keywords = {Facial Gestures, Mask, Wearables}, pages = {255--266}, file = {Full Text PDF:C\:\\Users\\saini\\Zotero\\storage\\JF4MDN2G\\Bhatia et al. - 2023 - DUMask A Discrete and Unobtrusive Mask-Based Inte.pdf:application/pdf}, }
2022
- TouchMate: Understanding the Design of Body Actuating Games using Physical TouchShreyas Nisal, Rakesh Patibanda, Aryan Saini, and 2 more authorsIn Extended Abstracts of the 2022 Annual Symposium on Computer-Human Interaction in Play, Nov 2022
Body-actuating technologies such as Electrical Muscle Stimulation (EMS) can actuate multiple players simultaneously via physical touch. To investigate this opportunity, we designed a game called “Touchmate”. Here, one guesser and two suspects sit across with their legs hidden under a table. The guesser attaches a ground electrode from one EMS channel, and each suspect attaches one active electrode from the same channel on their forearms. When a suspect touches the guesser’s leg, their bodies complete the electrical circuit, actuating both their hands involuntarily via the EMS. The guesser’s goal is to determine who touched their leg. In this paper, we present the results from our initial study and articulate three player experience themes. Ultimately, we hope our work inspires game designers to create physical touch games using body-actuating technologies.
@inproceedings{nisal_touchmate_2022, address = {New York, NY, USA}, series = {{CHI} {PLAY} '22}, title = {{TouchMate}: {Understanding} the {Design} of {Body} {Actuating} {Games} using {Physical} {Touch}}, isbn = {978-1-4503-9211-2}, shorttitle = {{TouchMate}}, url = {https://dl.acm.org/doi/10.1145/3505270.3558332}, doi = {10.1145/3505270.3558332}, urldate = {2023-11-10}, booktitle = {Extended {Abstracts} of the 2022 {Annual} {Symposium} on {Computer}-{Human} {Interaction} in {Play}}, publisher = {Association for Computing Machinery}, author = {Nisal, Shreyas and Patibanda, Rakesh and Saini, Aryan and Van Den Hoven, Elise and Mueller, Florian 'Floyd'}, month = nov, year = {2022}, keywords = {electrical muscle stimulation;physical touch games;movement-based games;bodily games;game design;integrated play;EMS games;motor play;social games}, pages = {153--158}, file = {Full Text PDF:C\:\\Users\\saini\\Zotero\\storage\\AA2KXXGY\\Nisal et al. - 2022 - TouchMate Understanding the Design of Body Actuat.pdf:application/pdf}, }
- SomaFlatables: Supporting Embodied Cognition through Pneumatic BladdersAryan Saini, Haotian Huang, Rakesh Patibanda, and 3 more authorsIn Adjunct Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology, Oct 2022
Applying the theory of Embodied Cognition through design allows us to create computational interactions that engage our bodies by modifying our body schema. However, in HCI, most of these interactive experiences have been stationed around creating sensing-based systems that leverage our body’s position and movement to offer an experience, such as games using Nintendo Wii and Xbox Kinect. In this work, we created two pneumatic inflatables-based prototypes that actuate our body to support embodied cognition in two scenarios by altering the user’s body schema. We call these ”SomaFlatables” and demonstrate the design and implementation of these inflatables based prototypes that can move and even extend our bodies, allowing for novel bodily experiences. Furthermore, we discuss the future work and limitations of the current implementation.
@inproceedings{saini_somaflatables_2022, address = {New York, NY, USA}, series = {{UIST} '22 {Adjunct}}, title = {{SomaFlatables}: {Supporting} {Embodied} {Cognition} through {Pneumatic} {Bladders}}, isbn = {978-1-4503-9321-8}, shorttitle = {{SomaFlatables}}, url = {https://dl.acm.org/doi/10.1145/3526114.3558705}, doi = {10.1145/3526114.3558705}, urldate = {2023-11-10}, booktitle = {Adjunct {Proceedings} of the 35th {Annual} {ACM} {Symposium} on {User} {Interface} {Software} and {Technology}}, publisher = {Association for Computing Machinery}, author = {Saini, Aryan and Huang, Haotian and Patibanda, Rakesh and Overdevest, Nathalie and Van Den Hoven, Elise and Mueller, Florian Floyd}, month = oct, year = {2022}, keywords = {Body Actuation, Body Schema, Embodied Cognition, Inflatables, Pneumatics}, pages = {1--4}, file = {Full Text PDF:C\:\\Users\\saini\\Zotero\\storage\\X9YT9FKQ\\Saini et al. - 2022 - SomaFlatables Supporting Embodied Cognition throu.pdf:application/pdf}, }
2021
- Actuating Myself: Designing Hand-Games Incorporating Electrical Muscle StimulationRakesh Patibanda, Xiang Li, Yuzheng Chen, and 4 more authorsIn Extended Abstracts of the 2021 Annual Symposium on Computer-Human Interaction in Play, Oct 2021
Motor movements are performed while playing hand-games such as Rock-paper-scissors or Thumb-war. These games are believed to benefit both physical and mental health and are considered cultural assets. Electrical Muscle Stimulation (EMS) is a technology that can actuate muscles, triggering motor movements and hence offers an opportunity for novel play experiences based on these traditional hand-games. However, there is only limited understanding of the design of EMS games. We present the design and evaluation of two games inspired by traditional hand-games, ”Slap-me-if-you-can” and ”3-4-5”, which incorporate EMS and can be played alone, unlike traditional games. A thematic analysis of the data collected revealed three themes: 1) Gameplay experiences and influence of EMS hardware, 2) Interaction with EMS and the calibration process and, 3) Shared control and its effect on playing EMS games. We hope that an enhanced understanding of the potential of EMS to support hand-games can aid the advancement of movement-based games as a whole.
@inproceedings{patibanda_actuating_2021, address = {New York, NY, USA}, series = {{CHI} {PLAY} '21}, title = {Actuating {Myself}: {Designing} {Hand}-{Games} {Incorporating} {Electrical} {Muscle} {Stimulation}}, isbn = {978-1-4503-8356-1}, shorttitle = {Actuating {Myself}}, url = {https://dl.acm.org/doi/10.1145/3450337.3483464}, doi = {10.1145/3450337.3483464}, urldate = {2023-11-10}, booktitle = {Extended {Abstracts} of the 2021 {Annual} {Symposium} on {Computer}-{Human} {Interaction} in {Play}}, publisher = {Association for Computing Machinery}, author = {Patibanda, Rakesh and Li, Xiang and Chen, Yuzheng and Saini, Aryan and Hill, Christian N and van den Hoven, Elise and Mueller, Florian Floyd}, month = oct, year = {2021}, keywords = {Bodily games, Electrical Muscle Stimulation, EMS Games, Game design, Hand games, Integrated Play, Movement-based games}, pages = {228--235}, file = {Full Text PDF:C\:\\Users\\saini\\Zotero\\storage\\BAASCIZQ\\Patibanda et al. - 2021 - Actuating Myself Designing Hand-Games Incorporati.pdf:application/pdf}, }
2019
- Aesop: Authoring Engaging Digital Storytelling ExperiencesAryan Saini, Kartik Mathur, Abhinav Thukral, and 2 more authorsIn Adjunct Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology, Oct 2019
The traditional storytelling experiences are often one dimensional, wherein they only contain a single channel of communication with the audience through narration. With the advancements in technology, storytelling experiences have been augmented with the help of digital media to be more engaging and immersive. Authoring these scenarios, however, is complicated as it requires technical knowledge to interface the means of engagement. In this work, we talk about Aesop, a system which assists the narrator to author engaging storytelling experiences. Aesop provides a block-based interface like Scratch and manifests words of a story, Cues, and Visualization (Outputs) as blocks that enable the user to create captivating stories. Our system also leverages physical actions performed by the user as Cues. These cues can trigger visualizations like robot actions, animations, environments simulation using sound and lighting effects.
@inproceedings{saini_aesop_2019, address = {New York, NY, USA}, series = {{UIST} '19 {Adjunct}}, title = {Aesop: {Authoring} {Engaging} {Digital} {Storytelling} {Experiences}}, isbn = {978-1-4503-6817-9}, shorttitle = {Aesop}, url = {https://dl.acm.org/doi/10.1145/3332167.3357114}, doi = {10.1145/3332167.3357114}, urldate = {2023-11-10}, booktitle = {Adjunct {Proceedings} of the 32nd {Annual} {ACM} {Symposium} on {User} {Interface} {Software} and {Technology}}, publisher = {Association for Computing Machinery}, author = {Saini, Aryan and Mathur, Kartik and Thukral, Abhinav and Singhal, Nishtha and Parnami, Aman}, month = oct, year = {2019}, keywords = {authoring tool, digital storytelling, narration, story}, pages = {56--59}, file = {Full Text PDF:C\:\\Users\\saini\\Zotero\\storage\\9DEW387X\\Saini et al. - 2019 - Aesop Authoring Engaging Digital Storytelling Exp.pdf:application/pdf}, }
- Gehna: Exploring the Design Space of Jewelry as an Input ModalityJatin Arora, Kartik Mathur, Aryan Saini, and 1 more authorIn Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, May 2019
Jewelry weaves into our everyday lives as no other wearable does. It comes in many wearable forms, is fashionable, and can adorn any part of the body. In this paper, through an exploratory, Research through Design (RtD) process, we tap into this vast potential space of input interaction that jewelry can enable. We do so by first identifying a small set of fundamental structural elements — called Jewelements — that any jewelry is composed of, and then defining their properties that enable the interaction. We leverage this synthesis along with observational data and literature to formulate a design space of jewelry-enabled input techniques. This work encapsulates both the extensions of common existing input methods (e.g., touch) as well as new ones inspired by jewelry. Furthermore, we discuss our prototypical sensor-based implementations. Through this work, we invite the community to engage in the conversation on how jewelry as a material can help shape wearable-based input.
@inproceedings{arora_gehna_2019, address = {New York, NY, USA}, series = {{CHI} '19}, title = {Gehna: {Exploring} the {Design} {Space} of {Jewelry} as an {Input} {Modality}}, isbn = {978-1-4503-5970-2}, shorttitle = {Gehna}, url = {https://dl.acm.org/doi/10.1145/3290605.3300751}, doi = {10.1145/3290605.3300751}, urldate = {2023-11-10}, booktitle = {Proceedings of the 2019 {CHI} {Conference} on {Human} {Factors} in {Computing} {Systems}}, publisher = {Association for Computing Machinery}, author = {Arora, Jatin and Mathur, Kartik and Saini, Aryan and Parnami, Aman}, month = may, year = {2019}, keywords = {interaction design, jewelry, wearable-based input}, pages = {1--12}, file = {Full Text PDF:C\:\\Users\\saini\\Zotero\\storage\\HVRZW4GS\\Arora et al. - 2019 - Gehna Exploring the Design Space of Jewelry as an.pdf:application/pdf}, }
- VirtualBricks: Exploring a Scalable, Modular Toolkit for Enabling Physical Manipulation in VRJatin Arora, Aryan Saini, Nirmita Mehra, and 3 more authorsIn Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, May 2019
Often Virtual Reality (VR) experiences are limited by the design of standard controllers. This work aims to liberate a VR developer from these limitations in the physical realm to provide an expressive match to the limitless possibilities in the virtual realm. VirtualBricks is a LEGO based toolkit that enables construction of a variety of physical-manipulation enabled controllers for VR, by offering a set of feature bricks that emulate as well as extend the capabilities of default controllers. Based on the LEGO platform, the toolkit provides a modular, scalable solution for enabling passive haptics in VR. We demonstrate the versatility of our designs through a rich set of applications including re-implementations of artifacts from recent research. We share a VR Integration package for integration with Unity VR IDE, the CAD models for the feature bricks, for easy deployment of VirtualBricks within the community.
@inproceedings{arora_virtualbricks_2019, address = {New York, NY, USA}, series = {{CHI} '19}, title = {{VirtualBricks}: {Exploring} a {Scalable}, {Modular} {Toolkit} for {Enabling} {Physical} {Manipulation} in {VR}}, isbn = {978-1-4503-5970-2}, shorttitle = {{VirtualBricks}}, url = {https://dl.acm.org/doi/10.1145/3290605.3300286}, doi = {10.1145/3290605.3300286}, urldate = {2023-11-10}, booktitle = {Proceedings of the 2019 {CHI} {Conference} on {Human} {Factors} in {Computing} {Systems}}, publisher = {Association for Computing Machinery}, author = {Arora, Jatin and Saini, Aryan and Mehra, Nirmita and Jain, Varnit and Shrey, Shwetank and Parnami, Aman}, month = may, year = {2019}, keywords = {construction toolkit, passive haptics, physical manipulation}, pages = {1--12}, file = {Full Text PDF:C\:\\Users\\saini\\Zotero\\storage\\ET7B6F93\\Arora et al. - 2019 - VirtualBricks Exploring a Scalable, Modular Toolk.pdf:application/pdf}, }