Demonstrations

RetroActivity: Rapidly Deployable Live Task Guidance Experiences

Andrey Konin, Shahram Najam Syed, Shakeeb Siddiqui, Sateesh Kumar, Quoc-Huy Tran, M. Zeeshan Zia

Short abstract

RetroActivity is a system for building computational models of a goal-driven physical task, that learns from a handful of video demonstrations of the task, without the need for any text-based computer programming effort. Once such a model is built, RetroActivity analyzes live demonstrations of the activity and guides a hands-on worker through the task, providing feedback when a mistake is made.

Radarmin: A Radar-Based Mixed Reality Theremin Setup

Ryo Hajika

Short abstract

The theremin is intricate to master because the slightest of motion across the whole body can affect the sound. Radarmin is a musical performance setup consisting of 1) a radar sensor for fine grain finger motion to pitch control, 2) a Leap Motion sensor for the larger hand movements as complementary instrumental input, and 3) the Magic Leap headset for Mixed Reality (MR) guidance cues for hand placement and spatially mapped sound visualization. Participants experience Radarmin in a rhythm game, where they can learn how to play theremin with the appropriate visual and audio feedback in MR.

Exploiting ARKit Depth Maps for Mixed Reality Home Design

Kevin Wong, Salma Jiddi, Yacine Alami, Philip Guindi, Brian Totty, Qing Guo, Michael Otrada, Paul Gauthier

Short abstract

Mixed reality technologies are increasingly relevant to the retail industry as it embraces digital commerce. In this work, we demonstrate the use of Apple’s new active Depth API in an ambitious, mixed reality experience being developed at IKEA, the world’s largest home furnishings retailer. Our solution offers advantages not widely available to consumers today, including the ability to capture portable and interactive room models that can be virtually furnished from anywhere, alone or collaboratively, with high fidelity rendering, expansive imagery, and fine occlusions.

HarpyGame: A Customizable Serious Game for Upper Limb Rehabilitation after Stroke

Gabriel Cyrino, Júlia Tannús, Edgard Lamounier, Alexandre Cardoso, Alcimar Soares

Short abstract

Stroke is the most common disease that leads to the dexterity impairment of the upper limbs. Serious games have emerged as an advantageous alternative to rehabilitation treatment when compared to traditional therapies. This work presents the development of a customizable serious game, based on Virtual Reality techniques to achieve a more natural and intuitive interface. Thus, the system consists of a serious game with a realistic environment, a control panel for patient management and game customization, and a database. Results indicated significant acceptance by the patients and physiotherapists, implying the system’s potential use in the post-stroke rehabilitation process of the upper limbs.

An End-to-End Mixed Reality Product for Interior Home Furnishing

Salma Jiddi, Brian Pugh, Qiqin Dai, Luis Puig, Nektarios Lianos, Paul Gauthier, Brian Totty, Angus Dorbie, Jianfeng Yin, Kevin Wong

Short abstract

In this work, we consider the challenge of achieving a coherent geometric and photometric blending between real and virtual worlds in the context of an innovative Mixed Reality (MR) application. Using consumer phones, the proposed solution takes as input color images of a scene to produce a magazine-quality stitched panorama, recover its dense 3D layout, and infer its lighting model parameters. Our system handles a large variety of real-world scenes where layout, texture and material properties vary throughout the full spectrum.

Generating Emotive Gaits for Virtual Agents Using Affect-Based Autoregression

Uttaran Bhattacharya

Short abstract

RockemBot Boxing: Facilitating Long-Distance Real-Time Collaborative Interactions with Limited Hand Tracking Volumes

James Cambpell, Eleanor Barnes, Jack Douglas Fraser, Bradley Twynham, Xuan Tien Pham, Nguyen Thu Hien, Geert Lugtenberg, Nishiki Yoshinari, Sarah Al Akkad, Andrew Gavin Taylor, Mark Billinghurst, Damien Constantine Rompapas

Short abstract

This demonstration showcases a boxing game that facilitates interactions between two users over a larger-than-arms reach distance. In RockemBot boxing, users stand two meters apart, and use virtual fists as a means of knocking the opposing player’s virtual head in an intense matchup. By first re-mapping the user’s hand tracked input to a virtual model, and representing the user’s in the collaborative space as a semi-attached avatar, we allow real-time high fidelity interactions.

The Visit VR. Understanding the experience of living with dementia

Volker Kuchelmeister, Jill Bennett, Gail Kenning, Natasha Ginnivan, Melissa Neidorf, Chris Papadopouos

Short abstract

The Visit is an interactive 6-dof real-time Virtual Reality experience, developed from a ground-breaking research project conducted by artists and psychologists working with women living with dementia. Visitors are invited to sit with Viv, a life-sized, realistic and responsive character whose dialogue is created largely from verbatim interviews, drawing us into a world of perceptual uncertainty, while at the same time confounding stereotypes and confronting fears about dementia.

VisuoTouch: Enabling Haptic Feedback in Augmented Reality through Visual Cues

G S Rajshekar Reddy, Damien Constantine Rompapas

Short abstract

The rapid advancements in Augmented Reality (AR) have recently included hand tracking frameworks, allowing for hand interactions with AR content. However, the lack of haptic feedback leaves the user confounded on whether or not their hand has successfully collided with the virtual content. In this demo, we showcase a system that enables the semblance of haptic feedback by providing visual cues. The cue illuminates the spot where the finger collides with the object. If the user continues to push through, the virtual finger bends against the object, following real-world physics.

EmbodiMap VR. Extending body-mapping into the third dimension.

Volker Kuchelmeister, Jill Bennett, Gail Kenning, Natasha Ginnivan, Melissa Neidorf.

Short abstract

EmbodiMap is a therapeutic/research tool that enables users to engage with and map their feelings, thoughts and emotions and how these are experienced within the body. Supporting fEEL’s research into felt experience, and drawing on insights from somatic and sensori-motor psychotherapies, EmbodiMap invites participants to engage with a virtual 3D facsimile of the body, entering inside this form and using the tool to paint sensations as they are experienced inside the body. An advance on technologies/media that allow only surface drawing, EmbodiMap promotes a palpable, interactive engagement with the ‘avatar’ body.

Multimedia Information Retrieval for Mixed Interaction Based on Cross-modal Retrieval and Hand Gesture Browsing

Rintaro Yanagi, Ren Togo, Takahiro Ogawa, Miki Haseyama

Short abstract

Physical devices such as a keyboard, a mouse and a mobile device are one of the most essential items for retrieving desired multimedia contents. As the next stage of these devices, wearable Mixed Reality (MR) devices become more popular. In this paper, we present a novel multimedia information retrieval system assuming utilization jointly with MR devices. By combining two components “cross-modal retrieval” and “browsing by hand gestures”, users can retrieve desired multimedia contents in a free-form style.

Project Esky: Enabling High Fidelity Augmented Reality Content on an Open Source Platform

Damien Constantine Rompapas, Daniel Flores Quiros, Charlton Rodda, Bryan Christopher Brown, Noah Benjamin Zerkin, Alvaro Cassinelli

Short abstract

This demonstration showcases a complete Open-Source Augmented Reality (AR) modular platform capable of high fidelity natural hand-interactions with virtual content, high field of view, and spatial mapping for environment interactions. We do this via several live desktop demonstrations. Finally, included in this demonstration is a completed open source schematic, allowing anyone interested in utilizing our proposed platform to engage with high fidelity AR. It is our hope that the work described in this demo will be a stepping stone towards bringing high-fidelity AR content to researchers and commodity users alike.

InSight AR. Virtual sculptures in presented in a public exhibition.

Volker Kuchelmeister

Short abstract

InSight AR is a site-specific Augmented Reality project and mobile phone app produced for the popular Sculptures by the Sea Bondi exhibition to be held in Sydney Australia in October/November 2020. It forms uncanny relations between virtual sculptures, visitors, the environment and the art on site. The encounter with the virtual sculptures is interactive and explorative. Just as in the real-world, a visitor can circumambulate a sculpture to examine it from all sides or get closer to reveal more detail.

Aipan VR: A Virtual Reality Experience for Preserving Uttarakhand’s Traditional Art Form

Nishant Chaudhary, Mihir Raj, Richik Bhattacharjee, Anmol Srivastava, Rakesh Sah, Pankaj Badoni

Short abstract

Aipan is a traditional art form practiced in the Kumaon region in the state of Uttarakhand, India. It is typically used to decorate floors and walls at the places of worship or entrances of homes and is considered auspicious to begin any work or event. This art is associated with a great degree of social, cultural as well as the religious significance and is passed from generation to generation. However, in the present era of modernization and technological advancements, this art form now stands on the verge of depletion. This study presents a humble attempt to preserve this vanishing art form through the use of Virtual Reality (VR). Ethnographic studies were conducted in Almora, Nainital, and Haldwani regions of Uttarakhand to trace the origins as well as to gain a deeper understanding of this art form. A total of ten (N = 10) aipan designers were interviewed. Several interesting insights are revealed through these studies that show the potential to be incorporated as VR experience. This paper presents a demonstration of the developed prototype showcasing a way to preserve Intangible Cultural Heritage of Uttarakhand.

AQVARIUM: A mixed reality snorkeling system

Juri Platonov, Pawel Kaczmarczyk

Short abstract

The goal of the AQVARIUM project is to create the illusion of snorkeling in an exotic place. The system allows for free movement on the water surface, limited only by the borders of the swimming pool. As the virtual reality technology completely substitutes reality by artificial content, the problem of inter-user and swimming pool border awareness is addressed. To our best knowledge AQVARIUM is the first mixed reality snorkeling system supporting unconstrained snorkeling by multiple users with mutual awareness.

Augmented Mirrors

Alejandro Martin-Gomez

Short abstract

Flower Factory: A Component-based Approach for Rapid Flower Modeling

Junjun Pan

Short abstract

Mobile3DRecon: Real-time Monocular 3D Reconstruction on a Mobile Phone

Hanqing Jiang

Short abstract

©2020 by ISMAR
Sponsored by the IEEE Computer Society Visualization and Graphics Technical Committee and ACM SIGGRAPH
IEEE Privacy Policy