Version BAD NETWORK/FIREWALL

lecture: Beyond Virtual and Augmented Reality

From Superhuman Sports to Amplifying Human Senses

Slice

With recent development in capture technology, preserving one's’ daily experiences and one's’ knowledge becomes richer and more comprehensive. Furthermore, new recording technologies beyond simple audio/video recordings become available: 360° videos, tactile recorders and even odor recorders are becoming available. . The new recording technology and the massive amounts of data require new means for selecting, displaying and sharing experiences.

Sharing experiences and knowledge have always been essential for human development. They enable skill transfers and empathy. Over history, mankind developed from oral traditions to cultures of writing. With the ongoing digital revolution, the hurdles to share knowledge and experiences vanish. Already today it is, for example, technically feasible to take and store 24/7 video recordings of one's’ life. While this example creates massive collections of data, it makes it even more challenging to share experiences and knowledge with others in meaningful ways.

A recurring theme in science fiction literature is the download of the abilities of another human to one's mind. Although current cognitive science and neuroscience strongly suggest that this is impossible, as our minds are embodied; we believe that skill transfer and effective learning will accelerate tremendously given recent technological trends; just to name a few of the enabling technologies, human augmentation using virtual/augmented reality, new sensing modalities (e.g. affective computing) and actuation (e.g. haptics), advances in immersive storytelling (increasing empathy, immersion, communication) etc.

The talk starts with sensing and actuation technology, giving an overview about them and discussing how they can be used.

I’m discussing several novel upcoming sensing modalities for VR and AR, first of all eye movement analysis for interaction and activity recognition, introducing the pupil eye tracker (open source eye tracker from pupil labs), affective wear (one of our research to track facial expressions on affordable smart glasses) to J!NS MEME (EOG glasses that can detect how much you are reading and how attentive you are).

In the next part of the talk I go into details about actuation.Here I especially discuss haptics.
From the TECHTILE Toolkit (a rapid prototyping haptic toolkit from two of my colleagues Kouta Minamizawa and Masashi Nakatani) to the REZ Infinite Haptic Suit.

In the end, I give an outlook on projects that push the limits for experience sharing and skill transfer: the Swiss Cybathlon and the Japanese Super Human Sports Society.

I’m a researcher in the wearable computing, AR and VR field organizing a Dagstuhl Seminar on a similar topic, I’m also a founding member of the Japanese Super Human Sports Society.


Archived page - Impressum/Datenschutz