What does Apple mean to a bat?

A VR experience of being a bat in Time Square

Project Overview

What Does Apple Mean To A Bat? is a virtual reality experience of being a bat in Time Square using echolocation.

It's my thesis project of Master program at Interactive Telecommunications Program (ITP), Tisch, NYU.

Timeline

Feb - May 2020

Tools

Unity Game Engine, VRTK, Oculus Rift, Cinema 4D

Scope

Artistic research, storyboard, interaction design, 3D modeling, programming, VR development

Advisors

Marina Zurkow,
Sarah Rothberg

demo
abstract

Our living experiences have been reconfigured by ubiquitous digital contents in hypermedia spaces. I wanted to explore what it is like for another being to experience a human-made hypermedia space that is saturated with symbols, signs and spectacles.

Inspired by Thomas Nagel’s essay What is it like to be a bat?, I chose bat as the metaphor. Bat shares a similar cognitive mechanism as human beings but has a very distinctive biological construction and perceptual experience.  Bats are mammals, and are assumed to have conscious experience. It is a highly evolved animal who actively uses biological sensors just like human beings. Bats mainly use echolocation to navigate and perceive objects.

Through this experience, I hope to let people experience our human-made world from another being’s perspective, and reflect on how our human experience is just one kind of experiences in nature.

storyboard

It is a semi-narrative, semi-interactive experience. User will follow a bat’s voice-over narration through out the experience, and interact with the environment through echolocation.

The narration style is inspired by Jim Trainor’s animation The Bats, the visual representations of echolocation are inspired by LIDAR technology and VR works: Notes On Blindness and Scanner Sombre.

There are three chapters in this experience:

01. Memory of Cave

It’s a scene where the bat is born ---  a cave and a park. Users will feel being a bat in the nature, learn how to echolocate, fly and eat moths

02. Apple Tree

In this animated scene, the bat will have a conversation with the Universe about its previous life as a human.

03. Destination

In this scene, users will wonder around Time Square, trying to look for the hint from the Universe about its previous life as a human.

Visual

The visual style is based on the research of bat’s perception. Bats have a cross-modal perception, which means they dynamically switch between the modalities. It uses vision for large objects domain and echolocation for small objects domain. I intended to represent these two modes of perception using different visual language.

Bat’s Visual Perception
Bat has dichromatic colour vision and UV vision. It can see things especially under dim light environment

Bat’s Echolocationg Perception
When a bat is echolocating, it construct streams of sound data into real-time acoustic-image motion pictures, which is similar to LiDAR scanning.

Artistic Visual Reference for echolocating perception

Visual Design

Vision Perception: Monotone, sensitive to light
Echolocating Perception: particle cloud as acoustic-image in brain
Interaction

In this experience, users will perceive and interact with the environment through echolocating, and move around through flying.

Echolocation Simulation
User makes sound to activate the environment. When users make sound, they will hear bat chirping in their headset as their “voice”.

The bat’s sonar beam can be likened to an auditory flashlight. There are two variables controlling the “echolocating” effects:
- The higher pitch, the narrower “field of view” but the more points be acitvated
- The higher volume, the more points to be activated

Fly Simulation
User move hands towards the desired direction to fly.

installation

Multi-User View
While player will perceive through bat’s senses, audience without headsets will see the human-version of the view on the screen.

Player's view
Audience's view
References

Influential Artist & Creators
Dr. Klaus Schmitt, Larry Sultan, Thomas Thwaites, Franz Kafka, Susanne Kennedy, Jon Rafman, Guthrie Lonergan, Chris Woebken, Nikki S. Lee, Jim Trainer, Liam Young, David Eagleman

Research Bibliography
Bolter, Jay David, and Richard Grusin. Remediation: Understanding New Media. MIT Press, 2003.
Crary, Jonathan. “Techniques of the Observer.” October, vol. 45, 1988, p. 3., doi:10.2307/779041.
Danilovich, S., and Y. Yovel. “Integrating Vision and Echolocation for Navigation and Perception in Bats.” Science Advances, vol. 5, no. 6, 2019, doi:10.1126/sciadv.aaw6503.
Debord, Guy. THE SOCIETY OF SPECTACLE.Eck, Allison. “Bat-Inspired Tech Could Help Blind People See with Sound.” PBS, Public Broadcasting Service, 23 Oct. 2013, www.pbs.org/wgbh/nova/article/bioinspired-assistive-devices/.
Gualeni, Stefano, et al. Augmented Ontologies: the Question Concerning Digital Technology and Projectual Humanism. Erasmus University of Rotterdam, 2014.
HOFFMAN, DONALD D. CASE AGAINST REALITY: How Evolution Hid the Truth from Our Eyes. PENGUIN BOOKS, 2020.
HOROWITZ, ALEXANDRA. BEING A DOG: Following the Dog into a World of Smell.
SIMON & SCHUSTER LTD, 2019.“Learning to Smell: Using Deep Learning to Predict the Olfactory Properties of Molecules.” Google AI Blog, 24 Oct. 2019, ai.googleblog.com/2019/10/learning-to-smell-using-deep-learning.html.
Merabet, Lotfi, et al. “Feeling by Sight or Seeing by Touch?” Neuron, vol. 42, no. 1, 2004, pp. 173–179., doi:10.1016/s0896-6273(04)00147-3.
Moss. “Probing the Natural Scene by Echolocation in Bats.” Frontiers in Behavioral Neuroscience, 2010, doi:10.3389/fnbeh.2010.00033.Nagel, Thomas. “11. What Is It Like to Be a Bat?” The Language and Thought Series, doi:10.4159/harvard.9780674594623.c15.
Sawyer, Eva K., and Kenneth C. Catania. “Somatosensory Organ Topography across the Star of the Star-Nosed Mole (Condylura Cristata).” Journal of Comparative Neurology, vol. 524, no. 5, 2015, pp. 917–929., doi:10.1002/cne.23943.
Simmons, James A., and Alan D. Grinnell. “The Performance of Echolocation: Acoustic Images Perceived by Echolocating Bats.” Animal Sonar, 1988, pp. 353–385., doi:10.1007/978-1-4684-7493-0_40.
Simonite, Tom. “Prepare for the Deepfake Era of Web Video.” Wired, Conde Nast, 4 Oct. 2019, www.wired.com/story/prepare-deepfake-era-web-video/.
Thaler, Lore, and Melvyn A. Goodale. “Echolocation in Humans: an Overview.” Wiley Interdisciplinary Reviews: Cognitive Science, vol. 7, no. 6, 2016, pp. 382–393., doi:10.1002/wcs.1408.
“The Senses of Touch : Haptics, Affects and Technologies.” 2007, doi:10.5040/9781474215831.
Uexküll Jakob von, and Uexküll Thure von. Jakob Von Uexkülls A Stroll through the Worlds of Animals and Men. Mouton De Gruyter, 1992.

Project Goal

Through the smart use of technologies and IoT devices, reactivate the lifestyle of the elderly, engaging their family, and empower professionals and frontline staff to work more efficiently.