Inneraction, a virtual reality application to observe the user’s biofeedback in real-time. This project is part of my Master’s thesis at the St. Pölten University of Applied Science.
The main idea is the development and evaluation of a virtual reality application which enables individuals to reflect and observe their own physiology using the technology of biofeedback interaction. Bio signals like heart rate, oxygen saturation, respiratory rhythm and the electrical activity of muscles will be visualised in a virtual reality setting. There will be two kinds of scenarios:
- “Self Observation” is a mirror-like, educational visualisation and sound with the aim to help participants to understand their own physical processes of their body.
- E.g. a representative 3D-object of a human heart can be animated based on the participant’s heart rate signal in real time.
- “Creative Inneraction” [sic] is a more expressive and experimental visual language, based on an algorithmic interaction between generative visualizations and sound entirely driven by the biofeedback data stream of the participant.
Ideas for visualization & sound
Visualisation: Self Observation
The following figure should just give an impression how the final project could look like. The idea is to animate representative 3D-objects of organs based on the user’s biosignals in real-time.
- The contractions of heart and lungs will be animated based on the real-time signals from the correspondent sensors:
- ECG-sensor (electrocardiography) → contraction of 3D-heart
- breathing sensor → contraction of 3D-lungs
Visualisation: Creative Inneraction
The current concept of this mode is to generate a more expressive and experimental visual language based on the combination of two visualisation approaches:
- Point of view from the inside of the organs: In contrast to the “mirror-like visualisation” the user’s point of view will be placed in the inside of the organs.
- Generative visualisation based on biosignals: This visualisation highly differs from user to user. Therefore it is not really possible to give a precious picture about the final approach. It will be obtained through algorithmic dynamic creation, animation, manipulation and distortion of 3D-objects (e.g. dynamically created organic three dimensional structures) and the use of dynamic video effects and manipulations (color, blur and glitch effects, render manipulations).
Self observation: The sound is based on real recordings of the corresponding organs. (e.g. heartbeat and breath sounds)
Inneraction: The sound design of this mode will be driven by the biofeedback data stream. A specially prepared sound file will be modulated and filtered using the five biofeedback signals. Every signal operates in it’s own sound layer, e.g. the respiratory rhythm modulates the amplitude envelope of the sound.