SPACE AS A SCORE:
CORPORATE UNCONSCIOUS AND THE VIDEOWALK FORMAT


1. Introduction
1.1 Background and Context
1.2 Research Question
1.3 Objectives and Scope
1.4 Structure of the Thesis


2. Corporate Unconscious 2.1 Credits
2.2 Description
2.3 Video Documentation
2.4 Synopsis
2.5 Full Text

3. Theoretical framework 3.1 Videowalk: Exploring the Format
3.1.1. Walking as a type of art.3.1.2 Audiowalks
3.1.3 The emergence of Videowalk
3.1.4 Choosing the format
3.2 Site-Specific Art and Spatial Narratives
3.3 Engaging Audiences in a Constructed Reality
3.3.1 Illusion and Engagement: The Rubber Hand Effect in Theater
3.3.2 We should invent reality before filming it
3.3.3 Simul Entertainment GmbH3.4 Meta-Score

4. Creative process 4.1 Concept Development
4.1.1 Synchronicity and simultaneity.
4.1.2 Corporate Language as a Narrative Tool
4.2 Space research
4.3 Development of visual, auditory and performative identity
4.3.1 Corporate Identity
4.3.2 Art Direction and Stage Design
4.3.3 Performativity
4.3.4 Costumes
4.3.5 Music composition
4.3.6 Cinematography
4.4 Dramaturgy and Script Development
4.4.1 Narrative Layers
4.4.2 Storytelling
4.4.3 Dramaturgical arc
4.4.4 Space Score and Timing
4.5 Videowalk Production phases
4.5.1 Creation of Fake Historical Footage
4.5.2 Videowalk Filming
4.5.3 3D Modeling and Scanning of the Space
4.5.4 VFX Development and 3D Animated Scenes
4.5.5 Documentary Development
4.6 Performance and Participation4.6.1 Installations & self-reflective moments
4.6.2 Leveled performances
4.6.3 Fake participants and recursive participation
4.6.4 Easter eggs
4.7 Multimedia Techniques
4.7.1 LiDAR Scanning and As-build modeling
4.7.2 On-site shading and texturing
4.7.3 Character and animations
4.7.4 Camera tracking and VFX compositing
4.7.5 Virtual production and "inverse virtual production"
4.7.6 Video Game development
4.7.7 Spatial audio
4.7.8 AI text models
4.7.9 iOS playback app


5. Conclusion
6. Acknowledgments
7. References
4.7.2 On-site shading and texturing

In the shading and texturing phase, we utilized Unreal Engine due to its advanced real-time rendering capabilities and increasing use in the film industry for virtual production. This stage involves applying textures and shaders to the As-Build 3D models, which transforms them from bare structures to detailed, lifelike environments. Textures, created from high-resolution photographs taken of the actual environment, cover everything from wall surfaces to floor patterns, ensuring the digital space accurately reflects its real-world counterpart. Leonhard Onken-Menke was fundamental in this process, as he kindly introduced me to the world of post-process materials and gave me insights on how the new Lumen real-time illumination engine can generate realistic environments. 

Unreal Engine's physically based rendering (PBR) shaders were employed to ensure materials such as metal, glass, and fabric interacted with light in authentic ways, enhancing the captured textures' realness. This process was not limited to static architectural features; it extended to dynamic elements like furniture and unique fixtures, vital for aligning the virtual experience with the physical site. Such details were essential for participants to recognize and correlate their actual surroundings with what they saw on their iPads. A critical consideration in this process was ensuring that the digital environment replicated the aesthetic and feel of the actual space. Unlike the conventional approach in video game design where "realism" is often stylized according to established gaming conventions, our objective was to achieve a level of realism that directly corresponds to the actual environment. This was pivotal because the experience intended for audiences to directly compare the real space with its digital counterpart on the iPad. This comparison was central to the project's immersive experience; hence, maintaining exactitude in every detail of the 3D spaces was imperative.









MFS Room after the process of adding textures in Unreal Engine.