SPACE AS A SCORE:
CORPORATE UNCONSCIOUS AND THE VIDEOWALK FORMAT


1. Introduction
1.1 Background and Context
1.2 Research Question
1.3 Objectives and Scope
1.4 Structure of the Thesis


2. Corporate Unconscious 2.1 Credits
2.2 Description
2.3 Video Documentation
2.4 Synopsis
2.5 Full Text

3. Theoretical framework 3.1 Videowalk: Exploring the Format
3.1.1. Walking as a type of art.3.1.2 Audiowalks
3.1.3 The emergence of Videowalk
3.1.4 Choosing the format
3.2 Site-Specific Art and Spatial Narratives
3.3 Engaging Audiences in a Constructed Reality
3.3.1 Illusion and Engagement: The Rubber Hand Effect in Theater
3.3.2 We should invent reality before filming it
3.3.3 Simul Entertainment GmbH3.4 Meta-Score

4. Creative process 4.1 Concept Development
4.1.1 Synchronicity and simultaneity.
4.1.2 Corporate Language as a Narrative Tool
4.2 Space research
4.3 Development of visual, auditory and performative identity
4.3.1 Corporate Identity
4.3.2 Art Direction and Stage Design
4.3.3 Performativity
4.3.4 Costumes
4.3.5 Music composition
4.3.6 Cinematography
4.4 Dramaturgy and Script Development
4.4.1 Narrative Layers
4.4.2 Storytelling
4.4.3 Dramaturgical arc
4.4.4 Space Score and Timing
4.5 Videowalk Production phases
4.5.1 Creation of Fake Historical Footage
4.5.2 Videowalk Filming
4.5.3 3D Modeling and Scanning of the Space
4.5.4 VFX Development and 3D Animated Scenes
4.5.5 Documentary Development
4.6 Performance and Participation4.6.1 Installations & self-reflective moments
4.6.2 Leveled performances
4.6.3 Fake participants and recursive participation
4.6.4 Easter eggs
4.7 Multimedia Techniques
4.7.1 LiDAR Scanning and As-build modeling
4.7.2 On-site shading and texturing
4.7.3 Character and animations
4.7.4 Camera tracking and VFX compositing
4.7.5 Virtual production and "inverse virtual production"
4.7.6 Video Game development
4.7.7 Spatial audio
4.7.8 AI text models
4.7.9 iOS playback app


5. Conclusion
6. Acknowledgments
7. References
4.7.4 Camera tracking and VFX compositing

In the production of Corporate Unconscious, camera tracking and VFX compositing were fundamental in integrating the digital elements seamlessly with the filmed footage. Camera tracking, also known as matchmoving, is a visual effects technique that allows the insertion of computer graphics into live-action footage with correct position, scale, orientation, and motion relative to the photographed objects in the shot.

We utilized SynthEyes, a specialized software for camera tracking, to analyze the movement of the camera used during the videowalk filming. SynthEyes translates real-world camera movements into a digital environment, enabling the precise alignment of virtual and real-world scenes. This was crucial for maintaining the illusion that the digital elements naturally belonged in the physical space captured in the video. The process involves identifying a series of tracking points across sequential frames, which represent the same physical point in the real-world space. These points help SynthEyes deduce the camera's path as well as lens characteristics. This information was pivotal, especially given our use of real architectural spaces with their inherent geometric and textural complexities.

One challenge we encountered was dealing with homogeneous surfaces, like the plain walls of HfMT, which provide few unique features for tracking points. In such cases, tracking becomes less reliable, and we often had to supplement the automated tracking with manual adjustments to ensure accuracy. This was particularly true for shots with minimal camera movement or those taken from a tripod where traditional tracking markers were less effective or absent.

After camera tracking, the next step was VFX compositing, carried out primarily in Adobe Premiere Pro due to time and resource constraints. While After Effects offers more advanced compositing features, Premiere Pro allows for direct integration with the editing timeline, facilitating a more streamlined workflow for this project. The compositing process involved layering the 3D elements and VFX over the original footage in a way that they appeared to exist within the real environment. This required careful adjustment of lighting, shadow, scale, and perspective to ensure that all elements appeared cohesive. Realistic interaction between real and virtual elements, such as shadows cast by virtual objects onto real surfaces or alignment of virtual objects with real-world counterparts, was crucial for maintaining the suspension of disbelief.











Camera tracking tests, using SynthEyes and Unreal Engine.