SPACE AS A SCORE:
CORPORATE UNCONSCIOUS AND THE VIDEOWALK FORMAT


1. Introduction
1.1 Background and Context
1.2 Research Question
1.3 Objectives and Scope
1.4 Structure of the Thesis


2. Corporate Unconscious 2.1 Credits
2.2 Description
2.3 Video Documentation
2.4 Synopsis
2.5 Full Text

3. Theoretical framework 3.1 Videowalk: Exploring the Format
3.1.1. Walking as a type of art.3.1.2 Audiowalks
3.1.3 The emergence of Videowalk
3.1.4 Choosing the format
3.2 Site-Specific Art and Spatial Narratives
3.3 Engaging Audiences in a Constructed Reality
3.3.1 Illusion and Engagement: The Rubber Hand Effect in Theater
3.3.2 We should invent reality before filming it
3.3.3 Simul Entertainment GmbH3.4 Meta-Score

4. Creative process 4.1 Concept Development
4.1.1 Synchronicity and simultaneity.
4.1.2 Corporate Language as a Narrative Tool
4.2 Space research
4.3 Development of visual, auditory and performative identity
4.3.1 Corporate Identity
4.3.2 Art Direction and Stage Design
4.3.3 Performativity
4.3.4 Costumes
4.3.5 Music composition
4.3.6 Cinematography
4.4 Dramaturgy and Script Development
4.4.1 Narrative Layers
4.4.2 Storytelling
4.4.3 Dramaturgical arc
4.4.4 Space Score and Timing
4.5 Videowalk Production phases
4.5.1 Creation of Fake Historical Footage
4.5.2 Videowalk Filming
4.5.3 3D Modeling and Scanning of the Space
4.5.4 VFX Development and 3D Animated Scenes
4.5.5 Documentary Development
4.6 Performance and Participation4.6.1 Installations & self-reflective moments
4.6.2 Leveled performances
4.6.3 Fake participants and recursive participation
4.6.4 Easter eggs
4.7 Multimedia Techniques
4.7.1 LiDAR Scanning and As-build modeling
4.7.2 On-site shading and texturing
4.7.3 Character and animations
4.7.4 Camera tracking and VFX compositing
4.7.5 Virtual production and "inverse virtual production"
4.7.6 Video Game development
4.7.7 Spatial audio
4.7.8 AI text models
4.7.9 iOS playback app


5. Conclusion
6. Acknowledgments
7. References
4.7.1 LiDAR Scanning and As-build modeling

To get a precise digital representation of the space, I collaborated with the architect and composer Nicolás Fuentes, with whom we had worked before in our site-specific company Oído Medio. Nicolás guided me on using LiDAR (Light Detection and Ranging) scanning, a technology that uses laser light to measure distances to the Earth's surface, generating precise three-dimensional information about the shape of objects and their surroundings. This method was chosen due to its accuracy in capturing the architectural nuances of the space, creating a detailed point cloud, a set of data points in space containing XYZ and RGB information.

However, point clouds can be overwhelmingly dense and cluttered with superfluous details that detract from the narrative focus, especially when converted to meshes. Nicolás worked on transitioning from point cloud data to an As-Build model and addressed this issue. Unlike the raw point cloud, As-Build modeling involves refining the data into a cleaner, more navigable 3D CAD model that represents the building's precise dimensions and architecture. This process eliminates unnecessary noise, focusing on essential elements such as walls, floors, and critical landmarks needed for orientation within the space. The decision to opt for As-Build modeling over a simple point cloud was strategic; it reduced visual distraction and aligned more closely with our narrative goals, providing a "clean slate" that was still rooted in the actual physical environment.

This approach allowed us to emphasize specific architectural features important for user navigation and story progression, such as distinctive markings, emergency exits, or unique lighting fixtures, ensuring participants could correlate the digital representation with their real-world context. Additionally, As-Build models are significantly lighter in terms of data compared to dense point clouds, facilitating smoother integration and manipulation in subsequent stages like camera tracking and VFX integration within Unreal Engine.

Cloud point captured at the MFS Room, using a LiDAR laser.
As-build 3d model of the MFS room, based on the cloud point data.