360°

Reality to Virtuality

GROUP 3

-

A Concert in 3D

Katherina Henggeler, Nico Lang, Nicolas Übersax

Terrestrial laser-scanners are high-end measurement instruments, typically used to produce accurate and dense 3D-models of the surrounding environment. A common application in the field of Geomatics is the modeling of buildings, bridges, dams or other constructions for extensive monitoring purposes. In other fields like Architecture, laser-scanners are deployed for visualization purposes. As a single recording takes a few minutes, it is usually assumed that the objects of interest are at rest during the measurement. Many challenges occur if the application requires a complete model of the 3D object. The major reason causing data gaps is occlusion. Usually, the object is scanned from different positions to merge the resulting point-clouds by a registration to a complete model. Additionally, the point-clouds consist of noisy data points that are caused by reflecting surfaces or if the laser-beams are reflected by multiple surfaces (mixed pixels). Another reason for noisy artifacts are moving objects like walking people. These noisy data points are usually removed to get a clean model of the steady environment.

 

We grasp the opportunity in the scope of the 360° LAB to explore the appealing visual effects of artifacts in laser scans. Our goal is to develop a graphic language where this imperfection becomes part of the expression. Especially the distortion caused by moving objects inspired us to focus on the visualization of non-static scenes in point-clouds recorded with a terrestrial laser-scanner. In particular, the aim is a visualization of a concert of the band "Finger Finger" in form of a video. Thus, a concert was recorded with a laser scanner. Further, the same concert was filmed. Having access to the two recordings and different media the goal was to create a fusion of the real video footage and the point cloud to compose a video clip to the song "When it's done" from Finger Finger.

IMAGE GALLERY

© 2018, ETH Zürich, All Rights Reserved