AUTOCONSTRUCCION is a live-coded audiovisual concert using video game animations and algorithmic music to explore speculative architecture and informal housing globally.
The project employs live coding to narrate audiovisual stories in a fluid and granular way. It incorporates machine learning for collaborative creation of 3D scenarios, using personalized datasets to represent non-architects’ dreams. The performance introduces “live cinema coding,” merging cinematographic language with sound patterns for extreme audiovisual synesthesia.
In this piece I am interested in how AI can process dataset of existing architecture to get unexpected vision of environments.