Today, I want to introduce a project that combines music, media, and artificial intelligence. We use a neural network to create visualizations from melodies that are created by visitors. This visualization is then projected onto a wall for viewers to experience a unique visual experience.
In the exhibition space, there will be a synthesizer or piano that anyone can play. The neural network processes the sound and reproduces the visualization in real-time as a projection on the wall. This visualization is unique to each melody.
Depending on the mood and style, different visualizations with varying content can be chosen.
If anyone is interested in this project for their exhibition space, please email serge@kozintcev.ru