Autonomous media pipelines · AI · Biosignals · Production systems
I build systems where signals become cinema —
combining AI, biosignals and production infrastructure into deployable outcomes.
Built across research labs, theatres, museums, film production and startup environments.
Primary focus: AI-native production systems for film, synthetic media and biosignal interfaces.
Because intelligence without output is just research. Output without intelligence is just content.
→ AI-native production pipeline for synthetic media and VFX
→ Biosignal-driven creative interface system
→ Automation infrastructure for film and content workflows
AI Systems Architecture
Workflow automation, orchestration, retrieval systems, fine-tuning, evaluation.
BioSignal & Research Systems
EEG / HRV / multimodal signal analysis, experimental HCI, neurointerfaces.
Cinematic & Media Systems
AI filmmaking pipelines, synthetic media, character systems, generative VFX.
Embedded & Interactive Systems
ESP32, sensor integration, physical computing, real-time interaction devices.
Dancers' brainwaves generated real-time audiovisual output on stage.
Developed with Tallinn University researchers. Presented at Bozar, Brussels.
→ TLU · → Bozar
Neurotheatre EEG Real-time AV Performance Systems
Permanent interactive audiovisual sculpture at Ivan Pavlov Memorial Museum, Koltushi.
Neural network trained on real primate psychophysiological data, generating autonomous visitor-responsive behavior.
Developed with the Group of Physiology of Sensory Systems of Primates.
→ Pavlov Museum
Neural Networks Interactive Systems Art-Science Permanent Installation
Neuro-performance system premiered at Alexandrinsky Theatre, St. Petersburg.
Live EEG-driven media with multichannel spatial audio. Featured on Habr.
→ Habr · → Facebook
Neurotechnology Live Systems EEG Interactive Media
Tactile learning system bridging physical interfaces, cognition and adaptive intelligence.
Published in Springer (2023).
HCI Embedded Systems Cognitive Research
Neurotheatre and lighting design research developed with ITMO University, CLD (2017).
Neurotheatre Lighting Design Interactive Systems Research-to-Practice
Feature sci-fi film listed on IMDb.
AI-assisted VFX workflows and generative post-production systems.
→ IMDb
Film Production AI VFX Pipeline Architecture Sci-Fi
Sensor-based intelligent object for gesture interaction and spatial learning.
Developed with the Center of Usability and Mixed Reality, ITMO University.
ESP32 BLE IMU Embedded Firmware
Internal LLM-orchestrated infrastructure for retrieval, automation and production systems.
n8n PostgreSQL pgvector AI Automation
Many can build tools.
Fewer can turn tools into outcomes people remember.
Demultiplexia was performed at Bozar. Cyber Monkey is still running.
Signals become narrative. Narrative becomes system. System becomes experience.
Available for: film & VFX projects · research grants · AI production infrastructure
aicinema.online · linkedin.com/in/yuri-didevich · github.com/jurididevich
70% engineering intelligence · 30% cinematic identity