| An AI orchestration system inside TouchDesigner that uses AI to interpret rolling live brain-signal summaries [OpenBCI → TD → Python] for pertinently producing guiding cues for the meditative user; video, voice, light, and text. It all happens automatically; deciding if, when, and how to interact with the user, given a particular set of available tools*.* [link] [comments] |