Illy is a semi-intelligent system powered by Web Audio API, WebGL, WebVR and sometimes Kinect. It understands what it is spoken to in a primitive way and responds to us in a primitive, emotional and simple way both sonically and visually. As opposed to the cutting edge AIs and interfaces like Siri, Cortana, OK Google, Illy does not understand natural language, defined commands or even language at all. It does not respond using fine-tuned intonations to deliver the information gathered from the cloud, it is not integrated with any services. It does not show maps, or pictures about the topic. Instead, as an antithesis, it only tries to understand the sonic properties of the spoken language like the attack, loudness, roughness and pitch. And responds sonically, in a non-language, musical, psychoacoustical way: without the symbolic sounds that we call language but with tonal sounds, timbre, frequency, consonance and rhythm. Visually, it shapes the visuals focusing on size, speed and color determined by its understanding and creating an abstract visual. Ignoring the symbolic data, like an infant, Illy tries to understand the emotion in the human voice - free from symbols of the language - and responds you with your emotions like sonic mirror to the soul. Illy learns but do not expect it to crack the human languages yet. The aimed interactive experience brings questions about human sound and human-machine interface that we sometimes ignore in this age of information.
For the VR version, at the moment Vive or Oculus (with Oculus Touch) headset is required for the optimal experience. Press and hold one of the controller's trigger button to talk/sing and release to hear the response. Press either thumbpad/thumbtrack to call Illy.
For the desktop version, Chrome is suggested as the browser. Press and hold left-shift button to talk/sing and release to hear the response.
Both versions are better with headphones!
The kinect version is not available as a website but if you have a kinect and want to try it out drop me a line.
(July, 2017) Illy: A Primitive Intelligence (The Entity - VR) exhibited at 2017 ACM SIGGRAPH Digital Art Community WebVR Online Exhibition – Immersive Expressions
(May, 2017) Illy: A Primitive Intelligence (The Entity - VR) exhibited at Bay Area Maker Faire
(March, 2017) Illy: A Primitive Intelligence (The Entity - VR) exhibited at UploadVR Art and VR: A Soiree of Immersive Art and Tech
(November, 2016) Illy: A Primitive Intelligence (The Eye - Motion tracked installation) participated at DANCE HACK DAY
(November, 2016) Illy: A Primitive Intelligence (The Entity - VR) was exhibited at CODAME ARTEX
(September, 2016) Illy: A Primitive Intelligence (The Eye - Motion tracked installation) was exhibited at CuriOdyssey Museum within Index of Probabilities
(June, 2016) Illy: A Primitive Intelligence was shown at 3D Web Fest