ChromaChord: A Virtual Musical Instrument

IDIA Lab: Ball State University

ChromaChord utilizes a combination of the Oculus Rift headset and an attached Leap Motion controller to create an immersive 3D VMI environment. The Oculus Rift provides a three-paneled visual interface, through Unity, that allows the users to access different component windows of the instrument (simply by turning their head), while the Leap Motion Controller allows the user to interact with the visual interface to musical effect (processed and generated in Max/MSP).


ChromaChord uses the Oculus Rift DK2, a virtual reality headset to immerse the performer within the VMI. Utilizing two cameras in the virtual environment, this device relays a stereoscopic image to the wearer allowing for highly dimensional experiences. Additionally, a gyroscope, accelerometer, and magnetometer detect real-world head rotation and transform its virtual equivalents. Additionally, a peripheral infrared camera captures limited-range positional tracking.


Mounted to the Oculus Rift is affixed a Leap Motion. This provides infrared tracking of hands. In our application, the Motion tracks the user’s real world hands and simulates them, mimicking hand positions and finger movements. Therefore, the user’s own hands are the primary input mechanism.

Custom Software


  • Key note triggering and glow by Leap Motion hand touches
  • Particle generation triggered Leap Motion hand touches
  • Button click and glow by Leap Motion hand touches
  • Particle destruction and orbit physics
  • OSC Server and Client configurator / Message Handler / Data Handler


Our graphical user interface is composed of three sections – left, center and right panels. On the left, the user can see the modulation sphere. By touching this sphere, the user generates orbital particles. The number of these particles affects a live process on the notes. The user can remove these particles by simply swiping them away. Turning to the right reveals the root note and scale type controls. To select a root note and scale type, the user must press the desired button. The interface responds by highlighting the selected options.


The center module contains the keys, a chromatic scale in two colored columns based on the Solfège. This is the core performance area of the UI. In addition there are arrows facing up and down that the user can touch to raise or lower the octave. All buttons and keys respond to touches by glowing white.


The default window is the center palette, where the instrument itself is located. Within this window, there are two columns of colored keys (four per column) to the left and the right of the screen. To create sound, the user has but to put their hands vertically in front of them and ‘touch’ the keys, which are placed for both ease of use and the ability to contact multiple notes simultaneously to create chords. The notes, themselves, are stacked in intervals of a third from each other; consequently, neighboring notes on the same hand will create conventional harmonies. Between the columns, on the top and bottom-most extremities of them, are two arrows that, if touched, move the current note set up or down an octave. By touching the notes, Unity sends a UDP message to Max/MSP, indicating which note has been touched and remains active until contact has ceased. Similarly, by touching the octave buttons, UDP messages are sent to Max to reassign the MIDI note values. Within Max, MIDI values are then converted to their equivalent frequencies and assigned to their respective oscillators before being processed and output through Max’s audio out.


When looking to the right, the user accesses the settings window, where note assignment takes place. First, the user selects the root note of the scale they wish to be within. The second parameter allows the user to select what mode of scale they would like – the options being: major, natural minor, and harmonic minor. By default, the instrument opens with a root note of C and is in the major mode. The UDP messages sent from the settings window control a series of gates within Max that then send adjusted MIDI values to the instrument’s oscillators.


The performer accesses the modulation window to the left, which controls the frequency and Q of a filter, as well as level of effect. By touching the button in the center, Unity spawns floating particles that surround the user’s hands. The longer the users touches the button, the more particles accumulate; they can be removed by making shaking or throwing motions. Without particles, the filter’s frequency setting is at it’s lowest audible setting, creating a much darker, mellow timbre; alternatively, as more particles accumulate, the filter opens up, allowing for higher frequencies to pass through, creating a brighter, richer timbre. Additionally, the particles move the dry/wet ratio for effects towards the ‘wet’ end of the spectrum, deepening the audibility of the effects. The UDP message sent from the modulation window sends a single numeric value for the total number of particles active; this value is then scaled in Max/MSP to most appropriately suit each parameter that’s being altered.



Displaced Resonance v3 proposal for Prix Ars Electronic 2014. Linz, Austria. 

Displaced Resonance Website:

Displaced Resonance Multiuser Virtual World:

Displaced Resonance v2

IDIA Lab was invited to exhibit their virtual and hybrid artworks at the Third Art and Science International Exhibition at the China Science and Technology Museum in Beijing, China.

Displaced Resonance v2 is an interactive installation consisting of sixteen reactive forms that are networked in a grid of light and sound. Interaction within the sculptural field is based on a participant’s presence and proximity to each sculpture. The Displaced Resonance installation is connected to a mirrored instance of the field in a virtual environment – bridging both physical and virtual visitors within a shared hybrid space. Visitors to the virtual space are represented by avatars and through their proximity affect the light and sound of each sculpture. Each participant is aware of the other in each space and uniting both instances within a singular hybrid environment.

A computer system using an thermal camera tracks the movement of visitors and responds by controlling the distribution of sound and dynamic RGB data to the LED lights within the sculptural forms. The installation utilizes custom live processing software to transform these sources through the participants’ interaction – displacing the interaction from both their physical and virtual contexts to the processed framework of physical and virtual resonances. The two environments are linked to send and receive active responses from both sides of the installation via messaging, sensors, hardware and scripting.

In prototyping this work, the process included both physical and virtual models to design and build the form and the interaction. The physical prototypes were interpreted in a virtual simulation environment, investigating the spatial interaction of the structure. The interactive functionality was tested through scripting before the form was brought into the sensored camera-based version. After several virtual iterations, the form was re-interpreted and fabricated.

John Fillwalk

with Michael Pounds, David Rodriguez, Neil Zehr, Chris Harrison, Blake Boucher, Matthew Wolak, and Jesse Allison.

Third Art and Science International Exhibition

China Science and Technology Museum in Beijing

TASIE Press release

Ball State artists create “forest” of light on display in China and Internet

Muncie, Ind. — Ball State University electronic artists have created a “forest” of light and sound that will be on exhibit in Beijing, China through November, yet also accessible to visitors from Indiana or anywhere else in the world.

That’s possible because “Displaced Resonance,” as the interactive art exhibit is known, has both real-life and virtual components.

The physical portion has been installed in a gallery of the China Science and Technology Museum in Beijing. There, in-person visitors can negotiate a thicket of 16 interactive sculptures spaced 1.5 meters apart that will change colors and emit music as they approach.

A digital replica of the layout, meanwhile, resides on the Internet, accessible through the museum’s website. Online visitors can wander the virtual exhibit using an avatar, and the digital pillars will change colors and produce sounds, just like their physical counterparts.

But that’s not all — the two pieces interact with each other, says John Fillwalk, director of Ball State’s Institute for Digital Intermedia Arts (IDIA) and Hybrid Design Technologies (HDT), which created the work in collaboration with IDIA staff, students and composer Michael Pounds, BSU.

When an online avatar approaches a virtual pillar, the corresponding real-life column also will change colors, and vice versa. In-person and virtual visitors will produce different colors, however, allowing them to track each other through the exhibit.

“It’s what we call hybrid art,” says Fillwalk. “It’s negotiating between the physical world and the virtual. So it’s both sets of realities, and there’s a connection between the two.”

The physical pillars are two meters (or more than 6 feet, 6 inches) tall. They consist of a wooden base containing a sound system; a translucent pillar made of white corrugated plastic and computer-controlled lighting.

A thermal camera mounted on the museum’s ceiling keeps track of visitors and feeds its data to a computer program that directs the columns to change color and broadcast sounds when someone draws near.

“It’s a sensory forest that you can navigate,” Fillwalk says.

A video screen mounted on a wall overlooking the exhibit allows museum visitors to watch avatars move around the virtual version, while Internet patrons can affect the physical counterpart.

“Displaced Resonance” is the centerpiece of Ball State’s contributions to the Beijing museum’s 3rd Art and Science International Exhibition and Symposium, a month-long celebration of technology and the arts. Ball State was invited to participate because museum curators discovered some of IDIA’s work and liked what they saw, Fillwalk said.

In addition to “Displaced Resonance,” IDIA contributed four other pieces of digital art that museum visitors can view at a kiosk.

Those pieces are:

  • “Proxy”, in which visitors create, color and sculpt with floating 3D pixels.
  • “Flickr Gettr,” in which visitors can surround themselves with photos from the Flickr web service that correspond to search terms they submit.
  • “Confluence,” in which users create virtual sculptures by moving around the screen and leaving a path in their wake.
  • “Survey for Beijing,” in which real time weather data from Beijing is dynamically visualized in a virtual environment.

(Note to editors: For more information, contact John Fillwalk, director of the Institute for Digital Intermedia Arts, at765-285-1045 or; or Vic Caleca, media relations manager, at 765-285-5948, or For more stories, visit the Ball State University News Center at


An interactive intermedia installation

Michael Pounds

John Fillwalk

Trans|duce is an interactive sound sculpture consisting of five metal plates with attached mechanical transducers that vibrate the metal according to audio signals sent from the computer. Signals sent from the computer are resonated according to the physical properties of the metal plates. Additionally contact microphones attached to the plates can provide feedback to the computer, allowing the computer to gather information about the specific resonances of each metal plate. This information can be used to further exploit those resonances by adjusting the audio signals sent to the plates.

Interaction is provided through a capacitive sensing mechanism. When viewers move their hands near the plates or touch the plates, signals are sent to the computer, allowing the viewers to trigger sounds or modify the properties of sounds through their hand movements. Five different metal sheets/plates are used with varying physical dimensions and thicknesses. Different transducers are attached to each plate according to its physical properties, and each plate will resonate at different frequencies.


oneWAPIHANNE is an interactive artwork visualizing live streaming data from water stations along the White River in Indiana. The stations are installed and monitored by the United States Geological Survey [USGS] – reporting the current gauge height and water speed over time. The visualization graphically represents each stations’ data in their entirety or individually – with overlays for station locations, and record river levels. Wapihanne is the name for the White River in the Delaware language.

The chromatic tiles in the upper left represent the gauge stations along the White River. The monochromatic tiles in the upper right represent supplemental data overlays – displaying a line graph of the river height over time, a compressed map of the White River with the approximate location of the gauge stations, and the record levels for each station. In the bottom right of the screen is a gray tile that refreshes the data and restarts the program.

Each flow of station data represents river height and velocity over the course of the last thirty days. The top of the flow is the height, and the size of the flow is related to velocity. Touching and dragging a specific flow, will reveal detailed information about that particular reading – creating a sound with volume tied to river velocity.

Events Wednesday, November 30th, 2011 at 20:29
Indian Institute of Technology Mumbai, India

The Indian Institute of Technology in Mumbai, India has invited Ball State University’s Institute for Digital Intermedia Arts from Upgrade! Indiana to showcase our hybrid reality and human computer interface artworks. This year we are exhibiting a work by John Fillwalk and IDIA Lab which is an interactive virtual installation using Unity 3D and a MS Kinect as gestural input. In its 14th year, this international festival is the largest of its kind in Asia with over 70,000 attendees. Invitees have included CERN, Oxford, Nissan, Airbus and Google in this important emerging technology showcase. The Festival is January 6-8th 2012 in Mumbai.

Events Wednesday, November 30th, 2011 at 20:16
Broad Art Museum Michigan State University, Lansing, MI

John Fillwalk and IDIA Lab from Upgrade! Indiana has been commissioned to design artworks and build a virtual presence for the upcoming Eli and Edythe Broad Art Museum at Michigan State University. The physical Museum in East Lansing Michigan, designed by architect Zaha Hadid, is slated to finish construction in Spring 2012. Museum Director Michael Rush commissioned Fillwalk and the IDIA to envision and curate a dynamic program of exhibitions and events including virtual installations, visiting artists, lectures and even artist studio space for the Virtual Broad Art Museum project (VBAM). Fillwalk’s works in collaboration with IDIA will be exhibited in the inaugural premiere of the VBAM project in early 2012. The physical museum opens in April 2012. The Institute is also working with project co-director and IDIA Lab fellow Adam Brown, MSU to virtualize one of his current works to be included in the VBAM program. IDIA is developing the project in the game engine of Unity 3D which will allow the virtual world to be experienced entirely within a web browser. Next steps include augmented reality and mobile art applications for future VBAM projects.

Events Monday, March 21st, 2011 at 17:25
2000 W University Ave. Muncie, IN

IDIA Presents: Christiane Paul “New Media in the Mainstream” 3/24/11 at 4:00pm, AJ 175
Christiane Paul
New media art curator and author

“New Media in the Mainstream”
Thursday, March 24 4:00pm

Art and Journalism Building, AJ 175

Christiane Paul will explore the challenges that new media art presents to institutions and the art market. Also discussed will be museum exhibitions-media and traditional- that responded to technological culture, as well as the historical roots of the complex relationship between new media and the mainstream art world.

Over the past decade, contemporary art has increasingly been shaped by concepts of participation, collaboration, social connectivity, performativity, and ‘relational’ aspects. One could argue that art responded to contemporary culture, which is shaped by digital and other new technologies and the changes they have brought about. While art institutions and organizations now commonly use digital technologies in their infrastructure—“connecting” and distributing through their websites, facebook pages, YouTube channels, and Twitter tours—they still place emphasis on exhibiting more traditional art forms that reference technological culture rather than art that uses these technologies as a medium. The talk explores the complexities behind this “state of the art world,” including the challenges that new media present to institutions and the art market. Also discussed will be museum exhibitions—media and traditional—that responded to technological culture, as well as the historical roots of the complex relationship between new media and the mainstream art world.

Christiane Paul is the Director of the Media Studies Graduate Programs and Associate Professor of Media Studies at The New School, NY, and Adjunct Curator of New Media Arts at the Whitney Museum of American Art. She has written extensively on new media arts and lectured internationally on art and technology. An expanded new edition of her book Digital Art (Thames & Hudson, UK, 2003), as well as her edited anthology New Media in the White Cube and Beyond – Curatorial Models for Digital Art (UC Press) were published in 2008. At the Whitney Museum, she is responsible for artport, the museum’s online portal to Internet art and curated the shows Profiling (2007), Data Dynamics” (2001), and the net art selection for the 2002 Whitney Biennial. Other recent curatorial work includes the Incheon Digital Art Festival (Korea, 2009); “Scalable Relations” (Beall Center for Art and Technology, Irvine, CA; gallery@CalIT2, San Diego, CA; CN(S)I, University of California Los Angeles; MAT University of California Santa Barbara, 2008-09); SOS 4.8 festival (Murcia, Spain, 2008); and Feedback (co-curated with Jemima Rellie and Charlie Gere; LABoral Art Center, Gijon, Spain, 2007). Dr. Paul has previously taught in the MFA computer arts department at the School of Visual Arts in New York (1999-2008); the Digital+Media Department of the Rhode Island School of Design (2005-08); the San Francisco Art Institute and the Center of New Media at the University of California at Berkeley (2008).


Events Exhibitions Saturday, July 17th, 2010 at 16:17
1201 South Figueroa Street Los Angeles, CA 90015

Upgrade! Indiana’s John Fillwalk and Jesse Allison are exhibiting an interactive installation at SIGGRAPH 2010 in Los Angeles. Final Wisdom I is an interactive installation engaging viewers in a sensory exploration of temporal and spatialized poetry. Participants manipulate imagery, sound and language through their gesture, touch and proximity. The work is engaged through a framework of cameras and sensors that react to heat, position, and capacitance – presenting a shifting environment of reactive media and haptics. Final Wisdom I is the work of artists Hans Breder and John Fillwalk, poetry by critic Donald Kuspit, music by composers Carlos Cuellar Brown and Jesse Allison. This project is produced through the Institute for Digital Intermedia Arts at Ball State University in collaboration with the Institute for Digital Fabrication.


Conferences Events Exhibitions Saturday, July 17th, 2010 at 16:13
Yeditepe Ünv., Istanbul Province, Turkey


IDIA’s director, John Fillwalk presented the institutes art and design work in hybrid realities at the 7th Annual International Symposium of Interactive Media Design held at Yeditepe University in Istanbul, Turkey. Fillwalk’s video work was also screened during the conference.

Events Wednesday, March 24th, 2010 at 15:24
N McKinley Ave. Muncie, IN

Upgrade! Indiana and the Institute for Digital Intermedia Arts


At Ball State University

IDIA Presents:

The History of Interactivity in the Fine Arts

A lecture by new media curator George Fifield.

Wednesday, March 17 at 4:00pm

Ball State University

Art and Journalism Building

Atrium Auditorium, AJ 175

George Fifield is a new media arts curator, writer, teacher, artist and the founder of the Boston Cyberarts Festival The Boston Cyberarts Festival is the first and largest festival of artists working in new technologies in all media in North America, encompassing visual arts, dance, music, electronic literature, web art, and public art. In addition, Fifield has written on a variety of media, technology and art topics for Artbyte, Bomb, Communication Arts, Digital Fine Arts and The Independent Film and Video Monthly among others. More recently he curated Act React: Interactive Installation Art at the Milwaukee Art Museum.

This lecture will examine the path artists took in the 20th century, which blurs the distinction between the reader/viewer and the author. The interactive artist has voluntarily relinquished a degree of control over the viewer’s experience in a way that the linear artist has not. In digital literature, web art, interactive installations, even in the popular art form of computer gaming, the audience had the ability to control their experience within the work. Visitors to a work of interactive art choose the path they take through it, giving them a hand in determining their own experience. New environments grow from simple and branching hyperlinked structures to become complex dynamic systems, virtual worlds in which people engage with their surroundings and with one another.

Meet George Fifield today between 11:30am and 12:30pm at the Atrium Studio for an informal lunch to sit and talk with the artist.

The Institute for Digital Intermedia Arts

At Ball State University