Home News Georgia Tech Researchers Translate Data Into Music to Help Blind Individuals Experience Solar Eclipse

Georgia Tech Researchers Translate Data Into Music to Help Blind Individuals Experience Solar Eclipse

by Holly Beilin

Monday’s once-in-a-century solar eclipse will be able to be experienced, in a very different way, by those who may not normally be able to tell the difference between day and night.

A team of Georgia Tech researchers at the Georgia Tech Sonification Lab used data to create an original music composition mimicking the path of the total eclipse. Sonification, which is the process of turning data into audible cues, has helped visually impaired and blind individuals experience an event that another individual would process with vision.

For example, the Lab has visualized fish movements from the Georgia Aquarium. They have converted numerical data into sounds to analyze stock market prices, elections, and weather. And in 2012, they collected data on two stars from a NASA telescope to compose a melody for a reggae rock band.

Now, they have developed a musical composition that represents the eclipse moving from partial to totality to back again. To do so, the team watched multiple videos of total eclipses (the last one seen in the U.S. was in 1979) and spoke to two blind individuals, one of whom had seen an eclipse.

”Our main motive was to use music and sound to demonstrate what’s going on in the sky,” says researcher Avrosh Kumar, a Georgia Tech grad who received a master’s in music technology. Kumar served as the composer for much of the melody. “At the same time, we wanted to create a pleasing, dramatic composition. It was a fine line to walk in order to achieve both goals.”

The project came about when AT&T came to Sonification Lab Director Bruce Walker. The company has been working with a startup on IoT-connected smart glasses, which help the visually-impaired connect to a remote agent who acts as a set of eyes. They intend to use the glasses to help blind persons visualize the eclipse.

AT&T asked Walker, a professor in Georgia Tech’s College of Computing and College of Sciences, if he could create an audio track to accompany the experience.

Neurological studies show that individuals blind since birth or childhood actually have a much more powerful auditory cortex, meaning they respond better to auditory stimuli, than those with vision. This is because of a phenomenon called neural plasticity, where brain synapses rewire themselves to become stronger in certain areas.

Moreover, music is often a particularly strong way for visually impaired individuals to communicate, learn, or experience the world. Pitch can help blind people identify colors, navigate their surroundings, and learn educational concepts.

“All people need access. Sonification, or turning data into audible cues, is one way to provide it,” says Walker.

There are two slightly different versions of the composition — one based off of an unobstructed view of the eclipse in Hopkinsville, KY, and one on eclipse conditions of 97 percent totality in Atlanta. Both will be finished in real-time on Monday; the researchers will gather data such as temperature changes and brightness levels from the Weather Channel during the eclipse, and use a musical algorithm to add instruments and tones. For example, trumpets will symbolize temperature changes.

In Kentucky, AT&T will play the soundtrack for a visually impaired eclipse observer using the smart glasses. And back in Atlanta, the composition will be streamed live during Georgia Tech’s eclipse telescope feed.

Listen to the composition here.

You may also like