The 19th Century Physicist Michael Faraday was known not only for his pioneering experimental contributions to electromagnetism, but also for his public speaking. His annual Christmas lectures at the Royal Institution grew into a holiday tradition that continues to this day. One of his most famous Christmas lectures concerned The chemical history of a candle. Faraday illustrated his arguments with a simple experiment: he placed a candle in a lamp glass to block any breeze and achieve “a steady flame”. Faraday then showed how the shape of the flame flickered and changed in response to disturbances.
“You must not imagine that the flame has this particular shape because you see these tongues all at once,” Faraday remarked. “A flame of this form is never like this. Never does a body of flame like the one you just saw rising out of the sphere have the shape it appears to you. It consists of a multitude of different shapes that follow one another so quickly that the eye can only perceive them all at once.”
Now, MIT researchers have brought Faraday’s simple experiment into the 21st century. Markus Buehler and his postdoc Mario Milazzo combined high-resolution imaging with deep machine learning sonify a single candle flame. They then used that single flame as a building block, creating “music” from its flickering dynamics, and designing novel structures that could be 3D printed into physical objects. Buehler described this and other related work at the American Physical Society meeting in Chicago last week.
As we previously reported, Buehler specializes in developing AI models to design new proteins. He is perhaps best known for using sonification to shed light on structural details that would otherwise prove elusive. Buehler found that the hierarchical elements of music composition (pitch, range, dynamics, tempo) are analogous to the hierarchical elements of protein structure. Much like music has a limited number of notes and chords and uses different combinations to compose music, proteins have a limited number of building blocks (20 amino acids) that can be combined in any way to create novel protein structures with unique properties create. Each amino acid has a specific sound signature, similar to a fingerprint.
A few years ago, Buehler led a team of MIT scientists mapped the molecular structure of proteins in spider silk to produce threads on music theory the “sound” of silk. The hope was to find a radically new way to make designer proteins. This work inspired a sonification art exhibition, “Spider’s Canvas,” in Paris in 2018. Artist Tomas Saraceno collaborated with MIT engineers to create an interactive harp-like instrument powered by the web of an a Cyrtophora citricola Spider with each strand in the “web” tuned to a different pitch. Combine these notes in different patterns in the 3D structure of the web and you can create melodies.
In 2019, Buehler’s team developed an even more advanced system to make music out of a protein structure — and then convert the music back to create novel proteins not found in nature. The goal was to learn to make similar synthetic cobwebs and other structures that mimic the spider’s process. And in 2020, Buehler’s team applied the same approach to model the vibrational properties of the spike protein responsible for the high contagion rate of the novel coronavirus (SARS-CoV-2).
Buehler considered whether this approach could be sufficiently extended to study fire. “Flames, of course, are silent,” he said during a news conference. But “Fire has all elements a vibrating string or molecule, but in an interesting dynamic pattern. If we could hear them, what would they sound like? Can we materialize fire? Can we push the envelope to create bio-inspired materials that you can actually feel and touch?”
Like Faraday centuries before, Buehler and Milazzo began with a simple experiment with a single candle flame. (A larger fire has so many disturbances that it becomes computationally too difficult, but a single flame can be considered the fire’s basic building block.) The researchers lit a candle in a controlled environment, with no air movement or other outside signals — Faraday’s quietly Flame. They then played sounds from a speaker and used a high-speed camera to capture how the flame flickered and deformed over time in response to these acoustic signals.
“This creates characteristic shapes, but they’re not always the same shapes,” says Buehler. “It’s a dynamic process, so what you see [in our images] is just a snapshot of it. In reality, there are thousands upon thousands of frames for each expectation of the audible signal – a circle of fire.”
Next, he and Milazzo trained a neural network to classify the original audio signals that produced a particular flame shape. The researchers effectively sonified the vibrational frequencies of the fire. The more violently a flame is deflected, the more dramatically the audio signal changes. The flame becomes a kind of musical instrument that we can “play” by, for example, exposing it to air currents to make the flame flicker in a certain way – a form of musical composition.
“Fire is vibrant, rhythmic, repetitive and ever-changing, and that’s what music is all about.” said Buehler. “Deep learning helps us get the data and specific fire patterns, and with different fire patterns you can create this orchestra of different sounds.”
Buehler and Milazzo also used the various shapes of flickering flames as building blocks to design novel structures on the computer and then 3D print those structures. “It’s a bit like being able to freeze the flame of a fire in time and see it from different angles.” said Buehler. “You can touch it, rotate it, and you can also look into the flames, which no human has ever seen.”
This article was previously published on Source link