I made music from auroras
When I sang along to music playing from my earphones, my mom would say, "Bhavya, you need to think of our plight too."
So how did I make music now?
I can't play instruments well and I can't sing. But our lives are increasingly becoming more and more digital and there is a limit to how sane you feel once your eyes can't look away from screens. I started thinking of ways to intersperse mindful physical experiences into my digital time.
Enter TwoTone. It is a data sonification tool. You give it a dataset, it gives you music. Naturally I was convinced I could become a musician finally. So I went to Kaggle and found a dataset about aurora sightings from 1913.

I simply fed this dataset into TwoTone to see what would happen. It's cute.
Maybe more than one instrument would be more fun, more full sounding. I tweaked the dataset a bit by mapping the text columns to numeric values, keeping the range the same so there wouldn't be notes flying all over the place.

I chose jazz instruments because I had been to a Dwayne Dolphin concert the day before (at City of Asylum! 25000/10! highly recommend), made a few changes to the scales, octaves, tempo and threw in arpeggios because I could.

LISTEN to this.


Bonus content: Auroras I drew earlier this year
Experimenting with AI
The auroras dataset was a gem on its own, the randomness of the dataset created music that didn't sound bad at all. Making a dataset that leads to good music should be possible, but remember I don't know what the instruments should be doing.
What if I reverse engineered 'Snookie' by Dwayne Dolphin to get to something similar to actual jazz music? But no matter what I tried, I couldn't get Perplexity to generate a dataset that had the variations that would lead to pleasant sounding music. The time spent on this was not worth a reverse engineer plan. I'd rather learn music from scratch.
I'm going to leave you with the Snookie song, no AI. Enjoy!