Tetraslam 9 hours ago

Hi! I'm a college freshman at Northeastern, and this is my first hackathon project for a major hackathon (MIT Media Lab). It won the Unconventional Computing track prize, so I'm pretty happy! I made this because I'm a major conlanger, and I was wondering if it would be possible to think in terms of sound when looking at an image. Fractals have easy-to-map parameters, so I created SHFLA, a language which takes in music, and creates fractals based on 0.1 second (you can change this) chunks of music! It's Turing-complete, so you can technically encode a ridiculous amount of information and computation in my system, although writing it out as music might take a while.

Hope you find the project cool :)

  • compressedgas 9 hours ago

    I hope they teach you how to describe things better.

    What you have here is a music visualizer that uses interpolated Julia set images as its depiction.

    • Tetraslam 9 hours ago

      I know what you mean, and I'll work on it :)

      SHFLA is Turing-complete though, so I am hoping people see that it has more potential than just being a music visualizer as I keep improving it! I'm currently in the process of rewriting it in Nim using SDL2, so once I get it to a performant state I'm going to implement an information-as-music encoder and all that.

indiraschka 8 hours ago

this is pretty cool! consider including a demo video and images at the beginning when explaining the project