I’ve been trying to teach myself a little Python, and here’s what I came up with for my first small project. Using their respective APIs, I’ve built a plugin for the Prismatik ambilight software that maps live data from the iRacing simulator. A video demonstrates the end result far better than I can explain it: The plugin itself is open source and hosted on GitHub. Click here to download the latest version.
Enough with the in-depth boring stuff. Let’s do something fun with Adalight! Aside from the relatively straight-forward color data, each Adalight frame is preceded by a small six-byte header. Since this header data is mixed in with a lot of RGB color data, I got to thinking… if this data was pushed to the LEDs, what would it look like?
Now that I’ve calculated the theoretical framerate limits, it’s time to measure the actual framerates my Adalight device is putting out. Using a logic analyzer and an Arduino Nano, I’m going to measure the framerate at varying Prismatik “Grab Intervals” and baud rates, and compare those numbers to what my calculations predict will happen.
Since I’m experimenting with increasing Adalight framerate, the first step was to try driving the Arduino Nano with a faster serial baud rate. Unfortunately, Prismatik only supports three baud rates: 9600, 57600, and 115200. But after talking with Patrick Siegler, he pointed out a way to use your own custom baud rate for Adalight or Ardulight devices.
I recently posted a few ideas about how to improve the framerate of an ambilight driven using the Adalight protocol. Before trying to implement some of those options, I thought it would be worthwhile to actually calculate the theoretical framerate limitations.
As cool as I think ambilights are, using Adalight with my DIY setup has one major limitation: framerate. Video technology works on a principle caused persistence of vision, which means that our brains still “see” an image briefly after it’s taken away. If you replace the images quickly enough, our brains interpolate the differences between them and we get an illusion of motion.
We’re done! The ambilight is in place behind my monitor and has been running great. To finish up, I wanted to reflect a bit on what I learned and talk about where to go from here.
Project complete! The LEDs are in place, the code is done, the PCB is built, and everything is installed and running. So what is there left to do? Shoot some videos of everything in action! In all of these videos, the ambilight is generating colors in real time based on the monitor’s image. The monitor image is as-filmed and is not superimposed.