Peggy 2.0 + UART + Quartz Composer = Video Peggy!
June 29, 2008 (last updated October 24, 2008)
A sneak preview
More videos after the long-winded explanation of how it works
how it all started
When I saw the 2.0 version of the Peggy kit designed by Windell over at Evil Mad Science, I knew it was a matter of time before I succumbed to the urge to buy one. LED's are cool on anything. More LED's are cooler. But 625(!) addressable LED's... that's geek nirvana.
If you're not already familiar with "Peggy 2.0", go over to Evil Mad Science and check it out. Before I explain what I did with it, let me first just say, this is a very well thought out kit. The board is intelligently laid out, and doesn't feel "cramped" when you're working on it. The instructions that come with it are clear and concise. Mine even came with printed instructions! Nice touch, Windell.
It is physically large for a PCB kit. Did I mention it was big? It just barely fit in my large Panavise PCB holder, with the rails spread as far apart as they would go. And it only fit one way. The most difficult thing about soldering the kit was it's physical size; it's quite unwieldy compared to the euro-card sized stuff I normally play with. But, otherwise, it's a fairly straight-forward kit to build, if you already have some soldering experience.
While I was debating the purchase, the most nagging thought I had was : "What am I going to do with this thing?". I started thinking about what could be done with a display this size, and then I hit upon the idea of using it as an outboard display for a PC. That was quickly followed by the idea of pumping low rez video stream to the board.
It seemed to me that you could stream 25x25 pixel video to the AVR microcontroller via either Serial, I²C, or SPI, since the AVR has hardware support for all three. Although the Atmega168 is a fairly low-end device, its 1k of ram should be enough to buffer a frame or two of packed 625 pixel data. And if you can get the data to it quickly enough, that's all you need....
"Getting the data to it quickly enough"
Some back-of-the-napkin calculations showed that I could get slightly better than 30 frames per second, if the data is transmitted efficiently at 115k baud. For 16 brightness levels, 4 bits per pixel are required. Since there are an odd number of columns and rows, the last byte for each column can just contain just one pixel, we'll discard the last 4 bits as a convenience. This gives 25 rows * 13 bytes per row = 325 bytes per frame. At 30 frames per second, this means a minimum of 9750 bytes per second need to be sent the the AVR, which can be done at 115k baud with some room to spare.
Given this information, serial transmission seemed like the simplest choice. The AVR has a hardware buffered UART, which is capable of speeds quite a bit higher than 115k. The only problem is, the RX/TX pins for this port are already used on the Peggy board to drive some of the chips used to turn the LED's on and off. Doh!
I played around with the idea of using the I²C (AKA Twin-Wire) port on the AVR, but this would require a second off-board chip to convert from serial to I²C, and probably a lot more code. Luckily, the Peggy 2.0 board is designed to be hacked: it has PCB pads brought out for all of the pins on the AVR.
The hardware mod
Standard disclaimers apply: Don't try this at home. You could blow up your Peggy 2.0 board, or shut down all of Boston if you do something wrong.
In theory, the change is pretty simple: We want to free up the PD0 and PD1 pins (RxD/TxD), and use some other pins to do the job of driving the 74HC154 that PD0 and PD1 used to do. I chose PC4 and PC5, since one of those is unused by default on the Peggy board, and the other is used to read a button (b5) which I felt like I could live without.
One of my goals was to not make any changes to the Peggy board that weren't easily reversible, so after some thought I came up with the idea to pull the AVR out of it's socket, and replace it with a daughter card. The daughter card has a socket for the AVR and maps all of the pins as before except for the ones I need to exchange. The card plugs into female headers that I soldered into the breakout pads on the Peggy board, along side the original AVR socket. I started to do this with a single sided PCB, when I realized I would need to solder headers from the top and the socket from the bottom (a head slapping moment). I dug around in my parts bin and found a protoboard called the SchmartBoard. Not only does this protoboard have thru-plated holes that can be soldered on either side, but it also has handy traces that run horizontally across the board on .05" centers. It makes soldering a little tricky (the pads are very small; it requires a fine-tipped iron), but it made rerouting the I/O quite simple.
Software Theory of Operation (AVR)
For the purposes of this explanation, the terms "LED" and "pixel" will be used interchangably. One LED on the Peggy board should be thought of as one pixel in a 25 x 25 pixel display.
The Peggy 2.0 allows for individual addressing of LEDs on each row, and by cycling through each row, all of the LEDs can be turned on or off. By using some high-speed strobing, it can be made to display 16 gray levels (actually brightness levels) for each LED. This requires fairly accurate timing to achieve good results.
So, again, the idea is to send data to the Peggy via a serial cable (or serial bluetooth), at a fast enough rate to support animation (or really really low resolution video). As described above, 115k baud serial is just fast enough.
The AVR code is broken into two parts: a serial receiver and a display refresh routine. Both sets of code share a "frame buffer" that is 325 bytes long.
For the serial receiver, the main method contains a loop that simply checks to see if there is a new byte of data from the serial UART, and if there is, it stores this in the frame buffer, and advances the pointer to the frame buffer to the next byte. Very simple.
The code looks for a "magic string" (0xdeadbeef) to mark the beginning of a new frame of data. This gives the AVR something to re-sync a new frame with should communication get out of sync. The AVR watches for those four bytes, and when it sees them, it then proceeds to copy the next 325 bytes into a frame buffer. If erroneous data is recieved at the beginning of a frame, it is ignored until the start-of-frame marker is recieved.
The display refresh routine is handled by a high speed timer interrupt that reads data out of the frame buffer and updates a row of pixels on the display. By using a timer interrupt for display refresh, very accurate timing of when the pixels are turned on or off can be achieved.
The timer interrupt is called roughly 25000 times per second, and during each call one row is updated. This actually occurs 16 times per row to give the 16 brightness levels. Therefore at a rate of 25000 interrupts per second, you get 25000 / (25 rows * 13 levels) = 76 full frames per second. The interrupt needs to complete in the time it takes to receive one byte, or else the serial UART buffer will be overrun. Also, there needs to be enough CPU time left over between interrupts for the serial loop to check the UART buffer for a new byte, and write it to the buffer. As it turns out, at 16mhz, an AVR can handle this with some CPU cycles to spare.
Note that the AVR doesn't care how many frames per second are transmitted, as long as each frame is transmitted at 115k baud. This code is therefore very general purpose, it can be used to update a static display at a much slower rate. If the serial transmission stops, it will simply display whatever was copied into the buffer last.
At slower transmission speeds, it would probably be necessary to double (or triple) buffer in order to avoid a "shearing effect". When the display routine starts drawing one frame, it will eventually catch up and pass the last received byte of serial data and start to display part of the previous frame. My prototype doesn't bother with this, at 115k baud, this effect is mostly unnoticeable to the naked eye. (This is not the same as the strobing horizontal bar effect that you might notice in the video. That is caused by lack of synchronization with the camera I was using, a similar effect to what one would get trying to video tape a TV set.)
Software Theory of Operation (transmitting computer)
Since I've been using Mac's for the past year or so, my options were fairly wide open as far as how to transmit data to the AVR. For testing purposes, I created a quick Python app that used pySerial to send test pattern data to the AVR, just to test that everything was functioning properly.
I decided that the most flexible thing I could do for transmitting "video", and the like, was to create a Quartz Composer plugin. If you are not familiar with Quartz Composer, it's a nifty application/framework for OS X that allows for graphics programming using a graphical nodal interface. QC "Patches" are wired together to make complex scenes. It's a very powerful tool, well integrated into the OS, but sort of hard to describe. You can get a better description from here or here. Windell has a great tutorial for it. But until you play around with it, it's hard to appreciate what can be done with it.
The QC plugin I wrote does little more than take and image as input and send a 25 x 25 pixel region to the serial port using the format described above. Writing this plugin turned out to be easier than I thought it would, Apple had sample code for both serial communications and QC plugins, and it was mainly an exercise in mashing the two samples together.
The only mildly difficult part, as an Objective-C neophyte, was figuring out how to access the image data at a pixel level. Quartz Composer uses opaque image types that can be cached in the video memory, which means that it must be transformed into another form in order to "read the individual pixels". Rendering the image into a CGContext allowed me to iterate through each pixel, converting it to 16 levels using something resembling the following pseudo code:
byte b = ( ((redEven + grnEven + blueEven) / 3) & 0xf0) | ((redOdd + grnOdd + blueOdd) / 3) >>4;
The red/grn/blue Even and Odd values represent pixels in either even or odd horizontal positions. A color average is a very poor way to convert to monochrome; it's a good idea to put a "Color Monochrome" patch in the QC pipeline for color sources.
Once I wrote the QC plugin, composing "scenes" in QC was rather easy. The screen grab below shows a composition that will play a movie file to both the Peggy and the Quartz Composer viewer.
In the snippet above, you'll notice a Image crop and Image resize patch. Serial-to-Peggy patch only examines the top corner (25x25 pixels) an images, so these are used to crop and resize the input down to the appropriate size. By stringing these together, it's easy to send a movie, animated moving text, or even web cam input to the Peggy board. The demonstration videos show the results.
The posterization that happens in the plugin could be slightly improved, and the brightness of the LED's better calibrated with the input. Since there are only 625 pixels in the final image, a histogram could be calculated to use for contrast stretching, which would help for some image types (mainly video).
The code that runs on the AVR is generic, so any number of desktop clients can be written to send data to the display. The data doesnt need to be video, it can be a static display or a semi-static display.
The Quartz Composer plugin is also fairly generic, in that many of the things that we can do with Quartz Composer can be sent to the Peggy
I've already done proof of concept for (large) scrolling text, movie and webcam output. Other possibilities are rss or twitter feeds, stock quotes, or other information displays.
I plan on eventually doing something in Python or Java so non-Mac users can play with this. I should at least be able to do scrolling text without too much effort.
Update: Bluetooth serial works! I wound up using a BlueSMiRF Silver module from Sparkfun. Does 115k with minimal "pausnia" at close range.
We stand on the shoulders of giants
Much credit goes to Windell at EvilMadScientist.com and EvilMadScience.com. Not only did he make a really nice and hackable kit, but he was quite helpful with a few issues that I encountered along the way.
Windell also pointed me to this site: http://www.solivant.com/peggy2/, one among possibly many individuals that demonstrated using a timer interrupt to refresh the Peggy display. Although it was my intent from the start to use a timer, it was nice to see that someone else had already proven that it was workable.
There are, of course, plenty of things that are arguably cooler for the übergeeks out there than what I did.
This code should be considered experimental! Use at your own risk! Wear eye protection! Don't cross the streams! (It would be bad).
AVR source code. Includes a sample Python script that sends an animated test pattern.
This code may damage Peggy boards that have not been modified as described above!