Hi everyone,
I’ve gotten images from the 8230 successfully so I thought I’d document the approach I took. My goal was to try and use as few components as possible, and do things in software where I could. I used an XMega (ATXMEGA16D4) as the main chip, and a 4MBit SRAM (AS6C4008) for the framebuffer. 19 of the 34 GPIOs on the xmega are dedicated to the address lines. The camera, xmega, and sram are all connected to an 8 bit parallel bus.
Because I wanted to do things in software where possible, the main issue initially was avoiding bus contention, since the camera has the unusual property (at least to me) of being low impedance (LoZ) on start-up. This meant that the first thing to do after pin setup was to set up I2C to tell the camera to go HiZ. After setup, the xmega waits for a shoot event. More on that in a sec…
The xmega acts as the TWI bus master when talking to the camera, but to be useful to other devices, it needs to act as the slave. I also wanted to be able to hook up multiple cameras to one TWI bus, but all the camera chips use the same id, so this was a way around that also. So the xmega is configured as a TWI slave during idle (which is most of the time). It’s only a master when talking to the camera.
So now back to the shoot event. A shoot event occurs when either the shoot switch is pressed, or a TWI message is received from the host microcontroller (an Arduino, Raspberry Pi, BeagleBone whatever). Once it gets the message, it switches to an I2C master and starts communicating with the camera. I’ll try and use I2C when talking about the communication between camera and xmega, and TWI when talking about communication between xmega and host (also, the I2C uses Peter Fleury’s software implementation, and TWI uses AVR hardware).
When a shoot event occurs, the xmega does some set up, and then puts itself to sleep waiting for an interrupt. The typical way to handle the 3 signals coming from the camera (VD, HD, DCLK) on fast hardware is to handle each DCLK as a separate interrupt. However, low speed micros like the AVRs can’t do this obviously. I even had trouble doing the thing Justin Shumaker described, which is to handle HD only, and time it properly.
Instead, I ended up basically getting one interrupt for the whole image, triggered by VD, and very carefully timing the the entire assembly routine so that the inner loop is basically just incrementing an address. It turns out you can do this in 8 clock cycles per DCLK, which is great because that allows the camera to be driven at 16Mhz, which is well above the minimum in the data sheet. So the fire hose of data being spewed out of the 8230 is timed exactly with the software address increment so that the sram fills up until the camera is finished.
It’s actually possible to do all of this with regular AVRs, assuming they have enough data pins. I had this going with an 8535 early on, but since that AVR was running at 16Mhz the motion blur from the pictures was pretty bad (and I was driving the camera below the minimum clock speed anyway). It worked fine though for static images.
One thing I did notice that drove me nuts was that at 160x120 and 320x240 there’s this weird blue line bug at the bottom, but I’ve seen that in other people’s images also that use completely different approaches, so I figured it was likely that was just some weird camera bug. It’s just a single scan line, and doesn’t happen at 640x480. Does anyone else have any info on that? You can actually set a register to clear it up (0x22 0x28 I think) but then it adversely affects the image in other ways.
If you want to look at the design files or the source code or the PC viewing software, all of that is available for free at hacromatic.com. Disclaimer: it’s a commercial product, but if you just want the data files for everything they’re at the bottom of the page.
If anyone knows more about that blue line bug I’d love to hear from you!