Got a AT91SAM7S devboard today. That should be enough horsepower to get this thing working. Also, the devboard has full port-breakout compared to the silly STM32 board with 49 i/o’s on cpu and not a single 8-bit port available for developing!
Hi,
I’ve ordered a couple of these little cameras to play with…currently in transit.
I’m hoping to make a little helmet cam, capturing still and possibly movies in movie jpeg format to a SD card.
Before getting all electrical and experimental with it I had a few thoughts about getting a picture out of the device onto a card.
Firstly it looks like pulling the data into a micro then out to the card will obviously be too slow. So I thought another approach might be to use the micro to setup the camera using the I2C interface, setting it to JPEG mode, brightness, resolution etc settings.
A shift register could be connected to the data output and the serial stream of the shift register could be fed into the SD card inputs. Some sort of clocking circuit that would run independently of the micro and clock X many bytes out of the device would be needed. This could run very fast.
The micro would also be connected to the SD card inputs.
To take a picture the micro would setup the JPEG/AVI header information onto the SD card, then start the shift register/clocking cct to cycle the image out of the device into the card. Perhaps closing the file at the end of the process by taking back control of the SD card inputs.
Sound feasible? I haven’t read the datasheet completely yet…I’m hoping the data output is suitably compatible with jpeg file format. Alternately the card could be read by a PC and the output data converted into an appropriately formatted jpeg file.
This approach would require a little bit of external logic, but if the camera data could be streamed directly onto the SD card this way, it’d save the processor dealing with the data at all, and hence be fast.
Phil
Good approach Phil. I’ve been thinking of the same myself.
However, the cam outputs continously so a buffer with tristate would be needed. Also, the camera has no idea how to allow SD card to ack each block and such needed for streaming to the card.
In jpeg mode the cam delivers around 8 mbyte/s.
What is needed is to get the cam to actually take a picture and not just send silly edge data or garbage. Only clear pictures I get from it is when I set it to test-pattern mode.
I can etch and send breakout boards to anyone who wants em.
I guess the more that are fiddling with the registers, the bigger the chance of getting it to work right.
Around $4 a card and $4 shipping should cover postage almost anywhere in the world. The cards would just be etched and drilled with printed docs for pinout, partlist and assembly instructions. I can throw in the regulators too as they might not be taking up shelf-space at all EE shops.
Other stuff is standard caps/resistors.
I would much rather someone produced the boards professionally though.
mmm
Looking a bit closer at the way you write to a SD card you:
-
send a write command, with a byte address aligned with the SD card block boundary
-
wait for a response from the card to proceed
-
send a start token, then a block of data
-
finish with a CRC-16 word of the data
-
wait for a accept/reject response from the card
-
the DO line of the card is held low while the card is busy
So 1-3 should be easy enough for the micro+simple logic to do however calculating the CRC-16 code for the data block might be difficult without pushing the data through the micro.
If the DO line can be used to throttle the camera output or perhaps to stop the clock the data could possibly be written without overloading the card.
So unfortunately a simple shifting logic looks like it doesn’t go far enough. I wonder if there is a way to perform a CRC-16 calculation in logic. I’m guessing it is possible, but not practical to do with simple logic chips and a FPGA/CPLD would be necessary. However apparently the CRC code requirement can be turned off with a special command.
Alternatively there might be special purpose chips out there that take a stream of data, do appropriate buffering and stream it to an SD card. I’ve looked at the Vinculum chip (to use USB thumb drives) however they have a serial interface…which would be too slow, as again the processor would have to suck in the data, process it and output to the Vinculum chip.
Possibly the only alternative is to move to a high end microcontroller like the dsPIC range.
A conventional solution would be to squirt the JPEG data from a frame (via a simple clocking logic circuit) into a ram chip, then the micro could interrogate the ram buffer at its leisure and send it to the SD card. I presume this is how older cameras worked, given they usually had a delay after taking a picture. However I’d prefer a direct streaming approach to keep open the option of taking video instead of single frames, particularly with the cheap abundance of fast SD cards.
Anyway, just my current thoughts on possible solutions.
Phil
Yeh rgbphil it is fiddly.
What makes it worse is that the cam did not deliver data through i2c as I had hoped. If it had, the master would be responsible for clock and there would be no issue.
Using buffer logic is not too good either as it adds bulk to the designs.
I for once want this to be very very small as I will be flying tiny RC planes with it onboard.
A fifo chip could help as it could be enabled with a latch by the mcu, then the cam clocks data in combining HBLK with DCLK to only clock in actual data.
The process stops when the VBLK goes low again thus clearing the latch.
If the FIFO reports full, you then have an overflow.
The reason this isn’t practical is the ammount of data… Even in jpeg mode you can expect 0.5 to 1 MB files from 1.3 mpix.
I’ll let you know how it goes using a faster mcu. (60 mips vs max 20 should be good)
We’ll get there eventually.
I contacted FTDI about the feasibility of a streaming input for their Vinculum chips…they don’t have any plans for one though.
However they did mention that the Vinculum chip has a parallel input (as well as RS232, I2C etc)…so maybe that will be suitable. The micro could send the setup/command codes then hand over to a streaming clocking circuit to dump the image file, then clean up afterwards.
Will have to look closely at the Vinculum datasheet…not the easiest of things.
I apologise in advance if I get laughed out of here, but I have never befor e used an 8-bit parallel interface. I have no idea how to even start conecting it to my micro (LPC2138). I can connect my micro to my uSD card no problem, and feel confident enough that if I can find a camera that I can use, I’m a few short steps away from my own home made camera. Could any kind soul point me to a link that will give me a breakdown of this interface please?
The interface for the data stream on this cam is simply 8 paralell wires each containing a bit simultaneously. There are seperate signals to tell you when the bits are valid.
Look at the datasheet for full details.
It will not be easy to interface this to an LPC2138. Need something faster or with a DMA-enabled parallel interface. I’m working on doing it with a Blackfin, will keep you guys updated.
KreAture, would you be interested in sharing your Eagle files? If so, I’ll get a dozen professional boards made up with solder mask and mail them out for $4 each shipped.
I guess that shows how little I know about this kind of project, I’ve always found my LPC to have plenty of headroom. What uC’s is everyone using?
The trouble is that the camera module is acting as a master - it is sending data out at a high rate, around 10 MByte/s depending on clocking, and it controls the clock. So really any microcontroller without a dedicated parallel interface will have trouble keeping up, because if your controller is doing say 40 MIPS you only have 4 instructions to receive each byte, save it somewhere, and keep the clock in sync. And few microcontrollers have enough memory to hold a whole frame at a time, even with the JPEG.
The easiest solution to interface to a regular micro is to use a FIFO buffer IC big enough to hold a whole frame, so the micro can clock the frame out as slow as it likes.
The sad part is the price of such a FIFO…
I am not sure if the ARM7 dma can help or not.
It doesn’t look like it can be clocked from an external pin directly but might need interrupt. That would make it too slow.
May be two avr microcontrollers working with time sharing and writing in two separate sd cards can help. That way we will have half of the time to set up writing a new portion in the sd and during this time the second micro will write the missing part of the image. The synch can be done by counting the clocks from the camera from the timer. Only an idea.
Another idea is to get a (big) 8-bit sdram and a counter. Connect clock to counter which acts as address, the data bus to … duh … the data bus, and another clock-trace to read strobe of the memory.
I don’t think the ARM7 dma is useful for this.
I remember finding a FIFO that was suitable and in the $15 price range, I’ll try to find it again.
I’m presently using an AT91SAM7S64 to access the TCM8230MD which poses the same problem as the one you are facing with the TCM8240MD. Keep in mind that you have some breathing room between each horizontal line that comes in. I’m working at 128x96, so a 256 element buffer should suffice… I just finished writing some C/ASM hooks into my dev environment for the FIQ. At best I’m seeing about 100nS latency to access 0x1C on the ARM7 FIQ handler and perform a single instruction. I’m using PWM hardware to drive the clock on the camera. I’m about to see how slow I can clock the camera. If I can get the clock to be stable at 3-4mhz (well outside the specification) then I think the FIQ plus my ASM routine will work. This would give me about 250ns between DCLK’s on the camera… I made my protoboard on the LPKF-S62 protomill I have here.
Assen
At 20 mips you would only have 3 instructions pr camera clock when clocking cam at spec speed. This is too slow to even check sync.
Twingy
Great that someone else has gotten this far
Are you getting proper data?
I am having trouble getting the cam to deliver rgb images correctly. Mosly I get either black or half a pic with only green.
In 128x96 I use a row-buffer of 256 as each pixel pr row is 2 bytes in RGB mode.
As for clocking, enabeling pll and fiddling a bit with the upper bits in that byte-register can give something like 5.7 mhz if you underclock the system. The odd thing is, it looks like when you turn on the pll, the cam outputs a clock opn extclk ??? I will have to verify this by turning off my extclk signal after enabeling the pll…
I have experimentally determined that on the TCM8320MD I must feed EXTCLK of the camera atleast a 6MHz signal (datasheet says 11.9MHz minimum) in order to send the TWI (I2C) cmds to activate the camera and begin producing a DCLK signal. I also determined that I can then change the DUTY and PERIOD of the PWM from 6MHz to something much lower and maintain the DCLK signal. WARNING!!! just because the DCLK signal is OK and the TWI (I2C) still works does not mean the data coming off the camera is OK. I do not know if the Camera is using DRAM that requires a minimum clock speed or if the minimum (11.9MHz) clock speed was defined as the minimum speed you want the camera to run to minimize BLUR. As of this moment in time I am writing straight ASM in the 0x1C address of the ARM’s FIQ to read the PIOA ODSR for the upper 8-bits on the GPIO, shove into a 256 pixel component buffer (128x96 resolution… 2*128 = 256…) and increment a counter… ALL WITHIN 300-400 ns. Note to you other masochists… There are ARM7/ARM9 processors (Liminaire) that have Camera DMA circuitry on them if you want to take the easy way out grin. I am running my AT91SAM7S64 at a frequency of 86,016,000 Hz.
Nice. I wish I could get the proper docs for this AT91 so I can configure the internal PLL. Right now it runns at 18 MHz.
If only I can get the speed up and see if I can get the PIO or DMA controller to do my bidding