Has anyone used the CM bit on the 0x03 register of the tcm8230md?
I think it is suposed to give us a colored or black & white image, but there something i’m not understanding.
If you set that bit to get a black white image, there will be only 1byte for pixel right? instead of 2bytes used in yuv422 or rgb565??
Regards
The data format output does not change. When in B/W mode, there are still 2 bytes per pixel.
In YUV the values for U and V stay constant at 128. The Y value is the only one that changes. It would be possible to just read every other byte in YUV mode to obtain a single byte per pixel frame.
Looking into buying this camera, but need jpeg compression together with a small resolution. I have tried searching through this thread and forum but have not found any clear data on this. There was one post in this thread suggesting that it might not work ( viewtopic.php?f=5&t=10314&p=61065&hilit … ion#p61065 ), but no hard data.
WickedMinds: Lets plug it into the camera/color mapping board
Nice work WickedMinds! I’ve was looking around to see what was going on with CMUCAM3 and looks like nobody is selling it anymore. I haven’t looked at this thread for a while, but just noticed your posts. I saw your schematic for the frame buffer. Looks great. I’m also interested in color/object tracking, so I’m curious about your color mapping board. You mentioned it in a couple of posts. Is this another piece of hardware that you have not described in detail? I was thinking it might be some type of hardware-based color transform / lookup? I’m thinking of doing something similar, so thought I would try to learn as much as I can about your experience.
WickedMinds: Lets plug it into the camera/color mapping board
I’m also interested in color/object tracking, so I’m curious about your color mapping board. You mentioned it in a couple of posts. Is this another piece of hardware that you have not described in detail? I was thinking it might be some type of hardware-based color transform / lookup?
Yes, it is a separate board from the frame buffer. The color mapping board has the ability to route the YUV data through an EEPROM which acts as a hardware look up. You’ll notice in the screen shots of my UI that there is a visual representation of the color map on the left. This allows me to assign a color index (Segment) to any given YUV value by just painting on the grid using the defined Segments as a palette. When I have my color map the way I like it, the color map is sent over USB serial to Arduino which then forwards it to a couple of I2C I/O expanders connected to the EEPROM to flash it.
When in raw mode, the color map board bypasses the EEPROM and raw YUV data is stored to the frame buffer allowing me to view and analyze raw frame dumps. This gives me the information I need to create/modify the color map.
When in color map mode, the upper 4 bits of Y and upper 6 bits of U and V are used to address the EEPROM (16 x 64 x 64 = 64k Color Map/EEPROM). The color mapped output of the EEPROM is written to the frame buffer. Since the raw YUV 4 byte pixels are mapped to single byte color indexes, the total memory size of a frame is 1/4th the size of a raw frame. This also means that the color data is output at 1/4th the pixel clock. This brings us closer to being able to read the frame data real time by an Arduino etc.
Next thing I am attempting to do is add hardware RLE encoding of the color mapped data before it hits the frame buffer. This would result in the MCU only needing to read two bytes (color index and run length) each time the color changes. Well within the abilities of a 16MHz MCU etc.
So far the only drawback is that I am colorblind and my wife has to help me with defining color maps :geek:
WickedMinds:
When in color map mode, the upper 4 bits of Y and upper 6 bits of U and V are used to address the EEPROM (16 x 64 x 64 = 64k Color Map/EEPROM). The color mapped output of the EEPROM is written to the frame buffer. Since the raw YUV 4 byte pixels are mapped to single byte color indexes, the total memory size of a frame is 1/4th the size of a raw frame. This also means that the color data is output at 1/4th the pixel clock. This brings us closer to being able to read the frame data real time by an Arduino etc.
Next thing I am attempting to do is add hardware RLE encoding of the color mapped data before it hits the frame buffer. This would result in the MCU only needing to read two bytes (color index and run length) each time the color changes. Well within the abilities of a 16MHz MCU etc.
Thanks for the explanation (using upper 4,6,6 bits for eeprom addressing). Sounds like you are going to have a really nice solution for color/object tracking with low-power embedded systems. I haven’t been able to find any similar products commercially available. Closest thing I found was the CMUCAM, but looks like those have all been discontinued. Do you have any plans for producing a product after prototyping? RLE encoding in hardware sounds like a really great idea too. Please post again when you get that working! I would certainly be interested in anything else you are willing to share (especially schematics) Thanks!
tvelliott:
Do you have any plans for producing a product after prototyping?
I think there will be a few vision sub system breakout boards/modules as a result of the R&D I am having to do to get this working for my main project.
I finally threw the cam and all it’s support circuitry including the clock and 5v->2.8v/1.5v power supplies onto a single board.
5V I2C level translation is done with a SparkFun BoB. Vertical Sync, Horizontal Sync and Pixel Clock are all brought up to 5v using a 74HCT08 with each gates inputs tied together so they act as buffers. The data lines are brought up to 5v level using a 74HCT574 latch clocked by the pixel clock.
(For those that don’t know, HCT versions of chips have a really low input threshold for logic level 1. 2 volts in this case versus 3.15v for the HC versions.)
Next adventure is inserting the next rev color mapping board between the two
p.s. Don’t you just hate when you realize you forgot to route a wire after you’ve put the board together :think:
Your posts let me know it was possible to get an image out of this camera, I had been using a CAM130 and all I could get was the syncs and snow.
My setup is a Hitex LPC4300 evaluation board, with the LPC4300 LCD driver driving a VGA display and using the Serial GPIO peripheral of the 4300 to capture the camera data (being stored in an SDRAM). I know this is not a typical hobbyist setup, but SparkFun is not just for the DIY crowd.
The red arrow is the camera on a board with 74LVC244 doing level shifting (3.3->2.8 ) both directions, for the CLK to the camera and data and syncs to the 4300. Right is a standard VGA monitor driven from the 4300 with an inserted QVGA image (it could do higher resolution)
I am interfacing TCM8240MD camera to PIC24FJ128GB106, but got confused by seeing the datasheet .I got the square wave on VBLK & HBLK lines but not getting DCLK :? . My aim is to get the JPEG image data from camera & display it.Can anyone help what steps should I follow to achieve this? please help me as soon as possible …
I am interfacing TCM8240MD camera to PIC24FJ128GB106, but got confused by seeing the datasheet .I got the square wave on VBLK & HBLK lines but not getting DCLK :? . My aim is to get the JPEG image data from camera & display it.Can anyone help what steps should I follow to achieve this? please help me as soon as possible …
thanks to all
Hi ak_embedded
If you look back in the past posts you see a lot of init routines for that camera, there is one BIT that you enable the output on the data lines.
Looking at the datash*t I think is the BIT 7 on the register 4.
If you look here there are some initalized registers with awsome results
I’m not using this camera, I’m using is little friend 8230 with dsPIC33 and I can only have enough RAM to store 1 image 128*96 and grayscale, so depends on your requirements that processor maybe or maybe not work for your application.