artem88:
Hmm… I suppose, you have duty cycle of extclk is not equal 50%. It is 1/3 or 2/3, isn’t it ? I’ll be happy, if so, because I can’t produce 50% at desired frequency.
That's correct. The duty cycle is not 50%. It worked even with 10% as I increased the period w/o caring about DC, so it is not critical.
artem88:
I noticed that the resulting DCLK and EXTCLK are equal in frequency in modes with higher resolutions. It is not a half of it any more.
Let’s send these 23 pages of topic to toshiba for their datash*t best replacement
This seems to be an everlasting problem with good and cheap things coming from Asia. They are offered undocumented.
KreAture, I don’t like YT fo this purpose because YT will re-compress the already compressed file.
Dimitri, I’ve uploaded H.264-encoded video to Vimeo, and then re-downloaded the file (you can allow users to do that) and compared them byte-for-byte. They were identical, despite Vimeo’s post-processing of the video after upload.
Dmitri, have you had any luck using the camera module’s JPEG compression? I’m considering building a board around an Atmel AT91SAM3U4C (ARM Cortex-M3), but I need to send slow video across a slow wireless link (sequential still video). I’d like to take advantage of the JPEG compression.
Also, can you talk more about how you interfaced to the PIC? I haven’t found where you talk about that, but this is a rather long thread at this point. I’d like to know what approach to take when designing my board.
Dmitri:
I don’t like YT fo this purpose because YT will re-compress the already compressed file.
Dimitri, I've uploaded H.264-encoded video to Vimeo, and then re-downloaded the file (you can allow users to do that) and compared them byte-for-byte. They were identical, despite Vimeo's post-processing of the video after upload.
rmann:
Dmitri, have you had any luck using the camera module’s JPEG compression? I’m considering building a board around an Atmel AT91SAM3U4C (ARM Cortex-M3), but I need to send slow video across a slow wireless link (sequential still video). I’d like to take advantage of the JPEG compression.
Also, can you talk more about how you interfaced to the PIC? I haven’t found where you talk about that, but this is a rather long thread at this point. I’d like to know what approach to take when designing my board.
Thank you!
Rick,
I did not try TCM8240MD yet. I wonder if I will after so many people had no luck with it.
I will update you all about my design very soon. I have managed to achieve 24 frames per second with audio (it was 18 fps w/o audio before). All streamed to a sd card as before.
As far as JPEG compession is concerned, you can make a DCT for 8x8 pixel area in 25 microseconds (0.024ms) with PIC32. 320x240 picture will fit into the RAM of the newest PIC32.
Hi, I’m using the TCM8230MD and have some problems with the colours in my images. I seem to have a lot of purple like Kreature had earlier… just wondering if anyone knows the cause of this? So far I’ve only grabbed QQVGA(full) images due to RAM.
My colourbar test looks perfect, except that the bars are vertical. From this forum it seems the 8240 bars are horizontal, is that right?
enakf:
Hi, I’m using the TCM8230MD and have some problems with the colours in my images. I seem to have a lot of purple like Kreature had earlier… just wondering if anyone knows the cause of this? So far I’ve only grabbed QQVGA(full) images due to RAM.
Do you grab YUV? If so, U and V might be swapped in your case.
thanks for reply Dmitri. I left out some information. I only tested RGB and I am only using it as 16-bit pixels in a bmp file. Do you think it could be related to how i use the pixel data? I might try one of the earlier suggestion to mirror the LSbs and create 24-bit data.
More information; i use 1.5V and 2.8V supplies from LDOs.
I am planning to explore using this camera for capturing still images… the resolution doesnt matter to me, but the max processor speed i can get is 16Mhz. I perused through this thread and the datasheet.
I have some questions, if we want to capture just images, we shoudlnt worry abt fps. My idea is to use some kind of memory like the EEPROM, and mebbe counter which will indicate the address of the memory. Initialize the camera and I shouldnt worry about the EXTCLK(mebbe clock it at 6MHz) and disable the camera after taking the pic and copy over the entire pixel data (128962 bytes) if we are using the 128*96 resolution). The pixel data should remain in what I visualise the camera memory till the copy operation is finished. And moreover, I wouldnt be needing 6-7 instructions for copying each byte since I have a dedicated counter and memory(EEPROM) for it. I agree the counter clock will still be at 16Mhz since that is the max clock freq of my system, but since I just want to capture a image, it should be fine.
Do let me know if I am thinking along the right line.
nachox2002:
We are developing a DVR with microSD storage an a future ethernet conexion for our UAV, we free the schematics when finish if anybody wants.
I am having big trouble configuring the TCM8230 camera, My external clock in is at 11MHZ and 2.8V IOVDD AND PVDD are also 2.8v , DVDD is 1.5V, the reset pin is pulled high.
When I try to write to the I2C registers, the part acknowledges when I send it 0x78, now whats very strange is that usually for the next byte u can address whatever config register you want, but for some strange reason I only receive an ACK when I send the part Ox1F for the second byte, the part will then ack anything for the 3rd byte but again no ack for another byte.
Has anyone come across behaviour like this before? I have been working on this for a week to try to see some clock pulses on DCLK. Also another thing to note is that my camera uses about 7mA from the 2.8V supply.
I could provide some videos done with this little thingie above, if some one wants to see them.
As I explaned long time before, I write the raw YUV stream directly to the SD card. The only modification is the reduction from YUV4:2:2 to YUV4:1:1. Thus, one line becomes 512 byte long (the size of a block to write) instead of 640.
A simple PC program dumps the data to the HD as a row of BMPs converting from YUV to RGB. These BMPs are put together by VirtualDub.
After having accepted 30 frames seamlessly, the SD card takes a longer break which results in dropping about 10 lines. So, each 30th frame has a white stripe. I can decide if I drop this frame or not.
I would like to encode the YUV output from this camera to MPEG4, what hardware could do this in real time? FPGA? are there any free cores available that could do this?
Dmitri:
I could provide some videos done with this little thingie above, if some one wants to see them.
As I explaned long time before, I write the raw YUV stream directly to the SD card. The only modification is the reduction from YUV4:2:2 to YUV4:1:1. Thus, one line becomes 512 byte long (the size of a block to write) instead of 640.
A simple PC program dumps the data to the HD as a row of BMPs converting from YUV to RGB. These BMPs are put together by VirtualDub.
After having accepted 30 frames seamlessly, the SD card takes a longer break which results in dropping about 10 lines. So, each 30th frame has a white stripe. I can decide if I drop this frame or not.
I've made a very similar project. It's based on a PIC24HJ128GP204 running @ 40MIPS (FNOSC_FRCPLL), I have also implemented multiblock write+SPI16+DMA (One shot, Ping-Pong mode Disabled) into ChaN's FatFs library and I'm using PMP (Master mode 2,No additional wait cycles,PMRD/PMWR port enabled) which reads the data coming out of a AL422B. It's recording YUV422 in raw. The TCM8230 extclk is feed with PIC's PWM output proving 20MHz. At this point I was only able to achieve ~16fps @ 160x120 (currently with a class 10 shdc 8gb sd card). The main issue I found is a bottleneck while writing to the SD card, SPI is configured to work @ 10MHz which is the theoretically the maximum clock speed allowed as per Microchip datasheet. I'm using a 3 input NAND (Max tpd of 3.8ns at 3.3V) in which inputs are connected to cam's VD and HD and the another one left is used to control frame capture. The NAND output is connected to FIFO's write enable.
At this point my aim is to get rid of the FIFO so I’m wondering how did you manage to achieve frame rates higher than 16fps? with resolutions higher than qqvga.