I have written a camera motion detector in Matlab to be used to enhance UAV stability without GPS support. It doesn’t have to refresh very often, but the more the merrier.
The algorithm consists of an edge-detector, a corner-detector and some correlation matchers. Most processing is small window convolutions.
In Matlab, this takes around 0.4 s on my Core 2 Duo 2.5 GHz, or about 3.5s on my 1 GHz Atom.
Does anyone have experience of how Matlab computing speed relates to the speed of an embedded ARM7 when doing these kinds of calculations.
So, basically, I’d like to know if this will take 2 seconds or 2 minutes?
Depending on the ARM. Matlab ahs tons of built in routines to help out. And even a 1Ghz Atom is quite powerful (and has tons of ram).
It depends on your implementation of the algorithm. But I would say 2 mins is optimistic.
You’re probably right. After some code-digging I find that the conv2-function (2D-convolution), which is responsible for 90% of the execution time, is eventually a call to a built-in function. This probably means native running code. Hard to compete with that.
I will have an embedded PC on the UAV, so maybe that one should be used for this. I really don’t like running parts of the autopilot on a high-level platform like Windows, but as it’s not a core component maybe it will have to do.
Maybe it could even be possible to run the convolutions in the graphics hardware, using DirectX and shaders.
Or… You could send a video link to the ground. The hardware required for such a setup would be much simpler and lighter. You could even use the digital link for something else.
Have you considered using a DSP for this kind of work?
Check out the BlackFin BF533, it could do the trick in less than 2 minutes. It really depends on the resolution of your camera and whether it is color / B&W / black or white.
I would second the suggestion of the Blackfin. For real time video stuff they are amazing.