Hello,
I am making a science exhibit that uses a depth camera to what is on a table top and run a simulation using that depth information, not dissimilar to AR Sandbox.
Up until now for the prototype we have been using an old macbook pro (2017 intel model) and a kinect v2, and the simulation runs in Processing 4 software. However both these bits of hardware are old and this exhibit needs to last years. We would like to upgrade the hardware. Both the kinect and an asus xtion we tried do not work with Macbook pro M1.
We are looking to run the exhibit on a new Mac Mini with M2 chip.
So, I need a depth camera that is going to work with a Mac M1/M2 chip. The camera should provide a depth image, that we can pull into either Processing 4 or into a webpage, and then use the depth image to influence the simulation.
We are interested in if the eYs3D https://www.sparkfun.com/products/14725 and the Oak-D LITE Oak-D LITE and the Oak-1 DepthAI Hardware https://www.sparkfun.com/products/17769 will meet these requirements.
Many thanks for your technical advice,
David
The eYs3D might not be compatible with Mac https://github.com/eYs3D (but maybe!)…but the Luxonis cameras look like they would be good choices, as they have very good documentation https://docs.luxonis.com/projects/sdk/e … uickstart/
That is very helpful, thank you for the reply TS-Russell.
I have posted my query on the Luxonis forum for confirmation.
The OAK-D Lite works very well with any Raspberry Pi, and I have it running nicely on the Pi Zero W, which fits right onto on the back of the camera for an extremely compact, standalone setup. A single, shared 5V 3A power supply is the only other part you need. Depth information from the scene can be sent and received anywhere via the local WiFi network.
Luxonis even offers an installation image of the RPi linux OS with all the camera software preinstalled, so setup troubles are nonexistent. It runs right out of the box.
Hi @jremington,
Thanks for this response. I’m not super technical on OS’s; are you suggesting that because it works on the Raspberry Pi it will work on a Mac Mini M2 or are you saying that I could run my setup using a Raspberry Pi with the OAK-D Lite and then pipe the depth image to the Mac Mini over Wifi?
Thanks, David
I don’t know anything about the Mac Mini M2, take a look at Luxonis downloads page to see how it is supported.
It is trivial to send the depth image over the local network. Currently I’m using the setup with remote desktop on a Windows PC. The Pi Zero W has the power and USB-C ports in parallel, so a single power supply can run both. Screenshot over the local network (depth frame update rate is a few per second):
Thanks. Could I avoid wifi and do the same over an ethernet connection? I’m hoping that would be just as easy and saves me having to deal with the venue wifi.
You would need an RPi with an ethernet port for that.
I see raspberry pi 4 has an ethernet port.