Close
0%
0%

Luxonis DepthAI

Bringing the power of the Movidius Myriad X to Raspberry Pi
(and your design!)

Similar projects worth following
The Myriad X is a vision processor capable of doing real time object detection and stereo depth at over 30FPS.

Let's unleash this power!

How?
We're making a Myriad X System on Module (SoM) which allows embedding the power of the Myriad X into your own products, with firmware support for 3D object detection/location.

And we're making a carrier board that includes all the cameras, the Myriad X, and the Raspberry Pi Compute Module all together to allow you to get up and running in seconds.

This allows:
1. The video data path to skip the Pi, eliminating that additional latency and bottlenecking
2. Stereo depth capability of the Myriad X for 3D object position
3. Significant reduction in the CPU load on the Raspberry Pi.

So you, the Python programmer, now have real-time 3D position of all the objects around - on an embedded platform - and backed by the power of the Raspberry Pi Community!

Start With the Why

  • There’s an epidemic in the US of injuries and deaths of people who ride bikes
  • Majority of cases are distracted driving caused by smart phones (social media, texting, e-mailing, etc.)
  • We set out to try to make people safer on bicycles in the US
    • We’re technologists
    • Focused on AI/ML/Embedded
    • So we’re seeing if we can make a technology solution

Commute Guardian

(If you'd like to read more about CommuteGuardian, see here)

DepthAI Platform

  • In prototyping the Commute Guardian, we realized how powerful the combination of Depth and AI is.
  • And we realized that no such embedded platform existed
  • So our milestone on the path to CommuteGuardian is to build this platform – and sell it as a standard product.
  • We’re building it for the Raspberry Pi (Compute Module)
    • Human-level perception on the world’s most popular platform
    • Adrian’s PyImageSearch Raspberry Pi Computer Vision Kickstarter sold out in 10 seconds – validating demand for Computer Vision on the Pi (that, and validating that Adrian is AWESOME!)

Development Steps

The first thing we made was a dev board for ourselves.  The Myriad X is a complicated chip, with a ton of useful functionality... so we wanted a board where we could explore this easily, try out different image sensors, etc.  Here's what that looks like:

BW0235

We made the board with modular camera boards so we could easily test out new image sensors w/out the complexity of spinning a new board.  So we'll continue to use this as we try out new image sensors and camera modules.

While waiting on our development boards to be fabricated, populated, etc. we brainstormed how to keep costs down (working w/ fine-pitch BGAs that necessitate laser vias means prototypes are EXPENSIVE), while still allowing easy experimentation w/ various form-factors, on/off-board cameras, etc.  We landed on making ourselves a Myriad X System on Module, which is the board w/ all the crazy laser vias, stacked vias, and over all High-Density-Integration (HDI) board stuff that makes them expensive.  This way, we figure, we can use this as the core of any Myriad X designs we do, without having to constantly prototype w/ expensive boards.  

BW1099

We exposed all that we needed for our end-goal of 3D object detection (i.e. MobileNet-SSD object detection + 3D reprojection off of stereo depth data).  So that meant exposing a single 4-lane MIPI for handling high-res (e.g. 12MP) color camera sensors and 2x 2-lane MIPI for cameras such as ~1MP global-shutter image sensors for depth.

And we threw a couple other interfaces, boot methods, etc. on there for good measure, which are default de-pop to save cost when not needed, and can be populated if needed.  

So of course in making a module, you also need to make a board on which to test the module.  So in parallel to making the SoM, we started attacking a basic breakout carrier board:

BW1098


It's basic, but pulls out all the important interfaces, and works with the same modular camera-board system as our development board.  So it's to some degree our 'development board lite'.

And once we got both of these ordered, we turned our attention to what we set out to build, for you, the DepthAI for Raspberry Pi system.  And here it is, in all it's Altium-rendered glory:

... Read more »

  • 1 × Intel Movidius Myriad X Vision/AI Processor
  • 1 × CM3B+ Raspberry Pi Compute Module 3B+
  • 2 × OV9282 Global Shutter camera modules optimized for disparity depth
  • 1 × IMX378 Nice high-resolution 12MP camera module that supports 12MP stills

  • Compute Module (BW1097) Second Revision

    Brandon6 days ago 0 comments

    Hi Everyone,

    So we received the second revision of DepthAI for Raspberry Pi Compute Module, and everything works as hoped (and it looks a lot cooler because we made the design more compact and ordered it in black).  Pictures to follow.  :-)

    We're continuing to integrate the depth portions we've written (here) with the direct-from-image-sensor neural inference (object detection, in this case) above.

    Cheers!

    The Luxonis Team

  • Power Use Comparison

    Brandon10/01/2019 at 17:09 0 comments

    Hi everyone,

    So we took a bit of time to see what our power use is now that we have neural inference (real-time object detection, in this case) running directly from image sensors on the Myriad X.  So this was actually inspired by this great article by Alasdair Allan, here, and we use his charts below in the comparison as well.

    Idle Power:

    Active Power (MobileNet-SSD at ~25FPS):

    So the DepthAI platform running MobileNet-SSD at 300x300:
    Neural Inference time (ms): 40
    Peak Current (mA): 760
    Idle Current (mA): 150

    So judging from Alasdair's handy comparison tables (here, and reproduced below), this isn't too shabby.  In fact it means we're the second fastest of all embedded solutions tested, and we're the lowest power.

    image.pngimage.pngAnd what's not listed in these tables is what is the CPU use of the host (the Raspberry Pi).  Here are our results on that (tested on Raspberry Pi 3B):

    RPi + NCS2RPi + DepthAI
    Video FPS3060
    Neural Inference FPS~825
    RPi CPU Utilization220%35%

    So this allows way more headroom for your applications on the Raspberry Pi to use the CPU; the AI and computer vision tasks are much better offloaded to the Myriad X with DepthAI.

    Thoughts?

    Thanks,

    Brandon

  • Outperforming the NCS2 by 3x

    Brandon09/29/2019 at 05:58 4 comments

    Hi everyone,

    So last week we got object detection working direct from image sensor (over MIPI) on the Myriad X.  MobileNet-SSD to be specific.  

    So how fast is it?

    • 25FPS (40ms per frame)... when connected to a powerful desktop

    According to 'The Big Benchmarking Roundup' (here) that's actually quite good.   

    But, this is connected to a big powerful computer... how fast is it when used with a Pi?

    To find out, we ran it on our prototype of DepthAI for Raspberry Pi Compute Module:

    And how did it fair?

    • 25FPS (40ms per frame)...  when connected to a Raspberry Pi Compute Module 3B+

    Woohoo!  

    Unlike the NCS2, which sees a drastic drop in FPS when used with the Pi, DepthAI doesn't see any at all.

    Why is this?  

    • The video path is flowing directly from the image sensor into the Myriad X, which then runs neural inference on the image data, and exports the video and neural network results to the Pi.
    • So this means the Raspberry Pi isn't having to deal with the video stream; it's not having to resize video, to shuffle it from one interface to another, etc. - all of these tasks are done on the Myriad X.

    In this way the Myriad X doesn't technically even need to export video to the Pi - it could simply output detected objects and positions over a serial connection (UART) for example. 

    Let's compare this to results of the Myriad used with the Raspberry Pi 3B+ in its NCS formfactor, thanks to the data and super-awesome/detailed post courtesy of PyImageSearch here.  (Aside: PyImageSearch is the best thing that every happened to the internet):


    So this is a bump from 8.3 FPS with the Pi 3B+ and NCS2 to 25.5 FPS with the Pi 3B+ and DepthAI.

    When we set out, we expected DepthAI to be 5x faster than the NCS2 when used with the Raspberry Pi 3B+ and it looks like we've hit 3.18x faster instead... but we still think we improve from here!

    And it's important to note that when using the NCS2, the Pi CPU is at 220% to get the 8 FPS, and with DepthAI the Pi CPU is at 35% at 25 FPS.  So this leaves WAY more room for your code on the Pi!

    And as a note, this isn't using all of the resources of the Myriad X.  We're leaving enough room in the Myriad X to perform all the additional calculations of disparity depth and 3D projection, in parallel to this object detection.  So if there are folks who only want monocular object detection, we could probably bump up faster than by dedicating more of the chip to the neural inference... but we need to investigate to be sure.

    Anyways, we're pretty happy about it:

    Now we're off to integrate the code for depth, filtering/smoothing, 3D projection, etc. we have running on the Myriad X already with this neural inference code.  (And find out if we indeed left enough room for it!)

    Cheers,

    Brandon & The Luxonis Team

  • First Inference Running

    Brandon09/27/2019 at 16:03 0 comments

    Hi everyone,

    So we have some great Friday-evening news (Europe time!):


    This is our first output of neural inference (in this case, mobileNet-SSD, which is the core of what we're targeting) from DepthAI.

    So we're still refining, simplifying, and also profiling this AI side, but it's a great first step!  And once we have all this locked down we're going to be integrating this piece (which is specifically architected to idle about 1/2 of the rest of the Myriad X to leave room for Depth work) with the Depth pieces (and smoothing, reprojection, etc.) that we have working in a separate code-base.

    We also have a flurry of new form-factors that we're working on and will be ordering soon.  Boards that adapt our Myriad X module for other applications, such as movable camera modules that connect over FFC (flexible flat cable - think like the ribbon cables of the RPi cameras).

    Best,

    Brandon and the Luxonis Team

  • Raspberry Pi HAT, Compute Module, Smaller

    Brandon09/17/2019 at 21:51 4 comments

    Hi everyone!

    So we've got every last bit of the RPi Compute Module version of DepthAI working, which is exciting.  So now that it's verified, we made the design smaller.  We had it big on the first prototype so that white-wire fixes would be easier... and we're glad we did that!  We ended up with white wires for:

    • Ethernet
    • 3.5mm Audio
    • I2C for the DSI and CSI display/camera for the Raspberry Pi.

    Anyways, these are all corrected, and the revised, now smaller, design was ordered on Friday!  Here's what it looks like (thanks, Altium 3D view):

    And now that this is ordered, we turned our sites on another incarnation of DepthAI that we think will be useful.

    But, before we dive into that.  Why did we make the Myriad X Module that we made, well, a module?

    It's so that we - and you - can easily turn designs that leverage it.  It allows the Myriad X to be used on simple, easy, 4-layer boards (instead of the 8-layer HDI and BGA tolerancing and yielf optimization that's required if you use the Myriad X directly).

    So, leveraging this module, here's the HAT that we're working on:

    You can see the spot for the Myriad X Module right there on top.  And then on the left are the FFC (flexible flat cables) connectors to connect to camera modules:

    That's right, we're making it so the cameras can be mounted where -you- want them.  We'll have some other follow-own Myriad X boards (e.g. non-RPi-specific) that take advantage of this as well.  Oh, and on FFC, think of the Raspberry Pi camera little ribbon cable... FFC is the general acronym for that type of cable.

    Speaking of which, here are the camera boards:

    OV9282 - 720p global-shutter mono camera use for disparity depth:

    IMX378 - 12 megapixel color image sensor, supporting 4K video!

    And you can see the FFCs on the back bottom of these respectively:

    And as usual feel free to ping us with any questions, comments, needs or ideas!

    Cheers,

    The Luxonis Team

  • Our Crowd Supply Pre-Launch Page is Up

    Brandon09/04/2019 at 17:23 0 comments

    Hi everyone,

    So if you're interested in buying DepthAI when it's ready, head over to Crowd Supply and sign up!

    https://www.crowdsupply.com/luxonis/depthai

    The pre-launch page is up, and having signed up there will allow you to get notifications when the campaign goes live.

    (The advantage of Crowd Supply is it lets us to a larger-than-we-would-be-able-to-otherwise order - which helps to get the cost down for everyone ordering.)

    Best,

    The Luxonis Team

  • Whirlwind Hardware Arrivals

    Brandon09/04/2019 at 17:17 0 comments

    Hi everyone!

    So we got a TON of hardware in over the past couple weeks, and have been heads-down testing, characterizing, etc. as well as in parallel making good strides on the firmware.

    So what did we get?  

    1.  The Myriad X Modules

    2.  The first (oversized) version of DepthAI for Raspberry Pi Compute Module

    3.  Our modular-ized development/test platform (which, in an earlier post, we thought had some sort of error - it doesn't... it was a PEBKAC error).

    So all of these work great, which is awesome!  We do have a small white-wire fix on the power-on-reset circuit on our Myriad X module as a result of simplifying that circuit and also the power distribution circuit A TON from our previous design.  Easy layout fix though, which is already done and ordered.  That, and the rewire makes the boards boot up well.

    And on the first DepthAI for Raspberry Pi Compute Module, we apparently accidentally hit the space bar, which, in Altium, is the rotate command, when selecting 3 of the ethernet pins on the footprint.  So some white wires there to correct that.

    So far we've tested/verified all of the following to work as designed/desired:

    1. IMX378 (12MP camera, 4K video)
    2. 2 x OV9282 (1MP global-shutter for depth)
    3. HDMI (1920x1080p works well)
    4. Ethernet (10/100)
    5. USB (2x external, and 1x internal to Myriad X Module)
    6. microSD (including boot with and without N00bs)
    7. Headphone jack (we initially populated the wrong line driver... got an open-collector variant instead of the push-pull, which we then thought was an error w/ our implementation of a custom Linux Device Tree... so that was a couple days of head-banging trying to get Linux to behave - and it turned out our Linux Device Tree was right all-along!)

    We've yet to verify the following (it's on the docket for today):

    a.  Raspberry Pi Camera

    b.  DSI Display functionality

    Anyways, here's the initial version of DepthAI software/firmware running on this board.  It ran first try with no issues, errors, or warnings, which was super satisfying:

    And on the firmware side, we're hard at work getting depth reprojection (which is what produces a point cloud like below) to run directly on the SHAVES in the Myriad X, as well as integrating depth filtering/etc. directly on the SHAVES as well.  Below is it running off of our DepthAI Development Board, with OpenGL rendering on the host, for now.

    And the second image above shows depth filtering experiments being run directly on the Myriad X SHAVES.

    Cheers, and more to come!

    The Luxonis Team

  • Bad News

    Brandon08/28/2019 at 23:23 0 comments

    So we've been digging into the firmware on the Myriad X as we implement more of the core of DepthAI and we just discovered a show-stopper that we didn't anticipate.

    First, the bad news: We won't have the 5x increase in FPS that we were expecting compared to the NCS2 with the Raspberry Pi.  As of now we aren't sure what the framerate will be, but it's not looking good and could actually be slower than the NCS2, particularly when depth is enabled in addition to the neural inference.

    Of course the platform still has the benefit of providing depth information, offloading the whole datapath from the main CPU, all of those advantages.

    But it may actually have a disadvantage compared to the NCS2 in terms of framerate.  

    We'll keep everyone posted as we learn more.  We'll be experimenting over the next ~2 months to see what the framerate is direct from image sensor, and see if it's better or worse than the 8FPS seen with NCS2 + MobileNet-SSD + Raspberry Pi 3B+.  We're of course hoping for better, but at this point we're unsure.

    Best,

    The Luxonis Team

  • Myriad X Modules | Good News and Bad News

    Brandon08/21/2019 at 06:12 2 comments

    Good news:

    Our Myriad X modules did ship, they arrived, and they do work!

    The bad news:

    They look like this:

    So in simplifying boot sequencing and the power rails required, we swapped some of the timing and control signals... which are fixed above w/ the white-wires (well, green wires, but you know what I mean).

    Also, funnily enough, JTAG isn't communicating... so we're debugging that as well.  The devices are USB-booting however and running (uncalibrated) disparity depth from the global-shutter sensors.  (The color sensor isn't tested yet.)

    And currently this board is lower framerate than our BW0235... so we're investigating the root of that as well.  We're thinking it has to do w/ the mix-up on the boot/reset signaling, which is causing some code to have to repeatedly timeout - but either way we'll find out soon!

    Cheers,

    The Luxonis Team

  • Myriad X Modules Shipped!

    Brandon08/19/2019 at 23:30 0 comments

    Hi again!

    So our first Myriad X modules _finally_ shipped!  So we're expecting to have these in-hand tomorrow, Tuesday August 20th.

    So back-story on these is that we ordered them on June 26th w/ 3-week turn from MacroFab (who we really like, and have used a lot), and the order was unlucky enough to fall in w/ 2 other orders that were subject to a bug in MacroFab's automation (which is super-impressive, by the way).

    So what was the bug?  (You may ask!)

    Well, their front-end and part of the backend were successfully initializing all the correct actions (e.g. components order, bare PCB order, scheduling of assembly) at the correct times.  However, the second half of the backend was apparently piping these commands straight to dev/null, meaning that despite the system showing and thinking that all the right things were being done, nothing was actually happening.

    So on July 15th, when the order was supposed to ship, and despite our every-2-day prodding up until then, it was finally discovered that the automation had done nothing, at all.  So then this was debugged, the actual status was discovered, and the boards were actually started around July 22nd.

    Fast-forward to now, and this 3-week order is now a 8-week order - which should arrive tomorrow!

    Unfortunately, the only photo we got of the units, was one from a confirmation that the JTAG connector was populated in the right orientation (and it was), so here's a reminder of what the module looks like, rendering in Altium:

    And for good measure, the only photo we have of the boards so far, which is of the JTAG connector:

    So hopefully tomorrow we'll have working modules!  And either way we'll have photos to share.

    Cheers!

    The Luxonis Team

View all 37 project logs

Enjoy this project?

Share

Discussions

Andrey V wrote 08/30/2019 at 17:53 point

I think it's too hard to get it work IRL. If you will do it, your LVL=80)

  Are you sure? yes | no

Brandon wrote 08/30/2019 at 17:59 point

Heh.  Thanks.  It's definitely a hard project - but we're hoping it will provide a lot of value!

  Are you sure? yes | no

Andrey V wrote 08/30/2019 at 18:02 point

Good luck!!!

  Are you sure? yes | no

Alan wrote 08/04/2019 at 00:40 point

Does your SOM break out the PCIe lanes on the movidius? I was looking at the UP Ai core but they convert PCIe to usb on board.

  Are you sure? yes | no

Brandon wrote 08/04/2019 at 01:03 point

Great question, thanks.

It does not.  The SoM uses USB3 (or 2) to connect to the host.  Less power, fewer parts, and also works with hosts that don’t have PCIE or don’t want to tie up a PCIE slot for this.

On the DepthAI for Raspberry Pi version there’s a USB hub chip on the carrier board, which is what allows the pi compute module to talk to the myriad x and also the outside world.

And yes the PCIE boards do convert from PCIE to USB on-board.

If you don’t mind me asking, why do you ask?  Do you need direct PCIE connection?

Thoughts?

Thanks again,

Brandon

  Are you sure? yes | no

Alan wrote 08/05/2019 at 05:47 point

I can't think of any reason right away, but it would be great to have as much of the io exposed as possible.  I think at least the SPI and ethernet should be exposed because those would both be useful for someone who wanted to use the SOM as a standalone device.

P.s. what did you guys have to do in order to aquire the chips? I can't seem to find anywhere to order them online

  Are you sure? yes | no

Brandon wrote 08/06/2019 at 22:35 point

Hey Alan,

So for some reason I can't reply to your comment, so I'm replying here instead.  So we don't know how to make Ethernet work yet from a firmware standpoint (and don't have a clear path to figuring it out) so we left it off of this SoM.  

That said we are making a SoM w/ an i.MX 8M on w/ the Myriad X, so that would provide Ethernet and a whole slew of interfaces like USB3, WiFi, etc.

Thoughts?

Thanks,

Brandon

  Are you sure? yes | no

psykhon wrote 03/07/2019 at 12:19 point

Hi Brandon, awesome project! 

How hard was to get te myriad x chips? Can you share some info on how do you do it? price?

  Are you sure? yes | no

Tegwyn☠Twmffat wrote 02/01/2019 at 14:53 point

I checked out that link above - I wonder how their larger kit compares to the Jetson TX2 in terms of performance?

I realise performance is not everything and the Intel model zoo is pretty useful. The Nvidia software seems to be a bit behind in that they only have blvc_Googlenet as 'out of the box' solution for detection.

What do you think your price point will be for a single myriad X carrier board, I'm presuming about $100 ?

  Are you sure? yes | no

Brandon wrote 02/01/2019 at 15:02 point

Great question!  So we've actually done a decent amount of stuff on the Tx2 as well.  The Myriad X, in terms of straight neural inference performance (e.g. object detection, semantic segmentation, etc.) is about the same as the Tx2.  The Myriad X neural engine is 1 TOPS, and the Tx2 peaks in ideal conditions at 2 TOPS, but from below, it seems like in most conditions, it's effectively 1 TOPS:

https://devtalk.nvidia.com/default/topic/1024825/cuda-programming-and-performance/jetson-tx2-performance/

But!  If your application is depth vision + neural inference, the Myriad X is equivalent to about 2 or 3 Jetson Tx2, mainly because of the 16 SHAVE cores in the Myriad X, which together can do 6 cameras in 3 pairs of depth streams. 

The neural inference part of the Myriad X is only 1 TOPS of the total 4 TOPS the device an do.  The remaining tops are for image processing functions like depth vision.

So this board won't really even tax the Myriad X, as there will just be one depth stream.  That said, we can use the extra Myriad X 'head room' to run fancier/more-processing-intensive depth calculation on these just 2 cameras - to produce a better set of depth information.

  Are you sure? yes | no

Tegwyn☠Twmffat wrote 01/31/2019 at 22:58 point

Hello Brandon! Does the Myriad X chip get put on the carrier board or does it stay in the USB stick?

If it goes on the board, how many of them?

  Are you sure? yes | no

Brandon wrote 02/01/2019 at 12:37 point

The Myriad X would be directly on the carrier board.  We could make versions with multiple Myriad X, for sure.  Is that of interest?  

These guys did that for their PCIE version:

https://www.crowdsupply.com/up/ai-core-x

I have 2 of those on order, by the way.  They're useful as well, for sure - just a different application, and not applicable for the Pi community (which is what this board should serve).

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates