Close

Hack Chat Transcript, Part 1

A event log for Machine Learning with Microcontrollers Hack Chat

Arduinos and Pis and AI, oh my

lutetiumLutetium 09/11/2019 at 20:150 Comments

OK, big crowd today, let's get started. Welcome to the Hack Chat everyone, thanks for coming along for a tour of what's possible with machine learning and microcontrollers. We've got @pt and @limor from Adafruit, along with @Meghna Natraj, @Daniel Situnayake, and Pete Warden fromt he Google TensorFlow team.

We've also got a livestream for demos -

Also, we hear that a class of sixth-graders is tuned in. Let's make sure we make them feel welcome - we always support STEM!

limor12:00 PM
heyyyy everybooooody

Daniel Situnayake12:01 PM
Hey everyone!! I am so jealous of that sixth-grade class; they have an awesome teacher

Can everyone on the Adafruit and Google side just introduce themselves briefly?

C.R. Myers joined  the room.12:01 PM

Hotmspam joined  the room.12:01 PM

Pete Warden joined  the room.12:01 PM

nu2life joined  the room.12:01 PM

Cormac Harkin joined  the room.12:01 PM

limor12:02 PM
hi everyone itsa me- ladyada

Pete Warden12:02 PM
Hi! This is Pete from the TensorFlow team at Google

Juan Manuel joined  the room.12:02 PM

Alex Colello joined  the room.12:02 PM

limor12:02 PM
check out the youtube video for the live demoooos

limor12:02 PM
and more chitchat :)

Meghna Natraj12:03 PM
Hi everyone! I'm Meghna from the Google Tensorflow Lite team. Excited to be a part of this event! :)

omarmung joined  the room.12:03 PM

pt12:03 PM
i'm pt, work with limor at adafruit, and founded hackaday 15 years ago which is a coincidence

limor12:04 PM
lets drop some LINKS

Evan Juras12:04 PM
Hey @Pete Warden and google folks ! When is TensorFlow 2.0 officially releasing?? :D

limor12:04 PM
while y'all think of the questions and ya wanna ask

Larry Bank joined  the room.12:04 PM

A happy coincidence ;-)

Thomas Marsh joined  the room.12:04 PM

Daniel Situnayake12:04 PM
Hey!! I'm Dan, and I work on the TensorFlow Lite team at Google. TF Lite is a set of tools for deploying machine learning inference to devices, from mobile phones all the way down to microcontrollers. Before I was at Google, I worked on an insect farming startup called Tiny Farms :)

Boian Mitov12:04 PM
Hello everyone... sorry for the late apperance...

happyday.mjohnson12:05 PM
Hey Dan, so it's a Bug's Life?

Christopher Bero12:05 PM
Can we get a brief overview of ML on uCs for those of us unfamiliar with the space? What problems is it solving, and on generic microcontrollers or special parts?

SimonAllen12:05 PM
So when does it start?

Andres Manjarres12:05 PM
@pt How neuronal network do yo use in this process?

Dave joined  the room.12:05 PM

Daniel Situnayake12:05 PM
OK so, if you're not familiar with ML on MCUs, I can share a few thoughts!

happyday.mjohnson12:06 PM
Can we make our own models (e.g.: using an LSTM for timeseries accelerometer readings)?

Arthur Lorencini Bergamaschi12:06 PM
@pt What problems this Tiny ML can solve?

Tegwyn☠Twmffat12:07 PM
Who invented the term 'confusion matrix'?

Duncan12:07 PM
@Daniel Situnayake What is the size of a typical TensorFlow Lite network (in kB) and what is the inference latency?

Paco joined  the room.12:07 PM

3gmarquardt joined  the room.12:07 PM

limor12:07 PM
@andres for raspberry pi we're using the mobilenet v2 models that are optimized for mobile device use - its really a tone of work they put into it to optimize them and we wouldn't be able to do a better job!

Pete Warden12:07 PM
Re: TensorFlow 2.0 - I honestly don't know, but we did just publish a release candidate I believe, so it won't be long

pt12:08 PM
@Christopher Bero while @Daniel Situnayake is answering, here are 2 example, one for the pi 4 and one for a samd https://learn.adafruit.com/running-tensorflow-lite-on-the-raspberry-pi-4?view=all & https://learn.adafruit.com/tensorflow-lite-for-microcontrollers-kit-quickstart?view=all

happyday.mjohnson12:08 PM
i'm interested in fall detection for my (very elderly) parents.

Andres Manjarres12:08 PM
Okey thanks!!!

Don't forget to tune into the livestream everyone:

Christopher Bero12:08 PM
Thanks!

jcradford joined  the room.12:09 PM

Pete Warden12:09 PM
Re: fall detection - We're working on an accelerometer-based example for gesture recognition, and hope to release that soon, it might be a good starting point

berg.erik joined  the room.12:10 PM

PaulG12:10 PM
@happyday.mjohnson I'm interested in home electricity consumption patterns for older people too, ditto water usage. (eg nobody used the toilet today)

Pobzeb joined  the room.12:10 PM

happyday.mjohnson12:10 PM
thank you. i'm interested also in the preprocessing. For example, one bunch-oh-data for human activity w/ accel used 4th order Butterworth Filter...whatever that means...

pt12:10 PM
@Arthur Lorencini Bergamaschi for us, we can do non-net connected voice recognition for microcontrollers, for example - we made a low cost example that can control a servo to move up ro down, based on voice only, good start for folks with mobility issues

Arthur Lorencini Bergamaschi12:10 PM
@pt and @limor thanks for answering!

Duncan12:10 PM
@Pete Warden Accelerometer only detection would be impressive. I've found that a gyro adds a lot of valuable signal with that type of task.

Daniel Situnayake12:11 PM
OK, so for a brief overview of ML on microcontrollers, I guess we should introduce ML first?

pt12:11 PM
here is the demo for that -

limor12:11 PM
@Arthur Lorencini Bergamaschi using TFlite means you dont have to do a lot of the optiizations required for heuristic-based pattern recognition

Russell Dicken joined  the room.12:11 PM

happyday.mjohnson12:11 PM
so there is preprocessing, training, making the model...maintaining...all these steps. NOt sure what goes on cloud what goes in edge. e.g.: I plop a acell/gyro on my parents....somehow get the readings...then do analysis in-dah-cloud? Then "compile" for edge?

Russell Dicken12:11 PM
I'm wondering, how much further development do Google see for Tensorflow.js? MCUs can use that easily with a web server.

Daniel Situnayake12:11 PM
ML is the idea that you can "train" a piece of software to output predictions based on data. You do this by feeding it data, along with the output you'd like it to produce.

Pete Warden12:12 PM
@Duncan - definitely, we're trying to keep it very simple so that you don't need an IMU, but so far using just accelerometer data seems to be effective for our use case

limor12:12 PM
instead you can take advantage of the optimizations that ARM & google have done, and you dont need to re-invent the matching algorithms - you just need to turn your model

happyday.mjohnson12:12 PM
what about keras? It seems far easier to me than TF?

Daniel Situnayake12:12 PM
2: Once your model is trained, you can feed in new data that it hasn't seen before, and it will output predictions that are hopefully somewhat accurate!

Pete Warden12:12 PM
@happyday.mjohnson we love Keras, and TF 2.0 is based around it as an API

Sébastien Vézina12:12 PM
Can you run Keras on micro/circuitpython?

Tara joined  the room.12:13 PM

happyday.mjohnson12:13 PM
yah - keras on micro/cp?

Dick Brooks12:13 PM
Question for LadyAda: What software and hardware was used to build the demo and how many people hours did it take to reach a stable state?

Ken joined  the room.12:13 PM

limor12:13 PM
right now the code for TF Lite on microntrollers is super streamlined

Daniel Situnayake12:13 PM
3: This second part, making predictions, is called inference. When we talk about ML on microcontrollers, we're generally talking about running the second part on microcontrollers, not the training part. This is because training takes a lot of computation, and only needs to happen once, so it's better to do it on a more powerful device

Christopher Bero12:13 PM
Ah, OK that makes sense.

george joined  the room.12:14 PM

Endgam3r joined  the room.12:14 PM

Sébastien Vézina12:14 PM
You guys think TPUs will make continuous training on the edge possible?

Pete Warden12:14 PM
@happyday.mjohnson @Sébastien Vézina we're focused on running models that have already been trained on MCUs, and Keras is all about training, so Keras isn't a great fit for our embedded use cases

Daniel Situnayake12:14 PM
4: So, why would we want to run inference on microcontrollers?! "Traditionally", meaning the last few years (since this stuff is all pretty new), ML inference has been done on back-end servers, which are big and powerful and can run models and make predictions really fast

limor12:14 PM
@Dick Brooks to get the micro speech demos working took maybe 2 weekends of work, about 20 hours total - but it was in a much less stable state. now the code is nearly ready for release so ya dont have to relearn all the stuff i did. also, i wrote a guide on what i learned on training new speech models

happyday.mjohnson12:14 PM
@daniel that makes sense. I hope it is as easy as going from making model on desktop and pushing a "compile" run/restart to go to CP?

@Daniel Situnayake - so how compact can the trained models be? In terms of storage, processing power needed, etc.

pt12:14 PM
(chuck from youtube asks) "Are you limiting yourself to only neural networks? ML methods based on decision trees like Haar Classifiers might be more appropiate for microcontrollers"

Daniel Situnayake12:15 PM
5: But this means that if you want to make a prediction on some data, you have to send that data to your server. This involves an internet connection, and has big implications around privacy, security, and latency. It also requires a lot of power!

pt12:15 PM
from discord "can I use my Google AIY Vision setup for this?"

somenice joined  the room.12:15 PM

Endgam3r12:15 PM
Hello!

happyday.mjohnson12:16 PM
please lemme know when there is a tutorial/stuff on accell/gyro human activity recognition....

Matteo Borri12:16 PM
we did some stuff with a parallax propeller for this, but it was in like 2009. GA generating a learning capable proggie

happyday.mjohnson12:17 PM
or perhaps human activity via processing videos (like when a "smart camera" is watching my mother and it notifies me she fell.

Andres Manjarres12:17 PM
Did you make the training phase in the microcontroller?

Pete Warden12:17 PM
Re: Neural-networks only? We're fans of other types of ML methods (this paper from Arm on using Bonsai Trees is great for example https://www.sysml.cc/doc/2019/107.pdf ) but for now we have our hands full with NN approaches, so that's what we're focused on

Daniel Situnayake12:17 PM
6: If we can run inference on the device that captures the data, we avoid all these issues! If the data never leaves the device, it doesn't need a network connection, there's no potential for privacy violations, it's not going to get hacked, and there's no need to waste energy sending stuff over a network

There is a scroll bar over to the right - it only appears when you're over it

Darryl N12:17 PM
its hard for a computer to wreck a nice beach. plus, auto car wreck is sum thymes a problem

limor12:17 PM
for chuck's questions, we're using NN here because it lets us take advantage of the huge resources available with tensorflow! you can do tests on cpython to tune things and them deploy to a smaller device

pt12:18 PM
@happyday.mjohnson we can show people detection now on vid... it's just a pi 4 too...

Daniel Situnayake12:18 PM
7: But on tiny devices like microcontrollers, there's not a lot of memory or CPU compared to a big back-end server, so we have to train tiny models that fit comfortably in that environment

Prof. Fartsparkle12:18 PM
have you tried tiny-yolov3 yet on the Pi 4?

happyday.mjohnson12:18 PM
@pt i saw image detection (cat, parrot..) but not movement detection (ooh - looky - the parrot fell off it's perch and is rolling around the floor...).

Daniel Situnayake12:19 PM
8: Despite the need for tiny models, it's possible to do some pretty amazing stuff with all types of sensor input: think image, audio, and time-series data captured from all sorts of different places

jhonattanrivera joined  the room.12:19 PM

somenice12:19 PM
Besides the great Adafruit resources where can you find other available micro-ML models?

Daniel Situnayake12:19 PM
9: Hopefully that's a useful intro!! I will now stop going on and on and answer some questions :)

PaulG12:19 PM
@happyday.mjohnson check what Bristol university are doing re: fall detection https://www.irc-sphere.ac.uk/100-homes-studyhttps://www.irc-sphere.ac.uk/100-homes-study

happyday.mjohnson12:19 PM
@daniel i think of those possibilities....i just don't understand the preprocessing -> training -> model building -> compiling into CP...

Christopher Bero12:19 PM
@Daniel Situnayake Awesome, thank you!

pt12:19 PM
@happyday.mjohnson yah, that is possible, could look for parrot and then look for movement

limor12:20 PM
yes the google AIY vision uses the movidius accelerator and a pi zero - totally will work with tensorflow lite. we wanted to experiment with non-aceelerated raspi 4's cause they're available, low cost, and we think that these SBC will only get faster!

Daniel Situnayake12:20 PM
@somenice we're busy adding examples to the TensorFlow Lite repo, you can see them here at http://github.com/tensorflow/tensorflow/blob/master/tensorflow/lite/experimental/micro/examples/

happyday.mjohnson12:20 PM
@PaulG - thank you.

Is it possible to train for particular people? IOW, differentiate between pt and limor when you're in frame together?

Duncan12:20 PM
@Daniel Situnayake @Pete Warden How tiny are the models in kB? What is the inference latency on a SAMD level mcu for something like gesture recognition?

Tara12:20 PM
2 Questions:

1. Is this only targeted for Cortex M boards?

2. Will there be any additional guidance on setting up MCUs that are not already available/vetted on TFLite repo or in book. A coworker and I set up the TFLite for Azure Sphere and we got stuck on changes needed for the linker. If we didn't have the ability to ping a guy on Sphere team we wouldn't have made progress. I would love guidance on trying this on different chipsets.

i.e., not just "person"

Daniel Situnayake12:20 PM
we'll soon have models and training scripts for speech hotword detection, image classification, and gestures captured with accelerometer data

john joined  the room.12:20 PM

cet joined  the room.12:20 PM

Evan Juras12:21 PM
Hey everyone - hope you don't mind if I make a shameless plug while we're here! If you're interested in running TensorFlow's Object Detection API on the Raspberry Pi to detect, identify, and locate objects from a live Picamera or webcam stream, I've made a video/written guide that gives step-by-step instructions on how to set it up. It's very useful for creating a portable, inexpensive "Smart Camera" that can detect whatever you want it to. Check it out here:

Discussions