ÓSK Squirrel

An Open Smart Kitchen (OSK) assistant for saving resources and meeting dietary goals by better utilizing the food you squirrel away

Similar projects worth following
Whether squirrel or human, the food cycle of foraging, stashing, preparing, consuming, sharing and foraging again is a complicated, time-consuming process. Coordinating with the whole family makes for some dramatic chatter.

There needs to be a "hub" that can process data locally that gives users control and respects user privacy. An open solution needs to be created before big business dominates the kitchen. We are working on a software framework that will run locally on a smart kitchen gadget. While we will provide an example hardware gadget, we hope others can use this framework to hack the kitchen with networked devices from the pantry to the fridge.

We created a Swift web service with an API for a characteristic kitchen gadget and are working on adding computer vision and machine learning. Our work continues as we refine our minimum viable product (currently a fruit basket) to demonstrate the usefulness of open-source hardware in the kitchen.


The food cycle consumes our time and attention across a range of touch points. This food cycle  directly impacts how we plan what to eat, where we purchase food, how we store items, how we share our food, and our health and well-being.

The challenge is to provide a hub which enables users to better manage the food in their lives.  This hub is a physical device that hosts a framework for managing the food cycle. We will work on a nominal hardware system and create a software framework that can integrate with other pantry inventory management and recipe software.


We aim to create an example of a minimally viable product that serves as a host device for a lightweight, open source framework for kitchen IoT devices.  We want our device to serve as an example of what could be created and used as a template for other devices.  Our key contribution is actually the software framework that will tie different projects together.  We don't want to recreate another load cell based smart container or another app for grocery lists (though these are cool and useful--- we just want to connect them all together to be more useful!).  We want to create a common interface layer for these smart kitchen products to tie them all together in a useful system.

Forming a bridge between sensors and advanced computing algorithms to connect with users must be accomplished in a private, convenient way. Any such approach needs to provide practical solutions to food lifecycle management. Part of the challenge is to explore which emerging sensors and computer technologies can aid the process.


  • Any “solution” that requires changing user habits or enforcing a specific routine is not a solution. Flexible and adoptable systems are necessary to satisfy the needs of different people. 
  • While a hackable API should be made available for advanced users, the basic device should improve the life of a neophyte (a technologically challenged squirrel).
  • The kitchen is a complicated and dynamic environment with a scurry (multiple) of users, thus information should be shareable.
  • Products should be designed to give users control over their own data and objectives.


Often technology which simplifies the grocery shopping process is created by businesses who gather user data. While some businesses strive to provide convenient shopping list building and inventory tracking applications, they really profit off of  user data.  Such data gathered about a user's behavior patterns is used to maximize the business's profits, rather than genuinely improve the user's quality of life and related objectives. (Convincing you to buy another donut may be profitable for them, but not good for you!)


Create a “hub” that can process product data locally that gives users control and respects their privacy.  An open solution needs to be created before big business dominates the kitchen.

For example, the Nvidia Jetson Nano is a candidate for affordable, local, high end data capture and processing.  Such a small, energy efficient GPU-enable platform could allow for the data to be kept in a private network without a requirement to use a remote server for data processing.

Our focus is on creating a scalable architecture for smart kitchen appliances where each kitchen gadget (with its own unique set of devices) can communicate with other gadgets and make life easier....

Read more »

SquirrelAssembly v10.f3z

This is the Fusion 360 CAD file for our second prototype

f3z - 4.43 MB - 08/25/2019 at 05:29


View all 8 components

  • Technical Log: Computer Vision and Machine Learning

    mepix08/25/2019 at 06:43 0 comments

    Since we are using the Jetson Nano as our primary board, we wanted to explore what could be done with the GPU.  We started our explorations with OpenCV and tried a little bit of machine learning with TensorFlow.

    Though we knew that our needs would likely be best served with a neural network to classify different food items, we started with the OpenCV fundamentals.  We wrote a basic image segmentation algorithm using a background subtraction technique included in OpenCV.  This algorithm is sufficient to isolate objects placed on a shelf or in a fruit basket.

    OSK-SegmentationTest from Mepix Bellcamp on Vimeo.

    Next, we began exploring the ways we could improve food identification using TensorFlow. TensorFlow is an open source framework developed by Google for creating Deep Learning models and numerical computation.  TensorFlow is Python-friendly and makes machine learning easy.

    TensorFlow allows the development of graphs which are structures that describe how data moves through a series of processing nodes. We used the Fruit 360 dataset to create a TensorFlow trained model to identify fruit.  Code was based on a file found here.  The code to create the model and graph and code to freeze the model into .pb and .pbtxt formats is on GitHub

    While we were successful at training the test dataset on our x86 computers, we were having some difficulties getting TensorRT to work on the Nano.  Our future work will close this gap.

  • Technical Log: Swift Webserver on the Nano

    ManyHats08/25/2019 at 06:37 0 comments

    A key part of the OSK infrastructure is the embedded webservice, which allows users to check the status of the device and provides an API for gadget-to-gadget communication.  The webservice infrastructure  provides capabilities for an end user to check GUI-based web pages on either a computer or mobile device.  Different applicances can communicate using a device-to-device JSON data transfer via RESTful APIs.

    The selected Swift-base webservice, Vapor, provides the following features:

    • supports the ARM aarch64 used by the NVIDIA Nano and other embedded computers (see, buildSwiftOnArm, swift-arm64)
    • links directly with C/C++ via the LLVM toolchain
    • is designed to be memory and CPU resource efficient

    A standalone example and test for Swift/C/C++ integration is provided in OSK-Bridge and OSK-Bridge-Mock. OSK-Bridge provides integration with the OSK Gadget. Our current implementation of a characteristic OSK Gadget is our squirrel shaped fruit basket, which includes both a strain gauge weight sensor and a video camera sensor.  The OSK-Bridge-Mock is used to provide a minimal API interface to support development and test.

    The OSK-Bridge includes the device software which is independently developed in the OSK-Core. OSK-Core provides a C/C++ testbed for working directly with the sensor devices.

    Below are some screen shots of the OSK-WebUI which is hosted on the NVIDIA Jetson Nano.  (Note: the software environment is cross platform and code can be tested and validated on multiple operating systems before deployment on an embedded system.) The web pages shown below use the Bootstrap CSS/JS toolkit to provide responsive pages which dynamically scale for both mobile devices and desktop computers. Pages can be easily added and customized.


    Provides a description and a placeholder where additional information can be provided to the user.

    Fruit Basket
    Displays the current contents of the fruit basket.  This could be expanded to show additional data beyond our prototype or adapted to another gadget.

    Since we are using Bootstrap for the template, the webpage easily adapts to mobile platforms.

    Settings Page
    The Settings Page allows a user to change their device preferences.

  • Technical Log: Manufacturing

    mepix08/25/2019 at 05:57 0 comments

    After the CAD was completed, we moved forward with the construction of our second prototype since we wanted to show something a little more "thematic" for the Hackaday Prize entry! 

    We used the CAM tool in Fusion 360 to carve the squirrel to hold the bracket.  During the CNC operation, we left 1mm of material at the bottom of the cut to save the waste board on the machine. This extra material was removed with a Dremel and then sanded.  (Here is a fun tutorial for those who want to get started with CNC and CAM using Fusion 360)

    After the squirrel was carved, we had to mount it vertically on a test board.  We chose some typical pine stock from Home Depot and some angle brackets.  We used bolts to hold it together since we thought we might have to change or move some components around as we built the second prototype.

  • Technical Log: Squirreling Around with CAD

    Makayla08/19/2019 at 04:46 0 comments

    We've done rough testing setups, so then it's time to work towards making something that actually looks like... well... looks nice and isn't just scrap slapped together. We decided to go with a hanging fruit basket design to more easily incorporate a camera and to implement a self-centering of the weight to guarantee accurate readings.

    After squirreling around in Fusion 360, we decided to go with the design pictured below.  CAD rendering is here and the file is in the project files.

  • Design Thinking 4: Prototype

    ManyHats08/18/2019 at 22:49 0 comments

    The fourth step in the design thinking process is Prototype. This stage is where the team developed scaled down versions and experimented with specific features found within a potential product.  We decided to initially focus on a fruit basket that could sense when fruit like an apple or banana is removed or added to the basket and provide feedback to a server that could interact with a larger software ecosystem.

    We first tested how software interacted with the different sensors to determine which might work with the features we were considering.  Then we played around with the load cell  and developed a "scale" to determine if the load balancing cell could detect weight differentials. 

    Following the load cell tutorial at SparkFunwe assembled a testing rig out of scrap wood. To simulate a fruit basket's use and to test the calibration and data collection process, we used an apple and kitchen weights

    We changed the code to calibrate the scale and to add a ring buffer such that the data would only be returned if there's a change in weight over a fixed set of data.

    Next we built on that initial prototype. We used threaded rods to provide a breadboard area to mount the camera and the NVIDIA® Jetson Nano™ above the scale. The camera is placed to identify items below in the fruit basket.


    Load Balancer

    Scale with rods added

  • Design Thinking 3: Ideate

    ManyHats07/29/2019 at 21:45 0 comments

    The next step in design thinking is Ideation which is idea generation.  We came up with several ideas to solve the problems we identified in the Define phase.  Here are some of our ideas as depicted in these photos.


    After this session, we came up with a list of sensors to try. Some of the sensors are pictured here:

    Some of the sensors and supplies include:

    Raspberry Pi Camera Module V2
    Can be used to take photos of food as it leaves a pantry for AI or scan bar codes.

    SparkFun Qwiic HAT for Raspberry Pi
    Can be used to stack as many sensors as you’d like to create a tower of sensing power!

    SparkFun Qwiic Adapter
    Can be used to make a I2C board into a Qwiic-enabled board

    SparkFun Qwiic Cable Kit
    Who doesn't need cable supplies for hacking?!

    SparkFun Load Cell - 10kg, Straight Bar (TAL220)
    This bar load cell (sometimes called a strain gauge) has the ability translate up to 10kg of pressure into an electrical signal. SparkFun had what seemed to be a great tutorial to help learn about this load cell.

    • SparkFun Load Cell Amplifier - HX711
    This amplifier is a small breakout board that allows the user to easily read load cells to measure weight. It helps improve accuracy also.

    SparkFun Qwiic Scale - NAU7802
    This scale allows the user to easily read load cells to accurately measure the weight of an object. By connecting the board to a microcontroller, users can read the changes in the resistance of a load cell. 

    • Square Force-Sensitive Resistor
    This is a sensor that allow you to detect physical pressure, squeezing, and weight.

    • Magnetic Contact Switch (Door sensor)
    This is a sensor that is used to detect when a door or drawer is opened.

  • Design Thinking 2: Define

    ManyHats07/29/2019 at 21:37 0 comments

    The next step in design thinking is Define. We spent time trying to further define the problems in the kitchen based on the Empathize step.  Define is determining users' needs, their problem and related insights.

    Some of the things we came up with:

    • Overworked and tight-budget users need an assistant to help save money and time because they want to take advantage of sales but don't have enough time to run all over the place to a bunch of different stores.

    • Environmentally conscious chefs want ways to prevent waste of food and waste of food packaging to help them feel like they are contributing to saving the environment. 

    • Parents needs help making quick cooking decisions when they're hungry so they can avoid the pitfalls of making decision that don't meet their goals.

    • People want group collaboration/input in meal planning, help in making menu decisions to that the family can enjoy what they eat and actually be able to sit down together and eat.

    • Parents need fast and easy shopping assistance because bringing the kids to the store is difficult. Speeding up the food shopping and delivery processes is important.

    • Shoppers need notification when products are used up by others. They are not the only one consuming the product and don't necessarily know when items are consumed/emptied.

    • Cooks with restricted diets need an assistant to help stick with their diet so they can be healthy and live longer.

    • Cooks with small space need to remember what is in the pantry so they have enough of the ingredients to cook specific recipes.

    • Cooks want fast and easy meals to save money be eating out less and purchasing less prepared foods.

    Cooks need some way to track what is consumed (eaten or spoiled) so that they have up-to-date information on what's available in the pantry.

  • Technical Log: Software Structure

    ManyHats07/28/2019 at 20:28 0 comments

    As we began prototyping, we wanted to create a scalable software architecture.

    Each smart kitchen contains smart gadgets (appliances, pantries, etc.). Each smart gadget is composed of one or more devices (cameras, weight sensors, humidity sensors, etc.). These devices all store some data and can act either as an input and/or output for the gadget.  

    Different gadgets in the OSK ecosystem can talk to each other, but they cannot directly access another gadget's device. If data needs to be shared, the appropriate API should be incorporated into the host device's logic.

    This is our diagram for the gadgets.  Software development has started, but is still very minimal as we want to keep the framework flexible at this stage. More information can be found on our Github.

  • Technical Log: Pi Camera on the Jetson Nano

    ManyHats07/28/2019 at 20:21 0 comments

    A quick test to verify the view angle for the Raspberry Pi camera.

    We need a camera in our project to perform computer vision and other typical AI functionality.  Git hub link with full information here.

    Technical Parameters

    The eLinux Wiki provides the technical information about the camera, including the Angle of View: 62.2 x 48.8 degrees.

    Emperical Results

    By measuring the distance between the camera and the calibrated target, we determined that the camera has a view angle of ~45 degrees, which is close enough to verify the 48.8 degrees in the spec.

  • Design Thinking 1: Empathy

    --marc05/21/2019 at 00:39 0 comments

    As we start playing with, testing the Jetson Nano, we are also trying to understand how this technology fits in the kitchen by conducting a mini design thinking exercise.  The first step in design thinking is Empathy. We spent time understanding the different habits, attitudes, and problems experienced in the kitchen.

    Common themes were: variety, health, inspiration/recommendation, time, budget, waste, privacy, and social.

    Our three key takeaways are:

    1. While food underpins everything we do (we need energy to work, play, and create), people often dread spending excessive amount of time in the kitchen.

    2. People strive to eat and live healthy lives, but often trade a healthy diet for convenience.

    3. Though people feel bad creating extra waste (excess packaging, forgotten leftovers, and rotten fruits), chefs make choices that save time and money in the short term in an effort to reduce stress in the kitchen.

    We've included photographs from our brainstorming sessions and a text summary of our findings.  Click the Read More link below if you're interested.

    Is there anything else you think we should add?

    We would love your input.

    Read more »

View all 10 project logs

Enjoy this project?



Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates