Close
0%
0%

Machine Eye

A low cost smart camera using cognitive programming to see more than what we can perceive. Inspired by RoboCop !

Similar projects worth following
The goal of the project is to use the Raspberry Pi computer and camera to capture more than the human eye and present some out of this world data. Heading picture is my best pal - Mac the Dog.

Details

  • Preliminary Testing of Camera

    Brenda Armour02/05/2017 at 14:48 0 comments

    It is time to become familiar with the camera and all the options. I will be hacking the code to make this camera smarter than my interpretation of what I see in a picture. I have been egged on by friends to add voice to the camera. Not sure if this is in scope but intrigued by the idea. I'm going to use Putty to enter commands. I need the IP address of the PI. So I'm using Ctrl+Alt+F1 on my keyboard to see the command line on the Touchscreen.

    Use the command ifconfig to get your IP address:

    ifconfig

    Now I can open Putty and invoke the camera for testing.

    cd adafruit-pi-cam-master
     sudo python cam.py

    The screen should boot to a live preview of what the camera is seeing. Okay that is my duck on the power supply.

    To take a picture you just have to touch the screen.

  • Kit has arrived and Assembly Completed

    Brenda Armour02/04/2017 at 14:23 0 comments

    After a delay I received the kit on Wednesday. The kit exceeded my expectations and I was able to put the camera together and install the software in one night. Below is a picture of the box. It also came with a case , face plate and power supply.

    I already had a Raspberry Pi 3 but it jumped off the table and the SD card holder broke beyond repair. (RIP) . I have several cameras but this one is V2 and 8 Megapixel ,1080 !

    Below is a picture of my assembled camera. I am using Version 1 of the Raspberry Pi . I have a sad history of breaking things and will reserve the new camera for final testing. I also have the case for the camera to protect it from me. The final setup will probably look different but for now happy with the on board WIFI and Bluetooth that comes with the Pi 3.

    Software installation went really smooth. No major errors or issues at all. I followed Adafruit's comprehensive guide found here. The link should open in a new tab. The type of screen that was sent to me was clearly stated on the box. This is important for the image you download.

    I used the Easy Install method because the complex programming is still ahead of me . The image did boot nicely but for some reason the camera and Dropbox did not install. No problem as the code is included in the guide.

    Now my friend laughed when he saw how small the screen was. How are you going to program with that ? Well most of us have a wireless keyboard and mouse. So if you place the dongle in the PI you can use your mouse and keyboard or you can also use the terminal to program. I love using Putty as a terminal. It is free and easy to connect to the Pi on my laptop. You can boot to the command line on the touchscreen by using Ctrl+Alt+F1.

    Dropbox will send the pictures to the cloud and the pictures will be accessible on my laptop and Blackberry. This is a free app and Adafruit details how to set up a Dropbox account and program your Pi.

    I am happy that I did take time to setup a prototype for the API I will be using. This is my first time using a touchscreen on a Pi but the resolution is awesome. Before I start hacking the camera code I need to do a bit of testing for the camera.

  • The Delivery Machine entered my Postal Code wrong

    Brenda Armour01/27/2017 at 20:11 0 comments

    Looks like they entered one digit incorrectly in my postal code. So after some considerable time on the phone we finally figured out where the mystery shipping address came from. The delivery machine failed :C but now I should receive the kit on Monday or Tuesday. So I'm doing some prep work to get the code set up. The Piface Display was a awesome first start but I want to modify the code to use on a Touch Screen for more robust output . The programming so far has been Python 3. I'll upload the Piface Display version on GitHub. If I have time I may tweak it for a remote control to activate a picture. At this point I'm just running the code I have dragged numerous toys in front of the camera and will be targeting humans, animal companions :) and outside scenery soon. Okay last video of my toys parading in front of the camera. I'm giving this effort 9/10. The machine thought my lamp bottom was a vase but was bang on with the sitting cat.

    My initial setup follows the excellent learning website by Adafruit which can be found here. I have downloaded the image and will continue to setup my SD card for the display.

  • First Prototype

    Brenda Armour01/26/2017 at 21:37 0 comments

    I am waiting my kit from Arrow for a more advanced camera setup. I am very pleased with the initial results with my Raspberry Pi 1 and PiFace Display. Okay it was correct about "sitting on a table" but decided to name the fish as a "parrot". Then my machine decided this was a blurry picture of a bird feeder ? Must train the machine !

View all 4 project logs

Enjoy this project?

Share

Discussions

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates