Close
0%
0%

Lepton 3.5 Thermal Imaging Camera

Documenting my experiments with the FLIR Lepton 3.5 thermal imaging camera.

Similar projects worth following
I bought a FLIR Lepton 3.5 mounted on the Pure Engineering breakout after using a friend's thermal imaging camera to analyze heat generation on a PCB I designed. The Lepton 3.5 can output absolute temperature data (Radiometric) at 160x120 pixel resolution. I decided a thermal imaging camera would be a good tool to have and decided to build my own. I have started with a Teensy 3.2 based test platform but eventually want to turn a Beaglebone Black with 7" LCD cape into a full-featured, networked, camera using the PRUs to handle the real-time video SPI feed from the Lepton. I hope the documentation in this project is helpful to others who might also want to play with these amazing devices.

The long-term goal is to create a capable thermal imaging camera using the Beaglebone Black and a 7" LCD cape as the platform matching some of the features of high end commercial products.  However it soon became obvious that I'd need a simpler platform to learn how to use the Lepton module when I started reading the documentation and playing with the various demo codebases.  It is a very capable device with a moderately complex interface, both firmware and hardware.  Although the device has good default settings I found enabling some features wasn't well documented and the video SPI interface (VoSPI) challenging to implement due to its real-time constraints.


There are a lot of other great projects online to help get going with the FLIR sensors.  Pure Engineering is to be commended for making these devices available to makers and provides a wealth of code examples, many designed to work with the previous Lepton models.  Max Ritter's DIY Thermocam is probably the most mature and well known.  He has done a great job and I pored over his code.  Damien Walsh's Leptonic is also really well done and works with the Lepton 3.5 as well.  Both Max and Damien were very gracious when I sent them various questions.

I decided to follow Max's lead and build a test platform using a Teensy 3.5 that I had (selected for the multiple SPI interfaces and copious RAM).  Unfortunately after soldering the Teensy to a Sparkfun breakout board I stressed the processor BGA package and made the board unreliable.  So I replaced it with a Teensy 3.2 hoping it would have enough resources to successfully interface to the Lepton.  It does, barely, and the next project log describes the test platform hardware.

teensy_schematic.pdf

Test platform schematic

Adobe Portable Document Format - 26.67 kB - 07/10/2018 at 19:11

Preview
Download

View all 18 components

  • tCam video

    Dan Julio06/24/2020 at 15:58 0 comments

    I entered tCam into a contest sponsored by Tech Briefs Media Group.  Here's a video made for the entry that shows the camera in operation.

  • tCam progress and some setbacks

    Dan Julio06/23/2020 at 05:08 0 comments

    The gCore-based camera has now become tCam and has probably outgrown its gCore base.  I added code to enable Wifi and a json-based command processor for remote access and control with a command structure like FireCam.

    Two interesting side-effects surfaced.  There is a little noticeable noise in the data from the Lepton when the esp32 is acting as an access point (but not when it is client).  The LCD controller also gets confused on occasion when Wifi is enabled and the Lepton performs a FFC.  I suspect the current required on the 3.3V rail exceeds the ability of the gCore TPS63051 converter and the voltage sags enough to confuse the LCD controller.  I suspect this will be an even bigger problem when adding support to read and write data to a Micro-SD card.

    Probably will need to design a new PCB with a better power supply.

    In the meantime I also wrote the first version of the desktop application that will be used for remote viewing, analysis and file management in the camera.

    Eventually I want to support for multiple selectable sense regions and a plotting module so I can monitor temperature changes over time (e.g. monitoring several areas of PCB).

    I also added a new screen to tCam to allow setting emissivity from a list.

  • FireCAM

    Dan Julio05/22/2020 at 18:45 0 comments

    My work with the Lepton lead to an opportunity to design a timelapse camera for a scientist working for the U.S. Forest Service Southern Research Station.  He plans to use the radiometric (temperature) data from the camera to study forest fires.  He also requires visual image data.  The result is a device called FireCAM based around the Lepton 3.5, an ArduCAM 2MP, ESP32 and GUI using LittlevGL.  The project also includes a simple utility program for Mac or Windows to control the camera via a WiFi interface.  Because it is a government project I was able to release the complete design as open-source.  It can be round on github.  Images are stored as json-formatted files (or sent via the wireless interface) which leads to all kinds of interesting possibilities I think.

  • A change in plans

    Dan Julio05/09/2020 at 20:11 0 comments

    My poor thermal imaging camera has been neglected while life went on and my time was spent elsewhere.  I even designed a thermal imaging time-lapse camera for a client was based on the ESP32.  As I worked on that project I became more disillusioned with my Pocketbeagle linux camera solution, primarily because of the long boot time for linux.  In addition the PRU pipeline would have to be reworked to support 16-bit radiometric data (it currently only supports 8-bit AGC data) and the 320x240 pixel LCD display seemed constraining.  I wanted to display a reasonable size image along with some controls.  The easiest size image to make is 320x240 because it requires a simple doubling of the Lepton's 160x120 pixel output.  However this leaves no space for controls unless they sit on top of the image.

    Long story short, I ended up deciding to remake the camera using an ESP32.  To that end I designed an ESP32 development board called gCore (documented here) that uses a 480x320 pixel display and started work on a new camera design with the Esspressif ESP-IDF and LittlevGL.

    The hardware is pretty simple.  A Lepton 3.5 connected to gCore via a swivel mount comprised of two simple 3D printed mounts and a screw, and a Maxim/Dallas DS3232 RTC clock with 236 bytes of SRAM for settings and parameter storage.  The RTC is powered directly from the LiPo batteries using its own ultra-low quiescent current LDO and communicates over I2C.

    Firmware is a work-in progress.  At the moment the camera functionality is pretty much done.  It supports both radiometric and AGC modes.  The radiometric mode allows extraction of image temperature anywhere.  AGC can be used for higher image quality.  There's a spot meter and a bunch of palettes.  The Lepton's image can be smoothly interpolated up to 320x240 pixels.  I plan to add the ability to store images or videos on an MicroSD card.  However instead of using an image format, such as png or jpg, I will store json-formatted files with the raw radiometric and telemetry data from the Lepton and other camera meta-data.  This will allow generating images or performing thermal analysis on images later.  I also plan to add a wifi-based command interface for talking to an application for display, control and analysis of images.  I haven't figured out how to do this yet but I'd like the application to be able to further calibrate the camera for more accurate readings (Flir claims at best +/- 5C accuracy with a stock Lepton).

    I haven't completely forgotten the Pocketbeagle camera.  I think I'll turn it into a web cam with a basic interface on the LCD and remote viewing through a web interface.  Not a huge amount of code I hope.

  • littlevgl GUI running

    Dan Julio07/26/2019 at 04:59 0 comments

    I want to see if the littlevgl GUI library can be used on my camera to provide a GUI on top-of and around the video stream.  One great thing about this library is a well-defined hardware abstraction layer that already includes support for the linux frame buffer and linux touchscreen event driver.  I already had the frame buffer running on my LCD module and I finally figured out how to use the LCD's resistive touch controller with new device tree entries.  This allowed me to quickly get the littlevgl demo program running.

    littlevgl also has a direct SPI-based driver for the TSC2046 resistive touch controller IC that I originally thought I might use.  Typically this would be used on a bare-metal embedded system (such as an Arduino or Teensy).  However as I researched it became clear that using the built-in linux touchscreen driver would be superior because I could make use of tslib's ability to provide calibration and de-jitter for the resistive touchscreen.  I did have to make a hardware change and add the pendown IRQ to a GPIO (GPIO27/P2.19) on the Pocketbeagle so my wiring diagram is updated.

    Architecturally the system looks as follows.  The tslib utility program ts_calibrate is run before the application to generate the pointercal file used subsequently by the daemon.

    The resistive touch screen works fairly well and will probably be sufficient for my application but we are spoiled by our capacitive touch screens and using something like the Adafruit Cap Touch LCD Module would eliminate the need for tslib and provide much better control at a substantial increase in cost (USD14 -> USD45).

    The updated device tree files, uEnv.txt and configured littlevgl library and demo are available on github.  I also got this running on a Beaglebone Black.  Even if you never build this camera perhaps the ability to have a sophisticated GUI running on a cheap LCD without needing the entire window system might be useful.

  • Pocketbeagle Camera Design

    Dan Julio06/07/2019 at 20:44 0 comments

    This poor project has languished too long.  I apologize if anyone has been waiting for the info I promised months ago.  I have actually used the camera but the documentation was waiting to be pulled together.  As well as quashing a niggling bug in the PRU VoSPI code that prevented it from properly syncing with the Lepton on occasion.  Over the last couple of weeks I finally had some spare time and updated my github repository with a detailed description of the camera, the modifications to the Solar Pi Platter, enclosure design and initial software.  Please head over there to see a bunch of new info in the "pocketgbeagle" directory describing the camera build.

    Next up will be to start on what I hope to be the real application software for the camera including a touch interface for the LCD display, and remote access through a web browser as well as custom desktop application.  I'm going to try to make a modular architecture so that parts of the code can be used on existing Beaglebone and Pocketbeagle computers with the Lepton 3.5 without having to build a bunch of hardware if one doesn't want too - for example to make a thermal-imaging webcam.

    Here's the schematic of the camera.

    And an exploded view of the openSCAD designs for the 3D printed case.

    The existing program, pru_rpmsg_fb, used to display the video stream on the LCD display and necessary support files for the Pocketbeagle are also in the respository.

  • An external PMIC makes for a boxy camera

    Dan Julio02/19/2019 at 05:22 0 comments

    But I have a camera now!  After the disappointment with the Pocketbeagle's built-in power-management system I decided to just hack my own Solar Pi Platter to act as a battery charger, power supply, RTC and USB Hub and then 3D printed a case.  After a lot of spray painting I now have a brick of a camera!

    This is a quick update.  In case anyone is interested I will shortly update github with the schematic, new code for the Pi Platter and the OpenSCAD/STL files for the enclosure.  Currently it's just running the existing PRU code ported to the Pocketbeagle.

    I had one huge panic.  I was manipulating the PCBs and bumped the Lepton on the desk.  The shutter came apart and I almost did too.  Fortunately I didn't lose any small parts.  But it didn't go back together in a functional state.  The shutter stuck whenever it closed.  After contemplating another $300 I started hacking.  Finally removing an internal shield and taping the whole thing back together yielded a functional part again.  At least for now.

  • Beaglebone PMIC - so close and yet so far

    Dan Julio12/21/2018 at 04:57 1 comment

    The Beaglebone boards have an additional capability, aside from the PRUs, that differentiate them from many other SBCs like the Raspberry Pi.  They have a true power-management controller (PMIC) that generates all the various on-board DC voltages and supports dual input voltage sources (DC in and USB in) as well as a 3.7v LiPo or Li-Ion battery including a charger.  In addition the PMIC, a TI TPS65217 designed specifically for the AM355x processor, can be configured to supply power to the AM355x built-in real-time clock while the system is powered down and sports a power button that interacts with the kernel for controlled shut down - features that are perfect for a portable device like a thermal imaging camera.

    Or at least in theory.  It turns out that a hardware bug in the first revision AM335x processor and decisions about how to wire the PMIC to the processor in the original Beaglebone Black made it so the "RTC-only" mode would never work, and under certain circumstances could actually damage the processor (a current leakage path through the processor to the USB serial interface).  However based on a lot of reading of various specs and errata I ascertained that the pocketbeagle shouldn't be at risk so I was excited to see how it ran on battery power (the battery signals are helpfully broken out).

    I connected an old camera battery to the pocketbeagle and added a 5V step-up boost converter to power a USB WiFi dongle attached to the pocketbeagle's host USB port.  The PMIC wants a temperature sensor in the battery back and the camera battery had the right kind (10K).  Unfortunately the PMIC doesn't generate a 5V out when running on the battery so the boost converter was necessary.  I noticed that the 3.3V output was only powered when the pocketbeagle was turned on so it sources the boost converter.  Success!  Or at least I thought at first.  The pocketbeagle ran happily from the battery with a reasonable current draw (< 200 mA w/ WiFi).  And it charged the battery too when plugged in via the USB Micro-B interface - at least while the system was running.  The problem came when the system was shut down.  Charging stopped.  This is hardly ideal for a portable device as not only do we expect them to charge while off but turning the system off makes more power available for battery charging.

    The TI PMIC has the capability to charge the battery when its DC output supplies are disabled (system off) but clearly something was disabling that when the system was shut down.  I dug into kernel source with help from a friend at my hackerspace and finally figured out the sequence of events that prevented charging when the system was shut down.  It turns out that the device driver for the PMIC (tps65217.c) configures the PMIC to enter "off" state when powered down instead of "sleep" state which would let it charge the battery.  A fuller explanation is documented in the pocketbeagle's google group.

    I thought this was good news because the driver configures the PMIC during probe at boot time and the PMIC can be reconfigured via its I2C bus after the system boots.  I tried it.  Success again - kind of.  The battery charged after the system shut down.  But there was also a ~15 mA load on the battery from the switched-off system - enough of a load to discharge a battery pretty quickly if the system was disconnected from the charger.  Unfortunately the pocketbeagle uses an Octavo system-in-a-package and I couldn't understand what could be causing the additional discharge.  Fortunately Octavo replied to a question on their support forum.  When the PMIC is configured to "sleep" mode enabling battery charging, it also enables one LDO regulator that is designed to feed the RTC input on the processor.  Octavo also connected another power-rail (plus an enable to the 3.3V LDO) to that output per TI's spec and...

    Read more »

  • Finally! PRUs in the picture

    Dan Julio12/07/2018 at 04:54 0 comments

    My post-holiday obsession continued until I am - finally - reading data from the Lepton using both PRUs in a Beaglebone Black.  Given how - relatively - easy it is to use the PRUs I have to confess I pored over a lot of web postings before figuring it out.  I'm not even the first person to document using the PRUs to read a Lepton camera.  That honor, as near as I can tell, goes to  Mikhail Zemlyanukha who used a PRU and a custom kernel driver to get image data on a 4.9 kernel system.  Unfortunately as I found out, programming PRUs is an evolving paradigm and what worked in the past, including Mikhail's code and old methods such the UIO interface no longer work on current kernels.  Finally I found Andrew Wright's tutorials and started reading TI's remote processor and rpmsg framework source and started to get code running on the PRUs.

    Although it was a slog, I have been converted.  I think the real-time possibilities offered by the PRUs in close cooperation with the Linux system are amazing.  The PRU is my current favorite embedded micro-controller because it has easy access to an entire Linux system without the baggage of any OS - and on something as small and cheap as the pocketbeagle.

    In case the following looks TL,DR; the code is the github repository.

    Failed First Attempt

    I was daunted trying to get Mikhail's kernel driver running on my 4.14 system but understood his use of one PRU to capture packets and send them upward to user-land.  I also successfully built and ran the rpmsg "hello" demos.  The rpmsg system is built on top of the kernel's virtio framework to allow user-land and kernel processes to talk to remote devices (e.g. embedded cores or co-processors).  TI has adopted it as the "official" mechanism for their OMAP processors to talk to the on-board co-processors (including the PRUs and the power-management ARM core).  It is probably used in every smart phone as I found Qualcomm's contributions to the source.  The kernel's rpmsg driver makes a co-processor available as a simple character device file that user-land processes can read or write just like any other character device.

    So I put together a simple program that got non-discard packets from the PRU and sent them to the kernel using rpmsg.  The PRU bit-banged a simulated SPI interface at about 16 MHz, buffered one packet's worth of data (164 bytes) and then copied it to kernel space via rpmsg.  My idea was to essentially replace the calls to the SPI driver in earlier programs with a call to the rpmsg driver to get SPI packets through it via the PRU.  I did have the sense to try to filter out discard packets but still, BOOM.  My BBB was brought to its knees by a message from the PRU about every 95 uSec (basic packet rate at ~16 MHz SPI + a very quick buffer copy - the PRUs are excellent at pushing data to main system memory).  They system was 100% pegged, unresponsive and my application seemed to be getting about 1 out of 100 packets.

    I didn't know it at the time but I was way overrunning the capability of the rpmsg facility and the kernel was bogging down trying to write an error message for each rpmsg from my PRU (that was overflowing its virtio queues) into several log files.  I saw later the hundred megabytes of log files that had accumulated.  The poor micro-SD card.

    Taming rpmsg

    Clearly I had to reduce the frequency at which messages were sent to the kernel driver to deal with - and also increase the amount of data sent at a time.  Quickly I found that the maximum rpmsg message size is 512 bytes, of which 16 bytes are used for message overhead.  It took a long time - this stuff doesn't seem to be documented anywhere - to understand that the kernel could manage a maximum of 32 entries in a queue for messages for one rpmsg "channel".  At least I had some parameters to work with.  The Lepton is...

    Read more »

  • [Finally] Playing with Beaglebone Black

    Dan Julio11/24/2018 at 17:34 0 comments

    The American Thanksgiving holiday finally gave me some time to attend to this project again.  I even planned to use the Teensy version of the camera to look at my daughter's house to help her and her husband identify where it was losing heat but sadly I left it at home in a rush to get out the door for the holiday.

    My end goal has been to make a wifi-enabled camera using the Beaglebone Black with one or both of its PRUs handling the Lepton VoSPI data stream to get the full 9 FPS out of the camera and be able to view it and access photos remotely.  I looked into using the ESP32 like Damien Walsh did and although it is an incredibly capable embedded system, I ultimately decided that I want a full Linux environment to build the camera application on.   Experiments with the Raspberry Pi show that it's hard for a user process to keep up with the VoSPI data flow - although as someone from my Makerspace pointed out, it's quite possible that applying the Linux real-time patches might make that feasible (something to be investigated at a future point).  The PRU subsystem was the first solution that occurred to me but I also investigated writing a kernel driver or seeing if the v4l2 project supported the lepton.   All of the solutions are intimidating to me because of the need to dive into some moderately complex low-level linux programming so it's been easy to put off getting started.

    I have an old 7" 4D-systems Beaglebone Black touch LCD cape that I thought I'd use for the camera display but over time have decided it's too bulky and power hungry.  I also have become interested in the Pocketbeagle board because of its small size and lower power consumption - characteristics better suited for a portable camera.  This lead to an investigation of using the Linux framebuffer driver and one of the ubiquitous ILI9341-based small LCDs using a SPI interface.  Because I also want to support another camera that also requires an SPI interface (for near IR - or night vision) I finally made the commitment to using the PRUs for the Lepton (over a low-latency software solution) because I can dedicate the two built-in SPI interfaces on the Pocketbeagle to the LCD and Arducam near IR camera.

    Making that commitment, and the time afforded by the holiday, finally got me moving on building a prototype.

    The first prototype simply connects the LCD to SPI0.0 and the Lepton to SPI1.0.  I am taking things one step at a time, first getting the Linux FB talking to the LCD and then getting some user-space code talking to the Lepton.  About twelve hours of hacking later (most spent figuring out how use the Device Tree system on the Beaglebone Black to configure IO and enable the frame buffer) yielded a hacked version of leptonic directly driving the frame buffer.  It works pretty well.  I'm not sure what the frame rate is but it's definitely higher than the 4.5 FPS I get on the Teensy 3.2 test platform.  It also occasionally stutters and gets confused reading the VoSPI datastream resulting in a garbage display for a frame.  The hacked code is pretty ugly - near midnight I was in full-on hack-hack-hack mode - so I'm not sure I'll post it (although I'm happy to share if you'd like a copy).  There's also a version that can send packets over the ZMQ network interface.  However I'm posting relevant parts of the /boot/uEnv.txt file below and a link to the Device Tree source file that finally worked to make my LCD a display (thanks Bruce!).  I modified the dts file to fit my rotation (90°) and lowered the frame rate some (since my practical limit is 9 FPS).

    The following uEnv.txt changes will make sense if you've messed with the Beaglebones much and maybe they'll help someone trying to get an ILI9341-based LCD to work as a display.

    #uboot_overlay_addr4=/lib/firmware/.dtbo
    uboot_overlay_addr4=/lib/firmware/BB-SPIDEV1-00A0.dtbo
    #uboot_overlay_addr5=/lib/firmware/.dtbo
    ...
    Read more »

View all 18 project logs

Enjoy this project?

Share

Discussions

Richard wrote 05/30/2020 at 16:15 point

Hi Dan,

According to Flir's datasheets, the Lepton uses 3.0V for IO functions, but the Teensy's on board Vreg runs at 3.3V. Did you do any level translation on the VOSPI and I2C interfaces or modify the Teensy to run at 3.0V, or just put 3.3V on the buses? I'm about to start a Lepton project of my own and want to be sure I don't kill my expensive sensor!

Thanks!

  Are you sure? yes | no

Dan Julio wrote 05/30/2020 at 16:49 point

Hey Richard,

You're right, they specify a 2.8-3.1 volt VDDIO.  But they also specify that the maximum on an IO pin is the lesser of VDDIO + 0.6 volts or 4.8 volts (table 18 - Absolute Maximum Ratings).  The Pure Thermal breakout boards have a 2.8 volt regulator feeding VDDIO so the maximum input voltage should be 3.4 volts.

To be honest I didn't pay attention to this spec when I first started.  I saw that the Pure Thermal boards were used with many different 3.3V SBCs so I jumped right in.  It wasn't until later I saw the specs you are referring to.

Although it doesn't seem like good engineering practice, I have run several Lepton 3.5 modules for long periods of time and this does not seem to change their functionality.  One could design their own breakout board for the Lepton and provide 3.1 volts to VDDIO which would lessen the over-voltage condition even more or provide resistor dividers on the SCLK and CSN signals (and pull SDA/SCL to the VDDIO rail).  As I work on what I hope is my final thermal imaging camera I am contemplating a full custom PCB (eliminating the breakout boards) and this is probably what I'll do, especially after your timely email :-)

  Are you sure? yes | no

Avi Cohen wrote 11/14/2019 at 11:45 point

Hi Dan, 

Is the code open to download?

  Are you sure? yes | no

Dan Julio wrote 11/14/2019 at 17:09 point

Hey Avi, 

yes, it's in the github repository.

https://github.com/danjulio/lepton

  Are you sure? yes | no

Sophi Kravitz wrote 07/10/2018 at 17:21 point

hi Dan! This is a cool project! This is happening pretty soon: https://hackaday.io/contest/159585-battery-powered-radiometric-lepton-contest

  Are you sure? yes | no

Dan Julio wrote 07/10/2018 at 18:09 point

Hi Sophi!  I saw that.  I'm not sure I'll have anything for the Lepton 2.5 but I will contribute a port of FLIR's LeptonSDKEmb32OEM library for the Teensy which might be applicable to the ESP32.

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates