Close
0%
0%

Hardware Data Logger

Easily extendable data logging platform featuring STM32F103RBTx, WiFi, microSD storage, LCD with four buttons, UART, and pulse counters.

Public Chat
Similar projects worth following
• Modular and Reusable Design: The mainboard features multiple connectors connected in parallel where cards are placed. These cards are used for data acquisition, handling, and storage, but other cards can be added as needed. For example, if a larger MCU is required to handle more processing, only that specific card needs to be replaced, saving time and money.
• The device uses cheap, easily accessible components and is easy to build.
• High-level logic (e.g., layouts with data presented on the LCD) can be completely developed, tested, and visualized on a PC simulation, which is a build variant.
• The development environment is containerized in Docker, meaning faster setup and no need to install tools manually.

GitHub: https://github.com/RobertGawron/HardwareDataLogger/tree/develop

Hardware

The idea is to have a mainboard with parallel slots where cards can be placed. These cards can serve various purposes, such as including microcontrollers, sensors, storage, or other functionalities as needed.

For the next version (the first to use this approach), work is being done on the following modules:

  • Main card for data processing with an STM32F103RBT6 chip.
  • A module for storing data on an SD card and transferring it via WiFi (ESP8266MOD) to other devices, such as a Raspberry Pi.
  • user communication module with an LCD and four push buttons; the LCD adjusts its brightness based on ambient light.
  • An acquisition card with four pulse counter inputs and two UART sockets.

Additionally, the mainboard exposes all STM32F103RBT6 pins, enabling the addition of new cards using GPIO, I2C, SPI, CAN, and other interfaces.

Due to the availability of low-cost, high-quality PCB manufacturing, home-etched PCBs have become largely obsolete. While the PCB design for this project may be difficult to etch at home, it is still possible. Subparts of the circuit can be assembled on a breadboard, making the process much easier, and the modular software design allows for easy reuse of these components.

Software

STM32F103RBTx

The STM32F103RBTx serves as the main microcontroller, handling data acquisition, processing, storage, and user interaction.

  • Toolchain: C++17, C, STM32 VS Code Extension, CMake.
  • More info.

ESP8266

The ESP8266 is currently used for data transfer via WiFi and will support FOTA (Firmware Over-The-Air) in the future.

  • Toolchain: TBD, currently using Arduino IDE.
  • More info.

DevOps

It's good to let the machine handle the tedious work of checking code quality, freeing up more time for the useful and interesting parts of software development.

  • Toolchain: Unit tests (Google Test, Google Mock), code coverage (LCOV), static code analysis (Cppcheck), Docker (for both local development and CI), GitHub Actions (CI).
  • More Info

Simulation

Embedded development is cool, but constantly flashing the target device for non-hardware-related logic, like the human-machine interface, can be time-consuming and frustrating. To streamline this, a simulation was developed that isolates the firmware not directly tied to hardware, adds stubs for drivers, and includes a GUI. This allows all high-level aspects, such as what’s displayed on the LCD, user interaction via buttons, and data parsing, to be tested without the need for hardware.

While this simulation handles the firmware, speed of execution isn't a concern since it focuses solely on high-level logic. For hardware or driver-related issues, traditional methods like using an oscilloscope or logic analyzer are still necessary, as the simulation cannot be used.

Below is a screenshot from the simulation. Note that this is from an earlier version when the device was designed to work exclusively with a Geiger counter. The current simulation doesn't work.

Documentation

UML diagrams were made using PlantUML.

  • Simulating STM32 and ESP8266 Firmware on a PC

    Robert Gawron12/27/2024 at 10:38 0 comments

    This project involves an STM32 and an ESP8266 microcontroller, which communicate with each other via UART. Previously, I created a simulator for the STM32's firmware that allows me to run and test it on a PC. In this post, I will share my progress in simulating the ESP8266 firmware.

    Below you can see an emulated ESP code that is just echoing what it received on UART and is blinking an LED (the yellow circle at the bottom of the display represents an LED connected to an ESP pin).

    The Idea

    I think the approach is always the same:

    1. Identify Code to Simulate: Check which parts of the code need to be simulated. Likely, this includes all project-specific code but excludes libraries that handle hardware communication.
    • Write Mock Implementations: Take the code and ALL header files it includes (libraries too). Then, create .cpp files for .hpp files that were taken without their source code – these will be the mocks.

    Why is it better to compile the code with .hpp files of the libraries we want to stub? They could just be copied and modified; maybe this would be easier?

    Well, no. If the .hpp file is modified (for example, if a new version of the library is used), the simulation build will simply fail to compile (the implementation in .hpp and .cpp will not be aligned), and we will know that our simulation is not up to date.

    However, this is problematic here because (which is not good) in many headers of Arduino libraries, the .hpp has methods with their bodies not in the .hpp or .cpp. For example, in HardwareSerial.h:

    class HardwareSerial: public Stream
    {
    public:
        size_t getRxBufferSize()
        {
            return uart_get_rx_buffer_size(_uart);
        }
    };

     uart_get_rx_buffer_size() comes from uart.h, so now we need to stub not only HardwareSerial.h but also uart.h. If uart.h has method bodies in the .h file rather than in a .c file, this process repeats. In the end, there was too much work, and I just copied the headers and cleaned them a bit. It’s not perfect but good enough.

    The diagram below presents the results. In yellow are mocked files from Arduino libraries, in green the emulated code, and in grey additional classes to provide an API for the simulation. The code is compiled into a .so library that is used by the GUI in Python.

    Functionalities

    There are not many functionalities for now, only simulation of GPIO and UART (both send and receive). How is it done?

    To send data to the firmware via UART, there is a method in the .so library:

    HAL_StatusTypeDef LibWrapper_OnSerialRx(
        const std::uint8_t *pData,
        std::uint16_t Size,
        std::uint32_t Timeout);

     It puts the data into a queue, and then HardwareSerial reads this data. The objective is that the simulated main.cpp code uses HardwareSerial but doesn’t know whether the data comes from a real UART or this simulated method.

    It’s a bit trickier when the simulated code needs to change GPIO state or send some data back. How would the simulation know that the state has changed (to show it on the screen to the user)? There are at least two ways:

    • Polling: The simulation periodically checks the state of the UART and GPIO mocks. This is not ideal because we would need to poll very frequently to get good precision.
    • Callback: The simulation registers a callback function with the mocks. On state change, the mock calls the callback. The callback points to a method in the simulation that, in the end, updates the screen. This is not ideal because code with callbacks can be difficult to debug and may crash at runtime if poorly written.

    I used the second method. It’s implemented in the HmiEventHandlers class.

    That’s all. I’m pretty happy with the results, although it is not complete.

    Link to the commit.

  • Automated Raspberry Pi Deployments with Docker Buildx and Ansible

    Robert Gawron12/23/2024 at 18:21 0 comments

    In this post, I will describe how I set up a Docker container on my PC (x86) to cross-compile Docker containers for Raspberry Pi. I will also discuss Ansible, a tool for automating the deployment and configuration of remote machines. I use it to configure the Raspberry Pi and install the images I build locally.

    Project Setup

    For this cross-compilation, we need two Docker configurations: one to build target images and another to define what those target Docker images are, to configure them (what is installed in them). Here's the project structure:

    ├── Host
    │   ├── Dockerfile
    │   ├── README.md
    │   ├── ansible
    │   │   ├── README.md
    │   │   ├── files
    │   │   │   ├── docker-compose.yml -> /workspace/docker-compose.yml
    │   │   ├── inventory
    │   │   ├── playbook.yml
    │   │   └── roles
    │   ├── bake.hcl
    │   ├── docker-compose.yml
    │   └── scripts
    │       ├── docker_export.sh
    │       ├── entrypoint.sh
    │       └── mount_ssh.sh
    ├── Target
    │   ├── Dockerfile
    │   ├── README.md
    │   ├── Test
    │   │   ├── README.md
    │   │   └── test.py
    │   └── docker-compose.yml

    The idea is that in the Host/ we build our builder container and once we are logged into it the configs to build target (Target folder) are mounted inside it.

    Host Container Setup

    Surprisingly, not much needs to be done for the host container. Docker is already equipped with an image for cross-compilation (docker:dind). I created this simple Dockerfile (along with docker-compose.yml to specify which external files are available inside the container), and it works perfectly:

    FROM docker:dind
    
    # Install QEMU for ARM cross-platform builds and other necessary packages, along with Ansible
    RUN apk add --no-cache \    qemu qemu-system-x86_64 qemu-system-arm \    bash curl git python3 py3-pip \    ansible \    rsync \    dos2unix
    
    COPY ./scripts/*.sh /workspace/
    RUN dos2unix /workspace/*.sh
    RUN chmod +x /workspace/*.sh
    
    # Set the working directory inside the container
    WORKDIR /workspace
    
    # Set the default command to bash
    CMD ["/bin/bash"]

    Once I log into the container, I have access to all the necessary files:

    .
    ├── Dockerfile
    ├── README.md
    ├── ansible
    │   ├── README.md
    │   ├── files
    │   │   ├── docker-compose.yml -> /workspace/docker-compose.yml
    │   ├── inventory
    │   ├── playbook.yml
    │   └── roles
    ├── docker-compose.yml
    ├── docker_export.sh
    ├── entrypoint.sh
    └── mount_ssh.sh

    Building for Raspberry Pi

    Inside this builder container, I use a docker-compose.yml configured for building images for the Raspberry Pi. The only modification from my last post (when I tested them locally on a PC) is that I hardcoded the platform: linux/arm64 directive for each service, specifying the target architecture for Buildx (this could also be configured in a .hcl file but I didn’t look into that).

    To build and extract the images into .tar files, I run the following command:

    docker buildx bake --file docker-compose.yml && docker_export.sh

    Now we are ready to deploy those exported .tar images.

    Deploying Images with Ansible

    Ansible is a tool that executes a list (called a playbook) of tasks on remote machines via SSH, much like a person would. So instead of doing the configuration each time, this tool does it automatically.

    These tasks can be grouped into reusable blocks (roles), many of which are open source. For example, to set up Docker and Docker Compose on the Raspberry Pi, I used the ansible-docker role.

    Here is an example of how I run Ansible:

    ansible-playbook -i ansible/inventory ansible/playbook.yml

    What Ansible Does on the Raspberry Pi

    Ansible performs the following steps on the Raspberry Pi (steps are skipped automatically if no modifications are needed):

    • Sets up the necessary directories.
    • Installs Docker.
    • Uploads the generated .tar image files.
    • Starts the Docker containers with the uploaded images.

    It's all automated!

    Notes


    I find Ansible to be slow, however,...

    Read more »

  • Integrating Grafana for Measurement Visualization: Part 1

    Robert Gawron12/21/2024 at 12:10 0 comments

    I've been setting up Grafana to run it on a Raspberry Pi in the future, and it's an amazing tool. If you have ESP modules sending data to your Pi, you should definitely try it. The best part is there's no need for custom software - just some configuration (well, actually, a lot of configuration!).

    I came up with this idea (a modified diagram from a previous post):

    All the components running on the Raspberry Pi are pre-existing Docker images, managed via Docker Compose. Configuration is centralized in a single file. For example:

    influxdb:
      image: influxdb:latest
      ports:
        - "8086:8086"
      volumes:
        - influxdb_config:/etc/influxdb       # Persist configuration
        - influxdb_data:/var/lib/influxdb     # Persist data
      networks:
        - app-network

     This setup creates a Docker container with InfluxDB—no need to `apt-get install` all the tools. Even an older Raspberry Pi should handle these lightweight Docker images well, as they’re not resource-intensive. For now, this is running on my PC.

    One issue is that data on Docker images isn’t persistent, meaning data stored directly inside them is lost after a reboot. Each container must be configured to communicate with its peers. For example, in Grafana, we need to add an authentication token to get data from InfluxDB. After a container reboot, it would be lost. Here are a few ways to handle this (there are probably more I’m not aware of):

    • Use persistent volumes in Docker Compose to retain specific directories. It’s easy but doesn’t store configurations in Git, so the initial setup must be manual and isn’t tracked.
    •  Mount configuration files and directories outside the container and track them in Git. This works but isn’t ideal for sensitive data like tokens, as it’s not good to save them in Git
    •  Store configurations in a private Git repo, adding complexity
    • Automate everything with Ansible, but that’s a lot more work.

    I chose the first option, which works well enough. I’ve documented the setup in a README.md, it’s detailed but as usual I probably missed half the details . :)

    Another challenge was testing without actual measurement data, as the firmware for my ESP isn’t ready. To simulate data, I created a simple Linux-based Docker image running a Python script to send fake data to MQTT. It works!

    Here’s the final version (with fake data) of the Grafana page showing measurements. By the way, I really love Grafana’s graphical interface and how polished the graphs look:

    Note that this is not all. I’ll need Buildx to create Docker images on my PC that I can then deploy to the Raspberry Pi. I can’t use the image I’m running on my PC since it has a different architecture than the Raspberry Pi (x86 vs ARM).

    I’ve created a new Git repo for all Raspberry Pi-based parts of this project to separate it from the hardware, which will remain in its original repo.

  • Gluing Firmware and Simulation and Exploring Ideas (Grafana)

    Robert Gawron12/18/2024 at 18:58 0 comments

    I have been working on the firmware and simulation, and it’s starting to look good. Below, I’ve made a short video. Here’s what you can see in the video (note: there is no sound):

    • In the simulator, the slidebars on the right simulate data from external pulse devices (for me, it’s my Semiconductor Radioactivity Detector and Geiger Counter - both featured on Hackaday, feel free to check them out if you’re interested).
    • On the screen (which emulates a real LCD screen), you can see a layout based on the MUI and U8G2 libraries displaying data from the simulated pulse counter. The numeric value corresponds to what is set on the slidebar on the right. Strictly speaking, it shouldn’t work this way—the device should wait, e.g., 60 seconds, then read data from the pulse counter driver (or the simulated driver in this case) and update the screen. But that’s a minor detail.
    • Just for fun, I’ve added a small feature to the simulation that visualizes changes over time for all the simulated pulse counter data. This part is visible at the end of the video.


    Link to commit.

    While I will continue to work on improving it, I also have some other cool ideas. The device is equipped with an ESP8266MOD, which is used alongside the STM32F103RBTx to send data via Wi-Fi to a remote host (I plan to use a Raspberry Pi). In short, I want to use Docker, InfluxDB, and Grafana to create web-based diagrams accessible from my local PC. The architecture is shown below:

    I was thinking that I could extend my simulation tool to also run the firmware for the ESP8266MOD and tunnel communication between the STM32F103RBTx -> ESP8266MOD -> Docker image running InfluxDB and Grafana. This is just a concept for now, and I plan to investigate it further.

  • Improving Code Quality with CodeChecker

    Robert Gawron12/16/2024 at 19:02 0 comments

    I was looking for a better open-source static analysis tool than the very limited Cppcheck, and I came across two interesting options:

    • CodeChecker, built on top of Clang Static Analyzer, Clang-Tidy, and Cppcheck.
    • Infer, a tool developed by Facebook. I haven’t tested it yet.

    In this post, I’ll share my experience with CodeChecker, which I find amazing. The number of checks this tool provides is astonishing. For example, while Cppcheck identified 5–10 issues in the project’s code (excluding all libraries), CodeChecker uncovered roughly 1,000. (I added more files for analysis, including PC simulations and unit tests, but the difference is still enormous.)

    The tool provides many checks that can be enabled or disabled depending on the project. It’s unlikely anyone would need all of them. I found it works best to enable everything, generate a report, and then disable the checks that aren’t useful. For example, there’s a checker for validating C++98 compatibility, which doesn’t matter to me since I use C++17.

    Here’s the list of checks I’ve disabled in my setup:

    --enable-all
    --disable clang-diagnostic-c++98-compat
    --disable modernize-use-trailing-return-type
    --disable readability-identifier-length
    --disable readability-uppercase-literal-suffix
    --disable modernize-avoid-c-arrays
    --disable modernize-use-auto
    --disable altera-unroll-loops
    --disable cppcheck-missingIncludeSystem
    --disable cppcheck-toomanyconfigs
    --disable clang-diagnostic-padded
    --disable altera-struct-pack-align
    --disable clang-diagnostic-weak-vtables
    --disable altera-id-dependent-backward-branch
    --disable bugprone-easily-swappable-parameters
    

    This snippet is part of my CMake configuration, which you can probably adapt for your project if it’s useful. However, CodeChecker relies heavily on CMake for configuration. For projects using other build systems, like SCons or Makefiles, it may not work well.

    The tool is also very CPU-intensive and slows down my computer significantly, especially when running in Docker. There’s a flag to limit CPU usage, but it didn’t work well for me—or maybe I didn’t configure it correctly.

    Since my project is hosted on GitHub and has CI configured, I’ve found a better way to use the tool. Static analysis builds are triggered automatically on every push. While the analysis runs remotely, I work on fixing other bugs based on the last report. Once the CI build finishes, I download the results and see if my previous changes fixed problems. From time to time, I rebase and squash those commits, then fix the history with git push --force.

    CodeChecker is a great tool!

    PS: I only have 92 warnings in the code now :-)

  • Advancing Firmware Testing Automation with Pytest and PC Simulation

    Robert Gawron12/14/2024 at 16:57 0 comments

    The simulator, which uses the PC variant of the firmware, works great but has one drawback for quick testing: interacting with the device is manual. This means that each time, I have to click on the buttons (which represent real buttons) to navigate through the menus and observe how various layouts are rendered.

    I started thinking about whether this process could be automated and came up with a solution: using pytest, which is likely the most popular test framework for Python.

    It works, and a sample HTML report is shown below:

    There are some glitches in the images, and I think the images appear a bit blurred. Since I don't have reference images to verify how it should look, the test only displays the images; it doesn't determine pass or fail.

    Here is what the test looks like for the report above:

    def test_iterate_list(assert_display_content):
        """Test to validate iteration through a list using display content."""
        logger.info("Starting test: test_iterate_list")
    
        dut: Simulation = Simulation()
        dut.start_firmware()
        logger.info("Firmware started for the simulation.")
    
        logger.info("Capturing initial display state.")
        assert_display_content(dut, "List Iteration Test 1", "test_iterate_list_1.png")
    
        logger.info("Simulating DOWN key press and release.")
        push_key(dut, SimulationKey.DOWN)
        time.sleep(0.1)
    
        logger.info("Capturing updated display state.")
        assert_display_content(dut, "List Iteration Test 2", "test_iterate_list_2.png")
    
        logger.info("Simulating DOWN key press and release.")
        push_key(dut, SimulationKey.DOWN)
        time.sleep(0.1)
    
        logger.info("Capturing updated display state.")
        assert_display_content(dut, "List Iteration Test 3", "test_iterate_list_3.png")
    
        logger.info("Simulating DOWN key press and release.")
        push_key(dut, SimulationKey.DOWN)
        time.sleep(0.1)
    
        logger.info("Capturing updated display state.")
        assert_display_content(dut, "List Iteration Test 4", "test_iterate_list_4.png")
    
        dut.stop_firmware()
        logger.info("Firmware stopped for the simulation.")

    This feature was surprisingly easy to implement. I used the simulation.py class (which I developed for simulation) to wrap the .so library in Python, allowing for seamless communication with the firmware, and then connected it to pytest using its fixtures.

    One tricky aspect was discovering that pytest looks for its fixtures in conftest.py. Fixtures won’t work if they’re located in the same file as the tests. It took me a while to figure this out.

    Now, I can run this lengthy one-liner in the command line (inside Docker) to compile and test the firmware:

    cd /workspace/build/ && cmake .. && make -j24 && cd /workspace/Test/SystemTests && pytest test_display.py -s  --html=report.html

     Then I refresh the page with the test results in Google, and I have all the results displayed in an easy and predictable way.

    Link to commit.

  • Integrating MUIManual and U8G2: Testing in Simulation OK

    Robert Gawron12/12/2024 at 16:10 0 comments

    I have integrated two libraries into the project:

    • MUIManual (a high-level GUI library for dialog boxes, lists, etc.)
    • U8G2 (a low-level GUI library for rendering, including non-fixed-width fonts).

    The code is messy, but it works (kind of).

    These libraries offer many useful features, which will save a lot of development time.

    Understanding how these libraries work was quite challenging! They are memory-optimized and use the concept of "tiles." This means they don’t maintain a buffer for the entire screen. Instead, they calculate updates for an 8x8 pixel tile of the display and then move to the next tile until the entire screen is updated. To make things more complex, the tiles may be updated in either horizontal or vertical order. It took considerable effort to grasp how this works and how to decode it to display content on the screen.

    Testing was conducted on the PC simulator for the device (the simulator was presented in previous posts), and it works! This allows me to develop all layout-related code without requiring the actual device (which is great because, in the actual HW version, the display doesn’t work.). Below is an example:

    This is a "Hello World" example from the MUIManual tutorial, adapted to the project. The "select me" text is more than just a rectangle; it represents a button (button on the screen, of course :)). The button is interactive and can be "pressed" because it can be linked to real hardware button.

    Additionally, I created a sequence diagram to visualize how this process works.

    This is a working version that is subject to change. With MUIManual, I think I might be able to remove the "Controller" and "View" classes since these features are already provided by the libraries. That would be great because it would simplify the project (less code to manage). However, it could also be problematic, as it would tie the project strictly to these libraries, making it harder to replace them in the future.

    PS: This project currently has 120 followers, which feels like a lot and makes me very happy. Thank you all for your interest!

  • High-Level Embedded GUI with u8g2 and muimanual libs

    Robert Gawron12/01/2024 at 10:38 0 comments

    In the previous post, I covered the low-level LCD driver and PC simulation. Now it's time for the last big piece: the high-level GUI.

    I was searching for an open-source library for small embedded projects that would:

    • Provide GUI elements like buttons, menus, and selection lists.
    • Support proportional fonts (not fixed-width like fonts in PC terminals). They look much better on an LCD.

    I could write the code for this myself, but it would be limited (because it's a lot of work) and not as aesthetic or powerful for the user. Also, layout rendering isn't something that really interests me, so a library is a much better solution. And I found one!

    The library is muimanual, which is part of the popular u8g2 library. I’ve never used it before, but from what I see, it provides great support for various displays and boards. If you have an Arduino or another project with a display and buttons, you should definitely give it a try!

    Here’s an example of the GUI (the image is from its Git page):

    One disadvantage is that it’s a monochrome library. However, I think I can find some workarounds. Even if not, it’s still better than writing something half-baked myself. I’d prefer a good monochrome GUI over a poor one with color support.

    As usual, I’ve made a diagram to visualize how it works (or rather, how it will work once everything is done):

    • St7735DisplayDriver will glue together all low-level aspects (HAL usage and the LCD display library).
    • Display will act as a bridge, providing muimanual and u8g2 with the LCD API they need to draw the GUI.
    • "User Defined Layouts" will be a high-level collection of classes for displaying various layouts.

    I'm working on gluing this all together; hope it works well.

  • PC-Based Firmware Simulation for High-Level Logic Testing

    Robert Gawron11/17/2024 at 18:33 0 comments

    I've created a PC simulation of the firmware to test high-level logic (mainly how data is presented on the LCD) without the need to take time to flash the device. The idea was to organize the code into layers, taking the high-level layer as is, without any modification, while providing mocks for all low-level code (mainly drivers using peripherals like SPI, UART etc).

    I've implemented it in such a way that CMake takes all non-hardware-related code along with mocks and generates a .so file (a shared library). A Python application pilots this library as a real microcontroller would and gathers data about what the library does (e.g., what it displays on the screen). All is in Docker of course :)

    Here's the simulation in action: in the center of the GUI, the LCD content is visible, showing what firmware renders on the LCD. The content in this example (firmware is not done) is not impressive - just a green line in the corner - but it works! 

    The GUI was built using PyQt6. In the original version, I used some graphic tool (I forgot its name, and Qt has many graphic tools), but it was a disaster due to how complicated it was. In this version, I took a picture of the old GUI and asked ChatGPT to create something similar without using any fancy tools. It looks good enough for me.

    To ensure I don’t forget how it works, I created a small component diagram to visualize the architecture of the simulation. It was done using PlantUML, an open-source UML tool. It’s great because it’s text-based, but one downside I found is that it’s not easy (and sometimes impossible) to organize blocks in an aesthetically pleasing way.

  • Thoughts on Display Libraries and Text Rendering

    Robert Gawron11/13/2024 at 20:14 0 comments

    When I searched for a library to handle the ST7735 display in the project, every link in Google pointed to stm32-st7735. The library itself is simple and easy to use, and it has this nice function in its API (slightly modified example from the author’s blog):

    ST7735_WriteString(0, 0, "Hello World", Font_7x10, ST7735_RED, ST7735_BLACK);

    This method sets the initial position, text to display, font, and colors, making it easy to display some data and get things started. However, later on, it's easy but not really good:

    • Fonts are hardcoded in the library: Changing the font requires modifying the library itself. I'm not a big fan of this because it usually means that either I’m doing something wrong, or the library has limitations.
    • PC simulation problems: I plan to have various layouts to display data and allow the user to configure the device. Constantly flashing the device to test layout changes takes time, so a PC simulation would be great. However, the binding of high-level logic and SPI data-sending (the chip driving the display is quite complicated IMHO) makes it tough. One option would be writing an SPI HAL driver (the one generated from CubeMX) mock to extract data, but that’s a complex task.

    Fortunately, the library also includes this method:

    void ST7735_DrawImage(uint16_t x, uint16_t y, uint16_t w, uint16_t h, const uint16_t *data);

     With this method, font rendering can be managed directly in the user code and passed through ST7735_DrawImage, so the library doesn’t need to know about it, it just do it's work to ONLY send the buffer to the display's chip

    No high level logic not related to being a driver library. For a PC simulation, the entire library can be mocked, making the extraction of display data easy. I have the *data with the pixel content and x and y with the position I can just store this and then present it in my app (see the screenshot in the project description).

    Even more, yesterday I found another library (ironically with the same name) for handling this display. It also has a similar method but is written in a much cleaner way. I’ll use it instead but the idea stay the same.

View all 17 project logs

Enjoy this project?

Share

Discussions

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates