While traveling alone and feeling in danger, one of the most crucial steps to prevent a potential crime is to let your emergency contacts know your whereabouts with a brief explanation of your situation. Although smartphones provide various features regarding location tracking and wireless communication, they might still be not suitable since reaching and utilizing a smartphone may be time-consuming and arduous in a time of crisis for people with mobility impairments. In light of recent developments in machine learning and IoT, there is a surge in devices enhancing smartphone features in the form of automated notifications and additional sensors, e.g. smartwatches and fitness wearables. Thus, in this project, I focused on developing an AIoT assistive device that improves smartphone features to inform emergency contacts automatically and instantly of the user's situation.

Approximately 15% of the population is affected by some kind of disability, and this number is rising exponentially due to aging populations and the spread of chronic diseases, according to WHO reports[2]. In this regard, budget-friendly and accessible AIoT assistive devices should be versatile and provide various features in a broader spectrum, considering people with disabilities and special needs.

After inspecting recent research papers on assistive devices, I noticed there are nearly no wearable appliances focusing on detecting personalized items to execute some predefined functions covertly, such as automated notifications, by utilizing smartphone features. Therefore, I decided to build a user-friendly and accessible assistive device to detect customized keychains (tokens) with object detection and inform emergency contacts of the user's situation automatically.

Since XIAO ESP32S3 Sense is an ultra-small-sized IoT development board providing a built-in OV2640 camera and a microSD card module on its expansion board, I decided to utilize XIAO ESP32S3 in this project. Thanks to the integrated modules on the expansion board, I was able to capture images and save them to the SD card as samples without requiring any additional procedures. Furthermore, XIAO ESP32S3 comes with integrated Wi-Fi/BLE connectivity and 8MB PSRAM. Therefore, I was also able to run my object detection model consecutively. Then, I connected the XIAO round display to XIAO ESP32S3 in order to notify the user of the current saved sample numbers on the SD card and the ongoing operation by showing assigned icons.

Since I wanted to capitalize on smartphone features (e.g., GPS, GPRS, BLE) to build a capable assistive device, I decided to develop an Android application from scratch with the MIT APP Inventor. As the user interface of the assistive device, the Android application can utilize the celluar network connection to transfer data packets to a web application via GPRS, obtain precise location data via GPS, and communicate with XIAO ESP32S3 via BLE so as to get model detection results and transmit commands for data collection.

After developing my Android application, I designed various keychains (tokens) denoting different emergencies and printed them with my 3D printer. Then, I utilized the Android application to transfer commands to XIAO ESP32S3 via BLE to capture images of these customized keychains so as to construct a notable data set.

After completing my data set, I built my object detection model with Edge Impulse to detect customized keychains (tokens) denoting different emergencies. I utilized Edge Impulse FOMO (Faster Objects, More Objects) algorithm to train my model, which is a novel machine learning algorithm that brings object detection to highly constrained devices. Since Edge Impulse is nearly compatible with all microcontrollers and development boards, I have not encountered any issues while uploading and running my model on XIAO ESP32S3. As labels, I utilized the names of emergency situations represented by customized keychains:

After training and testing my object detection (FOMO) model, I deployed and uploaded the model on XIAO ESP32S3 as an Arduino library. Therefore, this assistive device is capable of detecting keychains (tokens) by running the model independently without any additional procedures or latency.

After running the object detection model successfully, I employed XIAO ESP32S3 to transfer the model detection results to the Android application via BLE. Since I focused on building a full-fledged AIoT assistive device, supporting only BLE data transmission with an Android application was not suitable. Therefore, I decided to develop a versatile web application from scratch and utilize the Android application to transmit the model detection results, the current location parameters (GPS data), and the current date to the web application via an HTTP GET request (cellular network connectivity). After receiving a data packet from the Android application, the web application saves the model detection results, location parameters (latitude, longitude, and altitude), and the current date to the MySQL database for further usage.

Then, I utilized the web application to inform the emergency contacts selected by the user of detected emergency classes via WhatsApp or SMS immediately. In this regard, I needed to utilize Twilio's WhatsApp and SMS APIs simultaneously. Also, I employed the web application to obtain inquiries from emergency contacts over WhatsApp in order to send thorough location inspections generated by Google Maps with the location information stored in the database table as feedback.

As shown below, the assistive device allows the user to apply the Fine emergency class to save location parameters as breadcrumbs to the database table. Therefore, the web application can generate travel itineraries with different methods of travel related to previously visited destinations, depending on the inquiries (commands) requested by emergency contacts through WhatsApp.

Considering harsh operating conditions, I designed a smartwatch-inspired case with a modular part (3D printable) that allows the user to attach the assistive device to various mobility aids, such as wheelchairs, scooters, walkers, canes, etc. The modular part is specifically designed to contain XIAO ESP32S3, its expansion board, and the XIAO round display.

So, this is my project in a nutshell 😃

In the following steps, you can find more detailed information on coding, capturing customized keychain images, building an object detection model with Edge Impulse, running the model on XIAO ESP32S3, and developing full-fledged Android and web applications to inform emergency contacts via WhatsApp or SMS.

🎁🎨 Huge thanks to Seeed Studio for sponsoring these products:

⭐ XIAO ESP32S3 Sense | Inspect

⭐ XIAO Round Display | Inspect

🎁🎨 Also, huge thanks to Anycubic for sponsoring a brand-new Anycubic Kobra 2.

Videos and Conclusion


For more information, you can inspect the project tutorial:

https://www.hackster.io/kutluhan-aktar/ai-driven-ble-travel-emergency-assistant-w-twilio-a948b0