Ever wanted to build a robot assistant that not only responds to voice commands and navigates autonomously, but also serves as a remote file server accessible from anywhere on the internet for less than 400$? I created a module for the Waveshare Wavego - a Raspberry Pi-powered quadruped robot that evolved from a basic voice-controlled bot into a sophisticated autonomous system with dual WiFi walking server capabilities.
Apologies for the simple formatting, I'm really struggling with the options hackaday gives me and it keeps removing my html code every time I need to make a small change.

The three panels require you to use some superglue, I found this to be the easiest option to connect them together and this allows us to cleanly print everything with minimal support. The ultrasonic part of the module can just be stacked into the slot.
It reacts to commands such as "jump" or "hello" by using VOSK speech recognition and it can also understand other movement related inputs but it can also answer queries through the integration of smolLM, an amazing tool that works well on the modest pi4b.
Why not use a pi5 for better performance? You could but the battery time will be effectively cut in half from about 2 hours to 1 hour.
I have to admit that I vibecoded most of this but it is working surprisingly well, although it probably is a security nightmare, luckily it is set to be only accessible locally in the release but if you like you can create a tunnel to it to make it accessible from anywhere, creating your own walking filesharing server.

Files are served over a simple webpage, you will be forwarded directly after connecting to the WalkingServer hotspot. If it won't forward you you can connect to it through 192.168.4.1 in your browser.
The delete option is a bit risky but I just left it in as a social experiment for the next hacking convention.
Current release works out of the box and is hosted on archive.org here:
https://archive.org/details/wavego-robot
It will require you to flash an sd card with at least 16GB via etcher or just dd.
The sd card and most parts like the required spacers should come with the robot and the additional components purchased like the raspiaudio speaker and mic module.
The additional usb wifi stick is used to get regular wifi while the onboard wifi chip of the raspberry broadcasts a hotspot for the piratebox.
To connect the ultrasonic sensor you'll either need to splice the cables together for Ground and 5V or get a split cable connector, I'm sure there are other ways, it's fairly straighforward:
# Connect ultrasonic sensor # TRIG -> GPIO 23 # ECHO -> GPIO 24 # VCC -> 5V # GND -> GND
The ultrasonic module helps it to navigate in dark environments where the camera is not sufficient and also assists with regular navigation.
Navigating itself takes about three seconds between movements to run through an obstacle check amongst other things as well as a temporal image analysis that will keep the robot from being stuck.
Current voice commands are as follows:
**Movement:**
- forward, backward/back, left, right
- stop/halt
**Actions:**
- jump, wave/hello/hi
- steady/balance (stabilization mode, put him on an uneven surface and he'll try to balance himself)
**Special:**
- auto/autonomous (starts sensor-based navigation, can be stopped with "stop" command)
- question/chat (AI assistant)
### AI Question Flow:
1. Say: **"question"**
2. Robot: **"yes"**
3. Ask your question: **"What is gravity?"**
4. Robot: **"processing"** (~20 seconds on average)
5. Robot speaks 2-3 sentence answer
Read more »
Here's a video showing the navigation capabilities. Right now he's using a system that checks for several probabilities regarding obstacles (brightness, contrast, lines, etc) and then chooses the approach that will most likely avoid collision. In addition he uses temporal image analysis that keeps him from getting stuck and an ultrasonic sensor that is usually taking the lead...
Quackieduckie
Jacob David C Cunningham
Mihir Garimella
Joshua Elsdon
awesome...