Close
0%
0%

Mixed Reality Drone Racing

PC game to build virtual 3D racetracks in your house then race actual physical drones on them

Public Chat
Similar projects worth following
Hey everyone!

I wanted to share a project I'm working on to make an FPV drone racing game using Unity and ROS. The goal I'm working towards is a PC game (maybe VR) that let's you build custom virtual racetracks in an indoor environment and then race actual physical (and eventually autonomous) drones on them.

Check out my website for more about me and other things I like to do: vinayrajur.com

Why I'm building this

I like working on robotics projects in my spare time, and one project I've wanted to do for a while has been building my own autonomous drone. I've worked on some systems like that in the past and they've been really cool to see in person. Along the way, I also started getting into flying FPV drones as well and realized that flying them manually is just as fun as seeing them fly themselves, so I wanted to see if I could combine the two in some way by possibly making a game out of it. Btw, definitely check out the work done at University of Zurich if you're interested in high-speed autonomous drones.

  • Project Update 3

    Vinay01/21/2023 at 01:07 1 comment

    What's New
    1. Mixed Reality FPV Display
    2. Track Builder
    3. Updated GUI displays

    I am super excited to get this update out, because this is the first time I have a proper mixed reality FPV display. If you remember from last time, I had 2 separate feeds for the gameview and the fpv livestream - but it was pretty difficult to fly this way and led to a lot of crashes (just watch the end of the last video lol). So for this update, I spent most of my time trying to unify those two video feeds by overlaying the overlay virtual objects onto the livestream. You should definitely watch the demo video so see the proper effect - I have to say it looks pretty cool in my extremely biased opinion :)

    Another big new feature with this update was a new Track Builder tool to make it easy to actually design custom race tracks in your room. The main idea is to use the drone and it's position estimation system to place virtual 3D markers in your space, which you can then use as a reference to place your gates and other virtual objects. This made it much easier to place gates in the environment for me and especially helped me make sure I wasn't placing them in physically unreachable spaces (like partially through a wall/bed/table/etc).

    Apart from that, I also added a few more GUI displays just to display some relevant information like battery charge and runtime remaining. These primarily were to help with some of the range anxiety feelings I wrote about in my previous update post.

    Challenges

    1. Getting the video feed into Unity    
      1. Also fixing the memory leak I introduced in my implementation
    2. Overlaying the game objects onto the FPV feed
    3. Figuring out how to build interesting race tracks

    The first big challenge I faced this time was around getting the video feed into Unity and rendering it. If you remember, this is also where I got stuck last time (see my previous update the the details). The quick tl;dr though is that Unity doesn't support the MJPG video format (which how the video frames were being encoded by my video receiver). So instead of using Unity's WebCamTexture class to get the image data in, I decided to do the simplest thing I could think of which was to decode the video frames outside Unity and then pipe them in over the ROS network. Now, if I was trying to build an optimized low-latency, real-time system this would not be a great choice because I'm potentially introducing a whole slew of non-deterministic buffering delays and such...but for now my goal was just to get something workable. I'm also ignoring the fact that ROS1 doesn't support shared memory unless you're using nodelets (which I'm not), so that means I'm potentially using a ton of unnecessary RAM since every image gets decoded, stored into memory, then copied again into a buffer to be sent over the ROS network (and potentially copied back out when it's received if you're not careful). But, for the sake of getting something working, I decided to take on a large amount of technical debt just to see if it would even work. My implementation was then to setup a very simple video server node that connected to the webcam device, read each frame and sent it over the ROS network in a (compressed) image message. From Unity, I then setup a listener to watch for incoming image messages, and render them on a simple plane texture.

    So how bad was this implementation? Well, it actually worked decently well, which surprised me. I was successfully able to access and render the image data in Unity. Additionally, even though I didn't do any actual latency testing this time, but it didn't seem to be introducing a significant amount of lag either. Unfortunately, what I did introduce was a memory leak. It took me a while to notice it, but the game was consuming an increasing amount of memory over time leading to my computer freezing if I was played long enough. Luckily, I was able to track it down to the way I was rendering the images: I was creating a...

    Read more »

  • Project Update 2

    Vinay12/20/2022 at 13:57 0 comments

    What's New
    Picking up from last time, my goal was to get the game to playable state, which if you remember: I had no flight controls and no camera feed. So for this update, my focus was on getting those 2 components implemented for the game.

    Adding the flight controls was relatively straightforward. To make things easy, I wanted to use a generic PS3-type controller with some basic inputs setup for forward/lateral/vertical velocities, yaw rate, takeoff/landing and motor kill/reset. Luckily, Unity makes it really easy to support gamepad controllers, so I just setup a few callbacks listening for inputs which I then mapped to velocity commands and published over ROS. I then updated the crazyflie ros node to subscribe to these command messages which are then executed on the drone using the python cflib library. This works pretty seamlessly and feels responsive enough to fly with (after some tuning).

    By contrast, adding the FPV video feed was a bit more involved and came with a bunch of questions:

    • How to send video from the drone
    • How to receive video from the drone
    • How to render this video in the game

    To record video, I decided to use the smallest, cheapest off-the-shelf camera I could find, which ended up being this WolfWhoop WT03 FPV camera + transmitter on Amazon. It weighs about 5 grams, uses around 25 mW of power (at its lowest power setting) and works off of 3-5V input (so draws around 500mA current). It seemed like a good option because it was pretty light and low-enough on power consumption that it can be powered by the crazyflie's onboard lipo. Additionally, being an analog video transmitter meant it should be relatively low latency.

    To receive the video, I needed an analog video receiver. I found the Skydroid 5.8G OTG Receiver on amazon for around $30 which could receive an analog video stream and output it as a standard webcam feed on my linux pc. The output webcam feed produced by this receiver is a sequence of 640x480 frames encoded as MJPEG (which is basically a sequence of frames which are rendered as JPEG images without and temporal/multi-frame compression).

    To render this video feed into the game, I was looking for a quick solution and my main approach was to try and capture and render the video feed entirely in Unity using the WebCamTexture class. I found a fair bit of trouble with this approach, so for this demo I chose to just render the video feed outside of Unity using VLC player. I'm not very happy about this solution, but it worked enough to give me a feel for the game's playability.

    Challenges and Concerns I had
    There were a bunch of challenges in getting to this point:

    • Sourcing the right components (camera, receiver, etc...)
    • Adding the camera to the drone
    • Getting the video feed into Unity
    • Understanding the camera's impact on battery life
    • Understanding the latency of the camera feed
    • Concerns around playability:

    Let's go through them:
    Sourcing components - this was not very complicated, but did require doing a bit of homework to make sure that the camera met the power/weight constraints for the drone and that the receiver would be compatible with it (as well as my pc). This all seemed to work out though.

    Adding the camera to the drone - this was again not super complicated but required some homework (I am also less confident in my soldering/hardware skills than I am in my software skills). I used the crazyflie's prototyping deck to solder in leads to the power supply and used a JST pin header to connect to the camera. To physically attach the camera to the drone, I just used electric tape as a quick-and-dirty solution (but it's on my todo list to build a 3D printed mount).

    Getting the video feed into Unity - I feel like this was much more complicated that I was expecting (or than it should have been). As I mentioned in the sections above, I was primarily trying to use Unity to directly capture and render the video feed since...

    Read more »

  • Update 1

    Vinay08/01/2022 at 13:34 0 comments

    How does it work / What I've done so far

    I put together a quick demo video (linked at the top of the post) just to document the current state of my prototype.

    I'm very early in the process, and honestly, I've kind of cheated a bunch just to get something up and running and feel out the concept. Most of what I've done has just been connecting pieces together using off-the-shelf hardware/software. Right now, the prototype basically just proves out the concept of rendering the realtime position of a drone inside of a Unity game and getting all the "piping" set up to get data into the right place. Currently, the information flow is all one-directional from the drone to the PC.

    On the hardware-side, I'm using Bitcraze's crazyflie drone with it's lighthouse positioning deck and steamVR's base stations for estimating the drone's 3D position. State estimation is pretty hard, but thanks to all the hardwork done by the crazyflie open source community, this is just kind of works out of the box and in realtime (i.e. one of the big reasons why it kind of feels like cheating lol). Communication between the crazyflie and the PC is done using the crazyflie radio dongle.

    On the software-side, I'm using ROS to handle all the intermediate messaging and obviously Unity for the user interface, game logic and visualization.

    Challenges I've run into so far

    Getting the state estimate data from the crazyflie into Unity was somewhat interesting to figure out. Basically, the crazyflie computes its 6DoF pose (position and orientation) onboard, then transmits this telemetry over radio to the PC. On the PC, I wrote a simple ROS publisher node that listens for these messages and then publishes them onto a ROS network. To get the data into Unity, I'm using Unity's ROS-TCP-Connector package (and ROS-TCP-Endpoint) which essentially just forwards the messages from the ROS network into Unity. Inside Unity, I wrote a simple script tied to a gameobject representing the drone that takes the data, transforms it into Unity's coordinate frame and uses it to set the gameobject's position. Overall, it's just a lot of forwarding of information (with some annoying coordinate frame transforms along the way).

    Another important piece of the puzzle (as far as rendering the drone inside a 3D virtual replica of my room) was building the room model and calibrating it to my actual room. I can go into it more detail for sure, but at a high-level I basically just picked a point in my room to be the origin in both the physical and virtual room, put the crazyflie there (aligned with the axes I picked for the origin) used the crazyflie cfclient tool to center the base station position estimates there. My process was pretty rough as a first pass, and it will very likely have to improve, especially as I move in the mixed reality direction and start rendering virtual objects on a live camera feed.

    What's next?

    Tactically, the next few steps would be to add the FPV view into the game (streaming video data from the drone and rendering it into Unity), which involves more data forwarding (and calibration). In addition, I need to add input controls so you can actually fly the drone. The bigger goals in store would be around building out proper gameplay, integrating in autonomy (and figuring out where it makes sense), and maybe exploring what VR functionality might look like as opposed to just using a flat display on a PC monitor.

    Thanks for reading through this whole update! If you made it this far, I would really love to hear any feedback or questions on this or anything else. Most likely, it would help me figure out what some additional next steps would be, and I'd be super interested learn if there are other cool directions I could take this project!

View all 3 project logs

Enjoy this project?

Share

Discussions

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates