Close
0%
0%

Open Source Air Defense

Convert FPV racing drone into an interceptor using Raspberry Pi

Public Chat
Similar projects worth following
A true 'HACK' with a very basic approach to demonstrate the idea of air defense against incoming threats, being other quads, rockets, or otherwise. This project attempts to convert a first person view (FPV) quad-copter into a drone that intercepts incoming threats using a Raspberry Pi and Logitech web cam. Rasp-Pi-3-B is used to process basic manual target acquisition, target tracking, and flight guidance.

Why?  Demonstrate basic concepts, assessments, and design trades to stop incoming threats using physical impact lethality.  Also, build a low cost test platform to develop a highly capable system.

  • Provides a path to low cost commercial defense systems
    • Deters attacks and aggressive surveillance
    • Could help poor and underrepresented communities defend themselves
  • Evolve this project to a world class capability
    • High compute rate and agility to hit high speed incoming threats
    • High clutter environment targeting (ie: ground, sea, brush, forest, mountain, etc.)
    • Rain, fog, and night targeting capable camera sensor (ie: long wave infrared cameras)
    • Other targeting sensors (ie: miniature radar and other)
    • Go beyond quadcopter airframe (ie: boosted interceptor and guided munitions)
  • Inspire other open source defense capabilities
    • Low cost open source RF Radar systems
    • Low cost long range encrypted mesh RF networks

What? Add a Raspberry Pi and web-cam into First-Person-View quad-copter system to demonstrate intercept of airborne targets.

  • User assisted (manual) targeting via first person camera
    • RC cntrlr switched modes:
      • position on-screen target box
      • enter autopilot intercept mode
  • Python3 OpenCV3.4 based tracker and guidance
    • 20Hz update rate (60Hz maybe possible with this setup)
      • needs future improvement
    • ~640 x ~480 resolution range
      • needs future improvement
    • Autopilot without RF telemetry (countermeasure resistant)

How?  Insert the Raspberry Pi between the Human user and on-board quad-copter flight controller.  Use off-the-shelf products available nearly anywhere (via shipping).  Basic circuit to add Raspberry Pi into quad-copter system.  And some Python code and set-up on Raspberry Pi.

Products:

  • $349 ARRIS X220 V2 220MM 5" FPV Racing Drone with EV800 FPV Goggle
    • Omnibus F4 Flight Controller (Betaflight)
    • Arris VT5804 VTX video transmitter
    • RadioLink AT9S radio w/ R6DSM Receiver
      • SBUS Futaba Protocol
      • 4 Flight Control Channels  (Throttle/Yaw/Pitch/Roll)
      • 6 Aux Channels (Rotor Lock/Beeper/Flight Mode/ 3 User Available)
        • 2 Switches User Available ( This project uses for 'user targeting' and 'auto-pilot' modes.)
        • 1 Knob ( This project uses for targeting box size during 'user targeting' mode.)
  • $35 Raspberry Pi 3 B w/ 1Gb ram
  • $8 16Gb SD Card (nice high speed flavor)
  • $0.58 CD74HC14E Schmitt Trigger Inverter (Hex or x6 channel)
  • $0.45 NCP11177ST33T3G  3.3V 1amp voltage regulator
  • $65 Logitech C920 USB WebCam with 60FPS @ 640x360 or 854x480  normal color streaming camera 

Python Code on Github: https://github.com/rayzrocket/openairdefense

NX1117C-3-3v.pdf

3.3v regulator that I used

Adobe Portable Document Format - 1.13 MB - 05/15/2019 at 01:39

Preview
Download

cd74hc14-bitInverter.pdf

UART inverter that I used

Adobe Portable Document Format - 1006.73 kB - 05/15/2019 at 01:39

Preview
Download

  • March 2021 Update, >75Hz Frame Rate with Tracking WORKS!

    Ray Patel03/31/2021 at 03:54 0 comments

    The project is back on track after losing a year to Covid19 & other distractions life throws at us.

    Bottom line up front: Achieved 75Hz full processing including SBUS data read, frame grab, convert to numpy array, process image to track, generate PID control quad command, and finally SBUS write to the quad's flight controller.

    Details:

    For now, I've abandoned the use of USB cameras due to cost and other reasons.

    Switched to Pi 4 model b.

    Currently switched to Pi Camera Module V2 noIR via CSI ribbon cable.  Achieved high frame rates at 640x480 finally, 90fps in mode7.  In order to perform manual image processing, I don't need the regular jpeg compressed frames, rather need frames as numpy arrays, which was also achieved with very low latency. 

    The SBUS update rate on the remote control system I'm using is 75Hz (Radiolink AT9S transmitter with R6DSM receiver on the quad).

    Remember folks, the quad's flight controller (Omnibus F4 running Betaflight) would normally be commanded via SBUS from radio receiver.  But in this project we run the SBUS through the Pi.  The Pi can be in pass through mode or autopilot mode where the Pi can inject pitch, yaw, roll commands.  I have not implemented 'throttle' yet, currently the Human controls throttle at all times. 

    Testing using my ceiling fan with a hair-scrunchy strung between blades using fishing line was conducted.  To get the best loop rate, I shut off any (cv2.imshow) screen updates of the image. But I still need to see what the camera and processing was doing, so I store the images(numpy arrays) and X & Y tracker (box around target) output to local memory (just a python list).  For test purposes, I can set the script to run for some number of frames ( I used 700 frames ), upon completion it writes .jpg of each image and plots determined X & Y track positions using matplotlib pyplot.

    The PID flight control of pitch and roll is implemented and checked out by monitoring the Betaflight GUI for the values being sent to the flight controller.

    Outdoor tests with very limited flight while tracking another quad (DJI Mavic mini) have also been conducted a few times.

    The target drone was set up to fly around with a hard acceleration in one direction at first and then some minor movement in other direction as you can see in the plots of tracker position.

  • x220 Quad Loaded Weight Flight Tests

    Ray Patel05/21/2019 at 03:54 0 comments

    The stock Arris x220 with battery is 524g.  RasPi 3 B + c920 web cam + USBcable + chips-n-wires = 220g.  This test has added 232g to 524g Quad for a total 756g.   No CG balancing was conducted, just heaped some parts onto the Quad and used electrical tape for attachment.  ( The c920 web cam housing with built-in stand and USB cable is very heavy, ripe for weightloss! )

    See picture and videos below:

  • Basic Setup Tests, Learning, and Debugging

    Ray Patel05/17/2019 at 03:06 0 comments

    Basic SBUS, webcam, remote control target box, switched modes (start > manual targeting > auto tracking), OpenCV tracker, and casting to FPV screen working.  Found a slight flicker/noise problem on SBUS read in Rasp Pi every now and then.  I've debugged it down to the obtrusive OpenCV webcam frame grab .read(); so I plan to use threading.

    My apologies, I need to increase my 'webcam' budget for taking these videos.  (consider them 'retro' at NTSC 30Hz)

    You may see the 'User Targeting' or 'Tracking' label on the screen.  I'm using switch 'F' achannel(9) to control if user can move the box and resize (VrB Knob achannel(7)).  When switching out of 'User Targeting' the code is setup to automatically start the tracker with that current box defining the ROI (region of interest).

  • Conducting the Development

    Ray Patel05/14/2019 at 01:44 0 comments

    Setup testbench to trade & develop:

    • Image processing selection
      • My custom algos need help...so let's hold off on those
      • Let's use OpenCV built-in flavors
    • Select a camera (limited to what's around the house)
      • didn't get PiCamera V2 on ribbon cable to work (maybe more reasons than this to abandon PiCamera V2...framerates too low?)
      • OpenCV captures webcam easily
    • Rasp Pi receive & transmit SBUS serial from Quad's RF receiver and back to Flightcontroller
      • need inverter since SBUS is inverted UART
        • need 3.3v supply for inverter to keep SBUS thru inverter at 3.3v
        • 5v not good for Flightcontroller board (Omnibus F4)
    • Rasp Pi to output analog video (composite) to Quad's VTX (video transmitter) so that the image can be seen on FPV goggle screen.
    • Rasp Pi code to allow user to select region of interest square box on-screen.
      • Pass this to OpenCV tracking algo
    • Rasp Pi code to pursuit guide Quad based on control algo
      • Based on error from boresight center
    • Try to get at least 20Hz total loop time
      • C920 cam happens to go 30Hz in bright environment
      • Need to figure out SBUS repetition rate
      • Afraid of camera frame grab vs. incoming SBUS serial collisions causing time issues
      • Likely need 'threading' 

View all 4 project logs

  • 1
    Raspberry Pi Setup

    What I did to Raspberry Pi 3 B (for now)...

    Do not use NOOBS (I don't think it allows analog composite video output as we need it)

    • Install just RASPBIAN, use 'stretch' update only (I think?).
    • perform: -update pip3
    • perform: sudo apt install apt-file
    • perform: sudo apt-file update  , this will allow apt-file search such as : apt-file search libhdf5_serial.so.100 (one of the "not found" perhaps) then you can install the files  "not found"  by : sudo apt install libhdf5-100 ... do this for all missing dependencies if any exist.
    • Sometimes I used 'matplot lib pyplot' for plotting data so, install by:   sudo pip3 install matplotlib  ; likely better ways to do this.
    • Install 'Thonny'   (I'm a newbie, so I use Thonny ! )
    • Install Numpy  pip3 install numpy  (I'm somehow using Numpy 1.12.1)
    • Install OpenCV contrib headless:  
      • sudo pip3 install opencv-contrib-python-headless
      • sudo pip3 install opencv2
      • You will have to work out all the dependancies of opencv install.
      • 'ldd' is Unix cmd for list dynamic dependancies, so we can use ldd cv2.cpython.... | grep "not found"

    Analog Composite Video Output at Fullscreen:

    In Config.txt  ( i think you can use  sudo nano /boot/config.txt ) 

    • rem out hdmi_safe=1  (#hdmi_safe=1) ; not sure if we must do this
    • rem out hdmi_hotplug=1 (#hdmi_hotplug=1)
    •  add sdtv_mode=0

    When running RaspPi operating system, you can open Configuration Settings and change resolution to 720x480, which is what the output must be for the Quad's VTX to transmit.

    Connect the Quad's VTX connector to Rasp Pi 3 B at physical pin pp24 for composite video put and pp6 for ground.

    Quad's VTX connector White wire is signal  and Black wire is ground.

    To get fullscreen, I currently just make the cv2 window FULLSCREEN:

    cv2.namedWindow("Show", cv2.WINDOW_NORMAL)#Declare the image window
    #cv2.setWindowProperty("Show",cv2.WND_PROP_FULLSCREEN,cv2.WINDOW_FULLSCREEN)#fullscreen for FPV use
    #I typically rem out FULLSCREEN, since it's hard to develop the code on Pi when entire screen is the camera view.

    UART for SBUS: 

    Open the /boot/config.txt and do following:

    1. add device operlay command pi3-miniuart-bt to switch Bluetooth to the miniUART and PL011 to serial pins #8 and #10.  https://www.raspberrypi.org/documentation/configuration/uart.md

    It will set:

    • Bluetooth to Miniuart to /dev/ttyS0 that uses VPU clock bla bla, who cares? we are not using this.
    • PL011 to /dev/ttyAMA0  (this is what we use)
      • UART Tx from Pi is Pin 8 or GPIO14
      • UART Rx to Pi is Pin 10 or GPIO 15

    2. add enable_uart=1

    3. add dtoverlay=pi3-disable-bt

    *Disable 'Serial Console' with raspi config or manually be removing 'serial 0' and 'serial0, 115200' baud rate setting in /boot/cmdline.txt

    *Install 'pyserial', by using pip3 install pyserial pip3--update pyserialimport serial #serial port, based on pyserial install

    #use this in python3 for serial port setup:
    
    import serial #serial port
    
    ser = serial.Serial(#init serial    port='/dev/ttyAMA0',    baudrate=100000,    bytesize=8,    parity='E',    stopbits=2,    timeout=None,    )
    
    readin=ser.read(25)#readin 25bytes or timeout
    #do stuff in here...
    ser.write([sendout[0],sendout[1],sendout[2],sendout[3]....bla bla, see the code on Github https://github.com/rayzrocket/openairdefense/

    Time testing showed 3ms read, <1ms buffer reset, <1ms to parse SBUS into each channel (see Github code).

    8E2 UART of 25 bytes is 0.00275seconds, so 3ms is reasonable measured time.

View all instructions

Enjoy this project?

Share

Discussions

Doug R wrote 09/09/2021 at 02:05 point

Love the project!

I sincerely hate these rules but I want to give you a heads up before you get in trouble. This project is treading very close to violating ITAR rules.

https://en.wikipedia.org/wiki/International_Traffic_in_Arms_Regulations#Classification_of_Defense_Articles

Specifically watch out for articles 4, 8, 12 and 21 of these rules.

High Power Rocketry is one of my hobbies and we have to be super careful about what is posted on the internet. They actively prosecute anything the government considers an ITAR violation, even if it is an amateur project. Just be careful.

  Are you sure? yes | no

maxgb6 wrote 06/11/2021 at 11:37 point

Hey! The project looks awesome, I was wondering if you could go into a little more detail on how you send serial commands from the Pi to the flight controller to command the drone?

  Are you sure? yes | no

Ray Patel wrote 05/17/2019 at 01:13 point

That is a good observation and idea.  This project attempts to keep an option of being fully independent of a ground uplink signal that could be spoofed/jammed by the incoming threat or surroundings.  You are correct about cam 'field of view' limits, where high agility, filtering, target state estimating, etc...may help with that challenge.

  Are you sure? yes | no

Daemon informatica wrote 05/16/2019 at 08:05 point

Fun little project. :) Nice autonomous drone(-swarm) practice, if nothing else. I have a question / suggestion though: 

Wouldn't it be easier to use a stationary webcam 'on the ground' for target aquisition and control? The camera can track the missile and the drone both. (mark the drone with specific colors / patterns to recognice it better as such.)

It'd be slightly easier, because at this point you don't run the risk of losing the target the moment your drone swerves its target out of view of the webcam and you have a more absolute set of coordinates for both target adn drone, instead of having to constantly re-calculate relative coordinates between drone and missile.

Take it a step further and you can combine multiple webcam feeds for more accurate targeting.

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates