Close
0%
0%

Controlling a JJRC H37 Elfie quad from a PC

The JJRC Elfie Quadcopter comes with an Android/iOS app to control it from the phone. Can we control it from our own software?

Similar projects worth following
Last week I bought this £40 wifi-controlled FPV quadcopter. The toy came with an application to control it from the phone but I thought it would be nice to find out what kind of protocol the app uses and be able to write my own software to control it from a PC/interwebs/you-name-it. I used airodump-ng to sniff the traffic between the mobile phone and the quadcopter and found out that the app uses UDP to send the control commands to the quadcopter every 50 ms. So far I have only been able to identify the bare minimum I needed to put together a little python script that turns on and off the propellers of the quadcopter and demonstrate how this could work. I'm sharing the info here to encourage others to joint the quest for decoding the protocol.

ABOUT THE JJRC H37 ELFIE

For those who don't know the toy, the JJRC H37 Elfie is a cheap wifi-controlled FPV quadcopter (https://www.google.co.uk/search?q=jjrc+h37+elfie). The toy comes with an app for Android and iOS. The way you normally start playing with the quadcopter is the following: you connect the battery to the quadcopter, press the On button in the quadcopter, connect to the wifi network of the quadcopter from your phone (it is called "JJRC-something") and open JJRC's app. Once the app is on, you get to see what the camera of the quadcopter is seeing and then you can enable the on-screen controls, turn on the propellers and start flying the quadcopter. If you want to see this for yourself you may check one of the 1,000 reviews available in Youtube (https://www.youtube.com/results?search_query=jjrc+elfie).


PLEASE NOTE: THE TEXT BELOW IS OBSOLETE NOW THAT I'VE MANAGED TO MAKE THIS WORK. I'LL UPDATE THE TEXT SOON! (check the project log for more details)


ABOUT THE PROJECT

I was curious to find out how the application communicated with the quadcopter because I thought perhaps I'd be able to control the quadcopter from the internets or use a wifi repeater with a better antenna than my phone in order to extend its range. In the end, I used airodump-ng to find out what the channel of the quadcopter's wifi was and start monitoring the channel to dump all traffic between the quadcopter and my mobile phone using wireshark.

What I found out is that as soon as the JJRC application is opened, a continuous TCP communication is established between the quadcopter and the phone. This is the means the video stream is sent from the quadcopter to the phone. However, surprisingly enough, this communication persists even if the video is not being shown in the screen or the JJRC application is running in the background (Android).

Another detail I found out is that as soon as one presses the button to enable the "altitude hold" or the "on-screen controllers" in the mobile app, a UDP connection is established between the mobile phone and the quadcopter on port 8859. As soon as this connection is established and while this connection is on, the blinking pattern of the lights of the quadcopter changes. This connection remains active as long as the controllers are visible in the screen of the app and the altitude hold hasn't been disabled. Looking at the traffic in this UDP connection, the mobile phone seems to send an 8 byte packet to the quadcopter every 50 milliseconds.

STATE OF THE PROJECT

A detailed explanation of each of the following things can be found in my project logs. So far I've managed to do the following:

- Wi-Fi Repeater: Use a computer running Debian with two wifi cards to be a man-in-the-middle with a strong wifi antenna that interfaces the phone to the quadcopter at greater distances. This adds a bit of latency but the response is still good enough to control the quadcopter to wander at relatively mild speeds with no wind.

-Control the quadcopter from a PC with a USB gamepad without the phone (no video): this can be done by connecting the PC to the quadcopter via wifi and sending properly formatted UDP packets. This was very easy to implement and works like charm.

FURTHER STEPS / TO-DO

- Watch the video stream from a PC: The next thing I want to do is to be able to view the camera stream from the PC. The phone app seems to open multiple connections to the quadcopter and sends multiple 160 byte long messages that nobody seems to know what they mean. One of the connections is replied back by the quadcopter with the video stream and the 160 byte long message that initiates this process seems to be the same every time. I haven't tested that yet but it should be possible to open a TCP socket to the quadcopter, send the magic word and direct the quadcopter's response to mplayer or vlc.

watch_video.sh

This shell scripts calls the pull_video.py script to dumpt the video stream from the elfie into a Gstreamer pipe that plays the video in a window. The exact Gstreamer pipeline looks like this: python ./pull_video.py |gst-launch-1.0 fdsrc fd=0 ! h264parse ! avdec_h264 ! xvimagesink sync=false

x-shellscript - 111.00 bytes - 07/09/2017 at 20:55

Download

send_joystick.py

This application controls the quadcopter from a USB gamepad (it works on a Logitech Dual Action but should work with other gamepads as well). It requires the pygame library to be installed. You must connect your computer to the wifi network of the quadcopter before running the app using Python: "python ./send_joystick.py".

x-python - 1.17 kB - 07/09/2017 at 20:42

Download

pull_video.py

This script connects to the JJRC elfie and starts dumping the raw h264 stream into the stdout. This can be piped into a .h264 file for offline playback or it can be piped to Gstreamer for playback (the exact Gstreamer sequence is provided as a separate shell script). You must connect your computer to the wifi network of the quadcopter before running the app using Python: "python ./pull_video.py".

x-python - 537.00 bytes - 07/09/2017 at 20:41

Download

demo.py

[THIS SCRIPT WAS ONE OF MY FIRST TEST AND IT IS POINTLESS AT THIS STAGE] Demo that turns the propellers on and off (takes about 30 seconds to complete the cycle). You must connect your computer to the wifi network of the quadcopter before running the app using Python: "python ./demo.py".

x-python - 1.74 kB - 02/01/2017 at 00:08

Download

  • A first look at the DHD D5

    adria.junyent-ferre09/23/2017 at 11:43 0 comments

    I recently discovered the DHD D5, which looks like a clone of the JJRC Elfie but sells for a slightly lower price (£15 each, which is a serious bargain). I got a pair of them and I was hoping I'd be able to control them from my scripts straightaway but found out that at least from the software point of view they are significantly different.

    What I've learned so far:

    - The IP of the quad is 192.168.0.1 and it assigns the tablet the 192.168.0.100 (I haven't check if there's a DHCP that keeps a list of devices with different IPs)

    - The control signal from the control app is sent by UDP to port 8080 every 100 ms but the message seems to be different. While in standby, most messages are a single byte 0x25 ("%") with a longer 5 byte message happening every second or so 0x08c0a80164. When the on-screen controls are on, the pattern is similar but you see 11 byte long messages from time to time. I haven't done a TCP dump while flying yet.

    - Video is sent by UDP through a connection started by the app on ports 55934 or 577-something. The quad sends a chain of messages containing the video every 46 milliseconds. Assuming these are single frames, it gives some 21 fps, which is more or less the same framerate of the video from the JJRC Elfie.

    - The video link is negotiated through a brief TCP conversation through port 7070 that lasts for some 200 ms and contains the following:

    >> (from App to Quad)
    OPTIONS rtsp://192.168.1.1:7070/webcam RTSP/1.0
    CSeq: 1
    User-Agent: Lavf56.40.101
    << (from Quad to App)
    RTSP/1.0 200 OK
    CSeq: 1
    Public: DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE
    >>
    DESCRIBE rtsp://192.168.1.1:7070/webcam RTSP/1.0
    Accept: application/sdp
    CSeq: 2
    User-Agent: Lavf56.40.101
    <<
    RTSP/1.0 200 OK
    CSeq: 2
    Content-Base: rtsp://192.168.1.1:7070/webcam/
    Content-Type: application/sdp
    Content-Length: 122
    <<
    v=0
    o=- 1 1 IN IP4 127.0.0.1
    s=Test
    a=type:broadcast
    t=0 0
    c=IN IP4 0.0.0.0
    m=video 0 RTP/AVP 26
    a=control:track0
    >>
    SETUP rtsp://192.168.1.1:7070/webcam/track0 RTSP/1.0
    Transport: RTP/AVP/UDP;unicast;client_port=19398-19399
    CSeq: 3
    User-Agent: Lavf56.40.101
    <<
    RTSP/1.0 200 OK
    CSeq: 3
    Transport: RTP/AVP;unicast;client_port=19398-19399;server_port=55934-55935
    Session: 82838485868788898A8B8C8D8E8F90
    >>
    PLAY rtsp://192.168.1.1:7070/webcam/ RTSP/1.0
    Range: npt=0.000-
    CSeq: 4
    User-Agent: Lavf56.40.101
    Session: 82838485868788898A8B8C8D8E8F90
    <<
    RTSP/1.0 200 OK
    CSeq: 4
    Session: 82838485868788898A8B8C8D8E8F90

    - Some kind of "heartbeat" is sent to another UDP port (55935, which is the same port as the video + 1).  A short message is sent there every second. I assume this is a way for the quad to know that there's an application reading the video stream. It might be a common video protocol given that the info exchanged in the TCP stream above seems to indicate that a common video streaming protocol is used.

    I'm a bit lazy to go through the process of reverse-engineering all this stuff but this quad seems to be more advanced and cheaper than the JJRC, which is an incentive to give it a go. I may start a new project if I have the strength.

  • Recording the video while watching it

    adria.junyent-ferre07/29/2017 at 17:05 0 comments

    Quick update. Dumping the raw h264 video from the quadcopter into a file is something I managed to do very soon once I got the "magic word" the quadcopter was expecting. The raw file has .h264 extension, which can be played by most software available. Still, it seems ffmpeg, vlc or gstreamer will fail to play it at the right framerate. One way I found to convert the raw .h264 into mp4 without reencoding (the quality of the video is bad enough without any reencoding) is by specifying the frame rate in ffmpeg:

    $ ffmpeg -r 24 -f h264 -i dump1.h264 -c copy dump1.mp4

    I'm not 100% sure the actual frame rate is 24 frames/second [EDIT: I first thought it was 15] but the length of the video this gives more or less matches the real duration of the recording that I logged.

    The weight of the video is about 63kB/s.

  • Eureka! It works

    adria.junyent-ferre07/09/2017 at 21:16 0 comments

    I've been quite busy lately and I didn't touch this project since March. This weekend I finally found the time to give a few ideas a try and I managed to get everything working! weee!

    My original idea was to modify my early python script that dumps the video stream into the stdin and to direct the video to a UDP or TCP socket to ffplay or VLC instead. This didn't work very well and made the aforementioned software crash or not play video at all (the video stream from the quadcopter is raw, it isn't packed in RTP or anything). Following these tests, I tried all sorts of things (mplayer, ffplay, gstreamer) and after thousand random tries and lots of frustration I finally found a solution that works.

    The solution is to dump the video stream to the stdin using my original script and to use Gstreamer to decode and play the stream. The magic Gstreamer pipeline looks like this:

    python ./pull_video.py | gst-launch-1.0 fdsrc fd=0 ! h264parse ! avdec_h264 ! xvimagesink sync=false
    This works like charm and the solution is so simple that I feel a bit stupid to not have tried it earlier.

    I did a "feedback" test in order to measure the time delay between REALITY and the video output and this is what I got:

    As you can see from the picture, the lag is somewhere between 100 and 300 ms with a frame rate of about 5-7 frames per second. Not perfect but way way better than any video stream I've ever managed to set up :)

    I believe I've achieved what I wanted to achieve when I started this project and therefore I will soon update the text in the description of the project and mark it as completed. I will perhaps make another "final words" post and point at other interesting "research" ideas for us JJRC Elfie nerds.

  • Will I ever be able to play the video stream?

    adria.junyent-ferre03/26/2017 at 22:36 2 comments

    I decided to write a bit more about my attempts to decode the video stream from the quadcopter. So far, I haven't achieved much but I learnt a bit about h264 and captured some video for experimenting. Writing a little program that reads the stream from the quadcopter and plays it in real time isn't rocket science, it should be easy to anyone with the patience to learn about libavcodec but it isn't working for me yet.

    I used the following code to dump the video from the quadcopter:

    import socket
    import sys
    s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
    s.connect(('172.16.10.1', 8888))
    
    magicword='495464000000580000009bf89049c926884d4f922b3b33ba7eceacef63f77157ab2f53e3f768ecd9e18547b8c22e21d01bfb6b3de325a27b8fb3acef63f77157ab2f53e3f768ecd9e185eb20be383aab05a8c2a71f2c906d93f72a85e7356effe1b8f5af097f9147f87e'.decode('hex')
    
    s.send(magicword)
    data = s.recv(106) 
    n=0
    while n<10000: #write replace by while 1 if you want this to not stop
        data = s.recv(1024)
        sys.stdout.write(data)
        n=n+1
    s.close()

    This operation takes about 107 seconds to complete. The generated file weights 5.9 MB and contains about 1,364 frames (according to VLC). This means that the video stream of the quadcopter is about 54.35 KiB/s with an approximate frame rate of 12.76 fps. I recorded a video with the quadcopter facing a timer running on my tablet in order to measure the time. The video contains raw h264, which is made of a series of so-called NAL units. This can be played using VLC by telling VLC to use the h264 demuxer:

    $ vlc video.bin --demux h264

    The next thing I wanted to do was to see how many NAL units the video contained and what different types of units would I find in the video. I run the following script to search the recorded video for NAL units and list the headers of those units:

    f=open('recording.bin')
    dump=f.read()
    f.close()
    
    p1=dump.find('000001'.decode('hex'))
    while(p1!=-1):
      print(dump[p1:(p1+5)].encode('hex'))
      p1=dump.find('000001'.decode('hex'),p1+1)
    The output of this program shows the following: there are 3039 NAL units in my video, that's about 28.43 NAL units per second, which is about two times the number of frames per second. The next question I wanted to ask was how many different types of NAL units were in the video (because I know close to nothing about h264 and therefore I wonder this type of things). I counted the different NAL unit types using the following command:
    $ cat log.txt |sort |uniq -c |sort -nr

    This gives the following output:

    1286 000001a000
    1286 000001419a
     116 000001a100
     116 00000168ee
     116 000001674d
     116 0000016588
       1 0000011600
       1 0000011200
       1 0000010600

    All relevant information about how h264 works can be found in this document: https://www.itu.int/rec/T-REC-H.264

    In brief, the first byte after 01 contains the basic information about what type of NAL unit it is. The most significant bit of the byte is the forbidden bit and it should be 0, otherwise something is wrong with the encoder or the NAL unit is expected to be ignored. The next 2 bits are the nal_ref_idc and they have different meanings depending on the type of NAL unit. The least significant 5 bits are the nal_unit_type and they show what type of NAL unit we are facing. Going back to our list above, we get the following:

    1286 000001a000 --> a0 = 1010 0000 meaning forbidden=1, ref_idc=1 and unit_type=0
    1286 000001419a --> 41 = 0100 0001 meaning forbidden=0, ref_idc=2 and unit_type=1
     116 000001a100 --> a1 = 1010 0001 meaning forbidden=1, ref_idc=1 and unit_type=1
     116 00000168ee --> 68 = 0110 1000 meaning forbidden=0, ref_idc=3 and unit_type=8
     116 000001674d --> 67 = 0110 0111 meaning forbidden=0, ref_idc=3 and unit_type=7
     116 0000016588 --> 65 = 0110 0101 meaning forbidden=0, ref_idc=3 and unit_type=5
       1 0000011600 --> 16 = 0001 0110 meaning forbidden=0, ref_idc=0 and unit_type=22
       1 0000011200 --> 12 = 0001 0010 meaning forbidden=0, ref_idc=0 and unit_type=18
       1 0000010600 --> 06 = 0000 0110 meaning forbidden=0, ref_idc=0 and unit_type=6
    
    I gave a quick... Read more »

  • First attempts at reading the video stream

    adria.junyent-ferre03/02/2017 at 23:33 0 comments

    Today I played a bit more with the quadcopter. My idea was to dump the video stream somehow and find out how the video was encoded. My idea was to create a TCP socket to port 8888, write the 106 byte magic message that I saw the mobile app send before the quadcopter starts spitting out what looks like a video stream and then read whatever comes from the quadcopter. The following Python script can do this:

    import socket
    import sys
    s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
    s.connect(('172.16.10.1', 8888))
    
    magicword='495464000000580000009bf89049c926884d4f922b3b33ba7eceacef63f77157ab2f53e3f768ecd9e18547b8c22e21d01bfb6b3de325a27b8fb3acef63f77157ab2f53e3f768ecd9e185eb20be383aab05a8c2a71f2c906d93f72a85e7356effe1b8f5af097f9147f87e'.decode('hex')
    
    s.send(magicword)
    data = s.recv(106) 
    n=0
    while n<1000: #write replace by while 1 if you want this to not stop
        data = s.recv(1024)
        sys.stdout.write(data)
        n=n+1
    s.close()
    The result -almost- worked straightaway. You will notice in the code above that I discard the first 106 bytes that the quadcopter returns, this seems to correspond to one of these magic messages that the app and the quadcopter exchange and I have no idea what they mean. The rest of the data that comes from the quadcopter seems to be raw h264 video stored in "nal" format. I dumped the output I got and managed to play it in vlc using the following commands:
    $ python test.py > video.h264
    $ vlc video.h264 --demux h264
    The vlc successfully plays the video but complains about some errors in the headers.

    My next step is to find a proper way to read the TCP stream and make some video player play it with the lowest possible lag. I'll keep you posted.

  • [OLD REPOST] First attempt at "desciphering" the UDP comms

    adria.junyent-ferre02/20/2017 at 11:19 0 comments

    NOTE: this text was originally included in the description of the project and perhaps should have been a log update instead. I reposted it here even though it is outdated and the problems described here were solved later on.

    I've only played with this for one or two hours and I haven't been able to decode the 8 byte message; however, a few bits seem to change when one enables things such as the "altitude hold" or the "headless mode" and a few of the bytes also change whenever the control levers in the screen (thrust, yaw, etc) move.

    I tried following my normal startup sequence in the quadcopter, where I first enable the altitude hold, then I enable the on-screen controllers and then I "arm" the propellers by pressing the up arrow button in the screen. This leaves the quadcopter ready to take off. This sequence generates the following messages:

    66 80 80 01 80 00 81 99 "idle state" (on-screen controllers are enabled but the quad is idle and the altitude hold is disabled)

    66 80 80 80 80 00 00 99 "idle state with altitude hold" (the previous message changes to this message as soon as the altitude hold is enabled)

    66 80 80 80 80 01 01 99 "arm propellers" (the previous message becomes this when the altitude hold is on, the on-screen controllers are on and the "up arrow" button is pressed in order to arm the motors". The message is sent some 20-22 times -for about a second-).

    66 80 80 80 80 04 04 99 "emergency stop" (this is the message that is sent starting from the idle state with altitude hold as soon as the emergency stop button is pressed. The message is sent some 20-22 times -for about a second-).

    Using the information described above, I put together a small Python script that opens a UDP connection to port 8859 in the quadcopter and writes the series of commands that I describe above. The sequence sends a message every 50 miliseconds and leaves some of the messages for a certain amount of time, replicating more or less the sequence I follow when I want to get the quadcopter to be ready to fly.

    If you want to give the demo a try. Take your laptop, make sure you don't have any browser/software running that is internet-hungry and will flood the quadcopter with useless requests, connect your laptop to the wifi of the quadcopter and run the application doing "python demo.py". What you should see is the quadcopter will arm the motors and make the propellers speed up some 10 seconds after starting the app. The propellers will spin for a few seconds and then the program will turn them off and disconnect.

    THE DEMO CODE

    import socket
    from time import sleep
    
    # The IP of the quadcopter plus the UDP port it listens to for control commands
    IPADDR = '172.16.10.1'
    PORTNUM = 8895
     
    # "idle" message, this is the first message the app sends to the quadcopter when you enable the controls
    PIDLE = '6680800180008199'.decode('hex') 
    # "altitude hold" this is what the app starts sending to the quadcopter once you enable the altitude hold
    PHOLD = '6680808080000099'.decode('hex')
    # "arm" this is what the app sends to the quadcopter when you press the "up" arrow that make the motors start to spin before taking off
    PSPIN = '6680808080010199'.decode('hex')
    # this is the message the app sends to the quadcopter when you press the "emergency stop" button
    PSTOP = '6680808080040499'.decode('hex')
     
    # initialize a socket
    # SOCK_DGRAM specifies that this is UDP
    s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, 0)
     
    # connect the socket
    s.connect((IPADDR, PORTNUM))
     
    # send a series of commands
    for n in range(0,100): # send the idle command for some 5 seconds
        s.send(PIDLE)
        sleep(0.05) # the app sends a packet of 8 bytes every 50 miliseconds
    
    for n in range(0,100): # send the idle + altitude hold command for some 5 seconds
        s.send(PHOLD)
        sleep(0.05)
    
    for n in range(0,21): # send the "arm" command for 1 second (this is what the app seems to do when you press the up arrow to arm the quadcopter
        s.send(PSPIN)
        sleep(0.05)
    
    for n in range(0,100): # send the idle + altitude...
    Read more »

  • The first working app that enables joystick control from the PC

    adria.junyent-ferre02/19/2017 at 21:55 0 comments

    I've finally put together the first working app that reads the joystick input from a USB joystick and sends the right commands to control the quadcopter. The application requires the pygame library to be installed and other basic Python dependencies and I've configured to use the axes and buttons that were convenient for my USB gamepad (a Logitech dual action). Basically, it uses the two analogue joysticks for the throttle+yaw and pitch+roll respectively, plus two buttons (7 and 8) for the "arm" and the "emergency stop" commands.

    The procedure to make this work is quite simple: connect the computer to the wifi network of the quadcopter, connect the USB joystick and run the app (running with regular user permissions is ok).

    import socket
    import pygame
    import time
    from math import floor
    
    def get():
        #pygame gives the analogue joystick position as a float that goes from -1 to 1, the code below takes the readings and
        #scales them so that they fit in the 0-255 range the quadcopter expect
        #remove the *0.3 below to get 100% control gain rather than 30%
        a=int(floor((j.get_axis(2)*0.3+1)*127.5)) 
        b=int(floor((-j.get_axis(3)*0.3+1)*127.5))
        c=int(floor((-j.get_axis(1)+1)*127.5))
        d=int(floor((j.get_axis(0)+1)*127.5)) 
        commands=(j.get_button(6)<<2)|(j.get_button(8)) #only two buttons are used so far: "arm" and emergency stop
        out=(102<<56)|a<<48|b<<40|c<<32|d<<24|commands<<16|(a^b^c^d^commands)<<8|153
        pygame.event.pump()
        return out
    
    # The IP of the quadcopter plus the UDP port it listens to for control commands
    IPADDR = '172.16.10.1'
    PORTNUM = 8895
    
    pygame.init()
    j = pygame.joystick.Joystick(0)
    j.init()
     
    # initialize a socket
    # SOCK_DGRAM specifies that this is UDP
    s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, 0)
     
    # connect the socket
    s.connect((IPADDR, PORTNUM))
    
    while True:
        s.send(format(get(),'x').decode('hex')) #ugly hack, I guess there's a better way of doing this
        time.sleep(0.05)
    
    # close the socket
    s.close()

  • A basic script to monitor the controller input sent to the quadcopter by UDP

    adria.junyent-ferre02/19/2017 at 20:03 0 comments

    I'm in the process of making a small program to read the control input from a USB gamepad and send the commands to the quadcopter but first I wanted to make sure I got the UDP protocol right. Therefore, I wrote a small Python script that can be combined with airodump-ng and tcpdump in order to read what the phone app is sending to the quadcopter by UDP and translate the controllers axes readings into something a human can read. The source code is attached below:

    import sys
    
    state=0
    message=""
    yaw=0
    roll=0
    pitch=0
    throttle=0
    commands=0
    err=0
    n=0
    
    for line in sys.stdin:
    	if state==0:
    		pos=line.find("0x0010:")
    		if pos!=-1:
    			message=line[(pos+39):(pos+43)]+line[(pos+44):(pos+48)]
    			roll=int(line[(pos+41):(pos+43)],16)
    			pitch=int(line[(pos+44):(pos+46)],16)
    			throttle=int(line[(pos+46):(pos+48)],16)
    			state=1
    	else:
    		state=0
    		pos=line.find("0x0020:")
    		if pos!=-1:
    			message=message+line[(pos+9):(pos+13)]+line[(pos+14):(pos+18)]
    			yaw=int(line[(pos+9):(pos+11)],16)
    			commands=int(line[(pos+11):(pos+13)],16)
    			err=int(line[(pos+13):(pos+15)],16)
    			n=n+1
    			if n==10: #this is a little trick that only displays 1 out of 10 messages
    				sys.stdout.write(message+" roll:"+str(roll)+" pitch:"+str(pitch)+" throttle:"+str(throttle)+" yaw:"+str(yaw)+" commands:"+format(commands,"08b")+" err:"+format(roll^pitch^throttle^yaw^commands,'x')+'\n')
    				n=0
    In order for this to work, the following commands have to be executed first:
    # airmon-ng start wlan0
    # airodump-ng -c 2 mon0
    The first command puts the wifi adapter into monitor mode and creates the monitor interface. I'm not sure if this can work while the network-manager is running, so I stopped the network-manager beforehand using /etc/init.d/network-manager stop (under Debian). The second command above starts sniffing the traffic in the wifi channel #2. At this point I would turn on the quadcopter, connect my phone to the quadcopter's wifi network and start the jjrc app and the bsid of the quadcopter should appear in the terminal where airodump is running. It will look like this:
     CH  2 ][ Elapsed: 1 min ][ 2017-02-19 18:58 ][ WPA handshake: 42:D1:A4:C1:E5:A2                                      
                                                                                                                          
     BSSID              PWR RXQ  Beacons    #Data, #/s  CH  MB   ENC  CIPHER AUTH ESSID
                                                                                                                          
     01:C0:06:40:B2:32  -37   0      554     4901    0   2  54e. OPN              JJRC-A21494F                             
     A4:E7:22:14:5B:CD  -48  26      610        2    0   1  54e  WPA2 CCMP   PSK  BTHub6-RFPM
    The next thing to do is to start tcpdump and pass the UDP traffic between the phone and the quadcopter to my python script:
    # tcpdump -i 2 udp dst port 8895 -x -l | python ./my_script.py
    This should produce an output that looks like this:
    66807f0180007e99 roll:128 pitch:127 throttle:1 yaw:128 commands:00000000 err:7e
    6667640180008299 roll:103 pitch:100 throttle:1 yaw:128 commands:00000000 err:82
    6665630180008799 roll:101 pitch:99 throttle:1 yaw:128 commands:00000000 err:87
    6665630180008799 roll:101 pitch:99 throttle:1 yaw:128 commands:00000000 err:87
    66807f0080007f99 roll:128 pitch:127 throttle:0 yaw:128 commands:00000000 err:7f
    66807f0080007f99 roll:128 pitch:127 throttle:0 yaw:128 commands:00000000 err:7f
    66807f0080007f99 roll:128 pitch:127 throttle:0 yaw:128 commands:00000000 err:7f
    66a6770080005199 roll:166 pitch:119 throttle:0 yaw:128 commands:00000000 err:51
    66a6770080005199 roll:166 pitch:119 throttle:0 yaw:128 commands:00000000 err:51
    6671620080009399 roll:113 pitch:98 throttle:0 yaw:128 commands:00000000 err:93
    666b840080006f99 roll:107 pitch:132 throttle:0 yaw:128 commands:00000000 err:6f
    66978e0080009999 roll:151 pitch:142 throttle:0 yaw:128 commands:00000000 err:99
    66807f0080007f99 roll:128 pitch:127 throttle:0 yaw:128 commands:00000000 err:7f
    66807f0080007f99 roll:128 pitch:127 throttle:0 yaw:128 commands:00000000 err:7f
    
    By the way, I found out that the difference between the 30%, 60% and 100% settings in the app is the range of the roll and pitch control inputs one gets, they go from 1 to 255 when in 100% mode, from 68 to 187 in 60% mode and from 88 to 167 in the 30% mode. The throttle and yaw controllers are unaffected by the 30/60/100... Read more »

  • A Debian-based Wi-Fi repeater

    adria.junyent-ferre02/13/2017 at 23:07 0 comments

    One of my first concerns as soon as I played with the Elfie for the first time was the limited range one could achieve due to the weak Wi-Fi connection between the quadcopter and the phone. I read somewhere that people had successfully used Wi-Fi repeaters to extend the range of the Parrot drones and I assumed there wasn't any reason why it wouldn't be possible to do the same with the Elfie (SPOILER: I was right).

    I bought a random-branded usb wifi adapter + antenna combo with an alleged gain of 14 dBi (I haven't really thought if the specifications are realistic) and I thought I'd use an old laptop running Debian to act as a Wi-Fi repeater between the toy and my phone.

    After a lot of tries and plenty of frustration trying to use NetworkManager to create a Wi-Fi my phone could see and connect to ("Access Point" or "AP" mode rather than "Ad-Hoc") I found the following procedure that is more or less easy to set up without breaking my normal configuration:

    [EDIT] the credit for the procedure goes to this excellent thread: http://askubuntu.com/questions/180733/how-to-setup-an-access-point-mode-wi-fi-hotspot

      1. First I shut the network manager by doing
        # /etc/init.d/network-manager stop
      2. Then I replace /etc/network/interfaces to contain the following
        source /etc/network/interfaces.d/*
        
        auto lo
        iface lo inet loopback
        
        auto wlan1
        iface wlan1 inet dhcp
               wireless-essid JJRC-A3194E #SSID of the Wi-Fi of the quadcopter
      3. Then, I can make my laptop connect to the quadcopter using ifup:
        #ifup wlan1
      4. The next step is to create an access point using the second Wi-Fi card in the laptop. This can be done using hostapd. This wasn't installed in my computer by default, so I had to apt-get install it. Afterwards, I created a config file in /etc/hostapd/hostapd.conf containing the following self-explanatory definitions:
        interface=wlan0
        ssid=ElfieRepeater
        hw_mode=g
        channel=1
        macaddr_acl=0
        auth_algs=1
        ignore_broadcast_ssid=0
        wpa=3
        wpa_passphrase=1234567890
        wpa_key_mgmt=WPA-PSK
        wpa_pairwise=TKIP
        rsn_pairwise=CCMP
      5. The config file didn't exist at all and I had to modify another config file in order to make hostapd load my configuration. This was done by editing the file /etc/default/hostapd to include the following line:
        DAEMON_CONF="/etc/hostapd/hostapd.conf"
      6. Once that was done, it was possible to start the "repeater" Wi-Fi network by starting the hostapd daemon the Debian way:
        #/etc/init.d/hostapd start
      7. The status of hostapd can be checked by using the command "status" rather than start in init.d. The expected output should say that the service is active with no errors. Once this is active, the Wi-Fi network "ElfieRepeater" becomes visible for the mobile one and one can connect to it. However, no DHCP server has been set up, therefore the phone won't get any IP address assigned and an error will be shown in the phone. This issue can be fixed by installing a DHCP server, I've read some people use isc-dhcp-server but I didn't want to go through the process of setting that up and I simply configured the IP of the Wi-Fi adapter from the laptor and manually configured the IP of the mobile phone from the mobile phone. The first step, from the laptop is:
        #ifconfig wlan0 10.10.0.1
        Then in the mobile phone I chose "static" rather than "dhcp" in the advanced settings when connecting to the Wi-Fi and chose 10.10.0.2 as the ip of the phone and 10.10.0.1 as the gateway IP.
      8. The last step is to configure the iptables in the laptop to forward the traffic it receives in wlan0 so that the phone can connect to the quadcopter through the repeater, this can be done using the following commands:
        #echo 1 | tee /proc/sys/net/ipv4/ip_forward
        #iptables -t nat -A POSTROUTING -s 10.10.0.0/16 -o wlan1 -j MASQUERADE
      9. Voilà! Now the JJRC application can be run from the phone and it should be able to connect to the Elfie as if the phone had been connected straight to the Elfie.

      If the aforementioned procedure doesn't work because the forwarding doesn't work, the whole setup can be debugged by having...

      Read more »

    1. First attempts at understanding how the video link works

      adria.junyent-ferre02/04/2017 at 22:37 0 comments

      I haven't done any further tests with the UDP controls, but I captured some more traffic in order to try to figure out how is the video transmitted between the quadcopter and the mobile phone. The communication seems to be established through port 8888 of the quadcopter; however, getting the video is not as simple as opening a stream with VLC. The traffic in Wireshark shows multiple connections being established. These always start with the mobile phone sending a message of 106 bytes to the quadcopter which seem to have a common header:

      0000   49 54 64 00 00 00 52 00 00 00 7a 34 b1 04 99 c6  ITd...R...z4....
      0010   b1 36 81 b7 be 83 83 4d de 4e 57 aa be 10 36 eb  .6.....M.NW...6.
      0020   d8 94 9b a8 5e 59 54 92 1f e9 3e 37 fa 70 d6 52  ....^YT...>7.p.R
      0030   e2 ae e0 91 10 9c 34 cf ea ff 57 aa be 10 36 eb  ......4...W...6.
      0040   d8 94 9b a8 5e 59 54 92 1f e9 ad ce 18 b0 c3 10  ....^YT.........
      0050   67 a8 36 f3 0e 06 d3 4b 60 bf 0a 78 6b bc b0 82  g.6....K`..xk...
      0060   cf 8e a2 0b 26 f8 31 52 8f ab                    ....&.1R..
      
      0000   49 54 64 00 00 00 52 00 00 00 05 a7 a9 0f b3 6e  ITd...R........n
      0010   cd 3f a2 ca 7e c4 8c a3 60 04 ac ef 63 f7 71 57  .?..~...`...c.qW
      0020   ab 2f 53 e3 f7 68 ec d9 e1 85 47 b8 c2 2e 21 d0  ./S..h....G...!.
      0030   1b fb 6b 3d e3 25 a2 7b 8f b3 ac ef 63 f7 71 57  ..k=.%.{....c.qW
      0040   ab 2f 53 e3 f7 68 ec d9 e1 85 b7 33 0f b7 c9 57  ./S..h.....3...W
      0050   82 fc 3d 67 e7 c3 a6 67 28 da d8 b5 98 48 c7 67  ..=g...g(....H.g
      0060   0c 94 b2 9b 54 d2 37 9e 2e 7a                    ....T.7..z
      
      0000   49 54 64 00 00 00 52 00 00 00 72 98 c0 38 9b c3  ITd...R...r..8..
      0010   72 a7 1a 17 4b d1 b5 14 b3 ad ac ef 63 f7 71 57  r...K.......c.qW
      0020   ab 2f 53 e3 f7 68 ec d9 e1 85 47 b8 c2 2e 21 d0  ./S..h....G...!.
      0030   1b fb 6b 3d e3 25 a2 7b 8f b3 ac ef 63 f7 71 57  ..k=.%.{....c.qW
      0040   ab 2f 53 e3 f7 68 ec d9 e1 85 b7 33 0f b7 c9 57  ./S..h.....3...W
      0050   82 fc 3d 67 e7 c3 a6 67 28 da d8 b5 98 48 c7 67  ..=g...g(....H.g
      0060   0c 94 b2 9b 54 d2 37 9e 2e 7a                    ....T.7..z
      
      0000   49 54 64 00 00 00 52 00 00 00 0c 76 da 94 66 a3  ITd...R....v..f.
      0010   54 68 65 15 92 8c fd 4d 70 c7 ac ef 63 f7 71 57  The....Mp...c.qW
      0020   ab 2f 53 e3 f7 68 ec d9 e1 85 47 b8 c2 2e 21 d0  ./S..h....G...!.
      0030   1b fb 6b 3d e3 25 a2 7b 8f b3 ac ef 63 f7 71 57  ..k=.%.{....c.qW
      0040   ab 2f 53 e3 f7 68 ec d9 e1 85 b7 33 0f b7 c9 57  ./S..h.....3...W
      0050   82 fc 3d 67 e7 c3 a6 67 28 da d8 b5 98 48 c7 67  ..=g...g(....H.g
      0060   0c 94 b2 9b 54 d2 37 9e 2e 7a                    ....T.7..z
      
      0000   49 54 64 00 00 00 58 00 00 00 9b f8 90 49 c9 26  ITd...X......I.&
      0010   88 4d 4f 92 2b 3b 33 ba 7e ce ac ef 63 f7 71 57  .MO.+;3.~...c.qW
      0020   ab 2f 53 e3 f7 68 ec d9 e1 85 47 b8 c2 2e 21 d0  ./S..h....G...!.
      0030   1b fb 6b 3d e3 25 a2 7b 8f b3 ac ef 63 f7 71 57  ..k=.%.{....c.qW
      0040   ab 2f 53 e3 f7 68 ec d9 e1 85 eb 20 be 38 3a ab  ./S..h..... .8:.
      0050   05 a8 c2 a7 1f 2c 90 6d 93 f7 2a 85 e7 35 6e ff  .....,.m..*..5n.
      0060   e1 b8 f5 af 09 7f 91 47 f8 7e                    .......G.~
      Searching online, I found a Reddit thread from 10 months ago where some people discussed how to control a (different) quadcopter from the PC. The model of the quadcopter was different but it also used port 8888 and it also used 106 bytes-long messages. One of the redditors in the thread eventually published a Chrome app that was able to control the quadcopter.

      I've given the source code of the Chrome app a look and it doesn't have any comments but they seem to blindly send the messages they've seen the mobile app send to the quadcopter in order to establish the connection and get the video stream. The messages their application send seem to different from the ones the JJRC Elfie expects.

      I may try to identify all messages the JJRC app sends and see if I can get the video to open in VLC somehow but this is certainly not going to happen today. I will keep you posted.

    View all 11 project logs

    Enjoy this project?

    Share

    Discussions

    Alexander R Pruss wrote 06/15/2023 at 17:09 point

    I've been looking at my JJRC H68 drone, or more precisely its detachable camera and WiFi control module (the drone works fine without it, using the included controller; I've been using the camera with an RC car). 

    The IP address is 172.16.1.1. Video is TCP from port 8888.

    I can capture with:

       cat vidstream-trigger - | nc 172.16.1.1 8888 > video.out

    where vidstream-trigger contains the magic hex bytes 00 01 02 03 04 05 06 07 08 09 28 28. And your ffmpeg conversion works fine on the output.

    I think UDP commands are different--if memory serves, they are 11 bytes long. Port is still 8080.

    The camera runs linux. You can also telnet 172.16.1.1. Username: root, no password. I haven't found anything particularly interesting to do with the root shell.

      Are you sure? yes | no

    adria.junyent-ferre wrote 06/15/2023 at 17:18 point

    That's great! Thanks for writing. I'm tempted to buy one of those now!

      Are you sure? yes | no

    Alexander R Pruss wrote 06/16/2023 at 14:40 point

    Here is what I have figured out about the 11 bytes of UDP command (I haven't tried sending these to the drone yet):

    0xff
    0x08
    left stick vertical (bigger = up, center = 0x7E)
    left stick horizontal (bigger = right, center = 0x3f, max = 0x7d)
    right stick vertical (bigger = down, center = 0x40)
    right stick horizontal (bigger = right, center = 0x3f)
    xyyyyyyy: 
        x = altitude hold (1=on)
        yyyyyyy = left stick horizontal adjust (center = 0x10, bigger = right, min = 0x00, max = 0x1f)
    right stick vertical adjust (center = 0x10, bigger = up, min = 0x00, max = 0x1f)
    right stick horizontal adjust (center = 0x10, bigger = right, min = 0x00, max = 0x1f)
    dush00pp:
        pp = speed (00 = 30%, 01 = 60%, 10 = 100%)
        h = headless
        s = emergency stop (1 = active)
        u = one key start (up button; 1 = active)
        d = one key landing (down button; 1 = active)
    checkbyte: FE minus sum of above ten bytes

    I suspect that these get sent over an i2c bus to the drone.

      Are you sure? yes | no

    Kynan wrote 09/18/2021 at 08:20 point

    Hey Mate.

    I have started hacking into a similar drone using some code I have found from your project here.

    Feel free to look at the information I have gathered : 

    https://hackaday.io/project/181753-obtaining-control-of-jxd523-the-20-drone

      Are you sure? yes | no

    João Victor wrote 08/19/2020 at 16:39 point

    Hello. I am doing a project to my signal processing class and I think that you may come as a help. I want to use my drone to make video analysis with PCA using its camera. However, I don't know how you got its "magic word". My drone is the JJRC X6

      Are you sure? yes | no

    ozzy69 wrote 09/20/2020 at 05:54 point

    First you connect the drone application and leave your computer in monitoring mode. Then you start recording through the app and enter the wireshark and make the "Man-In-The-Middle" attack and get the magicword. I left some images to help you:

    https://sites.google.com/site/negocindosica/img1.jpg

    https://sites.google.com/site/negocindosica/img2.JPG

    Obs.: Sou br também kkkkkk

      Are you sure? yes | no

    Alexander R Pruss wrote 06/15/2023 at 17:13 point

    Or if your WiFi adapter doesn't support monitoring, you can just use PCAPdroid on your Android phone while running the jjrc app. Doesn't even need root. Send the pcap file to your computer and analyze with Wireshark or some other tool.

    Then look at the last TCP message sent from the phone to the drone before the drone starts sending back video (you can tell it's sending video when the data lengths get large). In my case that message was 00 01 02 03 04 05 06 07 08 09 28 28 (hex).

      Are you sure? yes | no

    ozzy69 wrote 07/22/2020 at 13:00 point

    Hello dude, thanks for your project!

    Let me ask you something, did you ever try to use the wireless from camera for to control another device? Like arduino for example?

      Are you sure? yes | no

    adria.junyent-ferre wrote 07/23/2020 at 16:10 point

    I don't know if I got your question right. You can use the drone as a wireless camera indeed and once you get the stream on your computer you can use OpenCV to do some computer vision stuff with the image. Back in the day when I worked on this project my main goal was to get low latency video on the laptop in order to control the drone from a good wifi antenna and try to get the longest transmission distance and stuff, I didn't attempt to do any CV control with the camera. This would make a good project.

      Are you sure? yes | no

    ozzy69 wrote 07/23/2020 at 16:21 point

    Sorry, I will try to explain again. I wanna use the wireless from the camera for to control the arduino. I wanna send commands wi fi for to control the arduino, without ethernet cable.

      Are you sure? yes | no

    ozzy69 wrote 07/23/2020 at 18:21 point

    Because in your project you use the wi fi from the camera for to send datas for the microcontroller of the drone, right? Now i wanna to change the microcoller of the drone and put the arduino and send data to him for turn on a led for example

      Are you sure? yes | no

    pauver wrote 02/10/2019 at 08:20 point

    Hello! I have the jjrc H37 (BABY elfie version) And i'm trying to communicate with it with your script, but it doesn't work. Could you help me out??? thx

      Are you sure? yes | no

    ozzy69 wrote 07/22/2020 at 13:02 point

    Just connect the wi fi of your drone and run the scrip

      Are you sure? yes | no

    tom.erez8 wrote 10/26/2018 at 16:13 point

    first, thank you very much! this blog was very helpful in my own little project. I bought the JJRC H47 ELFIE+ drone, and tried to do as you did: controlling it from the computer and receiving it's image. However, it seems that the H37 version and the H47 are quite different. This is my github page for my project if you would like to see: 

    github.com/TomE8/drones 

    Thank you again! 

    Tom

      Are you sure? yes | no

    Hermann Stamm-Wilbrandt wrote 08/10/2018 at 22:07 point

    Thank you so much for your project! Based on what I learned here I travelled a long way to capture traffic between Eachine E52 drone and Android UFO app. Currently I have pull_video.c only, but that works fine. Instead of a single magic word on one TCP connection I had to do replay of packet capture, create 6 TCP connections (2 against stealth ports), sends 19 packets over 3 of the TCP connections and then receive the h264 video. I published the repo today, need to provide equivalent of send-joystick.py for E52 as well: https://github.com/Hermann-SW/wireless-control-Eachine-E52-drone

      Are you sure? yes | no

    zhamalbek.kz wrote 12/21/2017 at 07:12 point

    hello) how to find ip 172.16.10.1

      Are you sure? yes | no

    Hermann Stamm-Wilbrandt wrote 08/14/2018 at 20:33 point

    Connect your Android to drone AP, then click on the connection in Android. It will tell you Android IP as 172.16.10.2, them AP is 172.16.10.1 (most likely)

      Are you sure? yes | no

    jackerfire wrote 10/14/2017 at 09:15 point

    May i ask something about Gstreamer? I'm not really understand how to use it and i found some problem with the path --->" WARNING: erroneous pipeline: no element "fdsrc" " , i really don't understand this path ---> ./pull_video.py | gst-launch-1.0 fdsrc fd=0 ! h264parse ! avdec_h264 ! xvimagesink sync=false

    Thnk you.

      Are you sure? yes | no

    Hermann Stamm-Wilbrandt wrote 08/14/2018 at 20:23 point

    fdsrc is a gstreamer 1.0 core element (run "gst-inspect-1.0 fdsrc"), your message sounds more that you run gstreamer 0.10. Which gstreamer version do you run?

      Are you sure? yes | no

    Turo Heikkinen wrote 05/22/2017 at 12:34 point

    Tried telnetting the drone? Lots to investigate there:) http://yesuserserviceableparts.blogspot.fi/2017/05/logging-into-eachine-e50s-drone.html

      Are you sure? yes | no

    adria.junyent-ferre wrote 05/22/2017 at 16:10 point

    Excellent stuff, many thanks for the information. I actually ran an nmap scan of the device but I didn't find anything useful. I did try SSH but I didn't try telnet. Will do!

      Are you sure? yes | no

    Dan Julio wrote 02/02/2017 at 02:56 point

    Nice work.  I also reverse engineering another toy drone and found a similar communication between the app and the WiFi interface.   I don't know if any of this will help you, but here's the link.

    https://hackaday.io/project/19356-reverse-engineering-a-promark-vr-toy-drone

      Are you sure? yes | no

    adria.junyent-ferre wrote 02/02/2017 at 11:19 point

    I should have done my homework and search hackaday.io properly before doing this myself. Clearly the protocol isn't the same but it is very similar indeed. Thanks for sharing.

      Are you sure? yes | no

    adria.junyent-ferre wrote 02/04/2017 at 22:02 point

    I've just found this thread in Reddit from a guy who was doing similar stuff with yet another quadcopter: https://www.reddit.com/r/HowToHack/comments/4512il/how_to_hack_ip_camera_in_toy_drone/

    Apparently, someone made a Chrome app to control the drone and watch the video in the browser: https://github.com/FREEZX/fq777-954-chrome-app

      Are you sure? yes | no

    Dan Julio wrote 02/05/2017 at 17:31 point

    Interesting.  Cute little quad.

      Are you sure? yes | no

    Similar Projects

    Does this project spark your interest?

    Become a member to follow this project and never miss any updates