This is an experimental project build on ubuntu and works fine with raspbian. The concept is to have something with low cost to surveillance an area. Via software we can control drone's movement, streaming, capture video, create missions ( autonomous drive )  and bind them with sensors ( where exist in the area and communicate via gpio( not added to the project yet) or ioio android hardware ). There is also and simple android application that "arm" and disarm the area. Thus when something violates a sensor, the drone start the mission is bind with that sensor in order the drone go to the specific sensor and record what is happening. Software uses mongodb as database and nodejs as back-end.  Uses opencv for face detection and was under development microsoft cognitive services for face recognition. 

Thing that need research  n development :

  • Only one drone can connect to server software because drone has a dhcp and server is connected as a client. Parrot uses busybox linux and we can disable dhcp and connect it to server's ip. I have all the resources to do that but currently no much time :) By that the connection could be secure because now anyone can connect to the drone because no password exists.
  • Drone's batter lasts 15 mins on stand by. Solution of that is wireless charging. But parrot disables dhcp. So we have to by pass this peace of code. I know where it is but i have't try to do it. Another solution is to connect the wireless charger adapter bypassing the usb port. Currently no much time to invest :) 
  • Needs to be developed obstacle avoidance algorithm in order to avoid obstacles while autonomous drive. 
  • Opencv has some problems in raspbian and the cause of that may be the installation i have done. 
  • Currently app is getting signals from sensors via ioio-otg micro controller, i have created piece of code ( nodejs ) in order to works with raspberry gpio but have not tested yet.  

The code of the project needs clean up and optimization but it was written in a very fast for a competition. 

Thus the project's workflow is one android app that arms and disarms area, one server software that listens signals from sensors and listens arms and disarms from android app and an android service that is connected with ioio otg micro controller to get digital inputs from sensors and send them to server software. While listening sensors, if the area is armed and if a sensor has "opened" the server software drives the drone to that sensor.