cxtr_
lvl.1
United States
Offline
|
Hello,
For the upcoming Synopsys 2025 championship our team is planning to build an unmanned aerial vehicle swarm system for survivor location in disaster relief. We are taking much inspiration from your drones, and will use some of your components in our build.
Background:
Disasters, whether natural or man-made, can have devastating impacts on communities, resulting in loss of life, injuries, and widespread damage. In the aftermath of such events, rapid response and effective search and rescue operations are crucial to saving lives and providing necessary relief to affected individuals. Traditional methods of locating survivors often involve extensive ground searches, which can be time-consuming and inefficient, particularly in large or difficult-to-access areas.
In recent years, advancements in unmanned aerial vehicle (UAV) technology have opened up new possibilities for enhancing disaster response efforts. UAVs equipped with advanced cameras, sensors, ML-systems, real-time communication capabilities, and autonomous navigation systems can significantly improve the speed and accuracy of locating individuals in distress. By leveraging aerial views, these UAVs can cover vast areas quickly, identifying survivors, assessing damage, and guiding emergency response teams to critical locations.
However, the effective deployment of UAVs in disaster scenarios requires careful consideration of various factors, including environmental challenges, regulatory compliance, and integration with existing emergency response frameworks. As such, there is a pressing need for a dedicated project focused on creating a UAV specifically designed to address these challenges.
Purpose:
This project is designed to streamline the process of finding individuals in a disaster-stricken area. This will allow responders to administer care faster, and the drone will be able to administer a small amount. Our project will be incorporating Machine Learning, in the forms of audio and infrared object detection. Incorporating more advanced forms of technology will help to save more lives, and will raise awareness on how technology can be used in ways to save lives.
Our drone uses an infrared camera to detect humans at night, allowing detection even if it is pitch black. On top of this, our project innovates by using machine learning on the camera feed, specifically object detection. The feed is sent back to the ground station computer running the model, allowing for fast and accurate detection. Our project also utilizes machine learning in the form of audio. Audio feed will be transmitted over radio to the computer, running another model in tandem that is able to detect human voices. This allows for an even tighter sweep for signs of humans. The project runs using GPS location to be able to accurately pinpoint the location of any humans. Our project will incorporate a bracelet that sends GPS signals for drones to help locate and find.
Thus, our engineering goal for this project will be to Create an unmanned aerial vehicle network capable of streamlining the entire disaster relief and survivor location process for hurricanes that can be managed by a single person.
Now that you have an overall overview of our project, we will get into the materials and procedures used for our project. Specifically, we plan on integrating your DJI Integra goggles as well as the Runcam Night Eagle HD with your Runcam Link VTX.
Here are the materials:
FC ESC flight stack Link (x1) ($60)
SourceOne Frame Link (x1) ($30)
GPS+Compass Module Link (x1) ($20)
Thermal Camera Link (x1) ($150)
RadioMaster RP4 Receiver Link (x1) ($24)
RadioMaster Pocket TX ELRS Link (x1) ($65)
AKK Race VTX Link (x1) (Already Owned)
Foxeer Micro (x1) (Already Owned)
Battery Link (x2) ($46)
RemoteID compliant module Link ($35)
Rush AGC Microphone Link ($5)
DJI Integra Link ($350)
SG90 Microservo (x1) (Already Owned/In FabLab)
Racer Star br2207s Link (x4) (Already owned)
Props (Already owned)
Total Price: $815
As for our build procedures and protocols, we have them planned out below:
Build Procedures: Assemble the frame using provided screws and standoffs following the instruction manual in the kit. Connect standoffs provided to the four underside corner screw holes of the flight controller using screws provided in the kit. Connect the electronic speed controller(ESC) under the flight controller standoffs using screws provided in the kit (same screw size and shape) Connect the ESC wire given in the kit to the ESC port on the flight controller as shown in the diagram. Connect the battery to check for smoke. Solder Analog VTX module to FC as shown in the diagram. Flash Ardupilot copter firmware onto FC using the Ardupilot flasher as described in the docs. Download, install, and run Mission Planner configurator as described in the docs. Configure the VTX on the correct baud rate and channel as described in the docs. Solder the thermal camera to the FC as described in the diagram above. Make sure the camera and VTX work by testing OSD on VRX as described in the docs. Solder ELRS receiver to the drone as shown in the diagram above. Configure ELRS through the ELRS configurator with binding phrase through wifi as described in the docs Bind ELRS receiver to Radiomaster Boxer with the binding phrase and configurator as described in docs. Solder GPS module to FC as shown in the diagram above. Set up the GPS module using mission planner software as described in the docs. Configure the ELRS receiver and transmitter to use MAVLink as described in the docs. Set up ELRS Backpack to communicate with Mission Planner as described in the docs. Find a dataset with aerial views of people, preferably over 15k images on RoboFlow. Train YOLOV8 Transfer learning model on data on RoboFlow. Find a dataset with audio of people screaming on RoboFlow. Train a model that segments audio of people screaming out of background noise RoboFlow. (Propwash, rushing wind) Setup camera feed on VRX to become a camera input to the computer running Mission Planner as described in the docs. Setup MAVProxy python module for Mission Planner as described in the docs. Write a Python script that uses the YOLOV8 Pytorch model and audio detection to send a ping to a StreamLit webpage when they detect a human as described in the docs. It should be able to control waypoints through MAVLink telemetry and dronekit. Set up channels for RC Controller through Arduflight Configurator as described in the docs. Configure motor direction and index (Props Out) as described in the docs. - Install drone kit SITL to simulate drone swarming as described in the docs.
In addition, our project will be thoroughly tested.
Testing Procedures:
Test that the drone properly arms and responds to radio control.
Test that the propeller direction is correct (IMPORTANT: TEST WITHOUT PROPELLERS)
Test that the drone can fly with radio control (From the RadioMaster Boxxer TX)
Test that the drone can send feed to a video camera
Test the drone GPS using mission planner and a simple pattern.
Test the range of the VTX and RX modules.
Test the ML camera stream for successful video transfer.
Test the ML audio stream for successful video transfer.
Test the drone with multiple testing runs using a variance of people. All tests will follow a predefined path and there will be 3 days of 20 testing runs each to get 60 results with variance of weather.
Test the drone remote arming
We need to make sure our project will meet the criteria and constraints
Criteria:
Drone must be able to reach speeds of 50KPH
Drone must be able to transmit video
Drone must be able to detect 99% of humans through all tests over a certain area
Audio and Video ML models must have over 95% accuracy when tested
Machine learning inference time must be under 100 milliseconds for both video and audio
The project must run in real-time
Constraints:
Budget: $1,000 (For affordability)
Project must be finished by January 21, 2025
Abide by Synopsys Safety Protocols as well as ours:
Safety Protocols (IMPORTANT):
Propellers will be OFF the running motors while a person is handling the drone or in a vicinity of 5 meters
Lithium Ion Batteries will be kept away from any heat over 50 degrees Celsius and will be 5 meters away from any water.
Drone will only be flown in FAA-regulated areas. Specifically Sunnyvale Baylands Park. Places that cannot be used for drone flying include anywhere near major airways or airports.
VTX power will be kept under 25 milliwatts
There is ALWAYS a RemoteID module compliant with the FAA any time the drone is flying.
Project has to fly in Baylands Park area that allows UAVs
A specific area is designated in Bayland’s Park that lets UAVs fly in them.
(If bibliography information is needed, please reply to us, we will send it.)
Us as researchers are also qualified to build this project. Each of us achieved first place in the Synopsys competition last year, and two of us, Noah and I, went to the California Science and Engineering Fair. The two of us then went on to the national level, ThermoFisher JIC, where we placed as one of the top 300 projects in the nation. Both our projects last year were built around machine learning as well, so we have experience around this area. If you want more information, we are able to send you our past research as well as some information about the competition we participated in last year.
As you can see, we have a fully fleshed out research plan for this coming season. The goggles integra and the Runcam Link are some of the most vital components of our project, and we are formally asking for a sponsorship request just for the Goggles Integra. In return, we can give you an advertisement of these goggles, specifically how DJI can be well integrated into the first response system. We can showcase how DJI is innovating in the process of disaster relief and search and rescue, at an affordable price. Compared to other corporate search and rescue drones, our build will only reach $815 dollars in price. Sponsoring our project will give you a way into the budget market, especially in low income areas that do not have access to these large and extremely expensive drones. You can profit overall by entering a new market, and your company will then have the upper hand in the area of low cost search and rescue which many others haven't entered yet.
In conclusion, sponsoring our research project is a profitable decision for both parties. You as a company will have access to a new market, showing that your high end products can be used at a considerably lower price than other search and rescue drones. If more information is needed please reply to this email with your requests; we will be happy to comply.
Best Regards,
Pranay Bokde, Vanessa Dinh, Noah Song
STEM Leadership Institute
3000 Benton Street, Santa Clara, California 95051
916 - 513 - 3747
|
|