The first full autonomous AI search and rescue drone
After visiting Japan to consider working there for a while, we were confronted with an earthquake alert on our mobile phone the very next morning. There were 30 seconds between the warning and the actual quake.
When a super major disaster like the Nankai quake hits, hundreds of thousands of buildings will collapse or suffer major damage and over 50 million people will be affected.
To search for survivors, humans or drones requiring one operator per unit are used. Search K9s are limited in numbers or are flown in, while survivors need to be found within 24-48 hours. With the possibilities AI offers this must change.
After our study of MSc Robotics and Mechatronics at the University of Twente, Netherlands,
we're designing LifeRaptor: A small, completely autonomous AI navigated drone which can be solo launched or in 'swarms' of 4, 8 or even 16. Think the Minority Report spyders in drone form.
Packed with sensors and custom hardware (RGB, thermal, LiDAR/proximity, LED floodlight and more) that provide rich data for the drone's AI.
Requiring just 32 cm to navigate, LifeRaptors locate hazards like fire, smoke, or gases faster and safer than humans, and more efficiently than large tethered drones. A single operator can monitor a drone pack’s data, mapping dangers. An all-clear lets responders move to the next structure. Searches are 5-10x faster, critical after quakes or disasters.
We believe this is the search and rescue future. And the time to develop is now, not after the next disaster.
One of us is going to be at the Xponential Houston event May 20-22, where innovators building autonomous drones show off their stuff.
While we're working on our project with some proprietary secrets, it's always a good idea to get a feel of what is in the pipeline for drone builders.
On a special note, Sony is rumoured to have their first showing of the AS-DT1 there, which is of great interest to us because this might be the smallest LiDAR on the market for the coming year.
April 12, 2025
Object Detection: If humans had LiDAR
A collapsed structure has many more obstacles than outdoor drone navigation. Steering through is our AI's primary goal: If the drone can access nearly anywhere, it can find its target nearly everywhere.
Our current LifeRaptor with LiDAR, monocular camera and our March 2025 AI can detect a 2mm obstacle.
Imagine a steel rebar rod sticking out directly into the drone's flight path: does LiDAR detect it, does machine vision when it matches a dark background, or does the depth resolution? We want the drone to: +10: detect and navigate around it. +3: detect, turn back and find another way. -10: collide. And of course the AI could also handle risk +5: detect, slow down, touch, and if it's non-blocking (hanging cable): turn around and return later if another path could not be found since the cable may be live or entangle the drone's rotors. This is why our AI needs such an array of different sensors: Camera machine vision requires more AI, LiDAR less.
Even though humans navigate and detect through vision, small obstacles can still trip us up. If we had LiDAR: this is close to what we're aiming for. LightWare SF45 provides us with a 58 gram LiDAR, FLIR Lepton the thermal camera, Monocular camara for machine vision, and AI on the NVidia Jetson.
April 4, 2025
AI Noise Filtering
As the LifeRaptor navigates through collapsed buildings, it is essential that sounds coming from nearby survivors are picked up, for example from behind a blocked door.
We're working on separate AI noise filtering which removes the drone's motor and prop sounds in real time which allows listening for human/animal noise emitters. Functioning similar to a noise-cancelling headphone. Additionally the AI navigation should be able to set down the drone, stop the rotors, listen and alert if any sound emitters are detected. The drone can alert a human operator if the sound amplitude is above a manually set threshold.
March 8, 2025
Unreal Engine Simulator
We are planning to move from our current c++/bgfx-based simulator to Unreal 5. This isn't a typical operator drone sim, this simulator flies from the drone's AI perspective, inside confined flightpaths inside collapsed structures.
Here are some key reasons for the changeover:
Primary is at some point we need to add environmental hazards to our simulation so the AI can detect, map and avoid smoke, fire, water. We'll need to write SFX shaders (GPU code) which unnecessarily slows development when these may already be available for Unreal.
It will also allow for faster level setup/model transformation. An example below: slanted structure on left is very disconcerting and difficult for human search vs building on the right. This should pose no such problem for drones.
Secondary is realism: Unreal 5 can render highly realistic models, and realism is key when you cannot always train or field test in real life collapsed structures.
Finally since we're currently 100% self funded, it will allow us to apply for an Epic grant, which will push along our development by extending personnel hire time and the purchase of more advanced AI tech and new LiDARs: We'd love to get our hands on a small batch of Sony AS-DT1s when they become available.
January 19, 2025
LifeRaptor rev 2
This week we're busy building an updated version of our LifeRaptor. Among the changes are a move to NVidia Jetson Nano, and the use of LightWare LIDAR instead of ultrasonic proximity sensors for more precision in low-light and featureless areas.
We haven't come across any drone of this size packed with such rich data collectors. BC of LIDAR we will run RTAB-Map instead of ORB-SLAM.
Remember the drone's aim is not fancy 4K or 8K YouTube videos, but provide fast data streams to maximize AI absorption.
Sub 50 decibel noise emission with AI noise filtering
November 10, 2024
Our latest AI trainer
Today we built our AI workstation: Intel Xeon W7 with 256GB RAM, NVidia RTX 6000 48GB, built on an Asus W790 Sage.
Even though this single machine is not yet optimal for our image/video based AI model training, this setup still allows us to run solid but lower AI models such as YOLO
+ ORB-SLAM/RTAB-Map.
July 5, 2024
In Japan we're asked: Explain the name LifeRaptor
Life because our drone searches for exactly that: it locates living organisms (whether they are human disaster survivors or animals) and Raptor because it directly and without hesitation homes in on its target.
And like Raptors, they can hunt in packs (or swarms in drone talk).
We aim to fundamentally change the way search teams operate in case of disasters such as earthquakes, explosions or any emergency which requires a structure to be scanned for lifeforms.
Our LifeRaptor drones are the first AI trained search drones that are fully autonomous, find their target, do not get stuck even in collapsed buildings and have a complete range of cameras and sensors that map out the entire structure for the operator to review. It operates faster, safer, and much cheaper than human or K9 search teams, resulting in buildings scanned for points of interest more efficiently.
We know this is another area where AI will change almost everything.
June 2, 2024
Japan
We're in Japan for another week surveying buildings damaged in the Noto quake.
Some of the damaged buildings have already been demolished, we learned timing is crucial. Apart from lots of images to feed our AI, seeing what type of structural damage is done by quakes allows us to adjust our 3D simulator scenes to be as lifelike as possible.