HMArnold.msn
First Officer
Flight distance : 24012 ft
United States
Offline
|
I have been getting a lot of email questions and forum posts lately about using the PC Ground Station for photogrammetric work, so I thought I would start a thread that summarizes what I have discovered so far.
The good news is that I have been able to get the entire S1000+/A2/Lightbridge/IOSD Mk II/Datalink 900mhz/Zenmuse gimbal/Dual Futaba 14SG system to fly reliable autonomous missions with rational grid distances using the Photogrammetry tool that produce flight information exported using the IOSD DataViewer application.
Combining this flight info with the images from a Sony A6000 camera can be used to produce a mosaic image of the entire grid area using the Agisoft software package.
Although I don't have any way to test it, I assume that these same results can be obtained with any other combination of hardware as long as it flies, takes pictures in response to a standard servo signal, and uses an A2 controller coupled with an IOSD Mk II unit.
The absolute minimal system for flying blind I can think of that could work would not need the Lightbridge, the camera gimbal, or the Futabas because you can see most of the required info on the PC Ground Station screen as you're flying, basic throttle/pitch/roll, etc commands are available on the PC keyboard, and the "RETURN HOME" button causes the platform to return back to the home point with remarkable accuracy and grace without any operator intervention.
All that being said, the bad news is that there are quite a few little operational issues that you have to learn to work around, but nothing that prevents you from bringing home the goods.
I have now flown 50 or so autonomous missions, all with happy endings equipment-wise.
More than half of those missions have been my attempts to characterize the mission planning in terms of altitude, distance between the grid rows, distance between the image points along each leg, and the speed of the platform during the mission such that the resulting images and flight info can be used in Agisoft to create a seamless orthogrammetric image of the target area.
These are the two main things I've learned:
1) The limiting factor to make a mosaic image commercially is the battery. I fly agricultural fields in southern Texas, and some of them are upwards of 250 acres. For an autonomous mission at 75 meters, I can cover about 35 acres with a 15000 mah battery before I'm looking for a soft spot to crash in. The images you get at 75 meters with a 50mm lens show individual leaves on the plants, weeds with enough definition to identify them, areas of standing water, etc. If I fly at 200 meters I can cover almost 70 acres with one battery because of the wider grid spacing, but the resolution at that altitude is limited to standing water and area color changes, which are both important to agricultural imaging.
2) The Agisoft software package is great stuff. I have been thinking of trying the Pix4D product, but haven't gotten around to it because I haven't seen anything written down that says it's better than Agisoft, and so far the Agisoft package has been reliable and repeatable.
The ability of the software to reconstitute a mosaic image is controlled by a lot of things like image quality, availability of GPS coordinates, etc - but the most important for me is the amount that the images overlap, even if you have flight parameters.
There are two different overlap values, and they have completely different risks and benefits.
The most important overlap value is the horizontal overlap, which comes down to the distance between each row of your autonomous mission grid. The greater the distance between each row, the fewer rows you have to fly, so the more acreage you can cover on a single battery. The best of all possible worlds would be such that the distance between each row is such that the image taken covers an area on the ground that just touches the area covered by adjacent picture on the previous leg, and just to the point where the adjacent image on the next leg will start. This would be 0% overlap, and if you sit down and position the images yourself by hand, you could re-create the mosaic image.
I have been able to make mosaic images using Agisoft with overlaps of as little as 35%, which when you really look at a single spot in two adjacent images, isn't a lot - but when you consider that 35% overlap means 35% wasted imagery, it hurts.
If Pix4D or any other software could work reliably with less than 35% horizontal overlap I would be very interested in hearing about it.
The other overlap value is the vertical image overlap, which is the distance between images as the platform flies along each leg. One would think that with a camera like the Sony A6000 that can take images at 10 frames a second until the SD card fills up or the battery runs dry, this wouldn't be a problem, but with an A2 controller, it is.
The only way I know of to trigger the camera shutter on an A2/PC Ground Station setup is by using the "GP SERVO ACTION" definition, which as I understand, presses a button that shoots an infrared signal to the camera that clicks the shutter.
The problem with an A2/PC Ground Station system is that the minimum time you can set for a "GP SERVO ACTION" point is one full second.
If your autonomous mission gets defined in such a way that the image points are closer together than one second, the A2 records the GPS location, but the "GP SERVO ACTION" point has not recovered from its one second cycle, so the camera doesn't take that image.
If you are right at the edge of usability with the vertical overlap, losing a few images messes up the whole show, offsets the GPS coordinates to the wrong pictures, Agisoft gets confused, and the entire mission is basically wasted.
The only solution to the one second restriction that I know of is to slow the platform down along each leg so that one second will still give you the vertical overlap you need to make the mosaic.
Since the lower the altitude, the faster the images need to be taken to overlap, for a 30 meter mission I have to fly at 2 meters a second to have any hope of stitching things back together.
Taking pictures at walking speed is never going to be economically viable.
I don't fault Agisoft for this, I fault the DJI minimum shutter time for having a camera that can take images 10 times faster than that limping along while the A2 finishes it's one second cycle.
At higher altutudes, say 75 meters and above, it doesn't hurt so bad because the fastest I can get the A2 to go is about 8 meters per second, and at that speed there is more than one second between image points.
There are 3 different places you can enter a platform speed on an autonomous mission, but no matter what you speed you enter, after each turns its flies at about 3 meters per second then slowly builds to a maximum of 8 meters per second.
If anyone knows of a way to get the platform to fly faster, please let me know because at 100 meters altitude you could still set great image resolution, stay ahead of the 1 second "GP SERVO ACTION" restriction, and finish a mission in less time if you could fly faster than 8 meters a second.
Also with the Agisoft, every time I have a question or a problem that I send to them in the form of an email, I get an intelligent, thoughtful response within a few hours that either tells me what I'm doing wrong of asks me to send them what it takes to see what I'm doing wrong - then after they see what I sent, I get an intelligent, thoughtful response.
Agisoft support and DJI support are as different as night an day.
When I figure something out and post what I've learned, I have been posting my email address and offering to do what I can to help. It's been a great opportunity to interact with people all over the planet, and we all have the same hopes and dreams
HMArnold@msn.com
|
|