Please select Into the mobile phone version | Continue to access the computer ver.
Augmented Reality Demo- Tutorial
383 6 8-3 07:31
Uploading and Loding Picture ...(0/1)
o(^-^)o
JohanH
lvl.3
Flight distance : 30479 ft

Netherlands
Offline

DJI UX SDK IOS Augmented Reality Tutorial

Introduction

Looking for ways to combine two of my favorite topics, drones and 3D graphics, I came up with an AR (Augmented Reality) solution for DJI drones. The original idea was to have a 3D particles trail in the video, rendered to a separate video file with a green background on the mobile device, which could then be merged with the drone’s video using most video editors. Due to the latency of updates between the drone and the mobile device (in my case an ol' Spark and an iPad) I had to abandon the 3D trail idea for now. To still demonstrate the concept I uploaded some sample code and wrote this tutorial.

One of the main challenges with AR is object persistence, which refers to virtual objects having a stable position and orientation in the real world. In some cases, however, the virtual objects can be attached to the virtual camera, following its movements and rotation. For example, a fire extinguisher attached to the drone:




The sample code includes the code for the fire extinguisher but this tutorial will focus on a different application: measuring the real world using a grid overlay.

grid1.jpg grid2.jpg

I chose the UX SDK as the basis for this sample because it’s obviously very convenient, but the same thing can be achieved by placing a 3D overlay on top of your video feed in the regular SDK.

This tutorial assumes basic knowledge of Objective C and Xcode. The deployment target is set to IOS 11.4 instead of 10 because I’m using simd and a few properties available only in IOS 11+. The complete sample code can be downloaded from github at:
 https://github.com/Xartec/ARDrone
Note this only includes the necessary files to add/change in the UX SDK Sample Code project (v4.13) provided by DJI.


Adding the 3D Overlay

Open the UX SDK sample code workspace and open the DefaultLayout.storyboard file.

Add a SceneKit View (SCNView) to the main view in the storyboard. Make sure it’s the first child of the main View by moving it up in the hierarchy as depicted below:


Screen Shot 2020-07-29 at 01.53.50.png

Set its background to Clear Color and make sure Allow camera control and User Interaction Enabled are both disabled.

Create an IBOutlet for the SceneKit View in the DefaultLayoutController.h file, call it arView, and connect the SceneKit View in the storyboard to it.

    @property (strong, nonatomic) IBOutlet SCNView *arView;


Import the SceneKit framework by adding the following line at the top of the file:
#import <SceneKit/SceneKit.h>

Also import the GridNode.h file. The GridNode files  contain code to create a wireframe grid as a SCNNode.
#import "GridNode.h"

Drag the GridNode.h, GridNode.m, and the linesShader.metal files (in the github repository) to the DefaultLayoutGroup in Xcode:
Screen Shot 2020-07-31 at 16.47.15.png

In the DefaultLayoutController.m file, above viewDidLoad create the following vars for the SCNNodes that will represent the virtual drone and gimbal/camera in the 3D scene:

SCNNode *droneNode;
SCNNode *gimbalNode;
GridNode *theGridNode;


In the viewDidLoad method, add the following code to set up and configure the scene with the nodes that represent the drone and its camera.

   SCNScene *newScene = [SCNScene new];
    self.arView.scene = newScene;
    self.arView.preferredFramesPerSecond = 30;
    self.arView.autoenablesDefaultLighting = YES;


    //create a node to represent the drone and add it to the 3D scene:
    self.droneNode = [SCNNode node];
    [self.arView.scene.rootNode addChildNode:self.droneNode];

    //create a gimbal node and attach it to the drone node
    self.gimbalNode = [SCNNode node];
    [self.droneNode addChildNode:self.gimbalNode];
   
    //create a camera node and appoint it to the gimbal node
    SCNCamera* gimbalCamera = [SCNCamera new];
    self.gimbalNode.camera = gimbalCamera;

    //set up the camera
    self.gimbalNode.camera.zNear = 1.0;
    self.gimbalNode.camera.zFar = 1000;
    self.gimbalNode.camera.projectionDirection = SCNCameraProjectionDirectionHorizontal;
    self.gimbalNode.camera.focalLength = 25;
    self.gimbalNode.camera.sensorHeight = 4.71;
   
    //set the main camera of the 3D scene to be the gimbal node
    self.arView.pointOfView = self.gimbalNode;



After trial and error I came up with the focalLength and sensorHeight values above, which I’ve tested on a Spark only. The two values combined determine the Field of View. Other drones and screensizes and aspect ratios will require a different FoV. The FoV is also calibrated for the video feed in video mode only on my iPad pro (10.5"). There’s probably a formula that can be used to calculate the FoV for different aspect ratios and drone cameras but I will discuss the calibration process later on.

We want to make sure the SceneKit View overlaps with the view of the live video feed. Because this differs per mobile device and whether the drone is in photo or video capture mode, we won’t set the frame value of arView until we actually start rendering 3D objects in the scene, at which time the video feed view (a DJIMovieGLView) is actually available.


Camera and SCNNode Logic

To have the 3D scene match the real world, the world origin coordinates of the 3D scene (0,0,0) will be mapped to the home point of the aircraft. In a final application this would mean we wouldn’t allow starting the AR mode until the RTH point has been recorded.

Add the following boolean (below the SCNNode vars we created above) to keep track of whether we are measure mode:

BOOL measureMode;

Additionally, we need some variables to keep track of the home location, the gimbal’s pitch angle, and another SCNNode, which we’ll use for the calibration process later on:

CLLocationCoordinate2D homeLocation;
DJIGimbalAttitude gimbalAttinDegrees;
SCNNode *calBallNode;


In viewDidLoad, set the homeLocation to invalid so we can check later on whether it has been set to valid coordinates:
homeLocation = kCLLocationCoordinate2DInvalid;

In SceneKit, a single unit represents 1 meter. Many of the values provided by the FlightController are in meters as well so we can use those values directly for virtual objects in our 3D scene. The exception is of course the GPS coordinates. I added some math to convert the distance between two GPS coordinates to meters although we won’t be using it for the measure demo.

To keep it simple, we’re going to use the real world drone’s heading for the heading of the virtual drone (only in calibration mode) and then apply only the pitch of the gimbal. The heading of the drone is updated in the flight controller state updates so we need to implement two state update delegate methods. One for FlightController updates, and one for gimbal updates.

First, change the @interface line in DefaultLayoutViewController.m to the following:

@interface DefaultLayoutViewController () <DJIFlightControllerDelegate, DJIGimbalDelegate>

Then add the following code:

- (void)flightController:(DJIFlightController *)fc didUpdateState:(DJIFlightControllerState *)state {
  
}

- (void)gimbal:(DJIGimbal *)gimbal didUpdateState:(DJIGimbalState *)state {
     
}


To receive these state updates, we need to fetch the gimbal and flight controller from the SDK and set their delegate property to this file (self). The methods to fetch them are available in the DemoComponentHelper files included in the regular DJI IOS SDK, but I’ll include them here for completeness:

//The following two methods are copied from DemoComponentHelper.m in the regular SDK
-(DJIFlightController*) fetchFlightController {
   

if (![DJISDKManager product]) {
        return nil;
    }
   
    if ([[DJISDKManager product] isKindOfClass:[DJIAircraft class]]) {
        return ((DJIAircraft*)[DJISDKManager product]).flightController;
    }
   
    return nil;
}

-(DJIGimbal*) fetchGimbal {
   

    if (![DJISDKManager product]) {
        return nil;
    }
   
    if ([[DJISDKManager product] isKindOfClass:[DJIAircraft class]]) {
        return ((DJIAircraft*)[DJISDKManager product]).gimbal;
    }
    else if ([[DJISDKManager product] isKindOfClass:[DJIHandheld class]]) {
        return ((DJIHandheld*)[DJISDKManager product]).gimbal;
    }
   
    return nil;
}



Add a method called resetDelegates:

- (void)resetDelegates {

    DJIFlightController* fc = [self fetchFlightController];
    if (fc) {
        fc.delegate = self;
    }
   
    DJIGimbal *gb = [self fetchGimbal];
    if (gb) {
        gb.delegate = self;
    }
   
}

We’re going to call this method from viewDidAppear, so the delegates will be reset every time the DefaultLayoutController appears (useful for when you switched to DJI Go and back for example):

-(void)viewDidAppear:(BOOL)animated {
    //reset the delegates when the view appears.
    [self resetDelegates];
}

Now that we set up the delegates and can receive state updates from the flight controller and the gimbal, we’re going to use that information to update the virtual drone and camera nodes in the 3D scene.

Add the following code:

- (void) updateARDrone: (DJIGimbalAttitude)gimbalAtt altitude:(double)altitude {

    //update gimbal
    gimbalNode.simdRotation = simd_make_float4(1, 0, 0, RADIAN(gimbalAtt.pitch));
   
    //apply the altitude to the drone node
    droneNode.position = SCNVector3Make(0, altitude, 0);
   
}

- (void) updateDroneHeading:(double)heading {
    droneNode.simdRotation = simd_make_float4(0, -1, 0, RADIAN(heading));
}


The drone’s yaw angle and gimbal pitch angle provide by the state updates are in degrees, which we need to convert to radian angles for SceneKit. The following macro to make that easy is also available in DemoComponentHelper files, I added it to the top of DefaultLayoutViewController.m for completeness:

#define RADIAN(x) ((x)*M_PI/180.0)

Next, we’re going to update the didUpdateState methods for the flight controller and gimbal to call the two methods above:

-(void)flightController:(DJIFlightController *)fc didUpdateState:(DJIFlightControllerState *)state {
   
    if (measureMode) {
        [self updateARDrone:gimbalAttinDegrees altitude:state.altitude];
        //[self updateGrid:state.altitude];
        
    }
   
    if (CALMODE) {
        [self updateDroneHeading:state.attitude.yaw];
        gimbalNode.simdRotation = simd_make_float4(1, 0, 0, RADIAN(gimbalAttinDegrees.pitch));
    }
   
    if (state.isHomeLocationSet) { //if real drone's RTH point has been set
        if (!CLLocationCoordinate2DIsValid(homeLocation)) { //and our stored homelocation is still invalid
            homeLocation = state.homeLocation.coordinate; //store the homelocation
        }
    }
   
}

- (void)gimbal:(DJIGimbal *)gimbal didUpdateState:(DJIGimbalState *)state {
    gimbalAttinDegrees = state.attitudeInDegrees; //store the gimbal attitude
}



Next, we need a way to enable/disable Measure Mode to show/hide the grid. Add a segment control and a label to the DefaultLayout controller. Create an IBOutlet property for the label in DefaultLayoutController.h, and an IBAction for the segment controller  in DefaultLayoutController.m.

Use the following code for the IBAction:

- (IBAction)modeSegmentControlChanged:(UISegmentedControl*)sender {
   
    if (sender.selectedSegmentIndex==0) {
        measureMode = NO;
        
        //remove the grid:
        [theGridNode removeFromParentNode];
        self.gridSizeLabel.hidden = YES;
        
        if (CALMODE) {
            [calBallNode removeFromParentNode];
        }
        
    } else  if (sender.selectedSegmentIndex==1) {
        measureMode = YES;
        self.gridSizeLabel.hidden = NO;
        
        //add the grid:
        theGridNode = [[GridNode alloc] initWithInterval:2];
        [self.arView.scene.rootNode addChildNode:theGridNode];
        
        DUXFPVViewController* theContentView = (DUXFPVViewController*)self.contentViewController;
        DJIMovieGLView* theGLView;
        for (int i = 0; i< theContentView.fpvView.subviews.count; i++) {
            NSObject* viewTest = [theContentView.fpvView.subviews objectAtIndex:i];
            if ([viewTest isKindOfClass:[DJIMovieGLView class]]) {
                theGLView = (DJIMovieGLView*)viewTest;
                if ((DJIVideoPreviewer *)theGLView.delegate) {
                    //use this to set arview
                    [self.arView setFrame: CGRectMake(theGLView.frame.origin.x, theGLView.frame.origin.y+40, theGLView.frame.size.width, theGLView.frame.size.height)];

                }
            }
        }
         
        if (CALMODE) {
            SCNGeometry *sphereGeo = [SCNSphere sphereWithRadius:0.13625]; //radius of the circle on the wall
            calBallNode = [SCNNode nodeWithGeometry:sphereGeo];
            calBallNode.geometry.firstMaterial.diffuse.contents = [UIColor blueColor];
            
            //put sphere at drone pos
            calBallNode.worldPosition = SCNVector3Make(droneNode.worldPosition.x, droneNode.worldPosition.y, droneNode.worldPosition.z);
            
            //move sphere 3 meter in front of drone
            SCNMatrix4 infrontMat = SCNMatrix4MakeTranslation(droneNode.simdWorldFront.x*3.0, droneNode.simdWorldFront.y, droneNode.simdWorldFront.z*3.0);
            calBallNode.transform = SCNMatrix4Mult(calBallNode.transform, infrontMat);
            
            [self.arView.scene.rootNode addChildNode:calBallNode];
        }
        
    }
   
}


You can ignore the CALMODE stuff for now, but add the following line below the RADIAN macro:

#define CALMODE NO

In addition to adding/removing a grid to the scene and setting the measureMode boolean, we adjust the SceneKit View’s frame to match the video feed. Using DJIMovieGLView requires importing DJIVideoPreviewer.h (in DefaultLayoutController.h):

#import <DJIWidget/DJIVideoPreviewer.h>

You can now run the app (if you changed the signing options and got a key from DJI for in the info.plist file) and test it.


Wait for the drone to connect to enough satellites for the RTH point to be recorded and is in GPS mode. The FoV settings (sensorHeight and focalLength are calibrated to video mode on an iPad pro 10.5" only (will hopefully be able to complete this tutorial with more info on how to adapt it to all models and mobile devices sooner than later...). Take off, then tap Grid On and fly. Rotate the gimbal down.




FoV Calibration

SceneKit cameras (SCNCamera) has properties to set the sensor height and the focal length, which are then used to calculate the fieldOfView value. There is probably a formula to calculate the values based on the real drone camera’s value in combination with the aspect ratio and possible screen size but I haven’t figured that out yet. Using the values from the real drone (Spark) didn’t give me the desired results so I needed to MacGyver a way to calibrate the field of view to match the real camera.

I created a circle with a diameter of 27.25cm (which happens to be the diameter of my plates) on a piece of paper and I put that on a wall. I then placed the real world drone at 3 meter distance to the wall, pointing towards the circle. When CALMODE is set to YES, the yaw of the drone will be used to rotate the virtual drone and a sphere SCNNode of 27.25cm will be placed 3meter in front of the virtual drone. When the sensorHeight and focalLength values are correct, the sphere should match the size of the circle on the wall, and when rotating (yaw) the real world drone, the ball should stay in place (apart from the delay described below). Additionally, the ball should stay in place when pitching the gimbal up/down.

For the most accurate results, the calibration process should be done outside as inside there’s usually too much interference for the compass to be accurate.

Challenges

In addition to accommodating for different drone cameras and mobile devices, a major challenge is object persistence for 3D objects that do not match the drone’s position and heading. Especially in combination with the movement speed of the drone and the latency between the drone and the remote controller and the mobile device. For example, when you shoot a virtual projectile and then yaw or move the drone/gimbal, there will be a delay between the heading and position of the camera view in the 3D scene and the live video update. Possible solutions to overcome this obstacle include faster connection (Occusync?) and/or buffering and delaying the video feed X frames. Perhaps the smoothest results can be achieved by reading the DJIStick info from remote controller state updates when connected with a cable to the mobile device.

Application Ideas

- Drone fire extinguisher training sim

- Race/Obstacle Course  - Fly through virtual donuts (torus)

- Dogfight Mode - communicate (via bluetooth or WiFi when using OTG cable) the aircraft location from another drone pilot’s mobile device to yours and create a virtual (empty) node for it in your 3D scene. This would allow you to shoot virtual projectiles at another drone and perform hittests in the 3D scene.


EXTRA

Instead of keeping the virtual drone fixed at x, z coordinates (y is altitude/up in SceneKit), we can update its position based on flight controller state updates. If we then place a 3D object at the drone’s current position, it will stay there while the drone can move away from it.


- (void) updateDronePosition: (CLLocationCoordinate2D)newDroneCoord altitude:(double)altitude {

    //set the position of the virtual drone node
    //the center of the 3D world is mapped to the RTH point of the drone
    //to determine the desired location in the 3D world, we calculate the offset between the
    //drone’s current GPS location and the RTH point, and then convert it to meters.

    double latMid, m_per_deg_lat, m_per_deg_lon, deltaLat, deltaLon;
    latMid = (homeLocation.latitude+newDroneCoord.latitude )/2.0;
   
    //the following two lines are used to calculate a conversion factor for meters per degree for Longtitude and Latitude.
    m_per_deg_lat = 111132.954 - 559.822 * cos( 2.0 * latMid ) + 1.175 * cos( 4.0 * latMid);
    m_per_deg_lon = (M_PI/180 ) * 6367449 * cos ( latMid );
   
    //apply the offset between drone and home point to the virtual drone
    deltaLat = (newDroneCoord.latitude - homeLocation.latitude) * m_per_deg_lat;
    deltaLon = (newDroneCoord.longitude - homeLocation.longitude) * m_per_deg_lon;
   
    droneNode.position = SCNVector3Make(deltaLat, altitude, deltaLon);
   
    self.arDroneAltitude.text = [NSString stringWithFormat:@"Alt: %0.0fm", droneNode.position.y];
    self.arDronePos.text = [NSString stringWithFormat:@"Pos: %0.0f, %0.0f", droneNode.position.x,droneNode.position.z];

    //update gimbal
    //gimbalNode.rotation = SCNVector4Make(1, 0, 0, RADIAN(gimbalAttinDegrees.pitch));
      
}


In combination with placing a SCNNode (in this case a plane with a texture, in billboard mode so always looking at the camera)  every X second / meter, I managed to create the trail I mentioned in the intro. This is a screenshot after flying backwards for a while. It truly gives a whole ‘nother dimension to flying. To prevent a virtual object close to the camera from blocking the view, the SceneKit camera’s zNear property can be increased.

trail.jpg


The sample at https://github.com/Xartec/ARDrone also contains the fire extinguisher code as demonstrated in this YouTube video:



8-3 07:31
Use props
DJI_Lisa
Second Officer
South Africa
Offline

This is a great teaser and preview!  I'm excited to see your tutorial. Please DM me afterwards so we can talk and I can find out more about you.  

~Lisa, DJI Developer Support
8-3 11:28
Use props
JohanH
lvl.3
Flight distance : 30479 ft

Netherlands
Offline

DJI_Lisa Posted at 8-3 11:28
This is a great teaser and preview!  I'm excited to see your tutorial. Please DM me afterwards so we can talk and I can find out more about you.  

~Lisa, DJI Developer Support

Thanks Lisa, I edited the post above to include the full tutorial and link to sample code at github. Nice to see I can edit an older post cause it may need some work still.

If anyone has any questions, just let me know. I realize it's not a complete solution, especially given the FoV stuff, but I hope it'll be interesting and maybe even lead to some fun and original apps.
8-6 07:08
Use props
z37soft
lvl.4
Flight distance : 789570 ft
Japan
Offline

Hi.

I've tried it too.
It still vibrates, but I'm hoping to improve.
Thank you for your article.

8-30 23:37
Use props
JohanH
lvl.3
Flight distance : 30479 ft

Netherlands
Offline

z37soft Posted at 8-30 23:37
Hi.

I've tried it too.

Nice job

It appears the vibration is caused by the input from the gimbal and not the updates from the drone location/heading. Perhaps a damping function on the gimbal attitude can stabilize things a bit.
8-31 06:46
Use props
z37soft
lvl.4
Flight distance : 789570 ft
Japan
Offline

Yes. You're right, the problem is that the Yaw value of the gimbal is moving unnecessarily.
8-31 08:35
Use props
djiuser_D0sgAm6xL5jn
lvl.4
Flight distance : 339951 ft
United States
Online

That's really impressive!
8-31 15:46
Use props
Advanced
You need to log in before you can reply Login | Register now

Credit Rules