Please select Into the mobile phone version | Continue to access the computer ver.
Mavic 3 Multispectral usability questions
5883 17 2023-2-5
Uploading and Loding Picture ...(0/1)
o(^-^)o
Phenomics
lvl.2
Flight distance : 18307 ft
Australia
Offline

Hi all,

We're looking at the Mavic 3 Multispectral but given what a miserable horrorshow the user experience for the Phantom 4 mutispectral is, we had some questions about how the M3M is going to work.
Has anyone on here tried the new M3M yet or can any DJI folks chime in with answers?

(1) Will the M3M support any third party flight control apps? The Phantom 4 Multispectral only supports the annoyingly bad DJI GSPro app and this is iOS only, so buying a P4M requires users to also buy an ipad. Will the M3M be the same way or will it support control by other apps?

(2) What sort of sun calibration is required or is this all integrated into the images on capture and a calibration panel is not needed? With the P4m it wasn't clear if additional post-processing and a calibration panel were required since there was mixed information on the web a very little documentation provided by DJI. For example Agisoft says you need to calibrate the images, some DJI forum posts say you don't have to but you could if you want to, but DJI's own webinar says you do, and so on, and DJI doesn't sell a calibration panel and last I checked, doesn't provide much useful information on which 3rd party panel to buy anyway, so you're basically on your own.

(3) Will DJI be providing code or workflows to support easier processing of multispectral images or are they just going to publish a
6-page PDF of equations like they did with the P4M?

(4) Image Naming: The P4M names all bands with the same naming convention that has no useful band information in it. This leads to a nightmare of data management since there is no way to know which bands were captured or if all bands have been recorded for each capture without writing your own code (here's a
code snippet that addresses that problem). Will the M3M images be named in a more helpful way than the P4M or will they also be named in the standard DJI_####.TIF format without any useful band or capture information in the name?

Thanks!


2023-2-5
Use props
Phenomics
lvl.2
Flight distance : 18307 ft
Australia
Offline

embedding urls in that text doesn't seem to work either, so here's the code snippet https://gitlab.com/-/snippets/2493855
2023-2-6
Use props
Phenomics
lvl.2
Flight distance : 18307 ft
Australia
Offline

and the P4M 6-page pdf https://dl.djicdn.com/downloads/p4-multispectral/20200717/P4_Multispectral_Image_Processing_Guide_EN.pdf
2023-2-6
Use props
LV_Forestry
Second Officer
Flight distance : 4726654 ft
Latvia
Offline

Phenomics Posted at 2-6 00:16
and the P4M 6-page pdf https://dl.djicdn.com/downloads/p4-multispectral/20200717/P4_Multispectral_Image_Processing_Guide_EN.pdf

P4M support android device with UGCS.  

The calibration target is not necessary for this kind of low-end sensor.  The solar collector does its job very well.  Agisoft Metashape offers a workflow dedicated to DJI products which takes into account the calibration data of the camera and the solar sensor.  

The file naming is quite simple, 0/RGB 1/Blue 2/Green 3/Red 4/RedEdge 5/NIR.  

I invite you to go see my review about the defects of the P4M RedEdge band.  The question was asked to DJI regarding the M3M, to find out if it also has a problem with Redege.  As usual, no response.

https://forum.dji.com/forum.php?mod=viewthread&tid=279882&mobile=2

2023-2-6
Use props
Phenomics
lvl.2
Flight distance : 18307 ft
Australia
Offline

LV_Forestry Posted at 2-6 01:08
P4M support android device with UGCS.  

The calibration target is not necessary for this kind of low-end sensor.  The solar collector does its job very well.  Agisoft Metashape offers a workflow dedicated to DJI products which takes into account the calibration data of the camera and the solar sensor.  

Thanks for the info. So for the naming, the images are named in groups of 5?
ie
DJI_0141 = Blue
DJI_0142 = Green
DJI_0143 = Red,
etc... is that how it works?

re UGCS... I've seen mixed reviews and comments that all the camera setup has to be done in GS pro first, is this the case or can one just us UGCS for the whole flight w/o having to interact with DJI apps at all?
2023-2-6
Use props
LV_Forestry
Second Officer
Flight distance : 4726654 ft
Latvia
Offline

Phenomics Posted at 2-6 15:55
Thanks for the info. So for the naming, the images are named in groups of 5?
ie
DJI_0141 = Blue

Yes it is a question of memorizing that B1 is blue, B2 is green...

Usually we use the name RGB, the engineer who designed the camera probably had to use a language that reads from right to left, I only see that as a valid explanation. Just remember that it's B,G,R R,N /1,2,3 4,5. easy!

For the use of UGCS the only step that must go through GSPro is the activation of the 5 bands plus JPG. Which in theory is already configured as such when you receive the drone from DJI.

Once this is done, unless you wish to disable certain bands, there is no need to use GSPro, the setting will not change.

So yes, you can go to work without taking an Ipad, an Android phone that is able to do connection sharing will do the job. (to download the flight plans you have made on the PC).

What we can blame UGCS for is the GUI. It really isn't great. This is a bit of a DIY open source solution. But the license is not very expensive and it does a good job, so we forgive them.
The online support (by email) is very efficient if you have a question regarding the use of the software.
2023-2-6
Use props
LV_Forestry
Second Officer
Flight distance : 4726654 ft
Latvia
Offline

And I forget to specify, it is not necessary to carry the PC with the UGCS client to fly the drone.
You can control the drone from the PC, with a joystick or with the interface. But once the flight plan is saved in the phone, you can disconnect from the PC. Please note that when updating the application, some phones erase flight plans.

I always keep a PC with UGCS client in the car. In case I have a flight plan modification to do on the spot. It is not possible to modify the trajectories on the android application.
The P4M works really well. To do crop monitoring there is no need for a huge resolution. on the contrary, it reduces processing times and that's good.

I absolutely do not recommend that you buy an M3M right away. For the moment there is no feedback from users. I asked to try it to compare the results with those of a hyperspectral camera. Nobody answers. This suggests that on paper it's the ultimate solution, but in reality I don't expect anything better than what the P4M already does.

The real added value of the M3M is its RGB camera which is clearly superior to that of the P4M. It's a kind of P4M and P4R united.

2023-2-6
Use props
Phenomics
lvl.2
Flight distance : 18307 ft
Australia
Offline

LV_Forestry Posted at 2-6 23:17
And I forget to specify, it is not necessary to carry the PC with the UGCS client to fly the drone.
You can control the drone from the PC, with a joystick or with the interface. But once the flight plan is saved in the phone, you can disconnect from the PC. Please note that when updating the application, some phones erase flight plans.

Thanks for all the helpful info!

re M3M... yeah I'm assuming it is roughly the same multispec camera but with the higher res RGB camera, but that's exactly what I need so it is probably worth the upgrade. But it would be nice to get some feedback from actual users so we can tell if there are any additional new dramas with the new system
2023-3-19
Use props
LV_Forestry
Second Officer
Flight distance : 4726654 ft
Latvia
Offline

Phenomics Posted at 3-19 17:59
Thanks for all the helpful info!

re M3M... yeah I'm assuming it is roughly the same multispec camera but with the higher res RGB camera, but that's exactly what I need so it is probably worth the upgrade. But it would be nice to get some feedback from actual users so we can tell if there are any additional new dramas with the new system

Be sure you don't need an exact RedEdge band.  

Other than that everything seems fine.  

I posted a link to a PIX4D dataset in the thread that talks about the famous RedEdge band if you want to try.
2023-3-19
Use props
Imagency
lvl.1

France
Offline

@Phenomics
M3M filenames are :
.JPG for RGB
_MS_R.TIFF for RED
_MS_NIR.TIFF
_MS_RE.TIFF
_MS_G.TIFF
unfortunately RGB is not splitted for Blue band
2023-3-26
Use props
djiuser_Jd1xWcj954Xy
lvl.1
Flight distance : 29665 ft

United States
Offline

Have any of yall able to Merge the _MS_(Insert Band here) files into a mosaic? I haven't had much luck on that end. Would be nice to process them and then do some of my own Raster Calculator.
(Sorry new account I've updated my info)
2023-5-10
Use props
LV_Forestry
Second Officer
Flight distance : 4726654 ft
Latvia
Offline

djiuser_Jd1xWcj954Xy Posted at 5-10 15:12
Have any of yall able to Merge the _MS_(Insert Band here) files into a mosaic? I haven't had much luck on that end. Would be nice to process them and then do some of my own Raster Calculator.
(Sorry new account I've updated my info)

Yes, with Agisoft Metashape.  
2023-5-10
Use props
djiuser_bsTrkDdZb1wr
lvl.2

Australia
Offline

I'm wondering, with Agisoft processing, I have been processing the RGB images WITH the MS images;  are the RGB bands from the RGB images suitable - i.e. can I calculate an index using the G band from the "RGB" normal camera; or should I not be doing that, and just sticking to the bands of the MS Camera for indices etc....

... as for comments above - not needing a calibration target with the M3M, I wonder if comparison between repeat datasets at different times is suitable without them?....
2023-8-19
Use props
Phenomics
lvl.2
Flight distance : 18307 ft
Australia
Offline

djiuser_bsTrkDdZb1wr Posted at 8-19 16:03
I'm wondering, with Agisoft processing, I have been processing the RGB images WITH the MS images;  are the RGB bands from the RGB images suitable - i.e. can I calculate an index using the G band from the "RGB" normal camera; or should I not be doing that, and just sticking to the bands of the MS Camera for indices etc....

... as for comments above - not needing a calibration target with the M3M, I wonder if comparison between repeat datasets at different times is suitable without them?....

The issue with pulling the blue band from the RGB is that it won't be calibrated or of the same spectral sensitivity as the deadicated bands, and you'd need to filter the blue down to a response range matching the width of the multispec bands. So it would need a fair bit of testing to assure yourself you were getting reasonably accurate resutles when including the bnlue band
Finding a spectral response curves for the M3M RGB or Multispec cameras would eb a good starting point (or generating them yourself )

See figure below for an example of how regular RGB cameras have a much wider response range in each colour.

Response range for multispec bands is:
Narrow Band Filter Green (G): 560±16 nm, Red (R): 650±16 nm, Red edge (RE): 730±16 nm, Near-infrared (NIR): 860±26 nm


(Image source: 10.1016/j.jqsrt.2020.107162)


2023-8-23
Use props
VBio
lvl.2
Flight distance : 569577 ft

Indonesia
Offline

Hi all, newbie user here
I recently use M3M for my work in agriculture, previously (and still) using Sequoia sensor multispectral. Currently in learning curve to understand this M3M multispectral sensor. I am processing the images using P4D. What I have found is the spectral reflectance result coming from M3M msp sensor which give strange reflectance value, i.e for NIR which only maximum 0.08 compare with Sequoia which maximum showing 0.82 in oil palm plantation. I still do not know if the value should be times 1,000 not 100 as usual and at the moment still waiting answer also from P4D team about my question.

Thanks
2023-10-19
Use props
LV_Forestry
Second Officer
Flight distance : 4726654 ft
Latvia
Offline

VBio Posted at 10-19 00:40
Hi all, newbie user here
I recently use M3M for my work in agriculture, previously (and still) using Sequoia sensor multispectral. Currently in learning curve to understand this M3M multispectral sensor. I am processing the images using P4D. What I have found is the spectral reflectance result coming from M3M msp sensor which give strange reflectance value, i.e for NIR which only maximum 0.08 compare with Sequoia which maximum showing 0.82 in oil palm plantation. I still do not know if the value should be times 1,000 not 100 as usual and at the moment still waiting answer also from P4D team about my question.


https://dl.djicdn.com/downloads/DJI_Mavic_3_Enterprise/20230829/Mavic_3M_Image_Processing_Guide_EN.pdf
2023-10-19
Use props
mapyx2
lvl.2
Flight distance : 17730 ft
Italy
Offline

Hi everyone
I'm using my M3M but it seems to me that the sensors have a very different focal length from the main RGB sensor which actually has a much wider-angle FOV. I wonder how the software can process a multilayer if the images between the layer selected for the SFM and the other layers do not match. Is this a problem only with my Mavic M3M?
2-2 12:07
Use props
mapyx2
lvl.2
Flight distance : 17730 ft
Italy
Offline

Optics are absolutely misaligned.
This causes the multilayer processing process of a DTM orthophoto to fail
To ensure that the FOV is the same it almost seems that a zoom (fake zoom because it is not optical) of 1.3 is necessary on the main camera.
Very bad !
Furthermore, the videos seems to a big problem due more to the mechanical shutter than to an SD storage speed problem. I tried a Micro SD V90 which has triple the speed required for storing 4K video.
Furthermore, it is not possible to change the FPS: only 29.9 fps, whether recording in 4K or 2K.
Furthermore, I noticed that videos recorded in FullHD really suck, it seems like very compressed VHS quality (here too there are no settings to adjust anything). Incredible but the Mini 3 Pro is far superior in quality.
This doesn't make sense.
2-15 09:06
Use props
Advanced
You need to log in before you can reply Login | Register now

Credit Rules