What is better than the RoboMaster?
Uploading and Loding Picture ...(0/1)
o(^-^)o
AntDX316
First Officer
Flight distance : 3394731 ft
  • >>>
Offline


2019-9-19
Use props
rhoude57 - YUL
lvl.4
Offline

Oneneeds to define "better" to answer your question... You,ll be hard-pressed to find "better" than a RoboMaster S1 if you are looking for a $500-$1000 STEM programmable educational robot packed with advanced technology features.
2019-9-19
Use props
AntDX316
First Officer
Flight distance : 3394731 ft
  • >>>
Offline

rhoude57 - YUL Posted at 9-19 21:57
Oneneeds to define "better" to answer your question... You,ll be hard-pressed to find "better" than a RoboMaster S1 if you are looking for a $500-$1000 STEM programmable educational robot packed with advanced technology features.

How many hours does it take to make a decent script?
2019-9-19
Use props
Duane Degn
Second Officer
Flight distance : 622234 ft
Offline

AntDX316 Posted at 9-19 22:56
How many hours does it take to make a decent script?

It depends what you mean by "decent script."
It's really easy to make a program either in Scratch of Python to make the robot move based on input from hit detectors. It's also easy to write a program to make the robot move in a predetermined pattern. Unfortunately there isn't much support for sensors besides the hit detectors.
If DJI would open sourced the software to the S1, it would make the S1 an amazing platform. As it is, the S1 is a lot of fun. It just could be so much more with freedom to really control it.
2019-9-20
Use props
BGA
Second Officer
Offline

Wait. Your argument is that military-grade robots (that might or might not be available for end-user purchase but that either way would cost several times the price of a RoboMaster S1 at least) is "better" than the RoboMaster S1? well, I guess we will have to agree to disagree.
2019-9-20
Use props
BGA
Second Officer
Offline

AntDX316 Posted at 9-19 22:56
How many hours does it take to make a decent script?

What do you mean by hours? To me it takes minutes. Obviously the definition of a decent script is relative. Care to give an example of what a "decent script" would be in your opinion?

Programming for the S1 is basically programming in Python so I am not sure what your argument is exactly.
2019-9-20
Use props
BGA
Second Officer
Offline

Duane Degn Posted at 9-20 05:33
It depends what you mean by "decent script."
It's really easy to make a program either in Scratch of Python to make the robot move based on input from hit detectors. It's also easy to write a program to make the robot move in a predetermined pattern. Unfortunately there isn't much support for sensors besides the hit detectors.
If DJI would open sourced the software to the S1, it would make the S1 an amazing platform. As it is, the S1 is a lot of fun. It just could be so much more with freedom to really control it.

Wait, but this is also not strictly true. The camera is a sensor and you cna do a lot of things with it using the AI module.

But I do agree that it has the potential to be way better than it is now. But as this is mostly just software updates, it is something DJI can achieve if they really want to.

2019-9-20
Use props
Duane Degn
Second Officer
Flight distance : 622234 ft
Offline

BGA Posted at 9-20 07:58
Wait, but this is also not strictly true. The camera is a sensor and you cna do a lot of things with it using the AI module.

But I do agree that it has the potential to be way better than it is now. But as this is mostly just software updates, it is something DJI can achieve if they really want to.

"do a lot of things with it using the AI module"
Good point. I hadn't played with that feature yet.
There may be ways of getting input from the optional remote as well. I see Scratch blokes labelled "Mobile Device." Hopefully there a way to use these blokes as inputs. I haven't figured out how to use these blokes yet.
2019-9-20
Use props
rhoude57 - YUL
lvl.4
Offline

BGA Posted at 9-20 07:58
Wait, but this is also not strictly true. The camera is a sensor and you cna do a lot of things with it using the AI module.

But I do agree that it has the potential to be way better than it is now. But as this is mostly just software updates, it is something DJI can achieve if they really want to.

The IMU built-in both the Chassis and the Turret are also sensors.
The microphone is also a sensor capable of recognizing hand claps. It would be nice it it provided voice recognition, allowing the robot to be voice commanded.

2019-9-20
Use props
BGA
Second Officer
Offline

rhoude57 - YUL Posted at 9-20 10:07
The IMU built-in both the Chassis and the Turret are also sensors.
The microphone is also a sensor capable of recognizing hand claps. It would be nice it it provided voice recognition, allowing the robot to be voice commanded.

Yep. I would *REALLY* love direct access to the video and audio streams in the S1. That would allow some neat projects.
2019-9-20
Use props
AntDX316
First Officer
Flight distance : 3394731 ft
  • >>>
Offline

Hours as in, making a decent program where it's beyond just basic movements.
2019-9-20
Use props
BGA
Second Officer
Offline

AntDX316 Posted at 9-20 12:29
Hours as in, making a decent program where it's beyond just basic movements.

Is recognizing and following a specific target "a decent program"? That is also minutes.

In fact, considering the available API and assuming you know how to program in Python (or even Scratch), virtually anything that you can do with the robot, you can do in minutes (or, say, half an hour).

2019-9-20
Use props
BGA
Second Officer
Offline

AntDX316 Posted at 9-20 12:29
Hours as in, making a decent program where it's beyond just basic movements.

To try to answer your question in a clearer way, here is an example of a program that can locate, center and fire upon a recognized vision marker. It is a fully working example and it took ore time to comment it than to actually write it (although this is a bit of cheating as it is mostly a simplified version of the Seek & Destroy example. Still, it is a very small program that does something interesting).

def start():
        // Create PID controllers for yaw and pitch angles.
        pid_Pitch = rm_ctrl.PIDCtrl()
        pid_Yaw = rm_ctrl.PIDCtrl()

        // Marker position in the S1 field of view.
        variable_X = 0
        variable_Y = 0

        // Maximum error that results in firing.
        variable_Post = 0.01

        // Move chassis and gimbal independently.
        robot_ctrl.set_mode(rm_define.robot_mode_free)

        // Enable vision marker detection.
        vision_ctrl.enable_detection(rm_define.vision_detection_marker)

        // Set PID parameters. Our controller uses mostly P and a bit of D.
        pid_Yam.set_ctrl_params(115, 0, 5)
        pit_Pitch.set_ctrl_params(85, 0, 3)

        // Set fire count to 1 bead per firing.
        gun_ctrl.set_fire_count(1)

        while True:
                // Try to find a marker.
                list_MarkersList = RmList(vision_ctrl.get_marker_detection_info())
                if list_MarkerList[1] == 1:
                        // Found exactly one marker. Get its X and Y positions.
                        variable_X = list_MarkerList[3]
                        variable_Y = list_MarkerList[4]
                        
                        // Compute error (offset) from center of view for yaw and pitch axes.
                        pid_Yaw.set_error(variable_X - 0.5)
                        pid_Pitch.set_error(0.5 - variable_Y)

                        // Rotate the gimbal to point to the center of the identified marker.
                        gimbal_ctrl.rotate_with_speed(pid_Yaw.get_output(), pid_Pitch.get_output())

                        // Wait a bit to stabilize.
                        time.sleep(0.05)

                        if abs(variable_X - 0.5) <= variable_Post and abs(0.5 - variable_Y) <= variable_Post:
                                // Error is smaller than or equal to our limit. Fire!
                                gun_ctrl.fire_once()

                                // Make sure we do not fire more than once per second.
                                time.sleep(1)
                else:
                        // Marker not found or multiple markers found. Make sure we stop rotating.
                        gimbal_ctrl.rotate_with_speed(0, 0)

2019-9-20
Use props
AntDX316
First Officer
Flight distance : 3394731 ft
  • >>>
Offline

BGA Posted at 9-20 13:11
To try to answer your question in a clearer way, here is an example of a program that can locate, center and fire upon a recognized vision marker. It is a fully working example and it took ore time to comment it than to actually write it (although this is a bit of cheating as it is mostly a simplified version of the Seek & Destroy example. Still, it is a very small program that does something interesting).

def start():

A difficult code would be to code for lead and lag shots?
2019-9-20
Use props
BGA
Second Officer
Offline

AntDX316 Posted at 9-20 13:17
A difficult code would be to code for lead and lag shots?

Who said anything about difficult code? I said it is a small program that does something interesting. No idea where you read difficult anywhere.

But again, this whole discussion is moot until you give an example of what you would consider a decent program (and, BTW, there is no correlation between decent and complex/difficult). But I suspect nothing I show you will satisfy you (it will always degenerate to "no true scotsman").
2019-9-20
Use props
AntDX316
First Officer
Flight distance : 3394731 ft
  • >>>
Offline

BGA Posted at 9-20 13:34
Who said anything about difficult code? I said it is a small program that does something interesting. No idea where you read difficult anywhere.

But again, this whole discussion is moot until you give an example of what you would consider a decent program (and, BTW, there is no correlation between decent and complex/difficult). But I suspect nothing I show you will satisfy you (it will always degenerate to "no true scotsman").

So you basically decide the output depending upon what the input senses are?
2019-9-20
Use props
BGA
Second Officer
Offline

AntDX316 Posted at 9-20 13:50
So you basically decide the output depending upon what the input senses are?

This is a very simplistic way of putting it, but yes. In general, any computer program can be reduced to "generate output based on input" so this does not really mean much (if anything).
2019-9-20
Use props
rhoude57 - YUL
lvl.4
Offline

AntDX316 Posted at 9-20 13:17
A difficult code would be to code for lead and lag shots?

Lead/Lag shooting is not that difficult. Once you have computed your aiming point, you apply a two-dimensional correction to that point's coordinates.
To make this a little more complex, you introduce a circular Lead/Lag, such as what would be required to hit the 2019 RoboMaster Bonus Rune.
A little tougher, you say? The target box will be at the end of one of five arms and it will light up randomly. If it is hit, then the target box is taken out of the list, and you move on...

Another way of making the lLead/Lag more complex is like chasing after a Sentry that typically changes direction randomly and changes speed... sometimes...

I like watching actual RoboMaster tournament matches. They are rife with excellent examples of programming challenges.
2019-9-20
Use props
MarkusXL
lvl.4
Offline

On my list of future projects is auto-sniper, but I won't bother until DJI unlocks autonomous program firing.

But the skeleton of it would be, detect the enemy S1, which loads a Python list object.  From that, read the x,y coords of the center of the target, and then use the width / height numbers to calculate a range.  Then look up a table to adjust x up a bit to account for bullet drop.  Then use PID to lock on to the result and fire a burst.  Re-lock and repeat.

If I ever get that working, I'll combine it with some movement commands and try to mimic the "Sentry" role of the actual Robomaster competitions.  Which is kinda what the S1 is telling to try to do - just by it's design.

And yeah - those competition Sentry bots are wicked - very good at aiming and evading.  And they cost a couple grand or more...  With some good software and features unlocked we can mimic that pretty good for $499.

Edit:  Oh, and instead of a fancy rotating target, we could mimic that with our electronic drop targets that pop back up   Easy peasy for $15

2019-9-20
Use props
AntDX316
First Officer
Flight distance : 3394731 ft
  • >>>
Offline

rhoude57 - YUL Posted at 9-20 17:00
Lead/Lag shooting is not that difficult. Once you have computed your aiming point, you apply a two-dimensional correction to that point's coordinates.
To make this a little more complex, you introduce a circular Lead/Lag, such as what would be required to hit the 2019 RoboMaster Bonus Rune.
A little tougher, you say? The target box will be at the end of one of five arms and it will light up randomly. If it is hit, then the target box is taken out of the list, and you move on...

If someone could make a script that people could tweak it would be easier.

The AI can only do certain things like cover to cover, chase, fall back, etc.  Implementing certain tactics would make it where doing any other way would make it worse.  Eventually, people would be using only the best scripts and nothing else for combat.
2019-9-20
Use props
DJI Stephen
DJI team
Offline

Hello and good day AntDX316. Thank you for sharing this video and information with us. Thank you for your support.
2019-9-21
Use props
Montfrooij
Captain
Flight distance : 2560453 ft
  • >>>
Offline

Have not owned one (yet), so I would not know .
2019-9-22
Use props
Advanced
You need to log in before you can reply Login | Register now

Credit Rules