Robobuilder Controlled by Kinect - Updated Version

Korean company maker of Robot kits and servos designed for of articulated robots. Re-incarnation of Megarobotics.
31 postsPage 1 of 31, 2, 3
31 postsPage 1 of 31, 2, 3

Robobuilder Controlled by Kinect - Updated Version

Post by MarcoP » Fri Jun 22, 2012 11:35 am

Post by MarcoP
Fri Jun 22, 2012 11:35 am

Hello

To those who did not know this, some time ago we did a small demo with a Robobuilder being controlled with a Kinect sensor.

That demo only read the position of the hand and moved the robot arm up and down to try to mimic the hand movement.

We have now improved on that to include full Upper Body Tracking.
This means the Robobuilder should be able to much closer mimic the position of arms by tracking shoulder, elbow and wrist (within it's own DoF limitations):

phpBB [media]


phpBB [media]


[Edit by Pedro, July 9th 2012] The source code and executable for the new version with Full upper Body Tracking can be downloaded here http://robosavvy.com/RoboSavvyPages/Rob ... _SDKv1.zip

- If you just want to run the software (without editing any source code), follow the same instructions used for the previous version: http://robosavvy.com/forum/viewtopic.php?p=33879#33943

- If you want to edit the code you'll need some more dependencies. Check here http://robosavvy.com/forum/viewtopic.php?p=33879#33892

An important heads up is that this code is designed for Kinect SDK v1.0



How we've implemented it:

WARNING! Math contents ahead! :)

A human shoulder has 3 degrees of freedom, meaning it can move in 3 different ways. Pitch, roll and yaw. A human elbow has one degree of freedom, pitch. More info here.

Our objective was to try to copy those movements into the Robuilder arm. However the Robobuilder arm does not have shoulder roll. That meant we had to sacrifice one degree of freedom.

To those more familiar with these subjects, what we are doing here is not inverse kinematics. We do not want to move the robot hand to a specific position in space, but rather move the robot arm in a similar way to the human movement. This means we only need to work with angles.

The Kinect outputs joints positions in a coordinate system present at the end of this site.
Since we need to get two angles for the two shoulder servos, the use of spherical coordinate system seems adequate.

The steps performed for each side of the body are:

The 3d position of the elbow is subtracted from the 3d position of the shoulder.
This gives a 3d vector corresponding to the upper arm angle in relation to the body.
That coordinate system is transformed into a coordinate system, where the Z axis points from the shoulder joint to the outward direction.
This enables us to obtain the azimuth and inclination angle by using these formulas.
(All of this assumes the person is facing the Kinect, so some strange results may occur if this is not the case)

For the elbow angle another approach is used:
By subtracting the wrist position from the elbow position, we get the vector corresponding to the forearm.
We can then use this method to obtain the angle that is used for the elbow.

Because of the limitation on the degrees of freedom only the elbow angle is copied, and not the forearm direction. This means that in some cases the movement is not correct.

Still we think this works out nicely.

Also visible in the video of another demo we prepared, where the Kinect is rotating to track a person.

I did the math for this, so if you have any questions let me know.
Pedro did most of the programming, so he will follow up with details on that

Regards
Hello

To those who did not know this, some time ago we did a small demo with a Robobuilder being controlled with a Kinect sensor.

That demo only read the position of the hand and moved the robot arm up and down to try to mimic the hand movement.

We have now improved on that to include full Upper Body Tracking.
This means the Robobuilder should be able to much closer mimic the position of arms by tracking shoulder, elbow and wrist (within it's own DoF limitations):

phpBB [media]


phpBB [media]


[Edit by Pedro, July 9th 2012] The source code and executable for the new version with Full upper Body Tracking can be downloaded here http://robosavvy.com/RoboSavvyPages/Rob ... _SDKv1.zip

- If you just want to run the software (without editing any source code), follow the same instructions used for the previous version: http://robosavvy.com/forum/viewtopic.php?p=33879#33943

- If you want to edit the code you'll need some more dependencies. Check here http://robosavvy.com/forum/viewtopic.php?p=33879#33892

An important heads up is that this code is designed for Kinect SDK v1.0



How we've implemented it:

WARNING! Math contents ahead! :)

A human shoulder has 3 degrees of freedom, meaning it can move in 3 different ways. Pitch, roll and yaw. A human elbow has one degree of freedom, pitch. More info here.

Our objective was to try to copy those movements into the Robuilder arm. However the Robobuilder arm does not have shoulder roll. That meant we had to sacrifice one degree of freedom.

To those more familiar with these subjects, what we are doing here is not inverse kinematics. We do not want to move the robot hand to a specific position in space, but rather move the robot arm in a similar way to the human movement. This means we only need to work with angles.

The Kinect outputs joints positions in a coordinate system present at the end of this site.
Since we need to get two angles for the two shoulder servos, the use of spherical coordinate system seems adequate.

The steps performed for each side of the body are:

The 3d position of the elbow is subtracted from the 3d position of the shoulder.
This gives a 3d vector corresponding to the upper arm angle in relation to the body.
That coordinate system is transformed into a coordinate system, where the Z axis points from the shoulder joint to the outward direction.
This enables us to obtain the azimuth and inclination angle by using these formulas.
(All of this assumes the person is facing the Kinect, so some strange results may occur if this is not the case)

For the elbow angle another approach is used:
By subtracting the wrist position from the elbow position, we get the vector corresponding to the forearm.
We can then use this method to obtain the angle that is used for the elbow.

Because of the limitation on the degrees of freedom only the elbow angle is copied, and not the forearm direction. This means that in some cases the movement is not correct.

Still we think this works out nicely.

Also visible in the video of another demo we prepared, where the Kinect is rotating to track a person.

I did the math for this, so if you have any questions let me know.
Pedro did most of the programming, so he will follow up with details on that

Regards
Last edited by MarcoP on Fri Jun 22, 2012 11:51 am, edited 1 time in total.
MarcoP offline
Savvy Roboteer
Savvy Roboteer
Posts: 81
Joined: Thu Jan 19, 2012 6:14 pm

Post by Kondo » Fri Jun 22, 2012 11:45 am

Post by Kondo
Fri Jun 22, 2012 11:45 am

What a great update :shock: , what has been published shall return all the programs used to get it? or you still working on the same versions as in the previous work with Kinect?
What a great update :shock: , what has been published shall return all the programs used to get it? or you still working on the same versions as in the previous work with Kinect?
Kondo offline
Savvy Roboteer
Savvy Roboteer
Posts: 48
Joined: Sat Jul 23, 2011 7:35 pm

Post by MarcoP » Fri Jun 22, 2012 2:16 pm

Post by MarcoP
Fri Jun 22, 2012 2:16 pm

Not sure if that is what you are asking, but all of this was done with the same software as the previous example.

We just wrote a lot more code.

Regards
Not sure if that is what you are asking, but all of this was done with the same software as the previous example.

We just wrote a lot more code.

Regards
MarcoP offline
Savvy Roboteer
Savvy Roboteer
Posts: 81
Joined: Thu Jan 19, 2012 6:14 pm

Post by l3v3rz » Sat Jun 23, 2012 10:50 am

Post by l3v3rz
Sat Jun 23, 2012 10:50 am

Nice piece of work !

I think he means are you going to publish your new code ? And did you use the same version of kinect or the latest.
Nice piece of work !

I think he means are you going to publish your new code ? And did you use the same version of kinect or the latest.
l3v3rz offline
Savvy Roboteer
Savvy Roboteer
Posts: 466
Joined: Fri Jul 18, 2008 2:34 pm

Post by MarcoP » Sat Jun 23, 2012 2:23 pm

Post by MarcoP
Sat Jun 23, 2012 2:23 pm

Tks l3v3rz

As usual some of the work was facilitated thanks to your libraries for Robobuilder.

It is the standard Kinect and we will share the code as before. Since it was Pedro that wrote it, i better leave that for him. I suppose it will be done in a few days.

Regards
Tks l3v3rz

As usual some of the work was facilitated thanks to your libraries for Robobuilder.

It is the standard Kinect and we will share the code as before. Since it was Pedro that wrote it, i better leave that for him. I suppose it will be done in a few days.

Regards
MarcoP offline
Savvy Roboteer
Savvy Roboteer
Posts: 81
Joined: Thu Jan 19, 2012 6:14 pm

Post by PedroR » Mon Jul 09, 2012 10:44 am

Post by PedroR
Mon Jul 09, 2012 10:44 am

Hi guys

Sorry for the time it took to get back to you with the source code.

Time runs fast over here! (Marco had to give me a heads up to remind me we hadn't published the code)


Before diving into the code it is important to understand the major improvement in this version is full Upper Body Tracking.

We're now tracking a lot more joints: Should, Elbow and Hand and calculating the vectors between each pair of joints in 3D space.

We then map the angles to actual Servo positions (within some pre established boundaries as the Robobuilder DoFs are much more limited than ours). Marco explains this is great in his first post of this thread.

The source code and executable for the new version can be downloaded here http://robosavvy.com/RoboSavvyPages/Rob ... _SDKv1.zip

- If you just want to run the software (without editing any source code), follow the same instructions used for the previous version: http://robosavvy.com/forum/viewtopic.php?p=33879#33943

- If you want to edit the code you'll need some more dependencies. Check here http://robosavvy.com/forum/viewtopic.php?p=33879#33892

An important heads up is that this code is designed for Kinect SDK v1.0

I've also re posted this information on the top of the thread to make sure people find this easily.

We haven't implemented the cool Kinect Speech Recognition feature that l3v3rz uses as these demos are for use in Trade shows where the ambient noise is too loud. Having said this if anyone wants to add it to the code, we'd more than happy to re post it.


More interesting new features:

On the GUI you'll notice an additional row with new fields at the bottom of the Window:

Image
kinematics by RoboSavvy, on Flickr

This is a new functionality that we've added to Track the Person in real time by rotating the Kinect on a motorized platform.
(you can see this happening in the videos on the first post).

If you supply a "Stepper COMM" and click connect, the software will start sending serial commands regularly over the COMM port with the actual Angle of the person in relation to the Kinect (displaye din the field "Kinect Angle").

This is done by calculating the angle between the center of the Kinect Space and the person's Head.

In our setup our Kinect was on a Rotating Platform controlled by Arduino and we kept sending angles to the platform to continuously keep track of the person.

If you want to implement your own solution the protocol is quite simple:
2 bytes:
- first byte is an "R"
- second byte is a signed byte the actual angle. (fyi a Signed Byte is still a byte but ranging from -127 to +127 but it's still 8 bit).
The positive/negative give you the direction (L/R).

If the person becomes out of range, you'll get an angle of 127, meaning person lost.
You can use this information to continue tracking for people of just stop.

To prevent shaking and overshooting, there is also a hard coded dead zone; I believe nothing is sent if the person is within +-5º of the center.


We hope you give it a try and let us know your thoughts on the improvements ;)

Regards
Pedro.
Hi guys

Sorry for the time it took to get back to you with the source code.

Time runs fast over here! (Marco had to give me a heads up to remind me we hadn't published the code)


Before diving into the code it is important to understand the major improvement in this version is full Upper Body Tracking.

We're now tracking a lot more joints: Should, Elbow and Hand and calculating the vectors between each pair of joints in 3D space.

We then map the angles to actual Servo positions (within some pre established boundaries as the Robobuilder DoFs are much more limited than ours). Marco explains this is great in his first post of this thread.

The source code and executable for the new version can be downloaded here http://robosavvy.com/RoboSavvyPages/Rob ... _SDKv1.zip

- If you just want to run the software (without editing any source code), follow the same instructions used for the previous version: http://robosavvy.com/forum/viewtopic.php?p=33879#33943

- If you want to edit the code you'll need some more dependencies. Check here http://robosavvy.com/forum/viewtopic.php?p=33879#33892

An important heads up is that this code is designed for Kinect SDK v1.0

I've also re posted this information on the top of the thread to make sure people find this easily.

We haven't implemented the cool Kinect Speech Recognition feature that l3v3rz uses as these demos are for use in Trade shows where the ambient noise is too loud. Having said this if anyone wants to add it to the code, we'd more than happy to re post it.


More interesting new features:

On the GUI you'll notice an additional row with new fields at the bottom of the Window:

Image
kinematics by RoboSavvy, on Flickr

This is a new functionality that we've added to Track the Person in real time by rotating the Kinect on a motorized platform.
(you can see this happening in the videos on the first post).

If you supply a "Stepper COMM" and click connect, the software will start sending serial commands regularly over the COMM port with the actual Angle of the person in relation to the Kinect (displaye din the field "Kinect Angle").

This is done by calculating the angle between the center of the Kinect Space and the person's Head.

In our setup our Kinect was on a Rotating Platform controlled by Arduino and we kept sending angles to the platform to continuously keep track of the person.

If you want to implement your own solution the protocol is quite simple:
2 bytes:
- first byte is an "R"
- second byte is a signed byte the actual angle. (fyi a Signed Byte is still a byte but ranging from -127 to +127 but it's still 8 bit).
The positive/negative give you the direction (L/R).

If the person becomes out of range, you'll get an angle of 127, meaning person lost.
You can use this information to continue tracking for people of just stop.

To prevent shaking and overshooting, there is also a hard coded dead zone; I believe nothing is sent if the person is within +-5º of the center.


We hope you give it a try and let us know your thoughts on the improvements ;)

Regards
Pedro.
PedroR offline
Savvy Roboteer
Savvy Roboteer
Posts: 1199
Joined: Mon Jun 16, 2008 11:07 pm

Post by Kondo » Mon Jul 09, 2012 11:33 am

Post by Kondo
Mon Jul 09, 2012 11:33 am

Hi PedroR

If you already have installed the above resources, we just have to install the new version of the skeletal traking?
Hi PedroR

If you already have installed the above resources, we just have to install the new version of the skeletal traking?
Kondo offline
Savvy Roboteer
Savvy Roboteer
Posts: 48
Joined: Sat Jul 23, 2011 7:35 pm

Post by PedroR » Mon Jul 09, 2012 11:42 am

Post by PedroR
Mon Jul 09, 2012 11:42 am

Hi Kondo

Yes that is correct.
If you already have all the dependencies installed (i.e. if you already ran the previous version) you shouldn't need to install anything else.
Just download the new code and run it.

We haven't changed or added any dependencies on this new version.

Regards
Pedro.
Hi Kondo

Yes that is correct.
If you already have all the dependencies installed (i.e. if you already ran the previous version) you shouldn't need to install anything else.
Just download the new code and run it.

We haven't changed or added any dependencies on this new version.

Regards
Pedro.
PedroR offline
Savvy Roboteer
Savvy Roboteer
Posts: 1199
Joined: Mon Jun 16, 2008 11:07 pm

Post by Kondo » Mon Jul 09, 2012 11:44 am

Post by Kondo
Mon Jul 09, 2012 11:44 am

Thanks PedroR

What proves this week and if I have time, shoot another video with the results, good, thanks for sharing these things, it's great :D
Thanks PedroR

What proves this week and if I have time, shoot another video with the results, good, thanks for sharing these things, it's great :D
Kondo offline
Savvy Roboteer
Savvy Roboteer
Posts: 48
Joined: Sat Jul 23, 2011 7:35 pm

Post by PedroR » Mon Jul 09, 2012 11:52 am

Post by PedroR
Mon Jul 09, 2012 11:52 am

You're welcome. :)

We have a lot of fun doing these things and they make for great demos. We only wish we had some more time to work on this type of projects.

Rgds
Pedro.
You're welcome. :)

We have a lot of fun doing these things and they make for great demos. We only wish we had some more time to work on this type of projects.

Rgds
Pedro.
PedroR offline
Savvy Roboteer
Savvy Roboteer
Posts: 1199
Joined: Mon Jun 16, 2008 11:07 pm

Post by Kondo » Mon Jul 09, 2012 12:26 pm

Post by Kondo
Mon Jul 09, 2012 12:26 pm

I must be doing something wrong because I downloaded the new version but I can not make it work, the program is not responding.

The application I try to run this in the folder C: \Robobuilder_V2_ SkeletalTrackingVB_SDKv1 \ Robobuilder_V2_ SkeletalTrackingVB_SDKv1 \ SkeletalTracking \ bin \ Release :?
I must be doing something wrong because I downloaded the new version but I can not make it work, the program is not responding.

The application I try to run this in the folder C: \Robobuilder_V2_ SkeletalTrackingVB_SDKv1 \ Robobuilder_V2_ SkeletalTrackingVB_SDKv1 \ SkeletalTracking \ bin \ Release :?
Kondo offline
Savvy Roboteer
Savvy Roboteer
Posts: 48
Joined: Sat Jul 23, 2011 7:35 pm

Post by PedroR » Mon Jul 09, 2012 1:03 pm

Post by PedroR
Mon Jul 09, 2012 1:03 pm

Can be more specific?

What do you mean by "not responding"?

- Does the Application Window Open?
- Are there any errors when you open the application?

- If you place yourself in front of the kinect (without connecting to Robobuilder) can you see the Balls move on the screen?

- When you type the "Robobuilder COMM" port number and click Connect what happens?


Ideally when you click "Connect" to Robobuilder, Kinect should not be tracking you to avoid having the USB bus bombarded with messages.

Finally, the "Stepper COM" fields should be left un touched unless you're actually trying to work that out.

Pedro.
Can be more specific?

What do you mean by "not responding"?

- Does the Application Window Open?
- Are there any errors when you open the application?

- If you place yourself in front of the kinect (without connecting to Robobuilder) can you see the Balls move on the screen?

- When you type the "Robobuilder COMM" port number and click Connect what happens?


Ideally when you click "Connect" to Robobuilder, Kinect should not be tracking you to avoid having the USB bus bombarded with messages.

Finally, the "Stepper COM" fields should be left un touched unless you're actually trying to work that out.

Pedro.
PedroR offline
Savvy Roboteer
Savvy Roboteer
Posts: 1199
Joined: Mon Jun 16, 2008 11:07 pm

Post by Kondo » Mon Jul 09, 2012 1:12 pm

Post by Kondo
Mon Jul 09, 2012 1:12 pm

Forgive my English level, but I am using the google translator.

I open the program and the balls move, put the port number of kinect and the program says that this "connected", but when I was connecting its port robobuilder corresponding, the program gets stuck and I can not close or do nothing.
Forgive my English level, but I am using the google translator.

I open the program and the balls move, put the port number of kinect and the program says that this "connected", but when I was connecting its port robobuilder corresponding, the program gets stuck and I can not close or do nothing.
Kondo offline
Savvy Roboteer
Savvy Roboteer
Posts: 48
Joined: Sat Jul 23, 2011 7:35 pm

Post by Kondo » Mon Jul 09, 2012 1:30 pm

Post by Kondo
Mon Jul 09, 2012 1:30 pm

I think I'm confused, Stepper port to which it belongs?
I think I'm confused, Stepper port to which it belongs?
Kondo offline
Savvy Roboteer
Savvy Roboteer
Posts: 48
Joined: Sat Jul 23, 2011 7:35 pm

Post by MarcoP » Mon Jul 09, 2012 1:34 pm

Post by MarcoP
Mon Jul 09, 2012 1:34 pm

Hi Kondo.

What Pedro said was for you not to use the Stepper Com.

It belongs to another part of the demo that it is not needed.


Regards
Hi Kondo.

What Pedro said was for you not to use the Stepper Com.

It belongs to another part of the demo that it is not needed.


Regards
MarcoP offline
Savvy Roboteer
Savvy Roboteer
Posts: 81
Joined: Thu Jan 19, 2012 6:14 pm

Next
Next
31 postsPage 1 of 31, 2, 3
31 postsPage 1 of 31, 2, 3