Roboard Experience...

Based on DMP's Vortex processor / SoC this board is a full computer capable of running a standard Windows and Linux installation on the backpack of your robot.
95 postsPage 4 of 71, 2, 3, 4, 5, 6, 7
95 postsPage 4 of 71, 2, 3, 4, 5, 6, 7

Post by matt.stevenson » Sun Aug 08, 2010 2:22 am

Post by matt.stevenson
Sun Aug 08, 2010 2:22 am

Hey Paul,

Sorry to hear about all the troubles. I've been stuck up in the Northeast doing research (not roboard related). This weekend I have to go out to San Fran. I'm not getting home until the 20th.

On an upbeat, before I left I was working with a small mic array using an ADC. I have it all working great on my desktop using a tascam USB ADC. My next step is to test it out using the Roboard's ADC. The software I wrote is in C and runs on Windows and Linux. Unfortunately, it will still be a couple week before I can test it.

The code I have is a simple OpenGL gui that display multiple mic channels and allows continuous recording/processing. If you private message me your email, I can send you the code within the next couple days.
Hey Paul,

Sorry to hear about all the troubles. I've been stuck up in the Northeast doing research (not roboard related). This weekend I have to go out to San Fran. I'm not getting home until the 20th.

On an upbeat, before I left I was working with a small mic array using an ADC. I have it all working great on my desktop using a tascam USB ADC. My next step is to test it out using the Roboard's ADC. The software I wrote is in C and runs on Windows and Linux. Unfortunately, it will still be a couple week before I can test it.

The code I have is a simple OpenGL gui that display multiple mic channels and allows continuous recording/processing. If you private message me your email, I can send you the code within the next couple days.
matt.stevenson offline
Savvy Roboteer
Savvy Roboteer
Posts: 37
Joined: Thu Apr 29, 2010 9:29 pm

Post by PaulL » Fri Aug 13, 2010 3:49 am

Post by PaulL
Fri Aug 13, 2010 3:49 am

Hey Matt,

Sorry to hear your job took you from your bot, mine sent me to Cali not so long ago. At least I took my code with me to work on in the hotel.. :) I'm feeling a bit better about my 'bot, got replacement gears for my HRS-5498HB servos on the way (a few more to order tomorrow, but I CAN order them!).

If you want me to test your code, I can possibly get to it this weekend, I'll send you my email address. Sounds interesting, I hope it works on Roboard! :) I will know more about my scenario some time next week when the Mini PCI to PCI adapter arrives.

Quick note- I'm not too familiar with OpenGL- will OpenGL code work on Roboard? I thought the graphics chipset needs to support OpenGL for OpenGL to work. What am I missing?

Take Care,
Paul
Hey Matt,

Sorry to hear your job took you from your bot, mine sent me to Cali not so long ago. At least I took my code with me to work on in the hotel.. :) I'm feeling a bit better about my 'bot, got replacement gears for my HRS-5498HB servos on the way (a few more to order tomorrow, but I CAN order them!).

If you want me to test your code, I can possibly get to it this weekend, I'll send you my email address. Sounds interesting, I hope it works on Roboard! :) I will know more about my scenario some time next week when the Mini PCI to PCI adapter arrives.

Quick note- I'm not too familiar with OpenGL- will OpenGL code work on Roboard? I thought the graphics chipset needs to support OpenGL for OpenGL to work. What am I missing?

Take Care,
Paul
PaulL offline
Savvy Roboteer
Savvy Roboteer
Posts: 423
Joined: Sat Sep 15, 2007 12:52 am

Post by matt.stevenson » Fri Aug 13, 2010 5:06 am

Post by matt.stevenson
Fri Aug 13, 2010 5:06 am

OpenGL can work on the roboard, but you wouldn't want to waste the processing on it. So I've been thinking about the code and I think it will take a little work to get it going on the roboard, but in theory it should run fine.

Let me explain it a little. I wrote this for someone who is working with a Tascam US-1641 16 channel ADC.
Other than OpenGL, I am using portaudio which is an open source ASIO library. I chose that because the Tascam unit came with some easy to use ASIO drivers. The roboard lacks ASIO drivers for the ADC, so this is the first thing that will need to go. However, I'm hoping that won't be a big issue. Portaudio provides audio input as a stream floats from -1.0 to 1.0 at whatever sample rate I choose. I just need get that from the roboard ADC.

The last library I am using is libsndfile, a library for reading/writing sound files. When I record from the Tascam, I am just appending the samples I get to a wav file using their library.

So, my thought is that I should be able to get audio from the ADC and save it to a wav, completely bypassing the sound card.

Using the incoming sound for something else will require I little more work, but you have to do these things one step at a time.

I should mention that it is a C# project, and port audio detected all of my sound inputs in windows, not just the Tascam unit.

Expect an email momentarily.
OpenGL can work on the roboard, but you wouldn't want to waste the processing on it. So I've been thinking about the code and I think it will take a little work to get it going on the roboard, but in theory it should run fine.

Let me explain it a little. I wrote this for someone who is working with a Tascam US-1641 16 channel ADC.
Other than OpenGL, I am using portaudio which is an open source ASIO library. I chose that because the Tascam unit came with some easy to use ASIO drivers. The roboard lacks ASIO drivers for the ADC, so this is the first thing that will need to go. However, I'm hoping that won't be a big issue. Portaudio provides audio input as a stream floats from -1.0 to 1.0 at whatever sample rate I choose. I just need get that from the roboard ADC.

The last library I am using is libsndfile, a library for reading/writing sound files. When I record from the Tascam, I am just appending the samples I get to a wav file using their library.

So, my thought is that I should be able to get audio from the ADC and save it to a wav, completely bypassing the sound card.

Using the incoming sound for something else will require I little more work, but you have to do these things one step at a time.

I should mention that it is a C# project, and port audio detected all of my sound inputs in windows, not just the Tascam unit.

Expect an email momentarily.
matt.stevenson offline
Savvy Roboteer
Savvy Roboteer
Posts: 37
Joined: Thu Apr 29, 2010 9:29 pm

Acceleration and Deceleration...

Post by PaulL » Sat Sep 11, 2010 4:15 pm

Post by PaulL
Sat Sep 11, 2010 4:15 pm

I received a PM with questions about this, so I figured I'd take a moment and write up a bit more about movement and what the code I posted in this thread does, how, and why.

In working with RoboIO, I realized early on that the code (servo moves, in particular) isn't really built for multitasking / multithreading. The problem I sought to get away from in using the MRC-3024 in my Robonova (interrupting a move, responding to things in the environment) persisted with using the RoboIO code with my Roboard. You can do fancy things in C on an MRC-3024 to get past the limitations of RoboBasic, but that wasn't my ideal approach, and I wanted much more processing power.

Physically, the Roboard is capable of exactly what I want and more, but not when using the RoboIO RCServo code as intended. I could manipulate the PWM module and abandon the RCSERVO lib to get similar results, but I found it easier for me to rewrite my own "RoboIO" code. Up to now, I am glad that I did, it's given me significant knowledge of how the board functions.

I have rewritten most of the capability found in RoboIO, for a few different reasons, here are the main ones:

* To understand Roboard better and know what the hardware can actually do.
* To simplify register access and control.
* To have a multithreading-friendly means of control.

This included writing my own servo-move code, which I am still working on (I have almost all of the work done, but am currently side-tracked with the sound card issue as in my other thread). Specific to the servo functions, I wanted to be able to control the servos better than just start here / stop there.

What is a movement in a servo, exactly? A move is just changes in position over time.

Time is one of the more problematic aspects of programming under Windows. Windows was built for people, not machines. A user won't notice a 100 mS lag, but your 'bot will! You can make Windows work with machines, but it takes some creativity to do so. Bear in mind, I'm not talking about Windows CE (predictive code execution and more), I'm talking about Windows XP and the rest. The timers in Windows are horrible for controlling machines. The Windows Forms timer is the worst- it may fire, it might wait a while. At best, it's got a 15 mS interval. A much better timer is the Multimedia timer, but it also is not perfect and can be ignored by the operating system for some period of time (not as bad as other timers, though). Timers in Windows, in short, are not absolutely periodic, and you absolutely can not use them as a time base in and of themselves. Even retrieving system time is problematic.

If you can't count on a Timer for keeping time, what can you do? I opted for the most accurate method, accessing a core CPU function called RDTSC. RDTSC is a function for retrieving the value for the number of CPU clock ticks since power-up. Time won't get more accurate than that on a desktop PC.

RDTSC doesn't give you a Timer event, but with a useable timer like the Multimedia timer coupled with RDTSC to tell how much time has truly elapsed, you can get very respectable results. You can even tune those results with the MM Timer's interval to back off timer resolution to an acceptable level (you wouldn't want to update the timer so fast that the value doesn't really change from one update to the next!).

Regarding servo movements being changes in position over time, this view can be boiled down to its key components: Time means duration, as a move will start at some time, and end at a later time. Duration = End Time - Start Time. Time is pulled from RDTSC, with each increment being a tick of the CPU's clock.

Some might think, "I am just moving a servo, why do I care so much about time, why is it so critical?"- Well, because timing is everything. :) Seriously, Time is a defining characteristic of a physical movement. If it wasn't, we'd just say "Set servo to position X", and the servo would get there as fast as it could. Try to make your bot walk with THAT logic. :) Instead, we have to massage motion out of the servo to get TO some position at some point in time. If we don't do that accurately enough, there will be jerkiness and fluid motion will be ruined, likely with random interruptions of the move as it occurs.

So, in my code, the Timer just ticks off at pseudo-random intervals (due to Windows), and in the Timer event, I call a function to get RDTSC to calculate FractionOfMove for my move code. When I say "Ticks", I'm referring to the value of RDTSC here. The resulting "FractionOfMove" calculation is (CurrentTicks - StartTicks) / (EndTicks - StartTicks), resulting in a value representing the fraction of the move. If CurrentTicks >= EndTicks, then the move is over, and we just set the current position to the end position and end the move. FractionOfMove transitions from 0 to 1 fractionally over the course of the move. Zero at start, 1 at end of move.

All of that gives us the ability to know time (in CPU clock ticks) and an event in which to update servo positions (Multimedia Timer), now we just need to set the position of the move itself for a given fraction of the move. For a point-to-point, no acceleration move, this would mean something like CurrentPosition = ((EndPosition-StartPosition) * FractionOfMove) + StartPosition. This would give you the same result as RCServo, and same as stock MRC-3024 moves, and many others.

Doing what I often do, I'm going to put in a few notes: I played with the math Sine function for acceleration / deceleration, but found it to be problematic when interrupting a move. It was also more CPU-intensive. I also went after the true calculations for the physics of trajectories and acceleration, which although was an interesting dive into calculus (not my strongest area), also resulted in CPU-intensive code. The Calculus-oriented physics calculations resulted in wide-swinging waves, often overshooting position and reversing when things (duration or end position) changed during the move.

I thought about it, a lot. I wanted simple calculations without all the CPU use and fuss. I started just looking at Acceleration. I ended up with Acceleration as FractionOfMove * Distance * FractionOfMove (note, "linear" movement is simply FractionOfMove * Distance). This gives a nice acceleration curve. At the start of the move, distance will be zero (0 * Distance * 0). At the end of the move, the result is the distance (1 * Distance * 1). The result inbetween is a nice acceleration curve. I used this same principle for decel, accelerating to .5 of the move as the end point, and reversing the values for decel as in the code. This worked nicely.

Now, on to interrupting a move:

Because by nature a timer is threaded, it can also be interrupted, and things can change that affect what happens in the timer external of the timer's code. This is bad for movement, but we've already dealt with that. It is GOOD for being able to alter what the timer does.

We still have a problem, if anyone's paying attention. :) We can't directly change the end time or the end position without grossly affecting motion. The servo will suddenly feel the need to be somewhere else if end time or end position is changed during the move, and that's bad.

What to do, what to do? We change our destination GRADUALLY. This is speaking of code I haven't yet written (mainly due to focusing on the sound card problem as per my other thread here), so I'm going to throw out some ideas in my head that I haven't coded yet:

* Calculate the difference in the new end position / end time versus current, and use my 0,1,2 theory to apply the change. This theory basically means that for a change's net value to be a factor of 1 (itself), it must transition from factors 0 to 2 fractionally over time so as not to be abrupt. In code, this would mean from the time the change is made to a new end time, the end time would be altered very gradually, then more and more as the end time nears. This could mean ending sooner or later. Imagine drawing out a deceleration, or shortening up a deceleration. Your position at time of change would be offset by 0 * change, at the midpoint from start of change to end of move by 1 * change, and at the end, offset by 2 * change. Same is true for a position being changed. The problem I see here is the possibility for abrupt changes towards the end of the move if the intended change is drastic- but then, if a change is drastic, wouldn't the resulting movement be equally drastic? :)

* Do something creative with the last move's delta (position over time) and the already written accel / decel code. Not sure how much trouble this will cause, I will end up experimenting a bit with this one.

So there we go, my $.02 on motion and servos.

*** General Update ***

I have been working on board design for the sound card for Roboard, and am getting close to something that should work. With the costs involved, I'm very tempted NOT to implement too much on this board. Right now, I'm thinking of adding a microphone preamp, a 1 to 2 watt amplifier, and maybe a "steering" circuit to accept 2 microphone inputs and generate an analog signal relating to left-to-right panning of signal source (this would support "turning of head to listen", and would take load off CPU to do the same in code). As much as I'd like to integrate a power distribution system consisting of DC-to-DC converters with digitally tunable power output, the cost would be ridiculous in low volume, likely approaching the cost of Roboard itself (and the board would be a bit larger, and heavier, etc). That part of my project will probably have to be separate, which may work better in distributing the components around the 'bot a bit. PCB's will be a bit steep on price, and I need to buy a hot-air reflow station. Further, the board's got to be right from the start. Anyone betting on a disaster here? I almost am. :)

I've also repaired a few "broken tooth" HSR-5498SG servos. One little "karbonite" gear in the middle of the steel ones, and no surprise, IT is the one that fails.

Regarding the e-chain based hands, I have a couple micro servos sitting on the bench to repair, the pot leads are stressed, and they only work when magic finger pressure is applied just so. I have fine-gauge silicone wire I'm using to isolate the board from the stressed pot leads- tedious work on the micro servos. As for the hands, I have some drawings for some areas I know will not change, and I have what may be a final concept for hand bracketry sketched on paper. I need to fab some parts in polycarb, test the fit, and finalize the bracket dimensions. Then, I can CNC some brackets and build the hands.

I KNOW the hands will work out fine, the only problem I have left is to add wrist rotation in what is almost zero space. I'm not sure what I'm going to do there. I'm just going to build the hands as tight as I can and work from there. Worst case scenario, I hang a big servo out of the way and rotate the hand at a bearing joint.

Hips. I might get below the 4mm height I was thinking they were going to add. I'm going to experiment with some "Nylatron" for the thrust part of the bearing. If it doesn't stick, that's what I'm going to use. If it DOES stick, I'm going to try nylatron as a ball bearing surface, order some bearing balls, and make a ball cage out of UHMW or teflon for the thrust bearing. The problem I'm thinking I'm going to have is that the 2 x 1mm thick thrust washers and 2mm thick roller bearing assembly are just too heavy. It's a lot of weight, and they're very over-rated for this use (something like >1000 pounds thrust). They slap right together with the center bearing, but I want something lighter weight. Besides, if I cut the thrust washers myself, I can integrate a pulley on the circumference for rotating the joint (otherwise, I'd have to add it separately).

That's all for now...

Take Care,
Paul
I received a PM with questions about this, so I figured I'd take a moment and write up a bit more about movement and what the code I posted in this thread does, how, and why.

In working with RoboIO, I realized early on that the code (servo moves, in particular) isn't really built for multitasking / multithreading. The problem I sought to get away from in using the MRC-3024 in my Robonova (interrupting a move, responding to things in the environment) persisted with using the RoboIO code with my Roboard. You can do fancy things in C on an MRC-3024 to get past the limitations of RoboBasic, but that wasn't my ideal approach, and I wanted much more processing power.

Physically, the Roboard is capable of exactly what I want and more, but not when using the RoboIO RCServo code as intended. I could manipulate the PWM module and abandon the RCSERVO lib to get similar results, but I found it easier for me to rewrite my own "RoboIO" code. Up to now, I am glad that I did, it's given me significant knowledge of how the board functions.

I have rewritten most of the capability found in RoboIO, for a few different reasons, here are the main ones:

* To understand Roboard better and know what the hardware can actually do.
* To simplify register access and control.
* To have a multithreading-friendly means of control.

This included writing my own servo-move code, which I am still working on (I have almost all of the work done, but am currently side-tracked with the sound card issue as in my other thread). Specific to the servo functions, I wanted to be able to control the servos better than just start here / stop there.

What is a movement in a servo, exactly? A move is just changes in position over time.

Time is one of the more problematic aspects of programming under Windows. Windows was built for people, not machines. A user won't notice a 100 mS lag, but your 'bot will! You can make Windows work with machines, but it takes some creativity to do so. Bear in mind, I'm not talking about Windows CE (predictive code execution and more), I'm talking about Windows XP and the rest. The timers in Windows are horrible for controlling machines. The Windows Forms timer is the worst- it may fire, it might wait a while. At best, it's got a 15 mS interval. A much better timer is the Multimedia timer, but it also is not perfect and can be ignored by the operating system for some period of time (not as bad as other timers, though). Timers in Windows, in short, are not absolutely periodic, and you absolutely can not use them as a time base in and of themselves. Even retrieving system time is problematic.

If you can't count on a Timer for keeping time, what can you do? I opted for the most accurate method, accessing a core CPU function called RDTSC. RDTSC is a function for retrieving the value for the number of CPU clock ticks since power-up. Time won't get more accurate than that on a desktop PC.

RDTSC doesn't give you a Timer event, but with a useable timer like the Multimedia timer coupled with RDTSC to tell how much time has truly elapsed, you can get very respectable results. You can even tune those results with the MM Timer's interval to back off timer resolution to an acceptable level (you wouldn't want to update the timer so fast that the value doesn't really change from one update to the next!).

Regarding servo movements being changes in position over time, this view can be boiled down to its key components: Time means duration, as a move will start at some time, and end at a later time. Duration = End Time - Start Time. Time is pulled from RDTSC, with each increment being a tick of the CPU's clock.

Some might think, "I am just moving a servo, why do I care so much about time, why is it so critical?"- Well, because timing is everything. :) Seriously, Time is a defining characteristic of a physical movement. If it wasn't, we'd just say "Set servo to position X", and the servo would get there as fast as it could. Try to make your bot walk with THAT logic. :) Instead, we have to massage motion out of the servo to get TO some position at some point in time. If we don't do that accurately enough, there will be jerkiness and fluid motion will be ruined, likely with random interruptions of the move as it occurs.

So, in my code, the Timer just ticks off at pseudo-random intervals (due to Windows), and in the Timer event, I call a function to get RDTSC to calculate FractionOfMove for my move code. When I say "Ticks", I'm referring to the value of RDTSC here. The resulting "FractionOfMove" calculation is (CurrentTicks - StartTicks) / (EndTicks - StartTicks), resulting in a value representing the fraction of the move. If CurrentTicks >= EndTicks, then the move is over, and we just set the current position to the end position and end the move. FractionOfMove transitions from 0 to 1 fractionally over the course of the move. Zero at start, 1 at end of move.

All of that gives us the ability to know time (in CPU clock ticks) and an event in which to update servo positions (Multimedia Timer), now we just need to set the position of the move itself for a given fraction of the move. For a point-to-point, no acceleration move, this would mean something like CurrentPosition = ((EndPosition-StartPosition) * FractionOfMove) + StartPosition. This would give you the same result as RCServo, and same as stock MRC-3024 moves, and many others.

Doing what I often do, I'm going to put in a few notes: I played with the math Sine function for acceleration / deceleration, but found it to be problematic when interrupting a move. It was also more CPU-intensive. I also went after the true calculations for the physics of trajectories and acceleration, which although was an interesting dive into calculus (not my strongest area), also resulted in CPU-intensive code. The Calculus-oriented physics calculations resulted in wide-swinging waves, often overshooting position and reversing when things (duration or end position) changed during the move.

I thought about it, a lot. I wanted simple calculations without all the CPU use and fuss. I started just looking at Acceleration. I ended up with Acceleration as FractionOfMove * Distance * FractionOfMove (note, "linear" movement is simply FractionOfMove * Distance). This gives a nice acceleration curve. At the start of the move, distance will be zero (0 * Distance * 0). At the end of the move, the result is the distance (1 * Distance * 1). The result inbetween is a nice acceleration curve. I used this same principle for decel, accelerating to .5 of the move as the end point, and reversing the values for decel as in the code. This worked nicely.

Now, on to interrupting a move:

Because by nature a timer is threaded, it can also be interrupted, and things can change that affect what happens in the timer external of the timer's code. This is bad for movement, but we've already dealt with that. It is GOOD for being able to alter what the timer does.

We still have a problem, if anyone's paying attention. :) We can't directly change the end time or the end position without grossly affecting motion. The servo will suddenly feel the need to be somewhere else if end time or end position is changed during the move, and that's bad.

What to do, what to do? We change our destination GRADUALLY. This is speaking of code I haven't yet written (mainly due to focusing on the sound card problem as per my other thread here), so I'm going to throw out some ideas in my head that I haven't coded yet:

* Calculate the difference in the new end position / end time versus current, and use my 0,1,2 theory to apply the change. This theory basically means that for a change's net value to be a factor of 1 (itself), it must transition from factors 0 to 2 fractionally over time so as not to be abrupt. In code, this would mean from the time the change is made to a new end time, the end time would be altered very gradually, then more and more as the end time nears. This could mean ending sooner or later. Imagine drawing out a deceleration, or shortening up a deceleration. Your position at time of change would be offset by 0 * change, at the midpoint from start of change to end of move by 1 * change, and at the end, offset by 2 * change. Same is true for a position being changed. The problem I see here is the possibility for abrupt changes towards the end of the move if the intended change is drastic- but then, if a change is drastic, wouldn't the resulting movement be equally drastic? :)

* Do something creative with the last move's delta (position over time) and the already written accel / decel code. Not sure how much trouble this will cause, I will end up experimenting a bit with this one.

So there we go, my $.02 on motion and servos.

*** General Update ***

I have been working on board design for the sound card for Roboard, and am getting close to something that should work. With the costs involved, I'm very tempted NOT to implement too much on this board. Right now, I'm thinking of adding a microphone preamp, a 1 to 2 watt amplifier, and maybe a "steering" circuit to accept 2 microphone inputs and generate an analog signal relating to left-to-right panning of signal source (this would support "turning of head to listen", and would take load off CPU to do the same in code). As much as I'd like to integrate a power distribution system consisting of DC-to-DC converters with digitally tunable power output, the cost would be ridiculous in low volume, likely approaching the cost of Roboard itself (and the board would be a bit larger, and heavier, etc). That part of my project will probably have to be separate, which may work better in distributing the components around the 'bot a bit. PCB's will be a bit steep on price, and I need to buy a hot-air reflow station. Further, the board's got to be right from the start. Anyone betting on a disaster here? I almost am. :)

I've also repaired a few "broken tooth" HSR-5498SG servos. One little "karbonite" gear in the middle of the steel ones, and no surprise, IT is the one that fails.

Regarding the e-chain based hands, I have a couple micro servos sitting on the bench to repair, the pot leads are stressed, and they only work when magic finger pressure is applied just so. I have fine-gauge silicone wire I'm using to isolate the board from the stressed pot leads- tedious work on the micro servos. As for the hands, I have some drawings for some areas I know will not change, and I have what may be a final concept for hand bracketry sketched on paper. I need to fab some parts in polycarb, test the fit, and finalize the bracket dimensions. Then, I can CNC some brackets and build the hands.

I KNOW the hands will work out fine, the only problem I have left is to add wrist rotation in what is almost zero space. I'm not sure what I'm going to do there. I'm just going to build the hands as tight as I can and work from there. Worst case scenario, I hang a big servo out of the way and rotate the hand at a bearing joint.

Hips. I might get below the 4mm height I was thinking they were going to add. I'm going to experiment with some "Nylatron" for the thrust part of the bearing. If it doesn't stick, that's what I'm going to use. If it DOES stick, I'm going to try nylatron as a ball bearing surface, order some bearing balls, and make a ball cage out of UHMW or teflon for the thrust bearing. The problem I'm thinking I'm going to have is that the 2 x 1mm thick thrust washers and 2mm thick roller bearing assembly are just too heavy. It's a lot of weight, and they're very over-rated for this use (something like >1000 pounds thrust). They slap right together with the center bearing, but I want something lighter weight. Besides, if I cut the thrust washers myself, I can integrate a pulley on the circumference for rotating the joint (otherwise, I'd have to add it separately).

That's all for now...

Take Care,
Paul
PaulL offline
Savvy Roboteer
Savvy Roboteer
Posts: 423
Joined: Sat Sep 15, 2007 12:52 am

Post by Sazabi » Mon Sep 13, 2010 5:36 pm

Post by Sazabi
Mon Sep 13, 2010 5:36 pm

Hi, Paul!
Great post! Thank you!
One question (yet:)))) still here - how do you know EndTick value? Or it goes from known servo speed and known travel distance? I.e. with speed of 17 deg per second and travel from 1400 to 1800 you could count tick qty to move through this distance. So elapsed time for movement will not be more than with linear speed. This is how I understand it. What's your way of that?

That's all I got for the moment:)

Regards,
Artem
Hi, Paul!
Great post! Thank you!
One question (yet:)))) still here - how do you know EndTick value? Or it goes from known servo speed and known travel distance? I.e. with speed of 17 deg per second and travel from 1400 to 1800 you could count tick qty to move through this distance. So elapsed time for movement will not be more than with linear speed. This is how I understand it. What's your way of that?

That's all I got for the moment:)

Regards,
Artem
Sazabi offline
Savvy Roboteer
Savvy Roboteer
Posts: 73
Joined: Mon Jan 07, 2008 8:57 am

Change in Servo Move Methodology I Forgot to Mention...

Post by PaulL » Tue Sep 14, 2010 11:53 am

Post by PaulL
Tue Sep 14, 2010 11:53 am

Hi Artem,

You're right, I completely forgot to mention my view on that- Endtick is a result of duration by starting at a known point in time. Typically, when a servo move is specified, you say "move to x position at a speed of y". Instead, I say "move to x position in y amount of time". The speed is handled by the accel / decel decel code to arrive at the destination at a given point in time, much more like a human reaction. People don't move their hands and say "speed of x, position of y", and we sure don't worry about the actual speed value or the acceleration and deceleration we use, we say "I need my hand to be at position x by a point in time of y".

Time, Speed, and Position are all dependent upon each other, I just changed the way I control the actual motion in my code, using position and duration to control movement instead of speed and position. Speed takes care of itself with acceleration and deceleration. :)

I didn't like dealing with speed and position alone because that complicates acceleration and deceleration and makes handling end time much more difficult. It's much easier to build code that calculates an accel / decel curve, with controling values of duration and end position.

This also helps move away from the notion of frame-based motions (poses) where all servos move at various speeds to end at a certain point in time, though you can still do that with this approach in setting all servo durations to the same value for a move. It also helps in setting up more complicated move sequences where some servos may stop, but others keep moving, then others may start, etc.

Further, I have functions for accel, linear, and decel separately, so you can accelerate to some speed, then maintain, then decel, etc. Also, you can use functions like accel alone for a punch, and decel alone to perhaps catch your bot when it falls (that I plan to try triggered by feedback from an accelerometer).

Hope this helps!

Take Care,
Paul
Hi Artem,

You're right, I completely forgot to mention my view on that- Endtick is a result of duration by starting at a known point in time. Typically, when a servo move is specified, you say "move to x position at a speed of y". Instead, I say "move to x position in y amount of time". The speed is handled by the accel / decel decel code to arrive at the destination at a given point in time, much more like a human reaction. People don't move their hands and say "speed of x, position of y", and we sure don't worry about the actual speed value or the acceleration and deceleration we use, we say "I need my hand to be at position x by a point in time of y".

Time, Speed, and Position are all dependent upon each other, I just changed the way I control the actual motion in my code, using position and duration to control movement instead of speed and position. Speed takes care of itself with acceleration and deceleration. :)

I didn't like dealing with speed and position alone because that complicates acceleration and deceleration and makes handling end time much more difficult. It's much easier to build code that calculates an accel / decel curve, with controling values of duration and end position.

This also helps move away from the notion of frame-based motions (poses) where all servos move at various speeds to end at a certain point in time, though you can still do that with this approach in setting all servo durations to the same value for a move. It also helps in setting up more complicated move sequences where some servos may stop, but others keep moving, then others may start, etc.

Further, I have functions for accel, linear, and decel separately, so you can accelerate to some speed, then maintain, then decel, etc. Also, you can use functions like accel alone for a punch, and decel alone to perhaps catch your bot when it falls (that I plan to try triggered by feedback from an accelerometer).

Hope this helps!

Take Care,
Paul
PaulL offline
Savvy Roboteer
Savvy Roboteer
Posts: 423
Joined: Sat Sep 15, 2007 12:52 am

Post by Noobius » Fri Sep 17, 2010 1:27 pm

Post by Noobius
Fri Sep 17, 2010 1:27 pm

Hi Paul,

An excellent thread and has certainly whet my appetite for one of the RoBoards. I am new to robotics, I havent even bought one yet but I have done quite a bit of programming over the years. Entirely desktop applications though, no microcontroller or embedded systems like this, its very interesting though I have to say.

At the moment I am keeping the cash I have towards a Bioloid Premium but Limor from Robosavvy suggested I look at the KHR1 with a RoBoard and this thread makes me question my choice so far. Dammit, I was so sure it was a Bioloid! I might yet take his advice...

Thanks for taking the time to share your knowledge and experience it is very much appreciated.
Hi Paul,

An excellent thread and has certainly whet my appetite for one of the RoBoards. I am new to robotics, I havent even bought one yet but I have done quite a bit of programming over the years. Entirely desktop applications though, no microcontroller or embedded systems like this, its very interesting though I have to say.

At the moment I am keeping the cash I have towards a Bioloid Premium but Limor from Robosavvy suggested I look at the KHR1 with a RoBoard and this thread makes me question my choice so far. Dammit, I was so sure it was a Bioloid! I might yet take his advice...

Thanks for taking the time to share your knowledge and experience it is very much appreciated.
Noobius offline
Robot Builder
Robot Builder
Posts: 11
Joined: Thu Sep 16, 2010 12:13 pm

Post by PaulL » Sun Oct 10, 2010 4:05 pm

Post by PaulL
Sun Oct 10, 2010 4:05 pm

Noobius,

Thanks, I'm mostly rambling, I type fast, so I tend to type a lot. :)

It's a good bit different to program for embedded solutions under Windows, but not as different as some might think. ;) The most difficult part in using Windows as an embedded platform is in working with timers to get to the actual capabilities of the hardware. Microsoft didn't intend for Win XP to run hardware this way, but with some tricks, you can make Windows do some pretty fantastic work quite reliably. Some balk at the thought of putting Windows on a Roboard and insist on Linux, but personally, I'm happy to have all the features and "extras" (like SAPI and .Net) available to use from within my application. I've been impressed at what .Net can actually do performance-wise.

In programming Windows Forms applications, I (and most others) typically don't worry a whole lot about CPU and resources. However, for an embedded solution with limited hardware (1ghz Vortex86DX and 256 mb RAM w/ MicroSD hard drive compared to a desktop PC w/ a Core2Duo w/ 2gb RAM and SATA drives is quite a difference), you have to do things much more cautiously to not adversely impact performance. Just connecting to an MS Access MDB file with ADO from .Net really hits Roboard pretty hard.

I use plain ole' arrays when dealing with collections of objects, as the performance is better. I tried a few "fancier" collection types, but they just aren't as fast as arrays.

The biggest "trick" I use with my software is to do as much work up front as possible, and reduce or eliminate object creation during execution. I have a C# coworker that frowns on global variables, but initializing instances of objects application-wide reduces overhead of creating and disposing of those same objects while running.

Regarding performance, the biggest favor you can do for yourself in writing code for Roboard is to make it as lean and fast as possible, time the results and try different approaches as you go. The more efficient your code is WHILE running, the more you can do when you ARE running.

Conceptually, the process of building embedded apps on Roboard under WinXP is not that different from building a Windows Forms app for users. The biggest difference is that you have to think of your UI in different terms (sensor inputs and servo outputs instead of labels and buttons). For User apps, the main thread handles the UI, and Microsoft recommends things like progress bars and coding practices that keep the application "lively", and not appearing in a "locked" state to the user. With embedded applications, you're better off forgetting about the form being run by a user (unless it is used for configuration and NOT even UPDATED while running), and focusing on the aspects of UI as specific capabilities of your robot (sensors, servos). You will want to give up a few bells and whistles in code-intensive libraries for the sake of performance, but it's worth it.

General Update:

I've been working on my "hands" project, and am holding off on the sound card build for the moment (price is steep, still working on sourcing more sound chips, might have another vendor for the chips).

I have the brackets drawn out in CAD, and they are nearly complete. The parts are in 3 pieces that use some creative bending to get the required shapes. The dimensions are pretty tight, and bending the .040" aluminum can change dimensions enough to cause problems. I need to cut some aluminum sheet and test some scoring methods for bending to get the bends as accurate and square on the inner corners as possible. So, score and cut alu strips on my mill, measure, bend, measure. I'm going to try various scoring with a 90 degree "chamfer" bit, and scoring with a 1/8" ball end mill at different depths to see how it goes.

I don't have a bending brake, but I'm not sure that a brake would be of much use due to the odd bends required. The tolerance for some areas is + / - .007", which is pretty tight for bending sheet metal. I will definitely have to score the metal at the bends, but will prove out a few methods first, then adjust the brackets in CAD as needed to account for dimensional changes in the bends.

The MKS DS-450 servos for the fingers are 3.10 kg/cm, but my radius for the finger pull will be closer to .5 cm. At any rate, his hands will have 15.5 kg/cm total for each hand, and I'm pretty sure 3.10 kg/cm is enough to break something in the plastic fingers / teflon. It's a good thing the actual parts are cheap, I just don't want the strength to be enough to bend the .040" aluminum. :) I have left myself a few options for strengthening the aluminum hand frame where it may be needed, but this will add complexity and weight. I will just have to test it all out to see how well it works and tweak the design.

I will have to rework the "cartwheel" move proportions, these hands will be fairly heavy w/ 5 micro servos and brackets and such. I may try to use these micro servos for wrist rotation, but I'm not sure they'll be strong enough to handle that.

Regarding code, I don't know if I've mentioned, but I have changed my approach to servos due to the fact that I will be using Pololu serial servo controller boards via Roboard's TTL serial, in addition to the built in PWM. I have created a generic servo object that only does what any servo does, with a "manager" class that performs the required actions on the hardware for a given servo (the manager class incorporates the move update timer as well as the accel / decel code, as this applies to all servos). This will let me set up the move sequencer across all servos, regardless of whether they're attached to Roboard's PWM, or to the Pololu serial board through Roboard's com port.

Curious question- has anyone tried neural networks to create kinematics engines? I'm not interested in neural networks for AI, but just for handling the kinematics. With feedback from an accelerometer and starting from a standing position, one could create some routines to exercise servos to train the network. Hmmm...

Take Care,
Paul
Noobius,

Thanks, I'm mostly rambling, I type fast, so I tend to type a lot. :)

It's a good bit different to program for embedded solutions under Windows, but not as different as some might think. ;) The most difficult part in using Windows as an embedded platform is in working with timers to get to the actual capabilities of the hardware. Microsoft didn't intend for Win XP to run hardware this way, but with some tricks, you can make Windows do some pretty fantastic work quite reliably. Some balk at the thought of putting Windows on a Roboard and insist on Linux, but personally, I'm happy to have all the features and "extras" (like SAPI and .Net) available to use from within my application. I've been impressed at what .Net can actually do performance-wise.

In programming Windows Forms applications, I (and most others) typically don't worry a whole lot about CPU and resources. However, for an embedded solution with limited hardware (1ghz Vortex86DX and 256 mb RAM w/ MicroSD hard drive compared to a desktop PC w/ a Core2Duo w/ 2gb RAM and SATA drives is quite a difference), you have to do things much more cautiously to not adversely impact performance. Just connecting to an MS Access MDB file with ADO from .Net really hits Roboard pretty hard.

I use plain ole' arrays when dealing with collections of objects, as the performance is better. I tried a few "fancier" collection types, but they just aren't as fast as arrays.

The biggest "trick" I use with my software is to do as much work up front as possible, and reduce or eliminate object creation during execution. I have a C# coworker that frowns on global variables, but initializing instances of objects application-wide reduces overhead of creating and disposing of those same objects while running.

Regarding performance, the biggest favor you can do for yourself in writing code for Roboard is to make it as lean and fast as possible, time the results and try different approaches as you go. The more efficient your code is WHILE running, the more you can do when you ARE running.

Conceptually, the process of building embedded apps on Roboard under WinXP is not that different from building a Windows Forms app for users. The biggest difference is that you have to think of your UI in different terms (sensor inputs and servo outputs instead of labels and buttons). For User apps, the main thread handles the UI, and Microsoft recommends things like progress bars and coding practices that keep the application "lively", and not appearing in a "locked" state to the user. With embedded applications, you're better off forgetting about the form being run by a user (unless it is used for configuration and NOT even UPDATED while running), and focusing on the aspects of UI as specific capabilities of your robot (sensors, servos). You will want to give up a few bells and whistles in code-intensive libraries for the sake of performance, but it's worth it.

General Update:

I've been working on my "hands" project, and am holding off on the sound card build for the moment (price is steep, still working on sourcing more sound chips, might have another vendor for the chips).

I have the brackets drawn out in CAD, and they are nearly complete. The parts are in 3 pieces that use some creative bending to get the required shapes. The dimensions are pretty tight, and bending the .040" aluminum can change dimensions enough to cause problems. I need to cut some aluminum sheet and test some scoring methods for bending to get the bends as accurate and square on the inner corners as possible. So, score and cut alu strips on my mill, measure, bend, measure. I'm going to try various scoring with a 90 degree "chamfer" bit, and scoring with a 1/8" ball end mill at different depths to see how it goes.

I don't have a bending brake, but I'm not sure that a brake would be of much use due to the odd bends required. The tolerance for some areas is + / - .007", which is pretty tight for bending sheet metal. I will definitely have to score the metal at the bends, but will prove out a few methods first, then adjust the brackets in CAD as needed to account for dimensional changes in the bends.

The MKS DS-450 servos for the fingers are 3.10 kg/cm, but my radius for the finger pull will be closer to .5 cm. At any rate, his hands will have 15.5 kg/cm total for each hand, and I'm pretty sure 3.10 kg/cm is enough to break something in the plastic fingers / teflon. It's a good thing the actual parts are cheap, I just don't want the strength to be enough to bend the .040" aluminum. :) I have left myself a few options for strengthening the aluminum hand frame where it may be needed, but this will add complexity and weight. I will just have to test it all out to see how well it works and tweak the design.

I will have to rework the "cartwheel" move proportions, these hands will be fairly heavy w/ 5 micro servos and brackets and such. I may try to use these micro servos for wrist rotation, but I'm not sure they'll be strong enough to handle that.

Regarding code, I don't know if I've mentioned, but I have changed my approach to servos due to the fact that I will be using Pololu serial servo controller boards via Roboard's TTL serial, in addition to the built in PWM. I have created a generic servo object that only does what any servo does, with a "manager" class that performs the required actions on the hardware for a given servo (the manager class incorporates the move update timer as well as the accel / decel code, as this applies to all servos). This will let me set up the move sequencer across all servos, regardless of whether they're attached to Roboard's PWM, or to the Pololu serial board through Roboard's com port.

Curious question- has anyone tried neural networks to create kinematics engines? I'm not interested in neural networks for AI, but just for handling the kinematics. With feedback from an accelerometer and starting from a standing position, one could create some routines to exercise servos to train the network. Hmmm...

Take Care,
Paul
PaulL offline
Savvy Roboteer
Savvy Roboteer
Posts: 423
Joined: Sat Sep 15, 2007 12:52 am

Roboard Experience Updates...

Post by PaulL » Sat Oct 16, 2010 3:25 pm

Post by PaulL
Sat Oct 16, 2010 3:25 pm

Well, I have posted a good bit under this thread and elsewhere, but I have uploaded ScrewTurn Wiki to my web space at http://www.mytrackwork.com, and have been adding documentation there instead.

This is one of my more recent entries:

http://www.mytrackwork.com/My-Roboard-Application-Architecture.ashx

It's a big project, and as I like to ramble in text and some don't like to see that in a forum, I'm going to post "bulk" content over there. :) And of course, I may put who knows what else there. :)

If anyone has any problems or comments, for now, reply to this thread or PM me here.

Take Care,
Paul
Well, I have posted a good bit under this thread and elsewhere, but I have uploaded ScrewTurn Wiki to my web space at http://www.mytrackwork.com, and have been adding documentation there instead.

This is one of my more recent entries:

http://www.mytrackwork.com/My-Roboard-Application-Architecture.ashx

It's a big project, and as I like to ramble in text and some don't like to see that in a forum, I'm going to post "bulk" content over there. :) And of course, I may put who knows what else there. :)

If anyone has any problems or comments, for now, reply to this thread or PM me here.

Take Care,
Paul
PaulL offline
Savvy Roboteer
Savvy Roboteer
Posts: 423
Joined: Sat Sep 15, 2007 12:52 am

Post by PaulL » Sat Nov 27, 2010 6:36 pm

Post by PaulL
Sat Nov 27, 2010 6:36 pm

Thought I'd post an update...

Well, I happened upon the most grotesque misuse of a system ever when searching for something I was trying to do in the UI for the configuration aspect of the system I'm building: The US Patent and Trade Office, along with the vast quantity of "patented" software methods that should be considered "obvious", and never should have been granted.

My intention was to use graphical objects on-screen to represent actions in sequences, but it seems there's one particular company that has managed to acquire a number of patents that affect me in a negative way.

Some company has patented the basic notion of graphical programming via a UI with moveable objects. This is a very broad scope for a patent, and I'm amazed they were able to do it. Further, they patented the notion of indicating a code loop with a box. That one doesn't affect me as much, but it's still amazing. If I had a team of lawyers, my method is different enough to be arguable, but just barely so. On my own, they'd have me for lunch.

I had already spent a decent amount of work in generating objects I could render on-screen graphically. I had made quite a bit of progress, and it would have been ready to use in setting up test scripts in a few more days of programming. This type of method, if released, would be liable for patent infringement. Nice, huh?

I ended up spending a few days investigating software patents and arrived at some very sad and disappointing understandings and conclusions. The intent of the patent system is to "encourage innovation", but it does seem quite obvious that it does exactly the opposite (it seems to encourage patenting itself in the software industry). If I had continued and released the graphical configuration tool of great use to my system, even as freeware or open-source, I could have been held liable for loss of revenue due to patent infringement. If you buy software from a vendor, and that software infringes on some seemingly trivial aspect, you can be held liable for their losses, not just the guy who built / sold it to you. I think this is all "patently" absurd.

I have further learned that software patents are used as bargaining chips between big companies like Microsoft, NI, Oracle, Sun, IBM, you name it. In essence, they "trade" rights to patents so they all can keep making software. The more patents you have, the better your position. Patents aren't cheap unless you do it yourself, and even then, they're not cheap enough to build up a portfolio of several dozen or more, and you're up the creek if you don't word it just so.

If you're a lone programmer, you don't stand a chance against those guys if you "infringe" on what a capable programmer would consider "obvious" methods that have been patented. Just being threatened with a lawsuit is enough to make one "cease and decist". What gets me is that you can be sued not for the money you made or didn't make, but for the money THEY, allegedly, could have made. This is typically more than a small business would have made, and it appears they can go after our personal assets if they so desire.

I get the idea of wanting to protect intellectual property per se, but I don't get that patents can be issued so frequently for such minimal things for a technology that is presently in a constant state of evolution. Everything is "new", meaning EVERYTHING is patentable every step of the way as software solutions evolve. Software is explosively evolving, and every day someone does something different, and sooner or later, two guys will come up with similar ideas for obvious reasons (but the one working for the big company that gets the patent first wins). I am starting to understand why some software doesn't look or behave in more obvious ways - it's likely a result of some patent infringement case they're trying to avoid.

Sure, most of the "obvious" kinds of patents can be argued, but a one-man shop just will not have the proper resources, time OR money to combat any kind of patent attack from a big company.

There are even companies out there that don't write software at all, but merely obtain software patents and sue others for patent infringement. It's all legal, amazingly.

It is quite discouraging to take time and look into the current state of affairs in this area, and to find things I have done or created without knowledge of existing patents that would infringe on such patents. If I can create something on my own in software, without knowledge of someone else's efforts, regarding software patents, I personally feel that I should not be infringed upon for my own ideas (by being sued for patent infringement) that happen to be similar to those of others. If I can do a particular thing, by myself, in my home office, without knowing about their stuff, wouldn't that be considered "obvious"???

Common sense would say yes, but the patent system isn't used that way, and big business takes advantage of that. At the end of the day, if you patent it first, you own it. That's the way it goes, no matter how trivial the claim actually is.

If I had my way, there'd be a moratorium on software patents while some "common sense" is injected, and an expansion of copyrights to protect software for what it is, a complete package to solve more specific types of problems. Patenting simple and obvious concepts with gross claims as to their impact at such an early stage of a technology's evolution only serves to discourage others from even trying.

I saw one patent infringement case that involved settling out of court - that company is now owned by the company that sued them (and it wasn't Microsoft in this particular case).

As disappointing and discouraging as this all is, I'm working on determining other "new" methods of accomplishing on-screen configuration without infringing on such patents. This requires more effort on my part, and significant effort I've made has to be tossed out completely. The result will be a less-intuitive interface, as the "best" solution already has been patented. I also have a few other things I need to change internally. Further, I will have to get a few patents on my own software to have even the slightest chance at protecting my own efforts.

I will likely have to spend yet more time researching ideas I've incorporated into the system and examine them for infringement potential, which will also be very time consuming. I will also need to apply for patents as I go, which will be costly.

I'm not giving up, I just can't use "obvious" methods others have patented, and to make sure that isn't the case, I have to continue searching hundreds upon hundreds of patents at each noteable (or potentially "new") turn.

For kicks, here's a screenshot of the functionality I built but now have to destroy:

Image

That's a "what could have been" picture, "what shall be" is yet to be determined.

So, if you're writing software you want to sell OR post, check out the patents and patent laws in your country before you start writing code, particularly if you're in the US, but also in your own country. ;) This isn't limited to Roboard, this is relevant for any software effort, on any hardware platform (yes, even embedded controller software).

Take Care,
Paul
Thought I'd post an update...

Well, I happened upon the most grotesque misuse of a system ever when searching for something I was trying to do in the UI for the configuration aspect of the system I'm building: The US Patent and Trade Office, along with the vast quantity of "patented" software methods that should be considered "obvious", and never should have been granted.

My intention was to use graphical objects on-screen to represent actions in sequences, but it seems there's one particular company that has managed to acquire a number of patents that affect me in a negative way.

Some company has patented the basic notion of graphical programming via a UI with moveable objects. This is a very broad scope for a patent, and I'm amazed they were able to do it. Further, they patented the notion of indicating a code loop with a box. That one doesn't affect me as much, but it's still amazing. If I had a team of lawyers, my method is different enough to be arguable, but just barely so. On my own, they'd have me for lunch.

I had already spent a decent amount of work in generating objects I could render on-screen graphically. I had made quite a bit of progress, and it would have been ready to use in setting up test scripts in a few more days of programming. This type of method, if released, would be liable for patent infringement. Nice, huh?

I ended up spending a few days investigating software patents and arrived at some very sad and disappointing understandings and conclusions. The intent of the patent system is to "encourage innovation", but it does seem quite obvious that it does exactly the opposite (it seems to encourage patenting itself in the software industry). If I had continued and released the graphical configuration tool of great use to my system, even as freeware or open-source, I could have been held liable for loss of revenue due to patent infringement. If you buy software from a vendor, and that software infringes on some seemingly trivial aspect, you can be held liable for their losses, not just the guy who built / sold it to you. I think this is all "patently" absurd.

I have further learned that software patents are used as bargaining chips between big companies like Microsoft, NI, Oracle, Sun, IBM, you name it. In essence, they "trade" rights to patents so they all can keep making software. The more patents you have, the better your position. Patents aren't cheap unless you do it yourself, and even then, they're not cheap enough to build up a portfolio of several dozen or more, and you're up the creek if you don't word it just so.

If you're a lone programmer, you don't stand a chance against those guys if you "infringe" on what a capable programmer would consider "obvious" methods that have been patented. Just being threatened with a lawsuit is enough to make one "cease and decist". What gets me is that you can be sued not for the money you made or didn't make, but for the money THEY, allegedly, could have made. This is typically more than a small business would have made, and it appears they can go after our personal assets if they so desire.

I get the idea of wanting to protect intellectual property per se, but I don't get that patents can be issued so frequently for such minimal things for a technology that is presently in a constant state of evolution. Everything is "new", meaning EVERYTHING is patentable every step of the way as software solutions evolve. Software is explosively evolving, and every day someone does something different, and sooner or later, two guys will come up with similar ideas for obvious reasons (but the one working for the big company that gets the patent first wins). I am starting to understand why some software doesn't look or behave in more obvious ways - it's likely a result of some patent infringement case they're trying to avoid.

Sure, most of the "obvious" kinds of patents can be argued, but a one-man shop just will not have the proper resources, time OR money to combat any kind of patent attack from a big company.

There are even companies out there that don't write software at all, but merely obtain software patents and sue others for patent infringement. It's all legal, amazingly.

It is quite discouraging to take time and look into the current state of affairs in this area, and to find things I have done or created without knowledge of existing patents that would infringe on such patents. If I can create something on my own in software, without knowledge of someone else's efforts, regarding software patents, I personally feel that I should not be infringed upon for my own ideas (by being sued for patent infringement) that happen to be similar to those of others. If I can do a particular thing, by myself, in my home office, without knowing about their stuff, wouldn't that be considered "obvious"???

Common sense would say yes, but the patent system isn't used that way, and big business takes advantage of that. At the end of the day, if you patent it first, you own it. That's the way it goes, no matter how trivial the claim actually is.

If I had my way, there'd be a moratorium on software patents while some "common sense" is injected, and an expansion of copyrights to protect software for what it is, a complete package to solve more specific types of problems. Patenting simple and obvious concepts with gross claims as to their impact at such an early stage of a technology's evolution only serves to discourage others from even trying.

I saw one patent infringement case that involved settling out of court - that company is now owned by the company that sued them (and it wasn't Microsoft in this particular case).

As disappointing and discouraging as this all is, I'm working on determining other "new" methods of accomplishing on-screen configuration without infringing on such patents. This requires more effort on my part, and significant effort I've made has to be tossed out completely. The result will be a less-intuitive interface, as the "best" solution already has been patented. I also have a few other things I need to change internally. Further, I will have to get a few patents on my own software to have even the slightest chance at protecting my own efforts.

I will likely have to spend yet more time researching ideas I've incorporated into the system and examine them for infringement potential, which will also be very time consuming. I will also need to apply for patents as I go, which will be costly.

I'm not giving up, I just can't use "obvious" methods others have patented, and to make sure that isn't the case, I have to continue searching hundreds upon hundreds of patents at each noteable (or potentially "new") turn.

For kicks, here's a screenshot of the functionality I built but now have to destroy:

Image

That's a "what could have been" picture, "what shall be" is yet to be determined.

So, if you're writing software you want to sell OR post, check out the patents and patent laws in your country before you start writing code, particularly if you're in the US, but also in your own country. ;) This isn't limited to Roboard, this is relevant for any software effort, on any hardware platform (yes, even embedded controller software).

Take Care,
Paul
PaulL offline
Savvy Roboteer
Savvy Roboteer
Posts: 423
Joined: Sat Sep 15, 2007 12:52 am

Post by veltrop » Sun Nov 28, 2010 4:12 pm

Post by veltrop
Sun Nov 28, 2010 4:12 pm

I'm sorry to hear about your patent woes.

But I try not to let it discourage a small fry like myself. I don't think they tend to go after the the infringement suits until its financially advantageous for them to. If one was big enough to be targeted by them it would likely be at a stage when one was fiscally ready for it. Having to eventually pay the fees on the patent out of your profits is better than no profits at all.

I would go as far as to say this small niche market would fly right under their radar.

And what patent is it? Did you find it with a general web search? Are you certain it's fees haven't expired? It doesn't sound new enough to worry about considering that it basically describes LabView...

But yeah, if it hits the fan and they want you to go to court, you can't stand up to a corporation.

In my case I'm releasing my stuff free, so all that up there about affording patents and becoming their target if I had a product wouldn't really effect me. But it would be a shame if I stepped on toes accidentally and got cease and desist orders.
... Maybe the EFF could help ;)


Anyway, I hope you can find a way to work around this, good luck!
I'm sorry to hear about your patent woes.

But I try not to let it discourage a small fry like myself. I don't think they tend to go after the the infringement suits until its financially advantageous for them to. If one was big enough to be targeted by them it would likely be at a stage when one was fiscally ready for it. Having to eventually pay the fees on the patent out of your profits is better than no profits at all.

I would go as far as to say this small niche market would fly right under their radar.

And what patent is it? Did you find it with a general web search? Are you certain it's fees haven't expired? It doesn't sound new enough to worry about considering that it basically describes LabView...

But yeah, if it hits the fan and they want you to go to court, you can't stand up to a corporation.

In my case I'm releasing my stuff free, so all that up there about affording patents and becoming their target if I had a product wouldn't really effect me. But it would be a shame if I stepped on toes accidentally and got cease and desist orders.
... Maybe the EFF could help ;)


Anyway, I hope you can find a way to work around this, good luck!
veltrop offline
Savvy Roboteer
Savvy Roboteer
User avatar
Posts: 59
Joined: Wed Jul 22, 2009 8:04 am
Location: Japan

Post by PaulL » Mon Nov 29, 2010 10:41 am

Post by PaulL
Mon Nov 29, 2010 10:41 am

Hey Veltrop,

My intention is to produce a product that has the capability of competing with other products in the process control and MES markets (definitely radar material!).

I see Roboard as being no less capable than a high powered industrial control device, and definitely more so in most respects. It is very much like a PLC in that it has specialized IO, but very much like a PC, as it is a PC. :) And, with it being a PC, it can run the same kinds of software one could use in industrial process control. If a PC-based solution is built for robotic control (such as for our toys) in a highly configurable, high performance, dependable, robust fashion (all desireable in our application), nothing differentiates the application from main stream ones! We use sensors, we use motors, servos, etc. If you write a plugin for such a highly configurable robotic control system that can talk to a PLC via ethernet as if it were a local IO device (easily done), you have the equivalent of a PLC in a PC. If you can talk to multiple PLC's that run their own programs (easily done), you have a SCADA system. If you add capability to create user interfaces (easily done), you have an HMI. If you add database capabilities (easily done), you have a Historian. These are all (in patenting context) legally obvious conclusions, there are more that are less-obvious, but just as easy.

The particular patent I'm speaking of in regards to on-screen graphical configuration is exactly what you guessed, a patent from NI, reflecting Labview. Their patents are current, and they are unrelenting in persuing infringement opportunities.

Have a look. Scroll down to the one about Measurement Computing:
http://www.bostonpatentlaw.com/PracticeAreas/Representative-Matters.asp

They bullied, then bought them out. Softwire was an interesting product, now in the hands of NI, further strengthening NI's resolve in prosecuting use of configuration of software graphically. The only thing, IMHO, that got Measurement Computing as much money in the company sale as they did is that they found the 2 Fluke patents they could use as leverage. Now NI owns those patents and the "offending" company and whatever other patents they might have had, as well as now settling yet another case in their favor for this approach. The www.softwiretechnology.com website has a copyright of 2008, and there doesn't appear to be any way to purchase the software (their website has that familiar "abandoned" look). Bought, and left to rot on the vine. NI's big and bad.

I have a few ideas that their patents won't apply to, things that steer clear of their approach. "Designing around" patents is a method that is intentionally encouraged by the patent system (that is, designing alternative solutions), and that's what I have to do. I just find it ironic that me, one guy, can come up with similar ideas on my own, essentially building a competing system, with a "spare time" level of effort. I think with that in mind, the USPTO never should have given NI the patents in the first place. Maybe NI should hire me. @NI: you can buy my ideas, but the price will be a bit steep. :) After all, I may have the one last viable solution to help you completely corner the market. I wouldn't mind patent licensing from you if such is affordable. If not, I'll find a different way. :)

Be wary of even giving things away free- you can, at least in the US, be sued for loss of revenue from a patent infringement lawsuit. I also thought it was Ok to "give" software away, that it isn't affected if you don't make a profit, but as it turns out, it's not OK - they can sue you here in the US for not what you did or didn't make, but what the patentee allegedly COULD have made. As an example, Open Source projects have been shut down due to threat of infringement lawsuits.

I hear you loud and clear in that this is a minor market in and of itself, but low cost industrial control systems are not cheap, and free is enticing to anyone looking at steep automation system prices. If a company takes on a free solution to solve a similar problem, the word will spread fast, and attention will arrive quickly. The only way to stay off their radar is to "lock down" the system's capabilities, keeping it as "toy robot-centric" solution. Even then, they might point at Mindstorms (uses NI software) and sue you, who knows.

Thanks and Take Care,
Paul
Hey Veltrop,

My intention is to produce a product that has the capability of competing with other products in the process control and MES markets (definitely radar material!).

I see Roboard as being no less capable than a high powered industrial control device, and definitely more so in most respects. It is very much like a PLC in that it has specialized IO, but very much like a PC, as it is a PC. :) And, with it being a PC, it can run the same kinds of software one could use in industrial process control. If a PC-based solution is built for robotic control (such as for our toys) in a highly configurable, high performance, dependable, robust fashion (all desireable in our application), nothing differentiates the application from main stream ones! We use sensors, we use motors, servos, etc. If you write a plugin for such a highly configurable robotic control system that can talk to a PLC via ethernet as if it were a local IO device (easily done), you have the equivalent of a PLC in a PC. If you can talk to multiple PLC's that run their own programs (easily done), you have a SCADA system. If you add capability to create user interfaces (easily done), you have an HMI. If you add database capabilities (easily done), you have a Historian. These are all (in patenting context) legally obvious conclusions, there are more that are less-obvious, but just as easy.

The particular patent I'm speaking of in regards to on-screen graphical configuration is exactly what you guessed, a patent from NI, reflecting Labview. Their patents are current, and they are unrelenting in persuing infringement opportunities.

Have a look. Scroll down to the one about Measurement Computing:
http://www.bostonpatentlaw.com/PracticeAreas/Representative-Matters.asp

They bullied, then bought them out. Softwire was an interesting product, now in the hands of NI, further strengthening NI's resolve in prosecuting use of configuration of software graphically. The only thing, IMHO, that got Measurement Computing as much money in the company sale as they did is that they found the 2 Fluke patents they could use as leverage. Now NI owns those patents and the "offending" company and whatever other patents they might have had, as well as now settling yet another case in their favor for this approach. The www.softwiretechnology.com website has a copyright of 2008, and there doesn't appear to be any way to purchase the software (their website has that familiar "abandoned" look). Bought, and left to rot on the vine. NI's big and bad.

I have a few ideas that their patents won't apply to, things that steer clear of their approach. "Designing around" patents is a method that is intentionally encouraged by the patent system (that is, designing alternative solutions), and that's what I have to do. I just find it ironic that me, one guy, can come up with similar ideas on my own, essentially building a competing system, with a "spare time" level of effort. I think with that in mind, the USPTO never should have given NI the patents in the first place. Maybe NI should hire me. @NI: you can buy my ideas, but the price will be a bit steep. :) After all, I may have the one last viable solution to help you completely corner the market. I wouldn't mind patent licensing from you if such is affordable. If not, I'll find a different way. :)

Be wary of even giving things away free- you can, at least in the US, be sued for loss of revenue from a patent infringement lawsuit. I also thought it was Ok to "give" software away, that it isn't affected if you don't make a profit, but as it turns out, it's not OK - they can sue you here in the US for not what you did or didn't make, but what the patentee allegedly COULD have made. As an example, Open Source projects have been shut down due to threat of infringement lawsuits.

I hear you loud and clear in that this is a minor market in and of itself, but low cost industrial control systems are not cheap, and free is enticing to anyone looking at steep automation system prices. If a company takes on a free solution to solve a similar problem, the word will spread fast, and attention will arrive quickly. The only way to stay off their radar is to "lock down" the system's capabilities, keeping it as "toy robot-centric" solution. Even then, they might point at Mindstorms (uses NI software) and sue you, who knows.

Thanks and Take Care,
Paul
PaulL offline
Savvy Roboteer
Savvy Roboteer
Posts: 423
Joined: Sat Sep 15, 2007 12:52 am

Post by veltrop » Mon Nov 29, 2010 2:33 pm

Post by veltrop
Mon Nov 29, 2010 2:33 pm

Wow, thanks for the added info Paul.
Wow, thanks for the added info Paul.
veltrop offline
Savvy Roboteer
Savvy Roboteer
User avatar
Posts: 59
Joined: Wed Jul 22, 2009 8:04 am
Location: Japan

Post by Spiked3 » Tue Dec 14, 2010 10:16 pm

Post by Spiked3
Tue Dec 14, 2010 10:16 pm

Hey Paul,
An idea to play the patent game, is that you can patent something that uses another's patented technology. That in itself buys you nothing, but it does prevent them from doing it, and can be very valuable if it comes up - so while they may have patented manipulating objects in a UI - they probably have not patented using objects in a UI to manipulate robots (or any other thing they may do or have done). And if they were to violate your patent ... like I said it leaves you in a decent place.
The patent system does suck, but its just a game. Do not be afraid to play. They will not bother you unless you have something worth bothering with to begin with.
Hey Paul,
An idea to play the patent game, is that you can patent something that uses another's patented technology. That in itself buys you nothing, but it does prevent them from doing it, and can be very valuable if it comes up - so while they may have patented manipulating objects in a UI - they probably have not patented using objects in a UI to manipulate robots (or any other thing they may do or have done). And if they were to violate your patent ... like I said it leaves you in a decent place.
The patent system does suck, but its just a game. Do not be afraid to play. They will not bother you unless you have something worth bothering with to begin with.
Spiked3 offline
Savvy Roboteer
Savvy Roboteer
Posts: 41
Joined: Sun Feb 22, 2009 8:31 pm

Post by PaulL » Sun Dec 19, 2010 3:46 pm

Post by PaulL
Sun Dec 19, 2010 3:46 pm

Veltrop, no problem. :)

Spiked3, yup, I know exactly what you mean. NI does sell a robotics "package" for something like $15k, which I find to be absurd. I've done some more research, and Microsoft has done significant work in two areas that I find beneficial: Workflows in .Net framework 4, and their MSRDS product. The way I see it is this: If I build a product based loosely on Microsoft products using Microsoft's Visual Studio as their customer, I don't think I will find any trouble. Their look 'n feel, my engine.

I plan to do something it seems Labview isn't going to do: I plan to provide information on the interfaces to my plugin system, allowing users to extend the system's capabilities. My thought is, why should I limit the system's potential due to my own time and resource constraints? :)

To be clear, Labview can do a lot of low-level work I frankly am not interested in, such as developing for microcontrollers. My intended platform is Windows / x86 based systems. Labview uses a data flow paradigm. My approach is significantly different, but is based on Microsoft technologies.

I have a concept for patenting that may help keep me in the clear, but I need to talk to a lawyer before I finish / distribute my app.

To put some more info out there, I look at all interaction with my engine as asynchronous activities. Certain code does not execute until some asynchronous activity has completed - typing text, clicking a button, some pin going high, some threaded function completing. ALL of these things are asynchronous, meaning that within my engine, I have no awareness of when they will complete. Operations within the engine should not be held up unless they actually depend on completion of some action. Action completion invokes some other action, and so on.

There is a way at looking at computer hardware and software from an evolutionary perspective that reveals the obviousness of what has been done since the beginning, slowly, over time: abstraction. Abstraction from what? Machine code. As abstraction has increased, productivity and efficiency in developing systems has also increased. From everything up to MS DOS, to Windows, to Win9x, to Windows XP and beyond, everything has been about revealing technology in ways that are easier for users to understand.

A user who knows NOTHING about CPU registers, paged memory, or even hard drives or RAM can effectively use a computer to accomplish work. That would not be possible without abstraction, without presenting the machine as a system they can relate to, with pictures and buttons and windows, word processors and document viewers that are represented graphically as pieces of paper, etc. People not knowing a thing about programming can set up and run a website complete with ordering, invoicing, shopping carts, etc. It's all about abstraction, about presenting the user with concepts they can relate to more easily.

What's unique about my approach is in varying LEVELS of abstraction within the system, utilized in the same fashion regardless of level. If you want to play with pins on a device, you can. If you want to say "walk over there", you can. THAT is, to me, an empowering system. And to provide advanced users the ability to EXTEND those capabilities is, to me, a recipe for one heck of a system. ;)

To further clarify what I mean by abstraction, let me explain: for any job, there is an inherent language that facilitates the job. Plumbers, doctors, lawyers, programmers, they all speak in their own specialized versions of languages relating to the work they do. A plumber could use Visio or MS Paint to draw out the plumbing for a house, but this is less intuitive than perhaps a plumbing-centric CAD program. Electronics engineers could work with some drawing program to create schematics, but it makes more sense to use schematic drawing / CAD programs that are based on the disciplines of the technology (symbology, etc).

The whole trick is to provide capabilities to the user in the most appropriate means possible. Even for advanced users that CAN write programs in assembler, advanced tools can help get the job done much faster.

For example, one could take a PC, start with assembler, design an operating system, design a means for accessing devices (drivers), and build an entire graphical operating system focused around robotics for ease of setup and interaction, but why would you do that if you can use more advanced tools already created to do the same thing? One can argue performance, and there's some merit there, but with a 1ghz CPU such as in Roboard, where does it become important to work in the nanosecond range? If you need to work in nanoseconds, perhaps an external device controlled by a more advanced system is a better answer! As a prime example, the PWM generator on Roboard is a specialized device. You could manipulate GPIO in software for PWM, but the PWM hardware uses much less CPU to do so. Changing PWM settings frequently enough to create smooth motion is entirely doable with minimal CPU. Updating a servo pulse every 2 microseconds in software is a waste! There are better devices for that, and they've already been built.

My advice, and approach, is to take advantage of Roboard for being what it is, a PC, capable of a lot of high-level tasks using software already written. I could program in assembler, but that's too tedious. I could program in C, but that's not my language of choice. I do VB.net for work, so I stick with that. :) I don't need to worry about networking, that's been written. I don't need to worry about accessing USB devices, that's been done. The operating system gives me a level of abstraction not possible without a monumental amount of work on my part, perhaps more than I could do in my lifetime.

All this said, yes, I've been rewriting device access code for Roboard, and the main reasons for doing so is to make device access more threading-friendly, and to make things easier, to create my own level of abstraction.

Personally, I generally am not fond of details and tend to think instead in terms of concepts. I can work with details when needed, but I like to get them "off my plate", to work over some aspect to a point where some sub, function, or class is intuitive and not detail-oriented. I like to pull out details into configuration, something that can be changed after the detailed code is written. For example, I can't remember the Pololu serial protocol involved with the class I wrote for the Pololu boards, but the class itself is functionaly easy to use. The protocol doesn't matter once the code is written to expose the desired capabilities.

Wow, I can ramble a lot... :)

Take Care,
Paul
Veltrop, no problem. :)

Spiked3, yup, I know exactly what you mean. NI does sell a robotics "package" for something like $15k, which I find to be absurd. I've done some more research, and Microsoft has done significant work in two areas that I find beneficial: Workflows in .Net framework 4, and their MSRDS product. The way I see it is this: If I build a product based loosely on Microsoft products using Microsoft's Visual Studio as their customer, I don't think I will find any trouble. Their look 'n feel, my engine.

I plan to do something it seems Labview isn't going to do: I plan to provide information on the interfaces to my plugin system, allowing users to extend the system's capabilities. My thought is, why should I limit the system's potential due to my own time and resource constraints? :)

To be clear, Labview can do a lot of low-level work I frankly am not interested in, such as developing for microcontrollers. My intended platform is Windows / x86 based systems. Labview uses a data flow paradigm. My approach is significantly different, but is based on Microsoft technologies.

I have a concept for patenting that may help keep me in the clear, but I need to talk to a lawyer before I finish / distribute my app.

To put some more info out there, I look at all interaction with my engine as asynchronous activities. Certain code does not execute until some asynchronous activity has completed - typing text, clicking a button, some pin going high, some threaded function completing. ALL of these things are asynchronous, meaning that within my engine, I have no awareness of when they will complete. Operations within the engine should not be held up unless they actually depend on completion of some action. Action completion invokes some other action, and so on.

There is a way at looking at computer hardware and software from an evolutionary perspective that reveals the obviousness of what has been done since the beginning, slowly, over time: abstraction. Abstraction from what? Machine code. As abstraction has increased, productivity and efficiency in developing systems has also increased. From everything up to MS DOS, to Windows, to Win9x, to Windows XP and beyond, everything has been about revealing technology in ways that are easier for users to understand.

A user who knows NOTHING about CPU registers, paged memory, or even hard drives or RAM can effectively use a computer to accomplish work. That would not be possible without abstraction, without presenting the machine as a system they can relate to, with pictures and buttons and windows, word processors and document viewers that are represented graphically as pieces of paper, etc. People not knowing a thing about programming can set up and run a website complete with ordering, invoicing, shopping carts, etc. It's all about abstraction, about presenting the user with concepts they can relate to more easily.

What's unique about my approach is in varying LEVELS of abstraction within the system, utilized in the same fashion regardless of level. If you want to play with pins on a device, you can. If you want to say "walk over there", you can. THAT is, to me, an empowering system. And to provide advanced users the ability to EXTEND those capabilities is, to me, a recipe for one heck of a system. ;)

To further clarify what I mean by abstraction, let me explain: for any job, there is an inherent language that facilitates the job. Plumbers, doctors, lawyers, programmers, they all speak in their own specialized versions of languages relating to the work they do. A plumber could use Visio or MS Paint to draw out the plumbing for a house, but this is less intuitive than perhaps a plumbing-centric CAD program. Electronics engineers could work with some drawing program to create schematics, but it makes more sense to use schematic drawing / CAD programs that are based on the disciplines of the technology (symbology, etc).

The whole trick is to provide capabilities to the user in the most appropriate means possible. Even for advanced users that CAN write programs in assembler, advanced tools can help get the job done much faster.

For example, one could take a PC, start with assembler, design an operating system, design a means for accessing devices (drivers), and build an entire graphical operating system focused around robotics for ease of setup and interaction, but why would you do that if you can use more advanced tools already created to do the same thing? One can argue performance, and there's some merit there, but with a 1ghz CPU such as in Roboard, where does it become important to work in the nanosecond range? If you need to work in nanoseconds, perhaps an external device controlled by a more advanced system is a better answer! As a prime example, the PWM generator on Roboard is a specialized device. You could manipulate GPIO in software for PWM, but the PWM hardware uses much less CPU to do so. Changing PWM settings frequently enough to create smooth motion is entirely doable with minimal CPU. Updating a servo pulse every 2 microseconds in software is a waste! There are better devices for that, and they've already been built.

My advice, and approach, is to take advantage of Roboard for being what it is, a PC, capable of a lot of high-level tasks using software already written. I could program in assembler, but that's too tedious. I could program in C, but that's not my language of choice. I do VB.net for work, so I stick with that. :) I don't need to worry about networking, that's been written. I don't need to worry about accessing USB devices, that's been done. The operating system gives me a level of abstraction not possible without a monumental amount of work on my part, perhaps more than I could do in my lifetime.

All this said, yes, I've been rewriting device access code for Roboard, and the main reasons for doing so is to make device access more threading-friendly, and to make things easier, to create my own level of abstraction.

Personally, I generally am not fond of details and tend to think instead in terms of concepts. I can work with details when needed, but I like to get them "off my plate", to work over some aspect to a point where some sub, function, or class is intuitive and not detail-oriented. I like to pull out details into configuration, something that can be changed after the detailed code is written. For example, I can't remember the Pololu serial protocol involved with the class I wrote for the Pololu boards, but the class itself is functionaly easy to use. The protocol doesn't matter once the code is written to expose the desired capabilities.

Wow, I can ramble a lot... :)

Take Care,
Paul
PaulL offline
Savvy Roboteer
Savvy Roboteer
Posts: 423
Joined: Sat Sep 15, 2007 12:52 am

PreviousNext
PreviousNext
95 postsPage 4 of 71, 2, 3, 4, 5, 6, 7
95 postsPage 4 of 71, 2, 3, 4, 5, 6, 7
cron