First movement of the HSRDP project arm, linked to ROS through an arduino that is controlling the motors, code here. The test setup only had 2 DOF + gripper but now the code works it’s a matter of hooking it all up!
A big problem with “intelligent” robotics today is that hackers put so much time into making the hardware chassis for their robots that when they’ve gone through all that effort they don’t even begin to get to the interesting software stuff.
Now by “intelligent” I mean robots that have some higher level purpose as opposed to following a line, avoiding light or sweeping the floor(badly), I mean things like giving a tour of a community center, serving drinks during parties or even guarding the place.
As we were donated the UMI RTX 100 arm last year, I thought why not put together a relatively sophisticated robotics platform that will allow more software inclined hackers (myself included) to write interesting applications for it. Unfortunately the arm was free for a good reason, it was sitting in someone’s garden shed for the past decade and has seen some corrosion so I spent a few months (on and off) restoring it, and unfortunately the circa 1986 control board is next to useless as the light erasable rom chips have lost their contents. The good news is that mechanically the arm is now it top condition and as for the control board, today this can be replaced with an Arduino Mega and a few motor drivers (on their way from China!), what better way to learn how these things are made.
The plan was to fix it up and mount it on a mobile platform, I though originally to source a tracked platform with differential drive, but it turns out one of the other Hackspacers (Hipster) got hold of a mobility scooter, no differential drive, rear will drive, passive front steering. What was simplified in hardware just got that much more complicated to handle in software (no more on the spot turns!)
So hardware considerations aside, I’ve chosen for the software glue that keeps the system together to be ROS, the robotic operating system that unfortunately isn’t used as much in the hacker community as it ought to be. IMO that is due to it’s complexity, but then this is this project may help change that! ROS has many very useful features, like being distributed, able to handle the modeling, particle cloud mapping, kinematics, motion planning, mapping, visualization, simulation in a very modular approach.
What will it run on? Well last year myself and some other guys on the Raspberry Pi forum managed to get most of the ROS to compile on the Pi, and although it’s underpowered it can deal with mid layer stuff very easily, just don’t make it run RVIZ(the visualisation package)! ROS is a meta-operating system and can run on multiple machines, as such Pi is good place to start with view to expand the computational oomph once needed. I’ve had previous exposure to ROS and one of my previous projects the Autonomous Rover Project used a similar setup but was MUUUUCH simpler compared to this beast.
The main sensor in the system is of-course the Kinect, which is the defacto standard for affordable 3D mapping of the environment. Realistically the system will also need bumper sensors and probably UltraSound/IR rangers to get a more complete and reliable perception of it’s surrounding environment
So a simplified system diagram would look something like this:
As it stands we’ve got a basic 3d robot model up (URDF) and are experimenting with planning as part of ROS’s new (you gotta …? Move-IT! software that munges the model into workable kinematic models and planning groups. Here’s a quick snapshot of the rather naked robot arm joints.
I’ll be sure to update this page as more progress is made but till then you can refer to the London HackSpace wiki page for the HSRDP
If you want to see the bot being worked on then it may be worth popping in to one of the HackSpace’s open evenings that run every Tuesday.
A few months ago I got involved in a access control project for tools at HackSpace. The project is interesting because it alleviates a few of the problems at a community workshop, mainly misuse of tools. Up till now we’ve been using plain old keys that only people who’ve been through the training are supposed to know the location of (yes yes a very fool proof method I know
So we decided to use the oyster cards, that we already use to get physcial access to the space and expand on that as most londoners have on oyster card anyway.
Here’s a brief overview of the system. Of-course ACnodes themselves are more complex but as was concentrating on writing the back-end, I put a tad more detail there
You can find more details on the project link.
And lastly, Sol, who’s working on the ACnodes themselves has recorded a quick video showing the node connected to our 3-in1 lathe (the more dangerous kind) which shows it reading the card and checking with the central server if the card owner is allowed to use it.