First movement of the HSRDP project arm, linked to ROS through an arduino that is controlling the motors, code here. The test setup only had 2 DOF + gripper but now the code works it’s a matter of hooking it all up!
A big problem with “intelligent” robotics today is that hackers put so much time into making the hardware chassis for their robots that when they’ve gone through all that effort they don’t even begin to get to the interesting software stuff.
Now by “intelligent” I mean robots that have some higher level purpose as opposed to following a line, avoiding light or sweeping the floor(badly), I mean things like giving a tour of a community center, serving drinks during parties or even guarding the place.
As we were donated the UMI RTX 100 arm last year, I thought why not put together a relatively sophisticated robotics platform that will allow more software inclined hackers (myself included) to write interesting applications for it. Unfortunately the arm was free for a good reason, it was sitting in someone’s garden shed for the past decade and has seen some corrosion so I spent a few months (on and off) restoring it, and unfortunately the circa 1986 control board is next to useless as the light erasable rom chips have lost their contents. The good news is that mechanically the arm is now it top condition and as for the control board, today this can be replaced with an Arduino Mega and a few motor drivers (on their way from China!), what better way to learn how these things are made.
The plan was to fix it up and mount it on a mobile platform, I though originally to source a tracked platform with differential drive, but it turns out one of the other Hackspacers (Hipster) got hold of a mobility scooter, no differential drive, rear will drive, passive front steering. What was simplified in hardware just got that much more complicated to handle in software (no more on the spot turns!)
So hardware considerations aside, I’ve chosen for the software glue that keeps the system together to be ROS, the robotic operating system that unfortunately isn’t used as much in the hacker community as it ought to be. IMO that is due to it’s complexity, but then this is this project may help change that! ROS has many very useful features, like being distributed, able to handle the modeling, particle cloud mapping, kinematics, motion planning, mapping, visualization, simulation in a very modular approach.
What will it run on? Well last year myself and some other guys on the Raspberry Pi forum managed to get most of the ROS to compile on the Pi, and although it’s underpowered it can deal with mid layer stuff very easily, just don’t make it run RVIZ(the visualisation package)! ROS is a meta-operating system and can run on multiple machines, as such Pi is good place to start with view to expand the computational oomph once needed. I’ve had previous exposure to ROS and one of my previous projects the Autonomous Rover Project used a similar setup but was MUUUUCH simpler compared to this beast.
The main sensor in the system is of-course the Kinect, which is the defacto standard for affordable 3D mapping of the environment. Realistically the system will also need bumper sensors and probably UltraSound/IR rangers to get a more complete and reliable perception of it’s surrounding environment
So a simplified system diagram would look something like this:
As it stands we’ve got a basic 3d robot model up (URDF) and are experimenting with planning as part of ROS’s new (you gotta …? Move-IT! software that munges the model into workable kinematic models and planning groups. Here’s a quick snapshot of the rather naked robot arm joints.
I’ll be sure to update this page as more progress is made but till then you can refer to the London HackSpace wiki page for the HSRDP
If you want to see the bot being worked on then it may be worth popping in to one of the HackSpace’s open evenings that run every Tuesday.
A few months ago I got involved in a access control project for tools at HackSpace. The project is interesting because it alleviates a few of the problems at a community workshop, mainly misuse of tools. Up till now we’ve been using plain old keys that only people who’ve been through the training are supposed to know the location of (yes yes a very fool proof method I know
So we decided to use the oyster cards, that we already use to get physcial access to the space and expand on that as most londoners have on oyster card anyway.
Here’s a brief overview of the system. Of-course ACnodes themselves are more complex but as was concentrating on writing the back-end, I put a tad more detail there
You can find more details on the project link.
And lastly, Sol, who’s working on the ACnodes themselves has recorded a quick video showing the node connected to our 3-in1 lathe (the more dangerous kind) which shows it reading the card and checking with the central server if the card owner is allowed to use it.
For about 3 months now I’ve been toying around with the idea of developing a club that would give youngsters the option to get stuck in with coding and robotics.
But as usually happens my main line of work came in the way, but I was pleasantly surprised when a friend forwarded me a news article about an initiative to get young children coding, and appropriately named the Code Club ! They aim to register as a charity so are asking for donations to help them reach the £5,000 needed to do so. They then plan do develop a 12 week course to introduce programming to youngsters through a very visual coding environment called Scratch.
I did my (unscientific) research on the topic and from what I’ve been able to gather it’s just not that exciting for most kids to get stuff moving on the screen. They need to be able to touch and play with what they are making.
But the idea of using a visual environment is a great one, so being a robotics enthusiast I realised that making a basic robot that can move about, make some sounds and possibly draw with a pen would have a significant effect on their interest in the activity. Doing some more searching I found out about the existence of S4A (stands for Scratch for Arduino) and as Arduino is a very simple tool for this it makes sense to build a robot platform around that! Some of the work in getting a hardware platform has already been done within the Scratch-IO project, all that’s missing is a ready-made hardware bot platform to plug it in.
The key thing here is that the platform needs to be simple enough for teachers and parents to set up for their kids to play. The exciting thing is that with the launch of Raspberry Pi the whole set-up is significantly cheaper and can potentially be done in a living room in-front of the TV.
I now have a new project on my hands, so if anyone has any ideas about what the bot should look like, what features it should have, then please add your thoughts in the comments below