Wednesday, 14 December 2011

Palpating Gripper Video

This is a video explaining the palpating gripper I have been developing as part of my work on augmenting robotic minimally invasive (keyhole) surgery with haptic (touch) feedback.



The basic premise of the work is that more tactile properties may be determined from an object if it can be actively explored using one or more fingers. To facilitate this sort of exploration inside the body I have been designing a robotic gripper that is simple enough to be manufactured for keyhole surgery but has enough dexterity to replicate some of the exploratory hand motions of a surgeon.

This work has been accepted for publication at IEEE Haptics Symposium 2012.

Wednesday, 30 November 2011

PhD Thesis Online

I've finally put my PhD thesis online with open access. The title of the work is: "Robust and Intelligent Control Approaches for Biologically Inspired Motion Generation with an Anthropomorphic Robot Arm".

The work attempts to find ways of modelling human motion patterns mathematically and then getting a robot arm to obey those mathematic rules in a safe and efficient manner. A good chunk of these rules dealt with producing optimal motion (in terms of 'effort') that could also deal with the robot having a imperfect model of its own body (for example, we didn't model the friction in the gearboxes). All the control is done dynamically (in terms of forces and torques) rather than kinematically (in terms of positions and angles). This is more difficult but gives more control over the robot.

The below figure shows one of the early results in the thesis, where the robot's reaching motion (to a target height) is based largely on effort minimisation and ends up appearing very human. The person in the image is only included for comparison, the robot isn't copying him.



You can view or download the full pdf of the thesis here.

Wednesday, 23 November 2011

Paper Accepted for Haptic Symposium

My research on a novel miniture robot gripper for tele-operated palpation has been accepted for presentation at Haptics Symposium 2012 in Vancouver.

Haptics Symposium Logo

Palpation is defined in the Oxford English dictionary as "[to] examine (a part of the body) by touch, especially for medical purposes". 

My submitted paper involved a bio-inspired approach to electromechanical device design, where the finger motions of surgeons were studied during exploratory scenarios in order to set a number of functional objectives for the device. The other design requirements were set by the proposed application of minimally invasive surgery.

Friday, 28 October 2011

Dada Robot - First Drawing

Many years ago I salvaged an ex-industrial robot that was being thrown out of the University research lab I was working in. A crime for sure but I could see why they were getting rid of it. The robot was a collection of metal and motors but little more. No electronics, no power supply, no software. A mere shell of a machine.

Over the last few years I've been nuturing it back to health, building and coding replacement hardware and software. I had to take a break from working on dada (as the robot is known) for quite some time, as I got very stuck on a tricky stepper motor control problem. Recently though I discovered an excellent Arduino library (Accelstepper) that more or less solved the problem for me. Enough so at least that a few weeks ago I got the robot to draw its first picture, which is rather abstract. Here's the sped-up video:



More on dada, including a scanned image of the final picture, and explanation of what is going on, can be found on dada's own site http://dada.soulwire.co.uk/

The dada project is a collaboration with Justin Windle.

Tuesday, 4 October 2011

Puppet Place Open Day Video

Here's a video from a recent 'open doors' day in Puppet Place, the studio I rent with other members of the creative technology collective RustySquid.



A bunch of the stuff on show was from people who work in the studio, though we had a good number of guest pieces too. You can find out more about each piece by clicking on it in the video.

Keen eyes will spot the Haptic Lotus at 0:42, which was present for hands on demos. Because I couldn't blindfold people or black out the lights as usual (far too much stuff to trip over) I had people follow me around while holding the Lotus. It fully blossomed when they approached the tea and cake stand. Pretty much everyone, robots included, likes tea and cake.

Wednesday, 28 September 2011

Thursday, 22 September 2011

UbiComp 2011 Best Paper Prize

This week a paper on the Haptic Lotus was awarded Best Paper prize at the prestigious Ubiquitous Computing conference (UbiComp) in China. There were over 300 papers presented at the conference so we (the authors) are very happy about this.


The paper "Haptic Reassurance in the Pitch Black for an Immersive Theatre Experience" was presented by my collaborator Janet van der Linden of the Open University. It may be downloaded here.

Here is a video of the talk by Janet.

View more videos from ubicomp2011

Tuesday, 9 August 2011

Haptic Lotus at Secret Garden Party

A couple of weekends ago I took the mobile Haptic Lotus rig (and a couple of assistants) to the Secret Garden Party festival, at the request of Guerilla Science, an organisation that brings science to music festivals.

Over the course of 4 or so hours we created an interactive installation where festival goers could use the haptic device and sound effects to be guided to target cannisters containing sweets. We had about 40 people pass through the installation, many in fancy dress:

 
I've demo-ed the Lotus at conferences before, but a music festival, where people are just passing through while looking for a good time, or something unusual, is quite a different experience. One thing we tried was getting people to race each other in the dark. That was fun!
Here's the blurb for my Thursday evening slot (which featured in the festival program):

 

THURSDAY: ASCENT

17:00-22:00 Blind Robot’s Bluff: Adam Spiers Navigate through darkness with the Haptic Lotus, a cybernetic instrument that manoeuvres through space with engineered grace. Use the unfolding petals to guide you to your destination, and ponder the meaning of light in the dark.

More information on the event can be found on the Question website with some extra pictures in the Guerilla Science Flickr album.

Designing in the Wild Talk

I've only just discovered this online talk by Haptic Lotus collaborator Prof. Yvonne Rogers, despite it being given at Stanford University in February 2011. The talk discusses interface design in unstructured settings, as opposed to typical laboratory settings.



"Abstract: (February 25, 2011) Yvonne Rogers discusses how "designing in the wild" is causing a new rise in discoveries and a new direction in computer science. She illustrates how these discoveries can be achieved along without the tensions and challenges that can arise when giving up control."

Yvonne's description of The Question (which featured the Haptic Lotus) is presented at around 42:00

http://www.talkminer.com/viewtalk.jsp?videoid=0t3gYfZMpKg&q=

Much of my past and current research has been based on concepts that originated at Stanford, so it is nice to know that my own work is making its way back there in some form.

Friday, 5 August 2011

Fractal Gallery

Nothing makes me feel more productive than creating an experiment / process that can run in the background while I do other stuff. TBH, I usually end up spending so long tuining the background process to work perfectly that I lose any time that I might of gained. Or otherwise I just watch the background process and lose the point of the whole thing.

That aside, I've been leaving my fractal code running in the background while writing emails etc. and this has led to some nice images. Rather than creating a new post for each of these, I've started a static Fractal Gallery page, where I'm going to be adding new images as necessary (either when finding cool stuff, or developing the code / maths further).

Wednesday, 3 August 2011

Initial Mandlebrot renderings

I recently picked up a copy of the excellent book 'The Beauty of Fractals' for just 20p from a library sale, inspired I've had another go at rendering a fractal function. This time the classic mandlebrot set.

Here's a small portion near the upper complex limit, this rendered slowly in Matlab while I went for lunch:

I grabbed the basic psuedo code from Wikipedia, though I noticed some bugs in this during implementation. I've updated the Wikipedia code to get rid of these. The wiki code doesn't shade inside the set (the green parts), but this is just related to the magnitude of the real + complex co-ordinates squared.

If people want the Matlab code then leave a comment and I'll post it.

PhD finished - delayed post

This is a very delayed post but I've now graduated from my PhD. The Viva was actually on 27th May and I graduated (pic below) on 18th July. This means that I'm now officially a doctor of robotics. My R2D2 obsessed, childhood self would be proud :-)

My examiners were Prof. David Stoten (University of Bristol) and Prof. Giorgio Metta (University of Genoa), the viva lasted under 2 hours and I passed with minor corrections.


No, I didn't wear the box for the ceremony...

Tuesday, 17 May 2011

Very Simple Fractal

On my way to geek martyrdom (PhD viva approaching) I'm finding myself becoming increasingly interested in fractal functions.

Here's my first attempt at coding a fractal in Matlab, which was done on my lunchbreak today.



It's based on a tiny part of the Mandelbrot set, which I've still not fully read into yet (I'm impatient!).

Here's the code, I told you it was simple! So simple in fact that it..soon...becomes.....pretty..........slow.............

clc         % clear command window
clf         % clear figure

z_old = 0-0i;   % Some random value to initialise z_old

for (seed = 0.01:0.005:0.2) % For all these values
   
    disp(seed)   % Display seed (not the right term)
   
    z0 = -seed - 0.54i; % The juicy bit
    z = 0;              % Start here
   
    rndclr=[rand, rand, rand];  % random line colour
   
    for (i = 1:1:50)   % number of iterations per seed
        z = z^2 + z0;   % iteration for the nation
        %disp(z);
       
        figure(1)
        hold on     % don't delete old lines
        title(i)    % display iteration
        plot(z,'.','color',[0,0.1,0]);  % draw a point
       
        if (i>1) % if point has a predecessor, connect them together
            line([real(z),real(z_old)],[imag(z),imag(z_old)],'Linesmoothing','on','color',rndclr)
        end
       
        z_old = z;  % Store last point
       
    end

end
   

Friday, 6 May 2011

3D Graph, meet 3D Printer

So awhile back I ran a stand on robotics and my PhD resarch at a Bristol University public engagement event Changing Perspectives.

One of the most visually assertive parts of my research in robotic movment have been the 3 dimensional plots I use to represent effort considerations in reaching motions. The idea is that the workspace of the robot makes up 2 axis on the graph, while the other axis represents the effort (a combination of graviational and muscular considerations). When planning movements to vertical targets, the robot can aim to keep it's trajectory in the lower parts of the graph, thus reducing effort during movement, as in casual human motion.

Some people find the concept of a 3D graph quite hard to grasp, which is fair enough, so I decided that a nice way to engage the public would be to make the graph tangible. Hence I had it 3D printed.


To make it more inutitive  I then created a contour 'heat' map of the simulation, which I projected onto the surface. This meant that points of high effort were highlighted with 'hot' colours, while low regions were coloured with colder shades of blue. Really this could of done with some projection mapping, but I didn't have much time.


Finally, I simulated the robot moving accross the surface in real-time to give the whole thing some motion and context. Unfortunately I was too busy chatting to the visitors to get a video of the projection on the graph, so here's the raw video:

The Question in Disability Now

Despite many things happening in the last few months (such as handing in my thesis), I've been very poor at updating this blog.

Here's a link to an article on The Question I just discovered in the magazine Disability Now. The Question was a collaborative piece I did with Extant last summer, where we developed the Haptic Lotus.

Here's a snippet from the full story:

Arguably one of Extant’s most breathtaking (and currently on-going) works to date, The Question (2009/2010) utilises the Live Art concept with open arms. It has everything: the tactile audience experience, the walk-through installation, the use of audio and live performers.

Monday, 14 March 2011

Generative Triangle Patterns

Inspired by the creations of my friend Soulwire, I spent a couple of hours last night coding up a simple algorithm that generates patterns from joined equilateral triangles. A random variable acts on the hypoteneuse of the triangles in 10% of cases, making the triangles shrink in size and lean, occasionally.

I've written it in Matlab for now (great for prototyping algorithms) but will probably switch to something more visually elegant (e.g. Processing / Open Frameworks) later.

It needs work but here are some initial images. Click to view larger.


Friday, 11 March 2011

Dynamixel Master / Slave network with Matlab

Following my last post I've now got two dynamixels working on the same network with Matlab. The flexibility of these servos means that one can be used just as a position sensor. This lets you do some cool stuff, like this simple tracking controller, where one servo follows the other. This is known as a master / slave network. The moving one is the slave as it obeys the master.



To make this work you need to set the ID of each dynamixel differently. This is obvious, but without the Dynamixel Wizard from Robotis (which didn't work on my machine) I had a hard time trying to figure out how to make this happen. In the end I discovered that by using the following command with an ID of 254 you can broadcast a new ID to every servo on the network.

(calllib('dynamixel','dxl_write_byte',254,3,new_ID))

So, just connect one servo to the network and broadcast it a new ID!

The Matlab source code for the master slave demo can be downloaded here.

IMPORTANT! - You won't feel the forces from the slave servo on the master. Do not stick your finger or anything else in the way to check!

Monday, 7 March 2011

Simple Dynamixel / Matlab Example Code

Recently I started working with a some Dynamixel RX-28 servos from Robotis. Unlike standard hobby servos, these actuators can provide position, current and velocity feedback and also have various modes of control, including compliance. They also work on a RX485 network, meaning that they can be chained together. These features make the servos a popular choice for building small humanoids, such as the Bioloid (also from Robotis).

Though a MATLAB API is available, I found the on-line examples lacking somewhat, especially when all I wanted to do was drive the servo and log its position. The following code does just that, moving the servo 90 degrees clockwise and anti-clockwise, then displaying a log of the 'present position' and 'moving' registers from the servo EEPROM.
The 'Moving' register is set to '1' when the servo is in motion. This is a bit buggy though as you need a delay in the code for it to read anything other than '1' and it doesn't exactly stop on time. Matlab isn't hard real-time so this pause seems to affect the data logging somewhat (the motion is actually really smooth, unlike what is shown above). Take the 'pause(0.01) out of the code below and you'll see what I mean

Note that you'll need to download and register the dynamixel API with Matlab before using this code.
Here are the instruction from ROBOTIS:
http://support.robotis.com/en/software/dynamixel_sdk/usb2dynamixel/windows/matlab.htm
These instructions from Agave Robotics are also useful
http://www.agaverobotics.com/products/servos/robotis/ax12/docs/Dynamixel-AX12-Matlab.pdf
Note that I had to change the 'open  device' code from the examples given above. They just wouldn't work!

I hope this is useful to someone, I spent some time searching for an example like this but couldn't find one. Of course the dynamixel servos can do much more than this but this is a nice place to start.

The source code can be downloaded from Google Docs here.

Thursday, 24 February 2011

Changing Perspectives

My PhD research (on natural reaching in humanoid robots) is going to be presented at a University of Bristol public engagment event in late March featuring live acrobats from Circomedia!

Image from http://www.circomedia.com/

Here's the blurb:

Come along to a free, interactive exhibition for families. Watch aerial circus performers, learn about how they accomplish their extraordinary feats and discover how your body works with TV presenter Alice Roberts. Take part in body painting and talk to local researchers from the University of Bristol about cutting edge research – in anatomy and physiology, neuroscience, robotics and biomechanics – happening in your city.

More information to follow. Here is the website of the event:

http://www.bris.ac.uk/changingperspectives/projects/experience/

Wednesday, 23 February 2011

Falcon Grip Delivery

Here is a shot of the extra Novint Falcon grips I received this morning from the Novint Advanced Products Group.


Spare ball grips for the Novint Falcon

The detachable grip on the Novint Falcon cries out for customisation, but the active electronics in the grip mean the the Falcon (in the background) won't function unless a grip is attached. I'm planning on hacking the electronics from these spare grips so that I can use my falcon with grips that are more specific to my research.

Watch this space!

Saturday, 5 February 2011

Wabi Sabi DJ set

Nothing to do with robots in this post, but last night I DJ'd at this night in Bristol...


and had a great time :-)

Friday, 28 January 2011

10 minute robot update

Moving house and thesis writing means I haven't had any time to work on my 3D scanning robot lately. Anyway, here's a video I put on YouTube almost 2 weeks ago of some 1dof movement with 3D visualisation in processing. The communication is done via serial.

Soapbox Seminar

I've been invited to give a talk at the Soap Box seminar series in University of Bristol's School of Geographical Sciences on 3rd Feb.

Here's the blurb:

Non-industrial robotics is a highly multidisciplinary branch of research which has the ability to catch public attention, mainly due to science fiction examples of advanced, thinking machines. But how do real robots compare to the popular concept? In this talk I shall be introducing the field of robotics by presenting some examples of the current state of the art. I'm also hoping to provide some food for thought, via the possible implications of robots in future society and the perceptions of robots by humans.

Though the talk is an introduction to robots in general, due to some email miscommunication my talk has ended up being titled 'Why robots probably won't take over the world!'

Everyone is welcome to attend.

Wednesday, 5 January 2011

Haptics Demo Video

Haptics is quite a difficult thing to describe to people without actually getting them to try out a system. This video was created to aid a talk on telehaptics, with the intention of showing the relationship between a haptic device (in the real world) and a simulated tactile and graphic object (in a virtual world).

The device is a Novint Falcon and the simulation is a demo from Chai 3D. I filmed the falcon while running the simulation and took a screen capture of the whole thing. An artifact of this was the horizontal compression.



Of course, a video will never beat getting someone's hands on a real device, but it should help the explanation.