Thursday, 9 December 2010

30 and 40 minute robot

I'm posting two stages at once because the photo upload tool on blogger was broken the other day. Anyway, here is the progress on the '10 minute robot'.

He now has a bracket mount for the tilt servo. I actually wasted 30 minutes trying to make this out of steel before realising that my lab's workshop hand-drill wasn't up to it. After giving up on that bit it only took 10 minutes to make this aluminium bracket out of scrap.


And here's a pan servo. Both degrees of freedom have been aligned with the sensor's center of rotation. Of course, mechanical offsets can be dealt with mathematically via trigonometry, but right now I wanted to solve this problem mechanically rather than in code.

 

Wednesday, 1 December 2010

20 minute robot

Yesterday's '10 minute robot' with another 10 minutes work.



Now he's got pitch rotation instead of yaw. I've used an angle bracket to keep the center of rotation in-line with the center of the IR sensor (his eyes). It doens't look as cool as his original neck, but it's more functional in terms of sensing.

Beats reading web comics for procrastination!

Tuesday, 30 November 2010

10 minute robot

I couldn't resist posting this little guy that I built in about 10 minutes today. A bit of creative procrastination away from the thesis.


It's almost a genuine robot as it consists of a sensor and an actuator, though an inability to manipulate it's environmet does exclude him from some definitions.

I'm not wiring up or programming him until the thesis is handed in!

Thursday, 11 November 2010

The Question in 'Philosophy Now'

I just found an article on The Question in the magazine Philosophy Now.

Here's a bit about the Lotus:

"Whilst the current haptic device, designed by humanoid technology engineer Adam Spiers, is a excellent step in the direction of delivering a similar theatrical experience to both blind and sighted participants, I have a sneaking suspicion that blind participants may have gained more from the experience than I."

This is an interesting point. On one hand the lack of light in the Question meant that sighted audience members were certainly deprvied of their primary navigational sense. On the other hand the familiarity of blind audience members with a lack of sight meant that the Lotus didn't always offer the same level of navigational benefits and encouragment as for sighted audience members.

The article was written by Sue Rolfe, who studied Philosophy under Martin Milligan. The plot of The Question was based on 'On Blindness' a book of letters between philosophers Bryan McGee and Martin Milligan, debating the nature of blindness with regard to one's experience and knowledge of the world.

The full Philosophy Now article is available here.

Haptic Lotus in the Guardian

The Question and the Haptic Lotus have been mentioned in today's Guardian newspaper G2 article titled:

'The player: videoless games can reveal skills we didn't know we had.
The Question and the iPhone game Papa Sangre can teach us things that video games can't'


In the article the journalist describes the Question (my collaboration with Extant) as her 'standout theatre experience of the year.' She also compares it with the audio only iPhone game Papa Sangre.


You can read the full article here.

Wednesday, 10 November 2010

Surgical Tele-Haptics Job

I've been very poor at updating this blog recently, mainly on account of the rush to get my a complete draft of my PhD thesis to my supervisor before starting work as a research associate at Bristol robotics laboratory.

The Novint Falcon haptic inteface
This new project is looking at integrating the sense of touch into robot assisted larposcopic (keyhole) surgery. So a surgeon who is remotely controlling a medical robot will be able to feel what is inside the patient that they are working on, rather than just relying on the images provided by an endoscope for diagnosis and surgery.

From now on I'll be working a great deal with haptic technologies. Already I've been plesently surprised by the accessibility of haptics, with the Novint falcon haptics interface costing only £200 and being supported by the Chai 3D open source C++ libraries (including dynamics engines for both rigid and deformable bodies). Using only these two elements I've already got a fully enabled haptics suite that I can start my research with. This is in stark constrast to the 3 months I spent at the start of my PhD writing drivers to get my very expensive robot to move at all.

Chai 3D screenshot
Of course, I'm still working on the PhD...

Friday, 17 September 2010

Humanoids 2010

My paper for the IEEE Humanoids 2010 conference has been accepted for publication.

The paper looks at using neural networks to observe and learn a small number of human reaching motions that have been scaled to a robot body. The neural network is then able to generalise these examples for new requirements, so the robot can generate new human-like trajectories on the fly.

Below is a word cloud of the paper, which was generated by Wordle.


This was the most recent piece of work I completed for my PhD on human motion synthesis for manipulator robots.

Friday, 20 August 2010

The Question DVD

A DVD has been produced on The Question, my collaboration with Extant, the Open University and Battersea Arts Center. Here is a condensed / trailer version of the film:



The widescreen format of this clip seems to be struggling with my blog theme. If you are only seeing half the video click here to watch it on YouTube.

The DVD features the planning of the installation, details on the team, the haptic lotus device and audience experience. The DVD was produced by Braunarts and is available by contacting Extant.

Thursday, 12 August 2010

Thesis progress

About a month ago I started writing my PhD thesis. I don't tend to blog too much about my PhD research so to give everyone a flavour of what I've been working on for the last few years here is a wordle word cloud of the 50 pages I've written so far:

Note that wordle doesn't include equations in the word cloud. There are quite a few of those in my work...

Until the thesis is handed in I'm trying to stay away from creative projects and other distractions, so this blog is probably going to be a bit quiet for awhile...

Monday, 12 July 2010

The Question conference acceptance

Our submissions on 'The Question' have been accepted to the following conferences:

* DRHA2010 (Digital Resources for Humanities and Arts)

* MobileHCI2010 (Mobile Human Computer Interaction)

We may be doing a scaled down version of the installation (complete with Haptic Lotus) for the DRHA conference (in Brunel University). I'm sure that some hands-on technology will be accompanying the MobileHCI presentation (in Lisbon).

Both conferences are taking place in the first week of September.

Thursday, 1 July 2010

Haptic Lotus in Wired

Technology magazine Wired have published an online article on The Question and the Haptic Lotus in their News and Culture section.


Written by one of the first attendees of the installation, the article speaks very favourable of the project and the haptic navigation device which I designed (with Paul O'Dowd and David McGoran) for the unique project.

"The Lotus device really does work, guiding you around rooms (and into the occasional wall). You have to work harder, but it's ultimately very satisfying to construct your own reality and story, using experiences that you yourself have to uncover and assemble."

An infra-red video of the reporter navigating the pitch black space accompanies the article.

Tuesday, 15 June 2010

The Question goes live

After a week of setting up, Extant's 'The Question' is now live, with performances running throughout the week. This project has been in the pipeline for several years so it's pretty satisfying to finally see the fruits of all the work and late nights.

My contribution to this project was the design of a unique navigation system that would assist people in finding their way around the pitch black installation. Below is an image of some 'haptic lotus' devices, waiting to be fitted to particpants entering the installation. These devices are one half of the navigation system that I designed, with Daivd McGoran and Paul O'Dowd, for this project.


I don't want to give away too much information on the project until the end of the week (i.e. I don't want to ruin it for any of the participants), so expect more details on the technology then.

Sunday, 13 June 2010

Open University Front Page News

Extant's, 'The Question' and the Haptic Lotus have both been featured on the main news page of the Open University.
Front page articles only stay on the front page for a few days and I was a bit late picking this up. You can now read the full article in the news repository.

The project goes live on Monday 14th June 2010.

Wednesday, 19 May 2010

Distinguished Research Seminar Video

I finally got my act together and ripped the streaming video of my lecture at the University of Wales. For anyone who is interested in the details of my PhD research, or wants another explanation of the Operational Space Formulation, well this video may help.


Operational Space Control, Robotics Seminar from Ad Spiers on Vimeo.

Unfortunately the connection to the server dropped out about 10minutes before I finished talking, but all the maths get a bit heavy after that so maybe it was for the best.

Thursday, 13 May 2010

The Question Interview Video

I neglected to mention that recently the website for the haptic theatre project went live. Renamed 'The Question' this project is something rather unique I've been working on with Extant, a visually impaired theatre company based in London. The project aims to create an immersive theatre experience set in a pitch black environment. More details are available on the site.

A video is being made as part of the project and during the last steering meeting most of the team was interviewed. Here I am talking about the technical side of the project and this unique application of haptic navigation technology.

The Question - Team Interviews from Alex Eisenberg on Vimeo.

As Maria says we've all been working quite independantly up until recently and things are really starting to get interesting now. However the final deadline approaches and there is still lots to be done. Watch this space!

Wednesday, 12 May 2010

Milk Pixel Arts Trail videos

Last weekend a highly improved Milk Pixel featured on the Southbank Bristol Arts Trail. The system was installated in the studio of the Creative Glass Guild, a brightly lit white-walled space filled with colourful glasswork. As Milk Pixel works best in the dark was necessary to build an enclosure. Being on a tight (i.e. non-existing budget) can be fun sometimes as a tent was quickly cobbled together out of a £15 Gazebo and a roll of cheap 'Halloween' fabric.


The system was working on some new 'autonomous', semi-random code whereby it would automatically change between different modes of responsiveness, sometimes overlaying colours onto it's video/motion/sound derived patterns while at other times ignoring what was happening in the tent and just doing it's own thing (i.e. displaying and moving pretty colours around). The idea was that:

1. The audience never actually knew what the system was doing
2. No two visits to the tent were the same

As I result we got some interesting behaviours out of both the system and the audience! Some of the most interesting comments went something like "It seemed to be watching me walk around, then it got bored and did something else".

Here's a video of the sort of interaction you get with the system, as you can see it's alot of fun (ignore the flickering, my camera went weird at the weekend, it doesn't actually flicker in real life):



I couldn't resist making a timelapse of the dissamembly:




Perhaps the most rewarding comments came from about 4 different people who all compared Milk Pixel to some of the installation work by Brian Eno and all said we should aim for more exposure in larger venues. Obviously, we were flattered!

Thursday, 6 May 2010

Milk Pixel on arts trail this weekend

An improved version of Milk Pixel  interactive digital art installation will be featuring on Southbank Bristol Arts trail this Saturday and Sunday (8th-9th May) between 10am-4pm.


Since the Arnolfini installation we've managed to improve the structure of the system, iron out the (flickering and latency) bugs in the code and automate the software (which will save me curating the peice like last time for two exhausting days). This should all improve the experience of watching or interacting with the system.



We will be installing the system in the Creative Glass Guild, (No 16. Whitehouse Street, Bedminster, Bristol BS3 4AY) where there will also be a number of other artists showing paintings, drawings ceramics and glass sculptures.

There was an embedded Google Street View Map of the location here but I removed it as it was confusing my browser. A map of the other venues on the trail can be found here.

Entry is of course free so if you're are in Bristol come and have a look and a play (don't forget it's an interactive piece). Feel free to take photos / videos. One of my biggest kicks at first installation was seeing so many people taking photos of the system then finding some of the images and videos online via flickr and youtube. So share your media (leave a comment on this blog if you like to tell us where it is) and make us happy.

Thursday, 22 April 2010

Distinguished Research Seminar Series

Tomorrow afternoon I'll be speaking at University of Wales' Robotic Intelligence Laboratory as part of their invited seminar series. My talk will be on the subject of my PhD: Biologically Inspired Reaching Techniques for Humanoid Robots.


The talk will be streamed through the virtual research centre in personal robotics though I'm afraid you have to be a member of the research centre to watch it. I'll see if there is someway I can get a recording of the stream to post online.

Friday, 12 March 2010

ACC publication

A paper I submitted to the 2010 American Control Conference (ACC) has been accepted for presentation and publication in the conference proceedings.

The paper is titled "An Optimal Sliding Mode Controller Applied to Human Motion Synthesis with Robotic Implementation" and describes a new type of nonlinear controller that I developed with my PhD supervisors Dr. Guido Herrmann and Professor Chris Melhuish.

Thursday, 11 March 2010

Milk Pixel in Realtime magazine


Milk Pixel has been featured in issue #95 of the Australian contemporary arts magazine Real Time. The article 'Coders, Crafters and Crooks' is a report on the Craftivism and Uncraftivism events at the Arnolfini (many videos of which are now available on the uncraftivism wiki). Here's what it says about us:

"Flickering from across the room was Milk Pixel built by the Bristol Robotics Lab, an inspired re-use project incorporating LEDs into two-litre plastic milk bottles. The 64 bottle/pixel array continuously responded to sound performances and moving image in unexpected collaborations and improvisations."


Interestingly the picture chosen to head the article (shown above) was one I took of Jason and Paul (from Potential Indifference) debugging MilkPixel a day or so before the event. It's nice to think that the editors thought that the hands-on sorting of a tanlge of ribbon cable and recycled junk represented the craftivism movement to such a degree.

The Article was written by Melinda Rackham, an Emerging Artforms Curator and Adjunct Professor at RMIT University.

Just for the record, MilkPixel no longer flickers, we fixed that bug :-)

Haptic Theatre Update

An online update of the collaborative project I'm working on with Extant (Britain's only professional performing arts company of visually impaired people), has appeared on the News Flash section of the Extant website.

The project involves "creating an immersive dramatic environment that invited audience members will explore using a specially designed hand held Haptic device. Currently the team of project partners, artists and engineers are working on areas of script development, sound and set design, and technical construction of the device and its support system, along with sensory trigger mechanisms to connect different elements of the installation".
For more details please visit http://www.extant.org.uk/question.html

Saturday, 27 February 2010

Potential Indifference


Quite recently my application to get Milk Pixel included on the Southbank Bristol Arts Trail was accepted. Great stuff! As the application form was intended only for a single person or group it was necessary to create a name for myself and the three other guys that built Milk Pixel.

And so it my great pleasure to officially anounce our Digital Art collective: Potential Indifference. Those in the know will realise that this is quite similar to the official description of Voltage. The second reason for this name is that this is how our work may be viewed, we sit somewhere between engineering and art but not sure if we really impress people from either discipline...

Potential Indifference consists of:


I used Soulwire's brilliant Glitch Generator to generate the glitches in the Potential Indifference logo. Lots of fun to be had with that thing.

The Southbank Bristol Arts trail will take place on the weekend of 8th & 9th May 2010. More details to follow about where to find Milk Pixel.

Tuesday, 16 February 2010

Jen Hui Liao's Self-Portrait Machine

A very original take on a drawing robot discovered on we make money not art. Rather than the robot simply drawing your portrait the robot actually moves your own hands around so that you are guided to produce the portrait yourself.


Here's the blurb: "The project started with the observation that nearly everything that surrounds us has been created by machines. Our personal identities are represented by the products of the man-machine relationship. The Self-Portrait Machine encapsulates this man-machine relationship. By co-operating with the machine, a self-portrait is generated. It is self-drawn but from an external viewpoint through controlled movement and limited possibility. Our choice of how we are represented is limited to what the machine will allow."

I particularly like the fact that the user (or used) has a pen in each hand, particularly unnatural...

CandleBot Finished!

After a solid Geekend of coding, CandleBot (a phospholuminescent drawing robot) is finished, well, my part of it is at least...
From it's humble beginings as an open loop perlin noise → joint angles system (which made some nice random curves) CandleBot has developed to this system capable of interfacing to any serial comms to draw trajectories. Here is a video of a few shapes communicated from processing.



Candlebot was developed to it's current stage as a platform for prototyping the low-level embedded motion controllers of Dada, a larger and more ambitious mini industrial robot that will also be used for drawing. CandleBot is now on it's way to Justin Windle (Soulwire) who will be taking care of the high level coding on the PC/Mac side that is necessary to get the robot to drawing something more worthwhile. In the mean time I'll be adding the extra embedded layers of code necessary to get the CandleBot drawing code working on Dada.

Monday, 15 February 2010

Many Arduinos!

Here's part of the Arduino Pro Mini order I recieved last week from Cool Components.


I'm often ridiculed by my techy colleagues for continually describing the benefits of Arduino for rapid prototyping (over PIC for example). This is clearly another step in my obviously healthy obsession with this less-geeky, more user friendly platform.

I can't tell you what project all these chips are for (non-disclosure agreement) but it involves many Arduinos :-)

Monday, 1 February 2010

Return of CandleBot

About a year ago I starting working with Justin Windle (aka Soulwire) on a drawing robot project, 'Dada'. Which aimed to refurbish a defunct industrial robot for creative purposes. Progress has been slow for a number of reasons but I recently realised that it would be much easier to develop my embedded drawing algorithms on a smaller and easier to control robot. Hence, CandleBot (originally called CanvasBot) has been dusted off and recently equipped with Inverse Kinematics and a Smoothing algorithm (which I developed on Sunday). It's working quite nicely as this picture of the robot drawing a house shows:


For more details visit the Dada project website, where details of the algorithms, videos and future drawings will be posted : http://dada.soulwire.co.uk/