Tuesday 15 December 2009

Milk Pixel at the Arnolfini

After a week of stress and late nights, Milk Pixel was up and running in the Arnolfini during Saturday and Sunday's uncurated open event unCraftivism. We kept the project more or less under wraps until the day so reactions were very interesting. Here are some pictures:


Milk Pixel is essentially a giant, interactive, 8x8 Red Green Blue LED array with each led pixel placed inside a plastic milk bottle. The textured plastic of the bottles diffuses the lights from the leds nicely to give some interesting colour combination.


The patterns of colours are generated from webcam and microphone input as well as number of manual controls. This means that the system responds to motion and sound. During the weekend the array was feeding off live musicians, projected films, games of tambourelli (badminton with tambourines) and anyone who happened to walk by during quiet moments. The Audio Visual driver software was written entierly in Processing. In the above photo the green parts of the array reflect movments of the brass instrument and player.

Here is the system with myself (center), Paul O'Dowd (left) and Thomas Burton (right). Jason Welsby is another memeber of the team that isn't in the photo.

Oh, and here's a video of it watching a movie while listening to a band. There are still some glitches that need to be ironed out so the representation of the audio / visual input is quite abstract at the moment...



If you were at craftivism and have any pictures of Milk Pixel I'd love to see them, leave a comment below...

Sunday 25 October 2009

Multiple LED pulsing via a 555 timer

UPDATE - The bug in the last schematic has been corrected. R2 had been incorrectly connected to Pin 7 and GND instead of Pin 7 and Pin 2.

I noticed awhile back that there were a few people in a few forums asking for simple ways to make LEDs pulse. During the summer I designed a circuit that pulsed a number of LEDs as part of a fancy dress costume. I was told that it looked great :-) Anyway, here is the circuit. It uses a single 555 timer in astable configuration to create a square wave. On the high part of the wave it triggers one LED, on the low part it triggers the other LED. The big capacitors cause the fading by charging and discharging the square wave into something more analog than digital.

Note: This is Version 3 of this circuit. A lot of people sent me messages saying that they were having a few issues building the circuit. There were a few problems with the previous schematics. The first was that the line connecting pin 2 and 6 could of been misinterpreted as also connecting to pin 7. I've now colour coded that line and put junction indicators (black dots) to show when two wires should be connected together. I recently realised that while making this correction I accidently connected R2 to GND instead of Pin 2. This is now fixed.
The other error was that the dotted line part had the led connected to ground after the capacitor instead of before it. As necro_nemisis pointed out this meant that LED1 was on a direct path from 5V to ground. I dread to think of all the dead leds that drawing mistake was responsible for!

Sorry it takes me so long to do corrections...

The effect is quite nice and you can add more capacitors in series to get several LEDs to light up in a chain. Here's a video where I had 4 sets of LEDs running from one 555 timer (shown at the end).



In addition, here's a photograph of how I wire up the 555 timer IC on some stripboard (with the strips running vertically from top to bottom). The only cut tracks are under the 555 chip. Note there is a red cable hidden from view behind the left capacitor which connects the other red cables together. I would only use this picture as a reference for the IC wiring, rather than the caps, as it is hard to see which track the capacitors are connected to.

Friday 2 October 2009

Milk Pixel

Following the last post we (myself and some colleagues from BRL) have decided to attempt to create a low-cost, low-complexity, interactive installation for the Craftivism open event at the Arnolfini.
The project is to be called Milk Pixel. Reasons for the name will become evident in time. We don't want to release too many details of what we are up to at the moment, but details will be appearing on the Craftisism wiki closer to the event (11 - 13 December).

Wednesday 23 September 2009

Open Event @ Arnolfini

I was recently approached by Rui Guerra of the V2 Institute for the Unstable Media who was looking for participants for an Open Event, 'Craftivism', to be held at the Arnolfini (Bristol) in December.



Rui has given everyone a huge amount of freedom in what they want to host as part of the event and my group (consisting of myself and other researchers from BRL) is no exception. Will keep posting as I know more.

The Enactive Torch was presented at V2 by my colleague Tom Froese in February 2008.

RiderSpoke

I recently took part in Rider Spoke which is currently being held in Bristol (with the Arnolfini as the hub) an event that combines Ubiquitous computing with cycling. As a cyclist and geek I was in.

Here is a blurb from the Blast Theory website:


'The audience can take part either either on their own bike or borrow one supplied by Blast Theory. Following a short introduction and a safety briefing you head out into the streets with a handheld computer mounted on the handlebars. You are given a question and invited to look for an appropriate hiding place where you will record your answer. The screen of the device acts primarily as a positioning system, showing where you are and whether there are any hiding places nearby. The interface employs imagery drawn from Mexican votive painting, sailor tattoos and heraldry: swallows flutter across the screen to show available hiding places, prefab houses indicate places where others have hidden.'

I thought the game was enjoyable but it could of done with some context, and I seemed to come across quite a dark set of questions and answers compared to my girlfriend, which probably affected my enjoyment of the experience. We tried the game on it's first few days after opening (such things book up quickly in Bristol) but it would be nice to try it again at the end of the week when more answers have been recorded.

The novely of having a little Nokia computer on your handlebars is very cool too (though it's only held onto it's bracket with velcro - I was convinced it would fall off or get stolen).

Thursday 17 September 2009

Paper in Springer Book


My first PhD paper "Robotic Implementation of Realistic Reaching Motion Using a Sliding Mode/Operational Space Controller" has been published the Springer book beries 'Lecture Notes in Computer Science' and may be found on Google Scholar (though I don't know if can be downloaded outside an institution with a Springer License).

If you would really like to read it (the paper, not the book) drop me a line and I can email you a copy.

Tuesday 15 September 2009

Conference Tour

I recently (well, relatively recently) returned from two robotics conferences where I presented some work from my PhD in synthesising human motion for robotic application.

At the International Conference of Social Robotics in Incheon, South Korea, I presented a paper titled 'Implementation of Realistic Reaching Motion Using a Sliding Mode/Operational Space Controller' abstract. This work described a controller that mimics the shoulder and elbow motions of a human for 2-dimensional reaching motion and then uses a sliding mode controller to allow this method to be applied to practical systems. This paper is to be published in the Springer proceedings 'Advances in Robotics'.

At TAROS (Towards Autonomous Robotic Systems), which was held in Londonderry, Northen Ireland, I presented an extension of the above paper which augmented the robot's concept of posture 'effort' to include limits of motion abstract. This is analogous to the discomfort felt by a human when stretching muscles or holding uncomfortable poses. Again the benefit of this work lies in the application to physical robots, where the limits of motion prevent mechanical damage to the system.

At both conferences my work was recieved well and there were some very interesting comments and suggestions. Of course it was also a pleasure to visit South Korea and Northern Ireland both as a tourist and as a robotics researcher. The image below was taken from the drinks reception at the Derry robotics lab and shows a research platform modified to distribute party snacks.



I don't think I can host .pdfs on blogspot otherwise I would make the papers available for download. Here are the full references if you wish to do your own search:

Spiers, A., Herrmann, G., Melhuish, C., Pipe, A. and Lenz, A. (2009) Robotic Implementation of Realistic Reaching Motion Using a Sliding Mode/Operational Space Controller. In: S. Ge, U. Witkowski, D., Kim, J., Kim, U., Rückert, D., Sandberg, M., Wahde, C., Cho, J. Cabibihan, Y. Pan, eds. In: Advances in Robotics - FIRA RoboWorld Congress 2009, Incheon 16th-20th August 2009. Heidelberg: Springer, 1980, pp. 230-238.

Spiers, A., Melhuish, C. and Herrmann, G. (2009) Implementing 'Discomfort' in Operational Space: Practical Application of a Human Motion Inspired Robot Controller. In: 10th Anniversary of TAROS - Towards Autonomous Robotic Systems, Londonderry 31st August – 3rd September 2009.

Wednesday 1 July 2009

Atari Punk Console - AdLab version

I've been rubbish at blogging recently as I finished this project quite a few weeks back but hadn't got round to making a decent video of it yet. This is my version of the infamous 'Atari Punk Console', a noisy little beast that uses two simple square wave generators to produce all manner of glitchy weirdness in the worlds simplest (probably) synth.


I didn't like the sound of the original circuit two much so probed about a bit and ended up with this circuit bent version with a bit more flexibility and enough knobs for a rasta colour scheme.

The left toggle produces continuous or interrmittent tones via the black 'punch button'. The right toggle adjusts tone and the red button changes the waveform from square to, er, weird. Two body contacts on the back add some 'wobble' to the sound. Red and yellow are the two standard dials while yellow uncovers hidden sounds. A full video and circuit diagram will be posted eventually...probably. For now here is a video of breadboard prototype which shows where these sounds actually come from.

Tuesday 30 June 2009

'DADA' blog launched

DADA is a collaborative project between myself and Justin Windle, a feelance Interactive Designer and Developer who's work is often based on generative illustrations that merge hand drawn elements with computer generated structure.

DADA stands for 'Dynamic Autonomous Drawing Agent' and is the name given to a project that aims to construct a robot capable of drawing in a variety of media. The hardware platform is a recycled 3dof pick and place robot that was custom built for Thames Water some years ago.

Following retierment the robot was donated to the University of Reading who extracted various elements of the system, leaving a body with motors but no power or control. Luckily I happened to be in the right place and the right time and intercepted the robot before it arrived at its final destination, the skip. That was over 4 years ago and since that time the robot has sat immobile under various tables until this project came along.



While Justin works on creating the images that the robot will eventually draw I am working on turning the currently broken and incomplete shell of a machine into something capable of drawing the images that will merge the robot's mechanistic form and purpose with the organic nature of generative art that Justin creates.

The project's blog can be found at http://dada.soulwire.co.uk/about

Wednesday 18 March 2009

Music on MySpace

I've started a myspace for the music I've been making using Ableton live and my custom midi controller. At the moment there is only one song on the site but I'll add more as I finish them (don't expect any updates soon though).

www.myspace.com/snrpeludo

I got called 'Peludo' once in Belize, made me laugh, hence the name.

Wednesday 11 March 2009

PhD topic in 'The Engineer'

An Article by my co-supervisor, Prof. Chris Melhuish, has been featured in The Engineer. The article highlights some of the issues in humanoid robot control that we are trying to achieve at BRL and features a picture of the BERT robot that I work with. Unfortunately the article cannot be read online.



While searching for an online copy of the new article I discovered that last July a description of my PhD research, written by my supervisor Dr Guido Herrmann, was also published in the same magazine:

'There are two levels to the control project,' said Herrmann. 'In the first, the robot operates like a machine in a production line. You can tell a robot to move a cup from a height of 0m to 0.4m, for instance. The second, operating on top of the machine motion, is a controller to create movement that is very human-like.

'The best way for a human to move is to minimise the muscle effort, and that's what we want to implement in robots. So we must measure that in humans and model it for our robots.'

The full July article can be read here:
http://www.theengineer.co.uk/Articles/300582/A+feel+for+the+future.htm

Thursday 5 February 2009

Sensory Augmentation Workshop

I will be demonstrating the latest Enactive Torch (ET3) at the Key Issues in Sensory Augmentation Workshop at the University of Sussex (26-27th March).


The event aims to bring together researchers from various fields in order to address the following three questions:
  • Are there rigorous techniques that can characterise the subjective experience of using sensory augmentation technology?

  • How can empirical experiments with sensory augmentation devices be used to further philosophical and psychological enquiry into cognition and perception?

  • What technologies are available for building sensory augmentation devices?

I will also be helping to run a technical workshop on constructing your own sensory augmentation device using my favourite open-source platform, the Arduino

This event is free to attend though you must follow the participation instructions on the e-sense project page.