The API for the physical world is coming – or – how a night out in Bath ended with chickens and sword dancing

Last night I made it for the first time to one of the BathCamp evening events organised by Mike Ellis in Bath. The original BathCamp was a BarCamp held in late summer last year that I couldn’t get to, but Mike has been organising a couple of evenings since. I thought I’d go along because it sounded like fun, and I knew a few of the people were good value, and that I didn’t know any of the others.

Last night there were two talks, the first was from Dale Lane, talking about home electricity use monitoring, with a focus on what can be done with the data. He showed monitoring, mashups, competitions for who could use the least electricity, and widgets to make your phone beep when consumption jumped; the latter is apparently really irritating as your phone goes off every time the fridge goes on. Irritating, but makes you think. Dale also touched on the potential privacy issues, a very large grey area which is of definite interest to me; who owns this data, and who has the right to use it?

Dale’s talk was followed by one from Ben Tomlinson, about the Arduino platform. Now maybe everyone knows about this already but it was new to me. Essentially Arduino is a cheap open source platform for building programmable interactive devices. The core is a £30 board with enough processing power and memory to run reasonably sensible programs. There is already quite a big code base which you can download and then flash onto the boards. The boards themselves take some basic analogue and digital inputs. But then there are all the other bits and bobs you can get, both unrelated with standard interfaces, but also built to fit; literally to plug and play, including motion sensors, motor actuators, ethernet, wireless, even web servers!

Now these were developed almost as toys, or for creatives, but what struck me was that this was a platform for building simple, cheap scientific equipment. Things to do basic measurement jobs and then stream out the results. One thing we’ve learnt over the years is that the best way to make well recorded and reliable measurements of something is to take humans out of the equation as far as possible. This platform basically makes it possible to knock up sensors, communication devices, and responsive systems for less than the price of decent mp3 player. It is a commoditized instrument platform, and one that comes with open source tools to play with.

Both Dale and Ben mentioned a site I hadn’t come across before, called Pachube (I think it is supposed to be a contraction of “patch” and “you tube” but not sure, no-one seemed to know how to pronounce it) which provides a simple platform for hooking up location centric streamed data. Dale is streaming his electric consumption data in, someone is streaming the flow of the Colarado River. What could we do with lab sensor data? Or even experimental data. What happens when you can knock up an instrument to do a measurement in the morning and then just leave it recording for six months? What if that instrument then starts interacting with the world?

All of this has been possible for some time if you have the time and the technical chops to put it together. What is changing is two things; the ease of being able to plug-and-play with these systems, download a widget, plug in a sensor, connect to the local wireless. As this becomes accessible then people start to play. It was no accident that Ben’s central example were hacked toy chickens. The second key issue is that as these become commoditized the prices drop until, as Dale mentioned, electric companies are giving away the meters, and the Arduino kit can be got together for less than £50. Not only will this let us do new science, it will let us involved the whole world in doing that science.

When you abstract away the complexities and details of dealing with a system into a good API you enable much more than the techies to start wiring things up. When artists, children, scientists, politicians, school teachers, and journalists feel comfortable with putting these things together amazing things start to happen. These tools are very close to being the API to link the physical and online worlds.

The sword dancing? Hard to explain but probably best to see it for yourself…

Twittering labs? That is just so last year…

mars phoenix twitter stream

The Mars Phoenix landing has got a lot of coverage around the web, particularly from some misty eyed old blokes who remember watching landings via the Mosaic browser in an earlier, simpler age. The landing is cool, but one thing I thought was particularly clever was the use of Twitter by JPL to publicise the landing and what is happening on a minute to minute basis. Now my suspicion is that they haven’t actually installed Twhirl on the Phoenix Lander and that there is actually a person at JPL writing the Tweets. But that isn’t the point. The point is that the idea of an instrument (or in this case a spacecraft) outputting a stream of data is completely natural to people. The idea of the overexcited lander saying ‘come on rocketsssssss!!!!!!!!’ is very appealing (you can tell it’s a young spaceship, it hasn’t learnt not to shout yet; although if your backside was at 2,000 °C you might have something to say about it as well).

I’ve pointed out some cool examples of this in the past including London Bridge and Steve Wilson, in Jeremy Frey’s group at Southampton, has been doing some very fun stuff both logging what happens in a laboratory, and blogging that out to the web, using the tools developed by the Simile team at MIT. The notion of the instrument generating a data stream and using that stream as an input to an authoring tool like a laboratory notebook or into other automated processes is a natural one that fits well both with the way we work in the laboratory (even when your laboratory is the solar system) and our tendency to anthropomorphise our kit. However, the day the FPLC tells me it had a hard night and doesn’t feel like working this morning is the day it gets tossed out. And the fact that it was me that fed it the 20% ethanol is neither here nor there.

Now the question is; can I persuade JPL to include actual telemetry, command, and acknowledgement data in the twitter stream? That would be very cool.

Friendfeed, lifestreaming, and workstreaming

As I mentioned a couple of weeks or so ago I’ve been playing around with Friendfeed. This is a ‘lifestreaming’ web service which allows you to aggregate ‘all’ of the content you are generating on the web into one place (see here for mine). This is interesting from my perspective because it maps well onto our ideas about generating multiple data streams from a research lab. This raw data then needs to be pulled together and turned into some sort of narrative description of what happened. Continue reading “Friendfeed, lifestreaming, and workstreaming”