Hello IOT-Kit, Introduction By Example

February 12, 2018 at 01:54 AM | categories: iotkit, python, iot, C++, definitions, iotoy | View Comments

IOT-Kit is a toolkit for enabling the creation of IOT devices, by people who can make simple arduino based devices. Rather than waffle on about why, how, etc, I think the best way of explaining what this does, is by example.

Specifically, this post covers:

  • Suppose you can make little arduino based robots
  • Suppose you want to remote control your robots over the local network, trivially from something like python. (Really over HTTP in fact!)

What do you as a device maker need to do to make this happen?

IOT-Kit - Make your Arduino Device an IOT Device, easily

So the really short version is this: you can make a simple robot, but you want to make it usable as an IOT device too. You don't want to build the entire stack. You don't want to build everything yourself.

You want your users to be able to type something like this program and have it search for the robot on the network, and have it control the robot.

from iotoy.local import simplebot
import time
import random

simplebot.lights = 0
while True:
    choice = random.choice(("forward", "backward", "left", "right", "blink", "stop"))
    if choice == "forward":
        simplebot.forward()
    if choice == "backward":
        simplebot.backward()
    if choice == "left":
        simplebot.left()
    if choice == "right":
        simplebot.right()
    if choice == "blink":
        for i in range(3):
            simplebot.lights = 1
            time.sleep(0.5)
            simplebot.lights = 0
            time.sleep(0.5)
    if choice == "stop":
        simplebot.stop()

    time.sleep(5)

NOTE: While this is python, this actually maps to a bunch of deterministic http web calls, and actually can be written in any langauge. iotoy/iotkit just provides a bunch of convenience functions to do these calls in a way that also maps cleanly to python. (It also would map cleanly in javascript, ruby, perl, etc)

How do we get to this?

Building the Robot - Hardware

These is the easy part. We could use a DAGU mini-driver. This can control a number of servos and also provides serial access over plain old hardware serial bluetooth.

Building the Robot - Software, No IOT

If we were just controlling the robot without any remote control, we could use Pyxie to program this. The Pyxie program you might use could look like this:

#include <Servo.h>

leftwheel = Servo()
rightwheel = Servo()

headlights_led_pin = 13
leftwheel_pin = 2
rightwheel_pin = 3

pinMode(headlights_led_pin, OUTPUT)
leftwheel.attach(leftwheel_pin)
rightwheel.attach(rightwheel_pin)

leftwheel.write(90)
rightwheel.write(90)

while True:
    leftwheel.write(180)
    rightwheel.write(180)
    delay(500)

    leftwheel.write(180)
    rightwheel.write(0)
    delay(500)

    leftwheel.write(0)
    rightwheel.write(0)
    delay(500)

    leftwheel.write(0)
    rightwheel.write(180)
    delay(500)

    leftwheel.write(90)
    rightwheel.write(90)
    delay(500)

    digitalWrite(headlights_led_pin, HIGH)
    delay(1000)
    digitalWrite(headlights_led_pin, LOW)
    delay(1000)

This program assume 2 continuous rotation servos, where the centre point 90 means stationary, 0 means full reverse, and 180 means full forward.

What this program means is "forward, right, backward, left, stop, blink headlights".

Pyxie generates C++ code, which we can use as a starting point for our code:

#include <Servo.h>

#include "iterators.cpp"

void setup() {
    int headlights_led_pin;
    Servo leftwheel;
    int leftwheel_pin;
    Servo rightwheel;
    int rightwheel_pin;

    leftwheel = Servo();
    rightwheel = Servo();
    headlights_led_pin = 13;
    leftwheel_pin = 2;
    rightwheel_pin = 3;
    pinMode(headlights_led_pin, OUTPUT);
    (leftwheel).attach(leftwheel_pin);
    (rightwheel).attach(rightwheel_pin);
    (leftwheel).write(90);
    (rightwheel).write(90);
    while (true) {
        (leftwheel).write(180);
        (rightwheel).write(180);
        delay(500);
        (leftwheel).write(180);
        (rightwheel).write(0);
        delay(500);
        (leftwheel).write(0);
        (rightwheel).write(0);
        delay(500);
        (leftwheel).write(0);
        (rightwheel).write(180);
        delay(500);
        (leftwheel).write(90);
        (rightwheel).write(90);
        delay(500);
        digitalWrite(headlights_led_pin, HIGH);
        delay(1000);
        digitalWrite(headlights_led_pin, LOW);
        delay(1000);
    };
}

void loop() {
}

Making a simple device abstraction layer for our device.

A device abstraction layer just means creating names for the key functionality we care about. At some point, pyxie will help here, but pyxie is currently very simple and can't create functions, so we take the C++ code we have so far and work from there.

The interators in the iterators.cpp file are not used, so we can ditch that too.

Creating functions for functionality

So our first step is to pull out and name all the functions. While we're at it, unlike pyxie, we'll split out the contents of what would normally be in a setup() and loop() in an arduino program.

#include <Servo.h>

int headlights_led_pin;

Servo leftwheel;
Servo rightwheel;

int leftwheel_pin;
int rightwheel_pin;

void forward() {
    leftwheel.write(180);
    rightwheel.write(180);
    delay(500);
}
void backward() {
    leftwheel.write(0);
    rightwheel.write(0);
    delay(500);
}
void left() {
    leftwheel.write(180);
    rightwheel.write(0);
    delay(500);
}
void right() {
    leftwheel.write(0);
    rightwheel.write(180);
    delay(500);
}
void stop() {
    leftwheel.write(0);
    rightwheel.write(0);
    delay(500);
}
void lights_on() {
    digitalWrite(headlights_led_pin, HIGH);
}
void lights_off() {
    digitalWrite(headlights_led_pin, LOW);
}

void setup() {
    leftwheel = Servo();
    rightwheel = Servo();
    headlights_led_pin = 13;
    leftwheel_pin = 2;
    rightwheel_pin = 3;

    pinMode(headlights_led_pin, OUTPUT);
    leftwheel.attach(leftwheel_pin);
    rightwheel.attach(rightwheel_pin);
    leftwheel.write(90);
    rightwheel.write(90);
}

void loop() {
    forward();
    right();
    backward();
    left();

    lights_on();
    delay(1000);
    lights_off();
    delay(1000);
}

Device abstraction for our robot

So the device abstraction layer for our device has the following signature:

void forward();
void backward();
void left();
void right();
void stop();
void lights_on();
void lights_off();

This is the what we need to build an IOT-Kit interface for.

Minimal IOT-Kit Interface

Our starting point for our IOT-Kit interface is something minimal. Initially we'll try to cover the following parts of our device abstraction:

void forward();
void stop();
void lights_on();
void lights_off();

We'll then add the rest in.

Changes to support minimal control API

We add the following include near the top of the file:

#include <CommandHostTiny.h>

In order to make our device introspectable and controllable, we need to add in a class which subclasses "CommandHostTiny".

The skeleton of this class looks like this:

class SimplebotHost : public CommandHostTiny {
private:
    char temp_str[128];   // needed for parsing input
    int lights;           // To store state of the headlight
public:
    SimplebotHost() : lights(0) { }
    ~SimplebotHost() { }

    const char *hostid(); // Returns the name of the device
    const char * attrs(); // Returns the list of attributes(+types) that can be changed
    const char * funcs(); // Returns the list of functions the device understands.

    bool has_help(char * name); // To allow us to find out whether a given name has help.

    void help(char * name); // Returns the help for a given name - usually a function
                            // Includes machine parsable type signature

    bool exists(char * attribute); // Returns true/false for an attribute existing.

    const char *get(char * attribute); // Gets the value for an attribute

    int set(char* attribute, char* raw_value); // Sets the value for attributes

    int callfunc(char* funcname, char* raw_args); // Calls the given function with given raw_args
};

So by way of example, hostid, attrs and funcs in this case look like this:

const char *hostid() {    return "simplebot";     }
const char * attrs() {    return "lights:int";    }
const char * funcs() {    return "forward,stop";  }

Note that the name returned as host id here - "simplebot" - is used as the name to advertise the robot on the network, and that is how this line of python is made to work:

from iotoy.local import simplebot

Help is implemented in two ways - firstly to note that help is available and then to return the help available:

bool has_help(char * name) {
    if (strcmp(name,"forward")==0) return true;
    if (strcmp(name,"stop")==0) return true;
    return false;
}

void help(char * name) {
    if (strcmp(name,"forward")==0) Serial.println(F("forward -> - Move forward for 1/2 second"));
    else if (strcmp(name,"stop")==0) Serial.println(F("stop -> - Stop moving"));
    else Serial.println(F("-"));
}

Attribute handling is then done as follows. Note we only have one attribute - lights. ANd here I choose to update the LED state whenever the lights value changes:

bool exists(char * attribute) {
    if (strcmp(attribute,"lights")==0) return true;
    return false;
}

const char *get(char * attribute) {
    if (strcmp(attribute,"lights")==0) { 
        itoa (lights, temp_str, 10); 
        return temp_str; 
    }
    return "-";
}

int set(char* attribute, char* raw_value) {
    if (strcmp(attribute,"lights")==0) {
        int value = atoi(raw_value);
        lights = value;
        if (lights) {
            lights_on();
        } else {
            lights_off();
        }
        return 200;
    }
    return 404;
}

Handling function calls is pretty simple:

int callfunc(char* funcname, char* raw_args) { 
    if (strcmp(funcname,"forward")==0) { forward(); return 200; }
    if (strcmp(funcname,"stop")==0) { backward(); return 200; }
    return 404; 
}

IOT-kit final step

At this stage, the command host isn't being used.

Our final step in our transformation boils down to:

  • Add the other functions from our device abstraction
  • Move the setup for the robot into a setup function in the class
  • Make sure that setup also sets up the command host
  • Make the arduino set up set up our robot
  • Remove the custom code from loop() and run the command host instead.

In practice this means that our final code looks like this:

#include <Servo.h>
#include <CommandHostTiny.h>

int headlights_led_pin;

Servo leftwheel;
Servo rightwheel;

int leftwheel_pin;
int rightwheel_pin;

void forward() {
    leftwheel.write(180);
    rightwheel.write(180);
    delay(500);
}
void backward() {
    leftwheel.write(0);
    rightwheel.write(0);
    delay(500);
}
void left() {
    leftwheel.write(180);
    rightwheel.write(0);
    delay(500);
}
void right() {
    leftwheel.write(0);
    rightwheel.write(180);
    delay(500);
}
void stop() {
    leftwheel.write(0);
    rightwheel.write(0);
    delay(500);
}
void lights_on() {
    digitalWrite(headlights_led_pin, HIGH);
}
void lights_off() {
    digitalWrite(headlights_led_pin, LOW);
}

class SimplebotHost : public CommandHostTiny {
private:

    char temp_str[128];
    int lights; //

public:
    SimplebotHost() : lights(0) { }
    ~SimplebotHost() { }

    const char *hostid() {    return "simplebot";     }
    const char * attrs() {    return "lights:int";    }
    const char * funcs() {    return "forward,backward,left,right,stop";  }

    bool has_help(char * name) {
        if (strcmp(name,"forward")==0) return true;
        if (strcmp(name,"backward")==0) return true;
        if (strcmp(name,"left")==0) return true;
        if (strcmp(name,"right")==0) return true;
        if (strcmp(name,"stop")==0) return true;
        return false;
    }

    void help(char * name) {
        if (strcmp(name,"forward")==0) Serial.println(F("forward -> - Move forward for 1/2 second"));
        else if (strcmp(name,"backward")==0) Serial.println(F("backward -> - Move backward for 1/2 second"));
        else if (strcmp(name,"left")==0) Serial.println(F("left -> - Spin left for 1/2 second"));
        else if (strcmp(name,"right")==0) Serial.println(F("right -> - Spin right for 1/2 second"));
        else if (strcmp(name,"stop")==0) Serial.println(F("stop -> - Stop moving"));
        else Serial.println(F("-"));
    }

    bool exists(char * attribute) {
        if (strcmp(attribute,"lights")==0) return true;
        return false;
    }

    const char *get(char * attribute) {
        if (strcmp(attribute,"lights")==0) { 
            itoa (lights, temp_str, 10); 
            return temp_str; 
        }
        return "-";
    }

    int set(char* attribute, char* raw_value) {
        if (strcmp(attribute,"lights")==0) {
            int value = atoi(raw_value);
            lights = value;
            if (lights) {
                lights_on();
            } else {
                lights_off();
            }
            return 200;
        }
        return 404;
    }

    int callfunc(char* funcname, char* raw_args) { 
        if (strcmp(funcname,"forward")==0) { forward(); return 200; }
        if (strcmp(funcname,"backward")==0) { backward(); return 200; }
        if (strcmp(funcname,"left")==0) { left(); return 200; }
        if (strcmp(funcname,"right")==0) { right(); return 200; }
        if (strcmp(funcname,"stop")==0) { backward(); return 200; }
        return 404; 
    }

    void setup(void) {
        // Setup the pins
        CommandHostTiny::setup();

        leftwheel = Servo();
        rightwheel = Servo();
        headlights_led_pin = 13;
        leftwheel_pin = 2;
        rightwheel_pin = 3;

        leftwheel.attach(leftwheel_pin);
        rightwheel.attach(rightwheel_pin);
        leftwheel.write(90);
        rightwheel.write(90);

        pinMode(headlights_led_pin, OUTPUT);
    }
};

SimplebotHost MyCommandHost;

void setup()
{
    MyCommandHost.setup();
}

void loop() {
    MyCommandHost.run_host();
}

Final notes

So, that's a whistle stop tour of the device layer. The fun thing now: assuming this robot has a hardware serial bluetooth (ala the dagu mini), then this is everything you need to do as an arduino based maker to make your device an IOT-able device. If you're not using bluetooth, then your device assumes it's doing serial down a cable.

Either way though, as a device maker, this is all the changes you need to do to enable the python program we started with to be able to control your robot over a network.

I'll explain how this works in a later blog post, but I thought this would make a good fun first example about how IOT-Kit gets implemented by a device maker to enable a very high level of abstraction to take place.

Read and Post Comments

Escaping The Panopticon of Things?

February 11, 2018 at 10:48 PM | categories: iotkit, python, iot, C++, definitions, iotoy, opinion | View Comments

The Panopticon of Things

The Internet of Things. Ask 100 different people what it means to them and you get a 100 different answers. I know, because I have done... When you do, in my experience you some different versions and themes.

For many companies though, futurists, and techies it boils down to some variation of this:

  • People have devices, which can detect things or have small amounts of processing power added
  • These devices monitor their state or activity, or similar
  • This is published on the internet via a central service
  • This information can be aggregated and worked on, and often can be drilled into down to individual items

But is that really an internet of things? Let alone "The internet of things"? No, it's internet connected things that reports information about you, your environment or your activity to a centralised system. Some extend this to the idea of connecting these centralised systems to each other.

So no, they're not really an internet of things. They're a panopticon of things.

If you're doing this, stop and think. Do you really want to build a panopticon?

A Panopticon of Internet Connected Things

The idea of the panopticon is a relatively old idea. A panopticon was a building where all (pan-) the residents could be observed (-opticon). If that sounds a little creepy, consider it was originally meant as a design for a prison...

It's actually been implemented in both the real world and in fiction. In the real world, it's been implemented as prisons in a variety of places around the world... In fiction, the most recent mainstream "up-beat" example is the floating prison in Captain America Civil War. The most and well known realisation of the idea of turning the general world into a panopticon is in the world of "big brother" in 1984.

One key point: the purpose of the panopticon is NOT to benefit those staying in the panopticon. The purpose is to benefit the owner of the panopticon in some fashion.

This means that any panopticon of things is designed to benefit the person running the panopticon, not the person who owns the things (however well intentioned the maker was/is). Indeed, it can mean the panopticon views you and your things as a product to be sold (advertising, data, etc), not as customers to provide value to. This isn't be universally the case, but it's common enough.

I don't buy products to benefit some random company. Do you? I buy or use products either for my benefit or for the benefit of those I buy them for. Don't get me wrong, being able to opt-in can have benefits. Google maps being able to give you a different route based on real time data is useful.

But be clear - it's based on using data from a panopticon, built on internet connected things.

Obsolescence Really Means Junk

Internet connected things isn't really a new idea. That means we've now gone through the full product cycle more than once. Be it a Nabaztag, Mattel IM-ME, AIBO, or similar. I've picked these ones for a variety of reasons:

  • They might have been acclaimed
  • The manufacturer thought it was a "Big" thing, and mass produced them
  • They seemed hackable and interesting
  • They're all kinda fun or interesting from some angle, but aren't really now

They all relied on some form of central service, and as those services disappeared, they became less useful or in some cases instantly useless junk. I also picked them because they all had many active hacker groups work to make them useful for many years - often in ways the original manufacturers didn't consider.

For each of these, there are dozens of other active objects with similar issues. They all relied on some form of central service. They all became obsolete when the service they relied on to work disappeared. They all contained interesting tech.

These devices all became junk. The value was in the service, not in the device. Even though with all of these devices they had value to the owner, and could've retained value without the central service.

A Panopticon of Internet Connected Junk

So this is really what this sort of internet of things really means. Building a network of benefit to the owner of the network using devices that become useless when the owner decides to cease supporting the devices.

That means the creation of electrical junk, that is wasteful, and in the end of limited benefit to the customer.

Reframing the question.

Rather than ask "what is the internet of things", ask yourself - "What is the internet of my things?" "what should the internet of things be -- for me?". Or I could ask you "what is the Internet of Things of Yours" ?)

  • Tracking of my "stuff"
  • Monitoring and controlling my own devices which are networked
  • Taking my networks at home and using them all in a unified manner
  • Allowing my devices to work with each other.
  • Using my data and devices in ways that benefit me, and those I get them for.

These are somewhat different answers. Similar, but different. They don't preclude working with panopticons. But they do take a different angle.

This reframed question is the reason behind:

I'll be describing in a short series of posts:

  • IOT-KIT and its motivation
  • How to use IOT-KIT to make your own IOT devices that have longevity of value to the owner.
  • IOTOY specifications
    • Device layer
    • Web Layer
  • An overview of how this was implemented in the microbit prototype
  • How to implement this in your own systems.

The core underlying ideas were:

  • Suppose you can make an arduino (or similar) based device. You should be able to make it an IOT-KIT based device trivially. (ie low barrier to entry)

  • Suppose you know very limited python, can you use and control the IOT devices that sit on your network. (Note, this allows you to then trigger behaviour between devices)

  • No "centre". Minimal standard interfaces making it normal for individuals, groups, companies and consortia to create their own domain specific standards. (Much like these days we use JSON, rather than centralised XML schemas for many services...)

  • Plan for "obsolescence means ongoing utilty". If your devices can continue to remain useful after the manufacturer disappears, then you build value, not junk.

These goals are effectively all designed for a low barrier to entry, while still inter-operating.

If you're interested, please like, share, comment or similar, and as always feedback welcome.

Read and Post Comments

Connected Studio - Coding For Teens, Building the Dresscode

July 23, 2014 at 03:13 AM | categories: wearables, arduino, programming, iot, BBC, kidscoding | View Comments

At a connected studio Build Studio last week, I along with 2 others in Team Dresscode prototyped tools for an proposition based around wearable tech. We worked through the issues of building a programmable garment, a programmable accessory, and a web app designed for teaching the basic of programming garments and accessories in a portable fashion.

We achieved most of what we set out to achieve, and the garment (as basic as it was) and accessory (again very basic) had a certain "I want one" effect of a number of people, with the web app lowering the barrier to producing behaviours. It was a very hectic couple of days, but as a result of the two days it's now MUCH clear as to how to make this attractive to real teenagers, rather than theoretical ones - which is a bonus.

Probably the most worthwhile thing I've done at work at the BBC in fact and definitely this year - beyond sending a recommendation to people in strategy last summer regarding coding.

The rest of this post covers the background, what we built, why we built it and an overview of how.

NOTE Just because I work in BBC R&D does NOT mean this idea is something the BBC will work on or take on. It does NOT mean that the BBC approves or disapproves of this idea. It just means that I work there, and this was an idea that's been pitched. Obviously I believe in the idea and would like it to go through, but I don't get to decide what I do at work, so we'll see what happens.

The only way I can guarantee it will go through is if I do it myself - so it's well worth noting that this work isn't endorsed by the BBC - I "just" work there

Connected What?

So, connected studio - what is that then?

Connected Studio is a new approach to delivering innovation across BBC Future Media. We're looking for digital agencies, technology start-ups and individual designers and developers (including BBC staff) who want to submit and develop ideas for innovative features and formats.

Note that this process is open to both BBC staff and external. It's a funded programme that seeks to find good ideas worth developing, and providing seed funding to test the ideas and potentially take forward to service. People can arrive in teams or form teams at the studio.

Connected Why?

And what was the focus for this connected studio?

  • Inspiring young people to realise their creative potential through technology
  • We want to inspire Britain's next generation of storytellers, problem solvers and entrepreneurs to get involved with technology and unlock the enormous creative potential it offers
  • The challenge in this brief is to create an appealing digital experience with a coding component for teenagers aged 13-16
  • Your challenge is to make sure we inspire not just teenagers in general, but teenage girls aged 13-16 in particular ** Ideally, we're looking for ideas that appeal to both boys and girls, however, we're particularly keen to see ideas that appeal to girls.

The early stages involve a idea development day - called a Creative Studio. At this stage people either arrive with a team and some ideas and work them up in more detail on the day, or arrive with an idea they have the core of and are looking for people to help them work up the idea, or people simply interested in finding an idea they think is worth working on.

That results in a number of pitches for potential services/events/etc the BBC could take forward. Out of these a number are taken forward to a Build Studio.

What happened on the way to the Build Studio?

So, that's the background. I signed up for the studio because I was interested in joining a team and interested in working with people outside our department - with a grounding either in BBC editorial and/or an external agency's viewpoint. I also had the core of an idea with me - which I'll explain first.

A meeting the previous week an attendee asked the question "If the stereotype of boys is that they're interested in sports, what's the stereotype for girls?". The answer raised by another person there was "fashion". Now, if you're a lady reading this who says "don't be so sexist, I'm not interested in fashion", please bear in mind I don't like sports - things like the commonwealth games, olympics, world cup, etc all leave me cold. Also please bear in mind the person asking the question was a lady herself, and the answer came from another lady at the meeting.

Anyway, true or false, that idea mulled with me over the weekend - with me wondering "what's the biggest possible draw I can think of regarding fashion?". Now the closest I get to fashion is watching Zoolander, so I'm hardly a world authority. Even I know about London Fashion week.

So, the idea I took with me to the creative studio was essentially wearable tech at london fashion week. I'd fleshed out the idea in my head more like this:

  • The BBC hosts an event at London Fashion week regarding wearable tech - with the theme of costumes for TV - which widens it beyond traditional fashion to props and so on. So you can have the light up suits and dresses but also props like wristbands, shoes, animatronic parrots, and so on. (Steampunk pirate?)

  • Anyone and everyone can attend, BUT the catch is they must have created a piece of wearable tech to take to the event - if they do however, they can take their friends with them, and one of them - either themselves or their friends can wear it down the catwalk.

To support this, you could have a handful of TV programmes and trails leading up to this, a web app for designing and simulating garments/etc of your choice, tutorials for how to build garments, and in particular allow transferable skills between the web app to creating programmable garments. So the TV drives you to the web, which drives you to building something, which drives you to an event, which drives you onto TV.

Now at the creative studio, there's essentially a "I've got this kind of idea" wall first thing in the morning - where if you've got an idea you can stand in front of those interested in joining a team and describe the idea- and then say what sort of skills/etc that you're after.

In my case I described the above idea, and said that I was primarily looking for people with an editorial or business oriented perspective. After all, while the above sounds good to me, I was never a teenage girl, and lets face it I was in the geeky non-cool demographic when I was a teenage boy :-) First person who joined me was Emily who has worked on various events at the BBC, primarily in non-techie areas including Radio 1's Big Weekend, Ben from a digital Agency and Tom from Nesta - a great mix of people. Thus formed Team Mix.

Team Mix's refined idea

We worked the idea through the day - with every element up for grabs and reshaping. The idea of being limited to London Fashion week was changed to be the wider set of possibilities - after all the BBC does lots of events from real physical ones like Radio 1 Big Weekend, through glastonbury, the proms, Bang goes the theory, through to event TV including things like The Voice.

After a session with a member of the target audience, we realised that while the idea was neat, for a teenager asking them to build something for a catwalk is actually asking an awful lot. Yes, while showing off, and finding your own identity is a big thing, not losing face, and not doing something to lower your status is a real risk. "I couldn't make that". As a result, we realised that this could be sidestepped by whatever the event was being changed to "this is a collection of your favourite stars wearing these programmable outfits and you get to decide what and how those clothes behave". However, on the upside this made the web element much clearer and better connected.

It left the accessories part slightly less connected, but still the obvious starting point for building wearable tech. This left the idea therefore as a 3 component piece:

  • Celebs with outfits that are programmable - a trend that seems to be happening anyway, and these would be worn at some interesting/cool event - to be decided in conjunction with a group running interesting/cool events :-)

  • A website for creating behvaiours via simple programming - allowing people to store and share their behaviours - the idea being that they control the outfits the celebs will wear.

  • Tutorials for building wearable tech accessories - which are designed to be programmed using the same behaviours as the celeb outfits - making the skills transferable.

So the idea was pitched at the creative studio - and ours got through to the Build Studio stage.

Taking wearable tech to the Build Studio

Out of the 20-30 ideas pitched, it was one of 9 to get through to the build studio. However, another team had ideas related primarily to Digital Wrist Bands, and our team - Team Mix - was asked to join with theirs - Team DWB - resulting the really non-sexy name Team Mix/DWB. That mean 8 teams at the studio. One of the teams didn't show, meaning 7 builds.

It transpired that out of the two teams, only myself and Emily could make it to the build studio - so we chatted to Ben and Tom about what they'd like to see achieved, and also to those on Team DWB. The core of their idea was complementary to our accessories idea so we decided to focus that part of our build on wristbands.

So, what was our ambition for the Build Studio? The ambition was pretty much to do a proof of concept across the board, and to describe the audience benefits in clearer, concrete terms. Again, there was the opportunity to bounce the ideas off a couple of teenagers.

We had help from the connected studio team in seeking assistance from other parts of FM, and after a call across all of BBC R&D, various groups in FM across the north and south we had a volunteer called Tom who was a work placement trainee.

I also ran our aspiration for the build studio via Paul Golds at work, and asked him if he wanted to be involved, and in particular what would be useful, and while he couldn't make it to the Build Studio, he did provide us we simple controllable web object.

What we built at the Build Studio

So the 3 of us set out to build the following over the 2 days of the build studio:

  • A wearable tech garment, which mirrors a web app, and in particular has a collection of programmable lights.

  • A web app for allowing a user to enter a simple program to control a representation of a web garment, with increasing levels of complexity. The app demonstrated implicit repetition, as well as sequence and selection.

  • A bracelet made of thermoplastic with programmable behaviours controlling lights on the device, perhaps incuding feedback/control using an LDR.

And that's precisely what we built.

The wearable garment was built as follows:

  • LEDs were sewn in strips - 8 at a time - onto felt. Conductive thread then looped through and tied on as positive and negative rails - like you would with a breadboard. Then attached underneath the garment so that the lights shone through.

  • The microcontroller for the garment was a Dagu Mini - which is a simple £10 device which is also REALLY simple to hook up to batteries and bluetooth. I've used this for building sumobots with/for cubs too. It's able to take a bit of bashing.

  • For each strip, the positive "rail" was connected to a different pin directly on the microcontroller - not necessarily the best solution, but works quite well for this controller.

  • Code for this was really simple/trivial - simple strobing of the strips. The real system would be more complex. In particular this is the sort of code in pastebin

    However, the "real" garment would use something like this iotoy library for control - to allow it to be controlled by bluetooth and also by an IOT stack.

The web app is a simple client side affair only, and uses a small simple DSL for specifying behaviours. This is then transformed into a JSON data structure for evaluation. This uses jquery, bootstrap, and Paul's awesome little web controllable light up dress. You can see this prototype on my website.

Again, bear in mind that this is a very simple thing, and the result of a 1 - 1.5 day hack by a work placement trainee - treat it nicely :-) The fact it generates a JSON data structure internally which could be uploaded/shared is as much the point as the fact it has 3 micro-tutorials around sequencing, control and selection.

The bracelet was created as follows:

  • Thermoplastic was used at the wristband body - about 5-10g. This is plastic that melts in hot water (above 60 degrees), and becomes very malleable.

  • A number of LEDs and and LDR

    • The LEDs connected to the 3 digital pins, A0 and A1, with A2 connected between an LDR and 10K resistor.
  • A DF Robot Beetle - which is a tiny (and cheap ~£4.50) Arduino Leonaro clone - though it's the size of a 10p piece, and only has (easy) access to 6 pins.

  • This was MUCH more fiddly than expected, and there's lots of ways of avoiding that issue. In the end the bracelet was electricly sound, though having access to a soldering iron (or rather a place where soldering could take place) would've been handy. We remarked several times that building the bracelet would've been easier at home multiple times - which I think is a positive statement really.

  • Again the cost part was very simple, and due to time constraints no behaviour of the LDR was implemented. On the upside, it does show how simple these things can be - http://pastebin.com/FzQqGSTv - again howvever, if the microcontroller was tweaked, you could use bluetooth and use something like the iotoy library above - allowing really rich interaction with other devices.

In fact we had a fair amount of this working before the meet the teenagers session. As a result we had concrete things to talk about with the teenagers, who warmed up when we point out the website and devices were the result of just a few hours work.

The feedback we had was really useful and great. This also meant that while physical and tech building continued on the second day, more time could be spent on the business case and audience benefit than might have otherwise - though never as much as you'd like.

Feedback we had was:

  • Make it chunky!
  • Can you make it velcro-able - so we can attach it to bags, clothing etc as well. ** It also made it more possible to add to things like belt buckles, jackets, hats etc.
  • Can you have these devices controllable / communicating?
  • Can you make the web app an Android/iOS app?
  • Can you add in more sensors?
  • Can it be repurposed?
  • Call the project DressCode - this is one of those things that is a moment of genius, and so obvious and so appropriate in hindsight.

This instantly solved a problem we had in terms of keeping wristband small and elegant - chunky makes everything alot simpler. Velcro was a good idea, but a but late in our build studio to start over. The idea of controlling and communicating is right up the angle we wanted to pursue (IOT type activities using the iotoy library) but didn't have time for in a 1 - 1.5 day hack.

The idea of having a downloadable app was one that we hadn't considered, but fits right in with the system - after all this would be able to directly interact with a bluetooth wristband, and the communications stack is already written. Having more sensors was obvious, and repurposing made sense if we switched to something using velcro.

It was a really useful session as you can imagine.

The Pitch :: Dress Code

So we carried though to the pitch - DressCode - Fashion of the Tech Generation. (After all, a generation now will be learning how to code behaviours of some kind - so this is a fun application of the skills they'll be learning)

We said in the creative pitch what we'd allow you to do, and said we were allowing you to do just that - as you can see from what our core aspects were vs our prototypes.

Some key aspects of making this work would be the development of kit forms for tech bands which could then be made and sold by 3rd parties and partner - assuming they meet a certain spec.

The ideas behind the pitch included self-expression - the ability to look like the group or different from the group - the ability to share behaviours, to be part of something larger for the coding to be a means of creating behaviours for something but also that it leads naturally into coding for other devices - such as indicator bands or gloves for cyclists, shoes for runners, bottom up fund raising as happens with loom bands, and as a starting project for learning about coding - with a strong/fun inspiration piece involving TV - from headlining glastonbury through to a light up suit for Lenny Henry for comic relief - where each red nose is a seperate pixel to be controlled.

We described the user journey from the TV to the webapp to the techband and back to events and the TV show.

Pitching doesn't come naturally to me - the more outgoing you are in such situations the more likely your idea gains currency with others. Emily led our pitch and I though did a sterling job.

Competition however was very fierce. If our idea doesn't go through, it won't be because it's a bad pitch or a bad idea, it'll just because a different pitch is thought to be stronger/more appropriate for various reasons.

Hindsight

Hindsight is great. It's the stuff you think of after the time you needed it. In particular, one thing we were ask to do - and I think we kinda addressed it - at the build studio was to deal with the link between the techbands and the online web app and TV show. The way we dealt with it there was to decide to make available the same DSL to both wristbands (or blinkenbands as I nicknamed the code) and garments. ie to allow the same behaviour online or on a garment to control a blinkenband and so on.

That's pretty good, but there was a better, simpler solution staring us in the face all along :

  • The felt strips we sewed LEDs onto then wired up with conductive thread, were pretty simple to make. Even sewing in the Beetle would've been pretty simple.

  • Putting velcro onto those would've been simple - meaning each piece of felt could be a blinkenband. Also each blinkenband could be attached inside a garment (as each strip was), allowing at once both more complex behaviours but also a much cleaer link between the wristbands and the garments.

  • Also, attaching bands inside a pre-existing garment also drastically reduced the risk element for teenagers in building a garment - if it didn't look good, you still had wristbands. If it DID work, you gain more kudos. Furthermore, if you did this there's an incentive to make more than one wristband with two side effects - firstly it encourages more experimentation and play - the best way to learn but also leads to people potentially selling them to each other, secondly it encourages people to make more than one - since once they've done one, if they do more they could make an outfit that's all their own.

  • Also, if you did this, you have an activity that while a little pricey (about £10 each all in) it IS the sort of activity a Guides group would do - especially at camp (assuming a pre-programmed microcontroller ) in part because it's a group that tend to do more crafts type activities and would actually find the light useful in this situation. While people have ... "isn't that rather 19th century" ... views of things like Guides/Scouts, they do both cover the target demographic, and out of the two, it seems more likely to get Guides happily making felt/fabric programmable blinkenbands than Scouts - based on who the two sets attract.

    For me, it's this final thought that made me think that this would definitely be a good starting point - since it seems a realistic way of connecting with the demographic. (Ironically - getting leaders on board in groups would perhaps be harder work!)

Closing thoughts

All in all, an interesting and useful couple of days, leaving me with some clear ideas of how to take things forward - with or without support from connected studio - which I think makes this a double win. Obviously getting 6-8 weeks to work on this for a pilot would be preferable to trying to cram it into my own time, but to me the Build Studio definitely proved the concept.

As mentioned an actual real world example of this would have to:

  • Identify a realistic event
  • Make the audience benefit clearer, figure out a marketing strategy
  • Build a better/more concrete garment - I started unpicking my jacket, but didn't have time to pixelise it...
  • Build a better web app - close the loop to controlling the garment, linking to users
  • A downloadable app - which links to the online account for spreading behaviours to a device.
  • Better tutorials, and kits for building tech bands.

All in all a great and productive couple of days.

Next up, building robots for the BBC academy to teach basics of coding to BBC Staff (though I suspect they'd like tech bands too :-), but that's another day.

Later edit

(27th August)

Well, this is a later edit, and a shame to mention this, but Dresscode did not make it through the Connected Studio to Pilot. That's the nature of competitive pitching though, which is both sad, and awesome because it means a better idea did go through.

Regarding Dresscode itself, the Connected Studio team also said the following:

With regards to continuing your work on this independently, the idea is your IP and as it did not get taken to pilot the BBC doesn't retain any IP on Dress Code. The only caveat is that the idea cannot be pitched at another Connected Studio event. However, the Connected Studio modus operandi does not replace BBC commissioning, so feel free to progress the idea this way. We must add that if choosing the BBC commissioning route, it's up to you to talk to your line manager about this before any steps are taken.

So while Dresscode as a Connected Studio thing is no more, it could be something I could do independently later. We'll see.

Read and Post Comments

Guild - pipelinable actors with late binding

March 07, 2014 at 11:51 PM | categories: open source, python, iot, bbc, actors, concurrency, kamaelia | View Comments

Guild is a python library for creating thread based applications.

Threads are represented using actors - objects with threadsafe methods. Calling a method puts a message on an inbound queue for execution within the thread. Guild actors can also have stub actor methods, representing output. These are stub methods which are expected to be rebound to actor methods on other actors. These stub methods are called late bind methods. This allows pipelines of Guild actors to be created in a similar way to Unix pipelines.

Additionally, Guild actors can be active or reactive. A reactive actor performs no actions until a message is received. An active guild actor can be active in two main ways: it can either repeatedly perform an action, or more complex behaviour can use a generator in a coroutine style. The use of a generator allows Guild actors to be stopped in a simpler fashion than traditional python threads. Finally, all Guild actors provide a default 'output' late-bindable method, to cover the common case of single input, single output.

Finally, Guild actors are just python objects and actors with additional functionality - it's designed to fit in with your code, not the other way round. This post covers some simple examples of usage of Guild, and how it differs (slightly) from traditional actors.

Getting and Installing

Installation is pretty simple:

$ git clone https://github.com/sparkslabs/guild
$ cd guild
$ sudo python setup.py install

If you'd prefer to build, install and use a debian package:

$ git clone https://github.com/sparkslabs/guild
$ cd guild
$ make deb
$ sudo dpkg -i ../python-guild_1.0.0_all.deb

Example: viewing a webcam

This example shows the use of two actors - webcam capture, and image display. The thing to note here is that we could easily add other actors into the mix - for network serving, recording, analysis, etc. If we did, the examples below can be reused as is.

First of all the code, then a brief discussion.

import pygame, pygame.camera, time
from guild.actor import *
pygame.camera.init()

class Camera(Actor):
    def gen_process(self):
        camera = pygame.camera.Camera(pygame.camera.list_cameras()[0])
        camera.start()
        while True:
            yield 1
            frame = camera.get_image()
            self.output(frame)
            time.sleep(1.0/50)

class Display(Actor):
    def __init__(self, size):
        super(Display, self).__init__()
        self.size = size

    def process_start(self):
        self.display = pygame.display.set_mode(self.size)

    @actor_method
    def show(self, frame):
        self.display.blit(frame, (0,0))
        pygame.display.flip()

    input = show

camera = Camera().go()
display = Display( (800,600) ).go()
pipeline(camera, display)
time.sleep(30)
stop(camera, display)
wait_for(camera, display)

In this example, Camera is an active actor. That is it sits there, periodically grabbing frames from the webcam. To do this, it uses a generator as a main loop. This allows the fairly basic behaviour of grabbing frames for output to be clearly expressed. Note also this actor does use the normal blocking sleep function.

The Display Actor initialises by capturing the passed parameters. Once the actor has started, it's process_start method is called, enabling it to create a display, it then sits and waits for messages. These arrive when a caller calls the actor method 'show' our its alias 'input'. When that happens the upshot is that the show method is called, but in a threadsafe way - and it simply displays the image.

The setup/tear down code shows the following:

  • Creation of, and starting of, the Camera actor
  • Creation and start of the display
  • Linking the output of the Camera to the Display
  • The main thread then waits for 30 seconds - ie it allows the program to run for 30 seconds.
  • The camera and display actors are then stopped
  • And the main thread waits for the child threads to exit before exitting itself.

This could be simplified (and will be), but it shows that even though the actors had no specific shut down code, they shut down cleanly this way.

Example: following multiple log files looking for events

This example follows two log files, and grep/output lines matching a given pattern. In particular, it maps to this kind of command line:

$ (tail -f x.log & tail -f y.log) | grep pants

This example shows that there are still some areas that would benefit from additional syntactic sugar when it comes to wiring together pipelines. In particular, this example should be writable together like this:

Pipeline( Parallel( Follow("x.log"), Follow("y.log"),
          Grep("pants"),
          Printer() ).run()

However, I haven't implemented the necessary chassis yet (they will be).

Once again, first the code, then a discussion.

from guild.actor import *
import re, sys, time

class Follow(Actor):
    def __init__(self, filename):
        super(Follow, self).__init__()
        self.filename = filename
        self.f = None

    def gen_process(self):
        self.f = f = file(self.filename)
        f.seek(0,2)   # seek to end
        while True:
            yield 1
            line = f.readline()
            if not line: # no data, so wait
                time.sleep(0.1)
            else:
                self.output(line)

    def onStop(self):
        if self.f:
            self.f.close()

class Grep(Actor):
    def __init__(self, pattern):
        super(Grep, self).__init__()
        self.regex = re.compile(pattern)

    @actor_method
    def input(self, line):
        if self.regex.search(line):
            self.output(line)

class Printer(Actor):
    @actor_method
    def input(self, line):
        sys.stdout.write(line)
        sys.stdout.flush()

follow1 = Follow("x.log").go()
follow2 = Follow("y.log").go()
grep = Grep("pants").go()
printer = Printer().go()

pipeline(follow1, grep, printer)
pipeline(follow2, grep)
wait_KeyboardInterrupt()
stop(follow1, follow2, grep, printer)
wait_for(follow1, follow2, grep, printer)

As you can see, like the bash example, we have two actors that tail/follow two different log files. These both feed into the same 'grep' actor that matches the given pattern, and these are finally passed to a Printer actor for display. Each actor shows slightly different aspects of Guild's model.

  • Follow is an active actor. It captures the filename to follow in the initialiser, and creates a placeholder for the associated file handle. The main loop them follows the file, calling its output method when it has a line. Finally, it will continue doing this until its .stop() method is called. When it is, the generator is killed (via a StopIteration exception being passed in), and the actor's onStop method is called allowing the actor to close the file.

  • Grep is a simple reactive actor with some setup. In particular, it takes the pattern provided, compiles a regex matcher using it. Then any actor call to its input method results in any matching lines to be passed through via its output method.

  • Printer is a simple reactive actor. Any actor call to it's input method results in the data passed in being sent to stdout.

Work in progress

It is worth noting that Guild at present is not a mature library yet, but is sufficiently useful for lots of tasks. In particular, one area Guild will improve on in - specifying coordination more compactly. For example, the Camera example could become:

Pipeline( Camera(),  Display( (800,600) ) ).run()

That's a work in progress however, adding with other chassis, and other useful parts of kamaelia.

What are actors?

Actors are threads with a mailbox allowing them to receive and act upon messages. In the above webcam example, it has 2 threads, one for capturing images, and one for display. Images from the webcam end up in the mailbox for the display, which displays images it receives. Often actor libraries wrap up the action of sending a message to the mailbox of an actor via a method on the thread object.

The examples above demonstrate this above via the decorated methods:

  • Display.show, Grep.input, Printer.input

All of these methods - when called by a client of the actor - take all the arguments passed in, along with their function and place on the actor's mailbox (a thread safe queue). The actor then has a main loop that checks this mailbox and executes the method within the thread.

How does Guild differ from the actor model?

In a traditional actor model, the code in the camera Actor might look like this:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
import pygame, pygame.camera, time
from guild.actor import *
pygame.camera.init()

class Camera(Actor):
    def __init__(self, display):
        super(Camera, self).__init__()
        self.display = display

    def gen_process(self):
        camera = pygame.camera.Camera(pygame.camera.list_cameras()[0])
        camera.start()
        while True:
            yield 1
            frame = camera.get_image()
            self.display.show(frame)
            time.sleep(1.0/50)
  • NB: This is perfectly valid in Guild. If you don't want to use the idea of late bound methods or pipelining, then it can be used like any other actor library.

If you did this, the display code would not need any changes. The start-up code that links things together though would now need to look like this:

display = Display( (800,600) ).go()
camera = Camera(display).go()
# No pipeline line anymore
time.sleep(30)
stop(camera, display)
wait_for(camera, display)

On the surface of things, this looks like a simplification, and on one level it is - we've removed one line from the program start-up code. Our camera object however now has its destination embedded at object initialisation and it's also become more complex, with zero increase in flexibility. In fact I'd argue you've lost flexibility, but I'll leave why for later.

For example, suppose we want to record the images to disk, we can do this by adding a third actor that can sit in the middle of others:

import time, os
class FrameStore(Actor):
    def __init__(self, directory='Images', base='snap'):
        super(FrameStore, self).__init__()
        self.directory = directory
        self.base = base
        self.count = 0

    def process_start(self):
        os.makedir(self.directory)
        try:
            os.makedirs("Images")
         except OSError, e:
            if e.errno != 17: raise

    @actor_method
    def input(self, frame):
        self.count += 1
        now = time.strftime("%Y%m%d-%H%M%S",time.localtime())
        filename = "%s/%s-%s-%05d.jpg" % (self.directory, self.base, now, self.count)
        pygame.image.save(frame, filename)
        self.output(frame)

This could then be used in a Guild pipeline system this way:

camera = Camera().go()
framestore = FrameStore().go()
display = Display( (800,600) ).go()
pipeline(camera, framestore, display) 
time.sleep(30)
stop(camera, framestore, display) 
wait_for(camera, framestore, display)

It's for this reason that Guild supports late bindable actor methods.

What's happening here is that the definition of Actor includes this:

class Actor(object):
    #...
    @late_bind_safe
    def output(self, *argv, **argd):
        pass

That means every actor has available "output" as a late bound actor method.

This pipeline called:

pipeline(camera, display)

Essentially does this:

camera.bind("output", display, "input")

This transforms to a threadsafe version of this:

camera.output = display.input

As a result, it replaces the call camera.output with a call to display.input for us - meaning that it is as efficient to do camera.output as it is to do self.display.show in the example above - but significantly more flexible.

There are lots of fringe benefits of this - which are best discussed in later posts, but this does indicate best how Guild differs from the usual actor model.

Why write and release this?

About a year ago, I was working on a project with an aim of investigating various ideas relating to of the Internet of Things. (In particular, which definition of that really mattered to us, why, and what options it provided)

As part of that project, I wrote a small/just big though library suitable for testing some ideas I'd had regarding integrating some ideas in Kamaelia, with the syntactic sugar in the actor model. Essentially, to map Kamaelia's inboxes and messages to traditional actor methods, and maps outboxes to late bound actor methods. Use of standard names and/or aliases would allow pipelining.

Guild was the result, and it's proven itself useful in a couple out projects, hence its packaging as a standalone library. Like all such things, it's a work in progress, but it also has a cleaner to use version of Kamaelia's STM code, and includes some of the more useful components like pipelines and backplanes.

If you find it useful or spot a typo, please let me know.

Read and Post Comments