Disabling Comments etc

February 04, 2011 at 10:21 AM | categories: python, oldblog | View Comments

In preparation for a complete replacement of this site, I'm disabling comments and posting after this one. The UI that this site uses has been interesting, but now shows it's age and creaking, people have complained about the editor I put in place for comments (even though it's just a vanilla dojo toolkit editor) and some minor bugs on the server side (which never affected security). The focus of this blog was really to test of a bunch of ideas, some good, some bad. Next iteration will be nicer :-)
Read and Post Comments

Kamaelia Released

December 28, 2010 at 06:43 PM | categories: python, oldblog | View Comments

I'm happy to announce Kamaelia's 4th release of 2010: (Y.Y.M.r) Kamaelia is a component system based around unix-like concurrency/composition & pipelining. There's a strong focus on networked multimedia systems.

Kamaelia's license changed earlier this year to the Apache 2.0 License.

The release is divided up as follows:
  • Axon - the core component framework. Provides safe and secure message based concurrency & composition using generators as limited co-routines, threads, experimental process based support, and software transactional memory. Includes examples.

  • Kamaelia - A large Ol' Bucket of components, both application specific and generic. Components vary from network systems, through digital tv, graphics, visualisation, data processing etc. These reflect the work and systems that Kamaelia has been used to build. Includes examples.

  • Apps - A collection of applications built using Kamaelia. Whilst Kamaelia includes a collection of examples, these are either releases of internal apps or exemplars created by contributors.

  • Bindings - a collection of bindings we maintain as part of Kamaelia, including things like DVB bindings. (Bindings recently changed over to using Cython to make life simpler)




Detail of changes:


Overview of Changes in this release:
  • This rolls up (primarily) 3 application and examples branches. The core functionality for these, as ever, is in the main Kamaelia.Apps namespace, meaning these applications and examples are designed for inclusion or extraction into other applications relatively easily. As a result they act as exemplars for things like 3D visualisation, video and audio communications, twitter mining, database interaction and analysis and django integration. They're also useful (and used) as standalone apps in their own right.
  • Examples (and application components) added for using the 3D graph visualisation (PyOpenGL based) - one based on visualising collaborations, another based on viewed FOAF networks.
  • Whiteboard application extended such that:
    • It supports multiway video comms as well as multiway audio comms.
    • Adds support for "decks" (collections of slides which can be downloaded, saved, loaded, emailed, encrypted, etc)
    • Removes pymedia dependency
    • Change audio over to us PyAlsaAudio directly.
    • Adds support for calibrated touch screen displays to Pygame Display.
      • For example large digital whiteboards in addition to tablets etc.
  • Adds in a "Social Bookmarking system" that does the following:
    • Harvests a semantic web/RDF data store for realtime search terms (relating to live television broadcast)
    • Uses these search terms to search twitter, to identify conversations around the semantic web data.
    • Takes the resulting tweets, and stores them in a DB
    • Analyses the tweets (including fixing language for analysis using NLTK) for a variety of aspects, storing these in the DB
    • Presents the results (graphs of buzz/popularity around the content)
    • Additionally the system attempts to identify particularly interesting/notable moments based on audience conversations, and provides links back to the actual broadcast programmes.
    • Additionally provides an API for data, generates word clouds etc.
    • Front end uses Django and web graph APIs to presnet data.

Mailing list:

Have fun :-)

Read and Post Comments

Europython 2010 Videos Now Online

August 14, 2010 at 01:55 PM | categories: python, oldblog | View Comments

Just a brief note to say that all the Europython videos I and helpers recorded are now uploaded and online on blip.tv. Since not everyone is subscribed to mailing lists, please find below the list/summary that I sent to the list. If anyone has any objections to their talk being up, please note I will take it down. However, please note - it's there because it's great that you were willing to stand up and talk!

Also, the real thanks have to go to John Pinner, Richard Taylor and Alex Wilmer (and the rest of the crew), without whom this years' Europython wouldn't've been the same. Likewise the same can be said about the many speakers willing to step forward and talk about something they love. And many thanks to Marijn, Richard and Walter too for their fantastic help in producing these videos :-)
Europython Community -- Awards Ceremony -- Thankyou Christian Tismer

Adewale Oshineye TDD on App Engine

Ali Afshar Glashammer

Andrew Godwin Fun with Django and Databases

Austin Bingham Python from the Inside Out

Bart Demeulenaere Pyradiso

Bruce Lawson Keynote Open Standards democratising and future proofing the web

Conference Opening Comments Housekeeping

Daniel Roseman Advanced Django ORM techniques

David Read Open Data and coding data.gov.uk

Denis Bilenko gevent network library

Donald McCarthy Python and EDA

Europython Community Awards Ceremony 4 Thanking Christian DryRun

Geoffrey Bache PyUseCase Testing Python GUIs

Guido van Rossum Appstats

Guido van Rossum Keynote

Henrik Vendelbo Real Time Websites with Python

Holger Kregel py.test rapid multipurpose testing

Jonathan Fine JavaScript 4 Pythonistas

Jonathan Fine JavaScript and MillerColumns

Jonathan Hartley Hobbyist OpenGL from Python

Kit Blake Mobi a mobile user agent lib

Kit Blake Web mobile templating in Silva

Lennart Regebro Porting to Python 3

Marc-Andre Lemburg Running Ghana VAT on Python

Mark Fink Visualizing Software Quality

Mark Shannon HotPy A comparison

Matteo Malosio Python Arduino and Mech Music

Michael Brunton-Spall Open Platform The Guardian API 1 Year On

Michael Brunton-Spall The Guardian and Appengine

Michael Foord Unittest New And Improved Part 1 of 2

Michael Foord Unittest New And Improved Part 2 of 2

Michael Sparks Arduino and Python

Nicholas Tollervey Organise a Python code dojo

Nicholas Tollervey Understanding FluidDB

Paul Boddie et al Web SIG

Paul Boddie Web Collaboration and Python

Peter Howard Aerodynamics and Pianos

PyPy Status and News Part Overview 1 of 3

PyPy Status and News Part JIT Compilation 2 of 3

PyPy Status and News Part cpyext 3 of 3

Raymond Hettinger Code Clinic 1 of 3

Raymond Hettinger Code Clinic 2 of 3

Raymond Hettinger Code Clinic 3 of 3

Raymond Hettinger Tips and Tricks 1 of 2

Raymond Hettinger Tips and Tricks 2 of 2

Richard Barrett Small The Trojan Snake

Richard Jones Keynote State of Python

Rob Collins Introduction to SMTPP

Russel Winder Keynote

Scott Wilson Flatland Form Processing

Semen Trygubenko Python and Machine Learning

Soeren Sonnenburg SHOGUN

Stefan Schwazer Robust Python Programs

Steve Holden Awards Ceremony 1 shaky

Steve Holden Awards Ceremony 2 PSF Community Service Award

Steve Holden Awards Ceremony 3 Frank Willison Award

Tomasz Walen Grzegorz Jakacki Codility Testing coders

Wesley Chun Programming Office with Python

Zeth What does it all mean

Lightning Talks:
A better pdb

Albertas upicasa

Andreas Klockner PyCUDA

Ariel Ben Yehuda cfg.parser


Brett Cannon How to properly package your apps front end code

Brian Brazil Pycon Ireland

Care Team Network

Conference Close

Ed Crewe From Shell Scripting to Config Mangagement

Experiences from Python Barcamp Cologne

Fiona Burrows Write More Games

Headroid Arduino Robot Face

Heres What I Think hwit.org

Jonathan Fine The easiest quiz in the world

Jonathan Hartley Run Snake Run

Laurens Van Houtven Python + E == Mont-E

Luke Leighton A Cry For Help

Luke Leighton Pyjamas

Magic Folder File Syncing

Marc-Andre Lemburg Growing the PSF

Martijn Faassen How to Fail at Pyweek

Michael Brunton Spall Python Javascript and Ruby in half an hour

Michael Sparks Embracing Concurrency

Moin Moin 2.0

Monstrum and Mercurial For Legions

Plone Conference

Porting Skynet to Python 3

Pure Python Proxying httplib and urllib2

Python Status Information 1 of 2

Python Status Information 2 of 2

Richard Jones PyWeek

Richard Jones The Cheese Shop

Richard Taylor

Sarah Mount Open Ihm Richard Jones With Gui

Steve Holden PSF


Unladen Swallow

Zero 14

Other (aborted) lightning talks:




Share and Enjoy :-)
Read and Post Comments

If you were 7 again...

June 27, 2010 at 06:28 PM | categories: python, oldblog | View Comments

If you were 7 again, what would you expect to find in a book on beginning programming? I have some thoughts on this, and going to do this, but I'm curious to the thoughts of others.
Read and Post Comments

Python Magazine is dead ?

March 20, 2010 at 09:08 PM | categories: python, oldblog | View Comments

For the past several months the python magazine hasn't sent any new issues out. Indeed, since late last year they ripped out their website and had a banner saying "We're busy building a new python magazine", with a link laughably suggesting that there is more information available. This is after last year them getting several months behind with the magazine. They also said " Don't worry—your subscription and back issues are safe and will be available when the new site launches. ". That's fine, in theory. However consider:
  • Whilst they may let the grass grow under their feet they haven't bothered telling their subscribers. Paid subscribers.
  • They haven't bothered updating their website telling their customers what they're doing.
  • Indeed, they appear, from a subscriber point of view, to have simply cut and run.
I can't actually think what excuse they can come up with that justifies not actually bothering to contact subscribers for well over 1/2 a year, but I'm sure they have one. On the flip side, they don't have any contact address on their front page, nor on their content free "what we're doing" page. Beyond this, last year they decided, of their own volition to charge my credit card to renew my subscription. Now I was going to renew anyway - it's been a great magazine in the past, but charging my card without upfront consent struck me as rather dodgy.
Since they've now reneged on their half of the sale contract and not delivered, and actually have a good reason to need to get in contact with them, I can't. This means I'm left with 2 choices:
  • Either put out a public notice in the hope that it's something that someone there will read, and actually get back in contact to let me know how to contact them
  • Contact Visa and say that they're a rogue trader, and that they should be banned from making any further transactions against my card (especially given the last one was done without my explicit consent.
Neither is particularly attractive, and hopefully someone knows how to get in contact with them because they sure aren't advertising any contact details right now.

Finally, I get that it's a small publication, that it's one borne out of love, rather than profit (at a guess based on guestimates of costs), but if you're having trouble getting things back started, at least have the decency to tell your subscribers, rather than having content free "information" pages.

After all, a lot can change in 1/2 a year... (Last issue I have is from August 2009...)

(Sorry to anyone who reads this who have nothing to with the python magazine, but if you know someone there, please let me know who is "running" it these days)
Read and Post Comments

Kamaelia components from decorated generators. Pythonic concurrency?

October 04, 2009 at 10:52 PM | categories: python, oldblog | View Comments

A few months ago, there was a thread on the then google group python-concurrency about some standard forms for showing how some libraries deal with concurrent problems. The specific example chosen looked like this:
tail -f /var/log/system.log |grep pants
Pete Fein also posted an example of this using generators, based on David Beazley's talk on python generators being used as (limited) coroutines:
    import time
    import re

    def follow(fname):
        f = file(fname)
        f.seek(0,2) # go to the end
        while True:
            l = f.readline()
            if not l: # no data
                yield l

    def grep(lines, pattern):
        regex = re.compile(pattern)
        for l in lines:
            if regex.match(l):
                yield l

    def printer(lines):
        for l in lines:
            print l.strip()

    f = follow('/var/log/system.log')
    g = grep(f, ".*pants.*")
    p = printer(g)

    for i in p:

The question/challenge raised on the list was essentially "what does this look like in your framework or system?". For some reason, someone saw fit to move the mailing list from google groups, and delete the archives, so I can't point at the thread, but I did repost my answer for what was called "99 bottles" for kamaelia on the python wiki .

I quite liked the example for describing how to take this and convert it into a collection of kamaelia components, primarily because by doing so we gain a number of reusable components in this way. For me it was able describing how to move from something rather ad-hoc to something somewhat more generally usable.

For me, the point about Kamaelia is really that it's a component framework aimed at making concurrent problems more tractable & maintainable. Basically so that I can get stuff done quicker, that won't need rewriting completely to use concurrency, which someone else can hack on without needing to come back to me to understand it. In practice though, this also means that I tend to focus on building stuff, rather than asking "is it concurrent?". (Axon kinda ensures that it either is, or is concurrent friendly) This does sometimes also mean I focus on getting the job done, rather than "does this look nice"... Whilst that does matter to me, I do have deadlines like the next person :-)

For example, one thing missing from the above is that when you do something like:
    tail -f /var/log/system.log |grep pants
You aren't interested in the fact this uses 3 processes - tail, grep & parent process - but the fact that by writing it like this you're able to solve a problem quickly and simply. It also isn't particularly pretty, though I personally I view the shell version as rather elegant.

Naturally, being pleased with my version, I blogged about it. Much like anyone else, when I write something it seems like a good idea at the time :-). As sometimes happens, it made it onto reddit with some really nice & honest comments.

And what were those comments? If I had to summarise in one word "ugh!"

Whilst I don't aim for pretty (I aim for safe/correct :), pretty is nice, and pretty is fun. As a result, I've wanted to come back to this.Ugly is no fun :-( . Fun matters :-)

There was also a comment that suggested using decorators to achieve the same goal. However, at that point in time I had a mental block about what that would look like in this case. So I just thought "OK, I agree, can't quite see how to do it". I did recognise though that they're right to say that decorators would improve this case.

In particular the stumbling block is the way python generators are used in the above example is effectively a one way chaining. printer pulls values from grep. grep pulls values from follow. When one of them exits, they all exit. Essentially this is pull based.

In Kamaelia, components can be push, pull or push & pull. Furthermore they can push and pull in as many directions as you need. At the time mapping between the two sensibly it didn't seem tractable to me. Then this morning, as I woke blearily, I realised that the reason why. Essentially the above generator form isn't really directly the same as the shell form - though it is close.

Taking grep, for example, if I do this:
grep "foo" somefile
Then grep will open the file "somefile", read it, and output lines that match the pattern and exit.

However, if I do this:
bla | grep "foo"
Then grep will read values from stdin, and output lines which match the pattern. Furthermore, it will pause outputting values when bla stops pushing values into the chain, and exit when bla exits (after finishing processing stdin). ie It essentially has two modes of operating, based on getting a value or having an absent value.

In essence, the differences about what's happening here are subtle - in the shell we pass in a symbol which represents which stream needs opening, whereas in the example above, we pass in, effectively, an open stream. Also the shell is very much a combination of push and pull, whereas the generator pipeline above is essentially pull.

This made me realise that rather than activating the generator we want to read from *outside* the generator we're piping into, if we activate the generator *inside* the generator we're piping into, the problem becomes tractable.

For example, if we change this:
def grep(lines, pattern):
    regex = re.compile(pattern)
    for l in lines: # Note this requires an activate generator, or another iterable
        if regex.match(l):
            yield l
To this:
def grep(lines, pattern):
    "To stop this generator, you need to call it's .throw() method. The wrapper could do this"
    regex = re.compile(pattern)
    while 1:
        for l in lines(): # Note we activate the generator here inside instead
            if regex.search(l):
                yield l

We gain something that can operate very much like the command line grep. That is, it reads from its equivalent to stdin until stdin is exhausted. To indicated stdin is exhausted it simply yields - ie yields None. The caller can then go off and get more data to feed grep. Alternatively the caller can shutdown this grep at any point in time by throwing in an exception.

Making this small transform allows the above example to be rewritten as kamaelia components like this:
import sys
import time
import re
import Axon
from Kamaelia.Chassis.Pipeline import Pipeline
from decorators import blockingProducer, TransformerGenComponent

def follow(fname):
    "To stop this generator, you need to call it's .throw() method. The wrapper could do this"
    f = file(fname)
    f.seek(0,2) # go to the end
    while True:
        l = f.readline()
        if not l: # no data
            yield l

def grep(lines, pattern):
    "To stop this generator, you need to call it's .throw() method"
    regex = re.compile(pattern)
    while 1:
        for l in lines():
            if regex.search(l):
                yield l

def printer(lines):
    "To stop this generator, you need to call it's .throw() method"
    while 1:
        for line in lines():

    grep(None, ".*pants.*"),

The implementation for both decorators.py and example.py above can both be found here:
Similarly, if we wanted to use multiple processes, we could rewrite that final pipeline like this:
    from Axon.experimental.Process import ProcessPipeline

        grep(None, ".*pants.*"),
Specifically the above will use 4 processes. One container process, and 3 subprocesses. (ProcessPipeline would benefit from a rewrite using multiprocess rather than pprocess though)

The other nice thing about this approach is that suppose you wanted to define your own generator source like this:
def source():
    for i in ["hello", "world", "game", "over"]:
        yield i
You could use that instead of "follow" above like this:
For me, this has a certain symmetry with the change from this
tail somefile.txt | grep ".*pants.*" | cat -
to this:
grep ".*pants.*" source | cat -
ie if you pass in an absent value, it processes the standard inbox "inbox", rather than stdin. If you pass in a value, it's assumed to be a generator that needs activating.

Stepping back, and answering the "why? What does this give you?" question, it becomes more apparent as to why this might be useful when you start monitoring 5 log files at once for POST requests. For example, putting that all together in a single file would look like this:
(assuming you didn't reuse existing components :)
import sys
import time
import re
import Axon
from Kamaelia.Util.Backplane import Backplane, SubscribeTo, PublishTo
from Kamaelia.Chassis.Pipeline import Pipeline
from decorators import blockingProducer, TransformerGenComponent

def follow(fname):
    f = file(fname)
    f.seek(0,2) # go to the end
    while True:
        l = f.readline()
        if not l: # no data
            yield l

def grep(lines, pattern):
    regex = re.compile(pattern)
    while 1:
        for l in lines():
            if regex.search(l):
                yield l

def printer(lines):
    while 1:
        for line in lines():


for logfile in ["com.example.1", "com.example.2", "com.example.3","com.example.4","com.example.5"]:
        grep(None, "POST"),

Now, I don't particularly like the word pythonic - maybe it is, maybe it isn't - but hopefully this example does look better than perhaps than last time! The biggest area needing work, from my perspective, in this  example is the names of the decorators.

Since this will be going into the next release of Axon - any feedback - especially on naming - would be welcome :-).

(Incidentally, follow/grep have already been added to kamaelia, so this would really be simpler, but it does make an interesting example IMO :-)
Read and Post Comments

Restarting Python Northwest. 24th Sept

September 14, 2009 at 11:36 PM | categories: python, oldblog | View Comments

A few people will have already noticed some small comments about this, but we're plotting to restart python northwest. Specifically, we're restarting this month.

  • When: Thursday 24th September, 6pm
  • Who: Who can come? If you're reading this YOU can (assuming you're sufficiently close :-)
    More specifically anyone from beginners, the inexperienced through deeply experienced and all the way back to the plain py-curious.
  •  What: Suggestion is to start off with a social meet, and chat about stuff we've found interesting/useful/fun with python recently. Topics likely to include robots and audio generation, the recent unconference, and europython.

How did this happen? I tweeted the idea, a couple of others seconded it, the David Jones pointed out "it easier to arrange for a specific 2 people to meet than it was to e-mail a vague cloud of people and get _any_ 2 to meet anywhere.", so that's where we'll be.

If twitter feedback is anything go by, we're hardly going to be alone, so please come along - the more the merrier :-) Better yet, please reply to this post saying you're coming along!

More generally, assuming this continues, pynw will probably be every third thursday in the month, maybe alternating between technical meets and social ones. (probably topic for discussion :-)

Please forward this to anyone you think may be interested!

See you there!

Read and Post Comments

Traffic Server to be Open Source?!

July 07, 2009 at 12:02 PM | categories: python, oldblog | View Comments

If this happens this will be awesome. Traffic Server is some really nice code. It's a large codebase, but it's really cool, and it *scales*. (I used to work at Inktomi, so have been inside the code as well). For those that don't know what it is, it's a very high performance web caching proxy, with a plugin architecture, allowing for the addition of other protocols. It used to support HTTP (& obvious friends), NNTP, RTSP, RTP, WMV, etc.

That's pretty much made my day that has.
Read and Post Comments

Europython Videos Transcoding

July 07, 2009 at 01:02 AM | categories: python, oldblog | View Comments

Since I've had a few questions about this, a short status update. At Europython last week I was recording all the talks I was attending. Including the lightning talks this means I have video from 55 talks. The video files from the camera are too large for blip.tv, so I'm transcoding them down to a smaller size, before uploading them. Since these 55 talks are spread over nearly 80 files, that naturally takes time.

Fortunately/obviously, I'm automating this, and it'll come as no shock to some that I'm automating it using kamaelia. This automation needs to to be stoppable, since I need to only do this overnight, for practicality reasons.

Anyway, for those curious, this is the code I'm using to do the transcode & upload. You'll note that it saturates my CPU, keeping both cores busy. Also, it's interleaving an IO bound process (ftp upload) with CPU bound - transcode.

import os
import re
import Axon

from Kamaelia.Chassis.Graphline import Graphline
from Kamaelia.Chassis.Pipeline import Pipeline

class Find(Axon.Component.component):
    path = "."
    walktype = "a"
    act_like_find = True
    def find(self, path = ".", walktype="a"):
        if walktype == "a":
            addfiles = True
            adddirs = True
        elif walktype == "f":
            addfiles = True
            adddirs = False
        elif walktype == "d":
            adddirs = True
            addfiles = False

        deque = []
        deque.insert(0,  (os.path.join(path,x) for x in os.listdir(path)) )
        while len(deque)>0:
                fullentry = deque[0].next()
                if os.path.isdir(fullentry):
                    if adddirs:
                        yield fullentry
                        X= [os.path.join(fullentry,x) for x in os.listdir(fullentry)]
                        deque.insert(0, iter(X))
                    except OSError:
                        if not self.act_like_find:
                elif os.path.isfile(fullentry):
                    if addfiles:
                        yield fullentry
            except StopIteration:


    def main(self):
        gotShutdown = False
        for e in self.find(path = self.path, walktype=self.walktype):
            self.send(e, "outbox")
            yield 1
            if self.dataReady("control"):
                gotShutdown = True

        if not gotShutdown:
            self.send(Axon.Ipc.producerFinished(), "signal")
            self.send(self.recv("control"), "signal")

class Sort(Axon.Component.component):
    def main(self):
        dataset = []
        while 1:
            for i in self.Inbox("inbox"):
            if self.dataReady("control"):
            yield 1
        for i in dataset:
            self.send(i, "outbox")
            yield 1
        self.send(self.recv("control"), "signal")

class Grep(Axon.Component.component):
    pattern = "."
    invert = False
    def main(self):
        match = re.compile(self.pattern)
        while 1:
            for i in self.Inbox("inbox"):
                if match.search(i):
                    if not self.invert:
                        self.send(i, "outbox")
                    if self.invert:
                        self.send(i, "outbox")
            if self.dataReady("control"):
            yield 1
        self.send(self.recv("control"), "signal")

class TwoWayBalancer(Axon.Component.component):
    Outboxes=["outbox1", "outbox2", "signal1","signal2"]
    def main(self):
        c = 0
        while 1:
            yield 1
            for job in self.Inbox("inbox"):
                if c == 0:
                    dest = "outbox1"
                    dest = "outbox2"
                c = (c + 1) % 2

                self.send(job, dest)
                job = None
            if not self.anyReady():
            if self.dataReady("control"):
        self.send(R, "signal1")
        self.send(R, "signal2")

class Transcoder(Axon.ThreadedComponent.threadedcomponent):
    command = 'ffmpeg >transcode.log 2>&1 -i "%(SOURCEFILE)s" -s 640x360 -vcodec mpeg4 -acodec copy -vb 1500000 %(ENCODINGNAME)s'
    def main(self):
        while 1:
            for sourcefile in self.Inbox("inbox"):
                shortname = os.path.basename(sourcefile)
                encoding_name = shortname.replace(".mp4", ".avi")
                finalname = sourcefile.replace(".mp4", ".avi")
                # Do the actual transcode
                print "TRANSCODING", sourcefile, encoding_name
                os.system( self.command % {"SOURCEFILE": sourcefile, "ENCODINGNAME":encoding_name})

                # file is transcoded, move to done
                print "MOVING DONE FILE", sourcefile, os.path.join("done", sourcefile)
                os.rename(sourcefile, os.path.join("done", sourcefile))

                # Move encoded version to upload queue
                upload_name = os.path.join( "to_upload", encoding_name)
                print "MOVING TO UPLOAD QUEUE", encoding_name, upload_name
                os.rename(encoding_name, upload_name )

                # And tell the encoder to upload it please
                print "SETTING OFF UPLOAD",upload_name, finalname
                self.send( (upload_name, finalname), "outbox")
                print "-----------------"
            if self.dataReady("control"):
        self.send(self.recv("control"), "signal")

class Uploader(Axon.ThreadedComponent.threadedcomponent):
    command = "ftpput --server=%(HOSTNAME)s --verbose --user=%(USERNAME)s --pass=%(PASSWORD)s --binary --passive %(UPLOADFILE)s"
    username =
< editted :-) >
    password = < editted :-) >
    hostname = "ftp.blip.tv"
    def main(self):
        while 1:
            for (upload_name, finalname) in self.Inbox("inbox"):
                print "UPLOADING", upload_name
                os.system( self.command % {
                                     } )
                print "MOVING", upload_name, "TO", os.path.join("encoded", finalname)
                os.rename(upload_name, os.path.join("encoded", finalname))
                print "-----------------"

            if self.dataReady("control"):
            if not self.anyReady():
        self.send(self.recv("control"), "signal")

    FILES = Pipeline(
                     invert = True),
    SPLIT = TwoWayBalancer(), # Would probably be nicer as a customised PAR chassis
    CONSUME1 = Pipeline(
    CONSUME2 = Pipeline(
    linkages = {


It should be fairly clear that this will go as fast as it can, so please be patient :-)

Read and Post Comments

Autoloading in python

June 21, 2009 at 04:14 PM | categories: python, oldblog | View Comments

Before I started using python, I'd used perl for several years, and one thing which I'd liked about perl was its autoload facility. Now in python the closest equivalent that I've seen is __getattr__ for classes, but not __getattr__ for a module. This seemed like a real shame since there are times when autoload can be incredibly useful.
If it seems chaotic, consider the Unix PATH variable. Any time you type a name, the shell looks in lots of locations and runs the first one it finds. That's effectively the same sort of idea as autoloading. Yes, you can do some really nasty magic if you want, but then you can do that with the shell to, and generally people get along find.
Anyway, vaguely curious about it I decided to do some digging around, and came across this post by Leif K Brookes, which suggests this:
You could wrap it in an object, but that's a bit of a hack.

import sys

class Foo(object):
     def __init__(self, wrapped):
         self.wrapped = wrapped

     def __getattr__(self, name):
             return getattr(self.wrapped, name)
         except AttributeError:
             return 'default'

sys.modules[__name__] = Foo(sys.modules[__name__])

That looked reasonable, so I created a file mymod.py which looks like this:
import sys

def greet(greeting="Hello World"):
   print greeting

class mymod_proxy(object):
    def __init__(self):
        super(mymod_proxy, self).__init__()
        self.wrapped = sys.modules["mymod"]
    def __getattr__(self, name):
            return getattr(self.wrapped, name)
        except AttributeError:
            def f():
            return f

sys.modules["mymod"] = mymod()
And tried using it like this:
~> python
Python 2.5.1 (r251:54863, Jan 10 2008, 18:01:57)
[GCC 4.2.1 (SUSE Linux)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import mymod
>>> mymod.hello()
>>> from mymod import Hello_World
>>> Hello_World()
And as you can see, it seems to work as expected/desired.

Now the reason I'd been thinking about this, is because I'd like to retain the hierarchy of components in Kamaelia that we have at the moment (it's useful for navigating what's where), but given we tend to use them in a similar way to Unix pipelines it's natural to want to be able to do something like:
from Kamaelia import Pipeline, ConsoleReader, ConsoleWriter
Rather than the more verbose form specifically pulling them in from particular points. Likewise, we don't really want to import every single module in Kamaelia.py, because of the large number of components there (Kamaelia is really a toolbox IMO where things get wired together, and Axon is the tool for making new tools), the majority of which won't be used in ever application!

Now, I haven't done this yet, and wouldn't do it lightly, but the fact that you can actually make autoload functionality work, seems kinda cool, and and a nice opportunity. But I'm also now wondering just how nasty this approach seems to people. After all, Leif describes it as "a bit of a hack", and whilst it's neat, I'm not taking in the positive view. I'm interested in any views on better ways of doing Autoload in python, and also whether people view it as a nice thing at all. (One person's aesthetic is another person's pain after all...)
Read and Post Comments

Next Page »