Why did coding at school level disappear ?

October 18, 2011 at 11:11 PM | categories: kidscoding | View Comments

As before, this post covers a fair range of points, I hope, intended to support my summary. If you just want though, you can skip to the summary.

In my earlier post Children and Computers I was outlined some of the key issues relating to kids and coding. There were 2 points I specifically mentioned that I think deserve dealing with first.

The first of these two I've already written about, and had some great responses. This post is about the second. I don't think anyone can claim to have absolute truth in these matters, since the UK educational system is not 100% homogenous. (Indeed, they still have the archaic divisive system of grammar schools and eleven plus in this area). However, I'm trying to describe things as I saw them both at the time and with the benefit of hindsight.

So, why did coding disappear at school level? Is that even the right question? Not really, so let's ask a better question : Why was 'computer studies' replaced with 'IT' ?

Where we left off

In my previous post I was talking about the early eighties - primarily when I was in the juniors at primary school. In days gone by that was called years 3 and 4 of junior school, these days it'd be called years 5 and 6. I talked in a lot about the environment that led to it being socially acceptable, and pragmatically beneficial for even a low income family to consider owning a micro. I summed up there one of the main reasons why kids learnt to code as being - access to free software (Free as in gratis for those who get antsy about definitions).

Again, like many things, at least many things were happening at once.

  • The BBC Computer Literacy project was in its hey day. Computers were going into family homes and schools. Programmes were inspiring people as to the possibilities these devices could provide - both on BBC and ITV. Interestingly, the BBC focussed on the less commercial aspects of computers, whereas ITV focussed on games. Both were complementary.
  • In schools, they started grappling with what they could do with computers in schools : how best assist with teaching. There was very little software for these machines, so usage was influenced by people like Seymour Papert, and in particular things like Logo. Whilst by today's standards, those micros were less capable than the arduino's and similar of today, they were also as powerful as the sort of computers that put people into space.
  • Programmes like The Great Egg Race were on TV (76-86 or so) - invention was part of the psyche of the time, and celebrated.
  • In homes, parents saw the opportunity for micros to give their children the opportunity for a better, well paid, job when they grew up. Bear in mind that this was a time of recession, after the recessions of the 70s, and millions of people unemployed. The idea that any kid could write a game and make a fortune from their expertise was something that inspired kids and adults alike. I suspect that most kids really thought "wow", whereas adults (used to longer term planning :-) saw the potential.
  • Kids saw the opportunity for free games to play with their friends.
  • Games were distributed on tapes which could be copied even at home

That's a pretty heady mix. Out of this, lots of kids learnt to write games to play with their friends, that people like their friends would be willing to buy, which led to success stories and a virtuous circle.

I want to argue here that this industry, this legacy is a side effect of the law of unintended consequences . Consequences of clear intention: let's educate the population to be computer literate. It's a fantastic legacy, but one that when the literacy project was started doesn't appear to have been an initial aim - they planned to make a 10 part series targetted at adults initially and weren't sure as late as 1981 who the audience would really be, and were thinking of targetting an even more niche audience. It's also also a consequence of all the other things that were clearly going on too.

Schools

So let's get into schools. This really means 2 things:

  • Computers and computing as a subject worth learning and in need of teaching
  • Computers as a tool for teaching and enabling learning.

These are two very different activities. The latter is where the bulk of the rest of this post follows, the former is an idea that really was ahead of it's time. Some people, even then had been way ahead of everyone else. For those who like me like books, even today, there's a great book with a terrible title - The New Media Reader. It has many seminal papers and extracts of seminal books. One of these is an excerpt of a book by Seymour Papert : Mindstorms Children, Computers and Powerful Ideas. This book was published in 1980 which itself seems early, but to give you a flavour of what it's like:

In the LOGO environment the relationship is reversed: The child, even at preschool ages, is in control: The child programs the computer. And in teaching the computer how to think, children embark on an exploration about how they themselves think..
...
This powerful image of child as epistemologist caught my imagination while I was working with Piaget. In 1964, after 5 years at Piaget's Centre for Genetic Epistemology in Geneva, I came away impressed by his way of looking at children as active builders of their own intellectual structures..
...
when a child learns to program, the process of learning is transformed. It becomes more active and self-directed. In particular, the knowledge is acquired for a recognizable personal purpose. The child does something with it. The new knowledge is a source of power and is experienced as such from the moment it begins to form in the child's mind.

Seen in this light, LOGO was always really designed to support the latter point of being a tool for enabling learning, but was mis-categorised by many as learning about computers and how they work. In practical terms, yes, it fits both categories, and yes, by definition by help with teaching 'computers' as a subject, but it also explains longevity - it is 44 years old, having been created in 1967. Yep, Seymour Papert was interested in teaching children using computers before man had stepped on the moon, when the vast majority of computers still used punch cards to be programmed.

As a result, given this long legacy, interest in LOGO was a given. People built turtle robots. People wrote LOGO interpreters for micro computers, including the BBC micro, and many a school ended up teaching preschool children to control little robots to run around the floor, either following a line, or drawing a line. Thing is, they didn't really know how or why to go further at primary school. If you think about it, that really makes sense - very few adults - teachers specifically - had had access to computers prior to the advent of the micro. How can we integrate it into teaching? How can we teach what's relevant to a 12 year old, if we don't know ourselves ?

That's not to say that people didn't try, but it wasn't really co-ordinated beyond these basics. There also wasn't any internet to enable bottom up co-ordination, and sharing of knowledge, let alone lesson plans. Heck, even as early as 1983, people tried creating custom versions of prolog (yes, prolog!) for children - microprolog - and found that yes, given the right tools children could learn programming and to their benefit. There were journals for teachers, published by Heineman with names like "Computers in Schools" and "Micro-scope". People even looked at teaching children COMAL...

The problem is, aside from a few far thinking people, the question of how to integrate with the curriculum wasn't really clear. Heck this was before the days of a national curriculum, which made it even harder. As a result, it shouldn't really come as any suprise that once people had used logo to essentially learn about geometry - leaning logo as they went - that the focus was mainly on "enabling or enhancing learning".

So what happened?

Well, if we take a snapshot in time, in the early part of the eighties:

  • Children in primary school learnt to use LOGO, essentially to learn geometry, though under the guise of "learning computers"
  • Children in secondary school, first 3 years, used educational software to enhance existing learning. This was often in the form of math related games. This was pretty painful horrid stuff that most kids, used to better games at home, really didn't enjoy. That's a rather important detail - they didn't enjoy "edutainment" software as it became called.
  • Then in the later years, there were CSEs and GCE O-Levels. It was common to select the subjects you would do, some would be new - a new language, technical drawing, woodwork, some would be old - maths, english, sciences. Both of these predated me, so it's worth noting that CSEs were targeted at a lower ability group, and O-Levels targetted at a higher ability group. As a result, it was possible to create a subject - computer studies - which was aimed solely at O-Level students - who could be assumed to be more capable.
  • At sixth form, you had GCE A-levels. Again, computer studies was a subject that was made available.

As you can imagine, being a new technology in an area that most adults had never encountered, there was a need for new teachers for these areas. (In anyone from the US reads this, secondary school is like high school, and sixth form is like seniors, except 2 years - 16-18 rather than than 1)

Schools, Computers and Programming in the early-mid 80's

Marty McFly started his time travelling, and Supergirl was writing code, baffling her computing teacher. (yes, really, rewatch it if you don't believe me)

As a result, the Computer Studies O-Level was like a simplified subset of the subjects you'd learn at A-Level, and the A-Level course itself like a simplified version of a polytechnic's first year course in computing. So, as a result, children doing O-Level computing would be taught how to use computers, broadly how they were put together and used, the basics of how programs were written. Heck there was even a programming project that kids had to write aged 15. In a time dominated by exams as the sole metric that in itself was quite unusual, but highly appropriate. This project would almost certainly have been written on a BBC Micro for most kids doing GCSE's. (Though clearly those with a micro of any kind at home would stand a better chance with this subject)

What this means of course is that in the early-mid 80's whilst not everyone in a school learnt to code - they didn't, and whilst not every year of the school at secondary school learnt coding, they generally shared a school with someone who coded, who could code. It wasn't something "adults did after university". It wasn't something "you did at sixth form, maybe". It wasn't something that you couldn't go an ask, you know, another kid, what it was. It was a part of life - at least for kids.

In a way that was also part of its downfall. There were many kids who'd learnt to write code who didn't see any value in doing that O-Level because they already knew the stuff . As a result, if you think about it, that means even then the curriculum failed children. It got them started at primary school, they had the means to carry on at home, and by the time the school was ready again to work with them it was too late.

Now, I can't give more detail about what that was like in secondary school in the mid 80's because I only started secondary school in 1985, so we need to skip forward a couple of years.

Schools and Information Technology, late 80's and onwards

Skip forward to 1988, and things have changed:

  • The family computing market is transitioning to 16/32bit Micros. The common computers in homes started changing from Spectrums and Commodore 64s to Atari ST's and Commodore Amigas.
  • Business computing was established as something that was common - PC Clones ran DOS (mainly) and were commonly used for tasks we'd recognise today.
  • The BBC Micro was a long dead machine. It was technologically inferior in almost every way to the dominant machines of the day.
  • Acorn established ARM - as a means of creating RISC based CPUs for powering it's new 32bit machines. Due to Acorn's historic dominance in the educational market by making sure that their machines interoperated with and in some cases ran BBC Micro programs, they provided schools with a way of "keeping up with the times", and simultaneously not throwing away their investment. Their primary machine was the Acorn Archimedes.
  • Macs were around, and highly sought after, but extortionately expensive, their influence though was felt through the fact that the Atari ST, Amiga and Archimedes all used a windows/icon/menu and pointer environment - or WIMP as it used to be called. Macs, as today, were heavily used in artistics industries.
  • Changes in education, and educational approach.

The point being is that computers were becoming categorised by usage: entertainment, education and work. They were being seen more and more as tools for fun, learning and doing business. Naturally, given a new generation of tech was in place, people started realising that they hadn't really thought about "how do we teach people to be ready to use these things in business.

So now we hit the final piece of the puzzle - changes in education. In 1987, the last set of people ever took their CSEs and GCE O-Levels - with both qualifications replaced by a new fangled qualification called "the GCSE". Now you can convince me of many things, but you can't convince me that the name GCSE isn't derived from going GCE CSE and smashing the two together until you get something that roughly fits.

GCSE's were designed as a piece of social engineering. It's easiest to be anti-social in describing this so I will be. CSEs were subjects and exams taken by thickos, and GCE O-Levels were subjects and exams taken by know it all brainiacs. (Being offensive to both groups seems equally fair, and simpler than thinking of a politically correct approach)

To understand where that comes from, in the post war period they formalised the idea of grammar schools (for know it all brainiacs) and secondary modern schools (for thickos) sorted by a rather offensive test called the eleven plus. As noted there's pockets of the country that this still exists in, and I find it rather sad that I currently live in one. Primarily because it's wrong to label someone for life as a thicko simply because they were born at the wrong time of year making 3 short exams harder for them than for people almost a year older. (The eleven plus thing has similarities to the hockey player thing described by Malcom Gladwell in his book Outliers) As a result, the thickos got one exam, the brainiacs got the other.

Now this was considered, rightly in my view, deeply offensive, wrong and for lots of reasons that Outliers makes sense of (IMO) pretty dumb. This led in the 60's to the creation of the comprehensive school system where everyone in an area would attend, and have equal chance to shine as they went through school - rather than relying on a brief test at age 10 or 11. This led to a change in general ideology whereby 20 odd years later it seemed like a good idea to stop labelling people as thickos or brainiacs in terms of what exams they could do.

Thus the GCSE was born - a single grading structure to cover everyone of every ability. This adds in a challenge - every GCSE provided by an exam board was a subject that had to deal with the varying levels of natural ability you find in the world. Everyone's crap at something right? Most of use are pretty good at something. Where does that leave Computer Studies ? A subject that includes programming, that as programmers agree is a hard thing. It puts it on shaky ground.

That's not all though.

1988 also saw the introduction of the national curriculum. Now, there are again lofty positive goals behind the national curriculum, and I'm not really here to talk politics. I know that it's both liked in ideal and hated in practice by many, and also vice versa.

By the time I took my GCSEs in 1990, they'd been established for a few years - our year was the first year to have been taught from the first year of seconday school all the way through to GCSE's with a plan of education that was designed for GCSEs. This means we also saw some of the teething issues - though not the kind the poor buggers of 1988 and 1989 had to deal with. We had all fresh ones. It also means that we were amongst the first group of kids to have our exam results comparable across the country. A GCSE grade A from a south cambridgeshire exam board covered the same curriculum as a GCSE grade A from Cumbria. At least that's the theory, not sure about the practice.

Now this is all well and good, but this led inexorably to the idea that subjects taught should be as widely applicable to as wide a range of abilities and due to the fact that it was now possible to rank schools subjects taught should lead to reducing the risk of making the school look bad. GCSEs slowly led to a situation whereby they became a means of measuring school quality rather than student performance, or student learning. (Why else would numbers passing, and passing at higher grades, increase every year - people aren't getting inherently smarter after all)

On the ground, in schools this is where it really hit. Our school ceased teaching computer studies in 1988. So when we started our GCSEs in 1988 (they're a 2 year course for people outside the UK, so for me 88-89, 89-90), Computer Studies wasn't even available as an option - it was information technology or nothing.

To get what a bummer that was for me and people like me, betwen 1986 and 1988 we'd had a Commodore Plus/4 (me writing a basic logo implementation on the ZX81 I think had been noticed and the plus/4 was for us to grow into, and the plus/4 was sold cheap) I'd learnt enough from all those usborne books and computer magazines to be able to write my own games, move sprites around, manually create bitmaps, encode those as binary, and enter that as hex. I'd learnt 6502 machine code (which the 8501 used), to the extent that I'd managed to implement my own memory manager and task switcher such that I could run 2 basic programs side by side, each having half the screen, in a form of basic pre-emptive multitasking. Not bad for a 13 year old kid. So then I'd outgrown that, and my parents upgrade me and my brother in the summer of '88 to an Amiga 500. Looking back, I'm not sure how they afforded that, but I'm certain it was on credit.

So then I start my GCSEs ... and we can't do Computer Studies, we had to do information technology instead.

What was IT? Well, that was as it is (largely) today - about what computers could do, and what they were used for. Not only that it was evaluated solely on coursework - no exam. You had to demonstrate that you could drive a word processor (albeit one on RISC OS rather than "MS Word" today. You had to show you could drive a spreadsheet. And so on. This actually makes sense given the design features of the course:

  • Something which anyone of any ability can pick up
  • Is a collection of life skills which someone will need to learn in order to be productive in the future
  • Makes good use of the hardware available.

For someone who has a pretty good idea of where to start writing the apps though (or at least thinks it's possible) it's a rather sad thing.

Now as noted, GCSEs were new. Evaluation of people based on coursework not exams was new. Hardware was still expensive. People had paired up on their IT coursework because in an IT class there were enough machines for about 1/2 the kids. Not only this our school had had a miscommunication with the exam board - they thought working in pairs was fine, the exam board said "no" and gave the school 3 options - all the kids redo the work, each pair split the work in half and each kid redo half the work, or one from each pair take the coursework and the other redo it all.

I asked our examinations tutor if instead I could be entered for the computer studies GCSE. I knew that it had a large exam element - 50-70% still and a large project worth 30-50% of the mark, but was also confident of my abilities. My exams tutor negotiated with the IT teacher who essentially said "If he thinks he's good enough, I've got last year's papers here, let's test him on that to show him it's not practical". So that morning, without warning I was asked to sit the previous years papers, and marked accordingly. Given I'd've gotten an A on the basis of that, they really had to enter me for the exam as a result of their balls up. (The IT teacher was called Mr Balls ironically...) For my project I wrote a music paint program. (You quite literally painted notes) I'll let you guess how it all turned out... :)

Why was 'computer studies' replaced with 'IT' ?

Anyway, whilst that's all a rather odd thing to discuss here, it does illustrate quite keenly in my view some interesting points. First off, the fact that the computer studies GCSE existed, but that the school refused, even that early - late 80s - to offer it. The fact that the GCSE still had a programming element then. That the school resisted requests to do that GCSE, even though it meant no extra teaching. The question you have to ask really there is why.

I've periodically thought about that, and whilst today I think it'd be because they'd be loathed to risk low grades unless they had to. I think there's elements of practicality - if everyone has to do a programming project, and there's not enough machines then you simply don't offer it. Then there's also the fact that people who could do it, simply chose not to, because they saw little value in being bored for a couple of years in order to simply get a piece of paper.

However, I think the real issue really boils down to is this - there was 1 IT teacher. 600 children (20 classes I guess), That's a pretty large load, and the last thing you want there is too much customisation. 1 kid who you can just leave to get on with it, who can talk at your level? That's probably OK and the limit, but offering 2 different GCSEs would've doubled an already heavy load. You can buy more machines, or split classes, make them smaller, timeshare, heck even get parents to buy them by shopping at a particular supermarket of the day.

You can't however magically create more teachers with the knowledge.

And so it all came down, I think, to a choice: despite sufficient kit, given limited human resource and the fact that everyone has to take IT, do you offer the more technical (and IMO creative) aspects of computing or just train them to use the tools and packages of the day. (At least then they still had to teach principles rather than MS Office...). Do you favour a subset of those interested in making stuff, or the majority to be able to use the tech.

Since in schools a teacher's time is actually the scarcest resource, and the fact that bad grades make the school look bad, and that risky "harder" subjects have higher rates of low grades and alienate people, why wouldn't you drop Computer Studies? If you do, this does mean you drop programming from the curriculum, and by this time next year you'll be able to get all the young women who took computer studies at a-level and put them in a room together.

As a result, due to a collection of reasons that make sense in isolation, we now have a world where programming is not taught on the curriculum, and we are creating new vicious cycle of generations of adults and parents who have never known anyone as a kid writing software. Which is sad, since it probably traces back to a decision to try and make things fairer, more relevant and more useful - by throwing the baby out with the bath water.

The real reason though is that programming, computer studies, and the wider ideas of computational thinking (as we call it today) simply did not fit it with the curriculum of the day. These days, again, teaching it in complete isolation would be fraught with the same consequences - it would be cut as an option when resources are tight.

If however it could boost and augment the capabilities of the child, of the teacher and of the school - if it could be treated by them as a superpower granting exoskeleton, then it'd be more likely to be adopted - especially if it was in a cross curriculum way.

In a world where businesses are code, trading on the stockmarket to make real money is algorithm design, where artistry is the creation of physics models, where cartoons are built using rendering pipelines, where extras in CGI films are code, I think I'm with Seymour Papert, who I'll give (almost) the last word:

As with writing, so with music-making, games of skill, complex graphics, whatever: The computer is not a culture unto itself but it can serve to advance very different cultural and philosophical outlooks. For example, one could think of the Turtle as a device to teach elements of the traditional curriculum, such as notions of angle, shape, and co-ordinate systems.

... so why stop there ?

blog comments powered by Disqus