Preconditions for widespread literacy

I’ve talked quite a bit here about “universal programming literacy” — the prospect of a large proportion of of our population being able to get computers to do things with the power and flexibility that programmer’s take for granted. But to understand what would be the right preconditions for this, it might be useful to look at historical precedents.

It seems that dramatic increases in literacy come about by a confluence of motivation and technological enablement. For example, the 1951 introduction of the Fender electric bass, followed closely by the electric guitar and similar innovations, allowed young people to perform their own music — in modern parlance, to communicate as “makers” — to a large audience of their peers, without the need for access to highly specialized and expensive venues. Most of these young people had little or no formal musical training, yet six decades later the revolution they started still dominates popular music.

And just in the last several years, widespread access to affordable digital video cameras and digital editing software, as well as free worldwide distribution (the last thanks to YouTube) have led to an enormous increase in filmmaking literacy among people, similarly motivated by a desire to communicate as “makers” with a large cohort of their peers. Again, most of these kids had little or no formal training, yet the influence of this new literacy is clearly going to be transformative in the decades to come.

There are many similar examples throughout history of a means of production going viral and thereby leading to a new form of widespread literacy, and they all seem to follow the same basic pattern.

For programming, what would be the equivalent combination of (1) a set of technological innovations that lead to wide-spread enablement of programming production and distribution, and (2) motivation to communicate via programming as a “maker” to a large cohort of one’s peers?

3 thoughts on “Preconditions for widespread literacy”

  1. I think to a large extent, what you’re describing happened in the late 1970’s (at least as far as #1 is concerned). Take a look at the first wave of mass market personal computers introduced then – the Commodore PET & C64, Radio Shack TRS-80, the Apple II. When you turned any of these on, they greeted you with the prompt “READY” This was an invitation to start coding in BASIC, which was already built into the machine’s ROM, ready to go. There really wasn’t much you could do with the machine until you either typed in a BASIC program or loaded one from a cassette tape.

    Sharing was a bit clumsier with out the Internet, but there were still vibrant communities making and sharing code. Popular computer magazines of the day often published listings of BASIC code, and local user groups swapped tapes with BASIC programs..

    Several years down the road, the “READY” prompt disappeared, replaced with the Mac & DOS/Windows environments we see today. These invite you to click on canned software (“apps”) written by somebody else. A software development environment (like BASIC) was no longer a built-in part of the machine, but an esoteric purchase.

    A lot of veteran software developers got their start on early PCs using BASIC; if you were sitting in front of a TRS-80 in 1979 at the local Radio Shack, programming was about the -only- thing you could even do with the computer.

    Today, anybody sitting in front of a web browser has an avalanche of resources for learning to program (CodeAcademy, StackOverflow, JSFiddle, GitHub, the list is huge). But it’s not the first thing greeting you when you turn the computer on.

    READY

  2. Perhaps Twitter’s prepended hashtag and Facebook’s ‘@’ + username to tag a user will come to be seen as some of the first steps towards a sort of mass programming literacy.

    I can imagine such simple bits of every day, task-specific ‘code’ accruing over time into an increasingly usable rudimentary programming language of sorts. For instance, once ordering a pizza is as simple as tweeting, say with a few keywords and a symbol or 2, it seems like the necessary logical structures to describe and refine a complicated order could be grasped almost intuitively.

    I suppose such a haphazard accretion of code would be a computer scientist’s idea of a complete mess. I imagine cave painting was messy too.

  3. Andras: I agree with you on this one. Quite a few years ago I was trying to work out a computer language that would be embedded in IM messages, and I strongly suspect that others have tried similar things. One interesting question is how such a language evolves. It might be ideal for an open platform to allow users to experiment on their own, in a sort of crowd sourcing fashion. It’s not clear to me that Twitter would agree to support such an open development platform, because of potential security concerns (programmers can do powerful things once they know how).

    I wouldn’t worry about the haphazardness. After all, computer scientists have made some very beautiful and elegant computer languages, yet the current de facto standard programming language for the Web — around which the very impressive computer scientists at Google have rallied — is anything but elegant.

Leave a Reply

Your email address will not be published. Required fields are marked *