April 2, 2015

The Future and History of Learning to Code

Category: Digital Learning
old computer in museum

Much of the discussion around ‘learning to code’ is couched in futuristic terms. By learning to code, we are told, young people will be equipped to become the innovators, tech entrepreneurs and civic leaders of the future. Yet, much less is said about the history underpinning learning to code, and how such an appreciation of its past might enrich our understanding of its future.

Future Codes

Before considering its past, it is worth reviewing some current claims about learning to code and its potential contribution to the future. For example, a recent UK report entitled “Young Digital Makers” claims that learning to code is a key part of growing a nation of “digital makers,” individuals with the skills and knowledge to create the digital products of the future, grow the economy, and contribute to societal improvement:

“A huge expansion is needed if we are to grow a nation of digital creators who can manipulate and build the technology that both society and industry are increasingly reliant on. This expansion cannot be left exclusively to professionals, however, as we simply don’t have enough of them. It will require the mobilisation of enthusiasts and interested amateurs, from parents and non-expert teachers, to those working in the tech industry, working and learning alongside young people to help meet this demand.”

The report provides a really useful summary of the state of digital making activities, including learning to code, and campaigns strongly for expansion in this area in order to achieve future social, economic and industry aspirations. Likewise, in order to address the need for more “digital professionals,” the BBC is even planning to give away a million “Micro Bits” coding devices to every high school student in the UK, as part of its ambitious “Make It Digital” campaign in support of the new nationwide computing curriculum.

Another way in which learning to code has been expressed is in relation to the emergence of ‘smart cities.’ The UK city of Glasgow has recently been awarded government funding to become a “future city” pilot. Much of the money is being spent on a high-tech command centre, and on upgrading the digital infrastructure of the city. But part of it, too, is being spent on a “future city literacies” programme that emphasizes the skills and competencies required by its citizens to participate in and contribute to the running of the city. In order to promote these smart city literacies, Glasgow’s Future Makers programme provides an “innovative coding education programme” to develop programming and coding skills among young people. Future Makers consists of coding clubs and workshops all aimed at enabling young people to help shape and sustain the Future City. Related activities in the Glasgow Future City include “hack days” putting citizens, programmers, designers and government staff together in teams to focus on coding citizen-centred solutions to urban problems. Future Makers, thus, acts in part as a pipeline ensuring that young people are equipped with the relevant technical expertise of coding and computational thinking to help “hack” or “code” the future of the city.

In the U.S. context, similar aspirations have been expressed. The smart cities thinker Anthony Townsend, for example, has argued that learning how to code will be an important prerequisite for civic improvement. Townsend is himself involved in an initiative called MakerCities that is premised on the idea that digital makers, coders and hackers “are starting to reimagine the systems that surround them. They are bringing the ‘maker mindset’ to the complex urban challenges of health, education, food, and even citizenship. Makers will make the future of their cities.”

There is a clear correspondence in such claims between arguments about learning to code and arguments about the social and technical organization of the future of everyday life and civic participation. According to these projects, then, learning to code is about the production of the literate citizen of the future smart city, a “smart citizen” subtly asked to participate in the construction and maintenance of the urban environment and its services.

It is not just cities that will benefit from the skills of those young people who have learned to code. An astonishingly utopian view of the capacity of coders and makers is evident in Silicon Valley tech incubator programmes. Y Combinator is a programme for startup organizations that invests support in new business and tech innovations. As its “philosophy” claims:

“We think hackers are most productive when they can spend most of their time hacking. Our goal is to create an environment where you can focus exclusively on getting an initial version built. … We realize that, as it gets cheaper to start a company, the balance of power is shifting from investors to hackers. We think the way of the future is simply to offer hackers the best possible deal.”

So there is a new kind of business model for the future articulated by Y Combinator. It supports a view of the future where hackers and coders, rather than investors, will lead the way.

Similarly, the Thiel Fellowship program, established by PayPal founder Peter Thiel, proposes a future where educational institutions are redundant. Each year, selected fellows of the program are “given a grant of $100,000 to focus on their work, their research, and their self-education while outside of university. Fellows are mentored by our community of visionary thinkers, investors, scientists, and entrepreneurs, who provide guidance and business connections that can’t be replicated in any classroom.” Recipients of the fellowships are all aged between 17 and 20, and all possess highly impressive track records in entrepreneurship and technical innovation. The program encourages them to reject higher education, or even school, and engage in self-directed technical research. Much of it looks like valuable high-end work. The point is that the Thiel Fellowship is the logical endpoint of the movement toward unschooling and “startup schools,” where the logic of non-school coding clubs, digital making, makerspaces and hackerspaces is understood to offer more powerful forms of hands-on learning than formal education permits. Learning to code is therefore part of a futuristic revision of education itself.

Historical Code

Economic development, smart cities and startup incubator programmes are three dominant ways in which learning to code has been attached to highly future-oriented aspirations. But, what else can we learn from considering the histories that have shaped learning to code, rather than just the future aspirations on which it is based?

Of course, there is a fairly straightforward history to trace back from learning to code to the “constructionism” associated with Seymour Papert at MIT in the 1980s. Papert’s work on programmable microworlds, “cybernetics for children,” and “thinking like a computer” has all been accommodated into initiatives like Code Club. The legacy of Papert’s LOGO programming environment is continued in MIT’s Scratch, usually the first programming environment that young people encounter if they enroll in any learning to code scheme.

But, a more intriguing history of learning to code can be found in recent studies of the social history of computing, programming and coding itself. You could, for example, go back to the 1960s and the formation of the first computer science curriculum for undergraduate and graduate students. As Nathan Ensmenger claims in his wonderful history of computing, “The Computer Boys Take Over: Computers, Programming and the Politics of Technical Expertise,” the Association for Computing Machinery produced a curriculum for computer science in 1968 that helped establish computer science itself as a scientific discipline. Its emphasis was on the study of the algorithm, those precisely defined instructions for executing a task.

But, computer science and the study of the algorithm are not synonymous with learning to code. Learning to code looks much more like software development than computer science. Indeed, this reflects a set of sharp historical divisions in the disciplines of computing — between computer science, computational science, and software development — that Brian Hayes has traced out in a recent brief history of the “Cultures of Code.” While computer science is concerned with understanding underlying algorithms, software development is concerned with the production of tangible artefacts, and computational science treats the computer not as an object of study but a scientific instrument. And, the divisions run deeper than this. The cultures of computer science, computational science and software development are very different; computer scientists, computational scientists and software developers work in different settings, attend different conferences, belong to different professional associations, and have very different ways of working, with different worldviews, systems of thinking, and professional practices. They are distinctly different “cultures of code” as Hayes argues.

These differences are important for situating learning to code and its possibilities. In the “cultures of code” article, Brian Hayes is optimistic that “a new generation discovers that coding is cool” and about the “hacker enthusiasm” for the “nerdy side of life,” but is cautious about the long-term contribution of learning to code initiatives to the field of computing:

“How will the members of this exuberant new cohort distribute themselves over the three continents of computer science, computational science, and software development? What tasks will they put on their agendas? At the moment, most of the energy flows into the culture of software development or programming. The excitement is about applying computational methods, not inventing new ones or investigating their properties. … Everyone wants to pick up the knack of coding, but the more abstract and mathematical concepts at the core of computer science attract a smaller audience.”

Learning to code, therefore, needs to be understood in terms of the longer history of the separation of coding from computer science, and potentially seen as an unhelpfully “cool” deviation from the far less funky business of advancing the future of computing itself. The distinguishing culture of code associated with computer science, and the professional identities held by those who conduct it, is distinct from the culture of code associated with learning to code, and the potential professional identities available for those who pursue coding further.

And this, in turn, leads to a final point of history. As Nathan Ensmenger has shown in “The Computer Boys Take Over,” what is meant by “coding” is itself a historical artefact. When computing first emerged as a professional practice in the 1940s and 50s, a “coder” was seen merely as a “glorified clerical worker” and the task of coding was almost exclusively performed by women who could “code into machine language the higher-level mathematics developed by male scientists and engineers. Coding implied manual labor, and mechanical translation or rote transcription; coders were obviously low on the intellectual and professional hierarchy.” The actual art of “programming,” as it came to be known the late 1940s, consisted of six steps (including mathematical conceptualization, algorithm selection, and numerical analysis) only the last of which was the “coding” done by female coders.

History from the Future

Hopefully, such a sharp division of labor can be combated by the learning to code movement, which is partly driven by a gender equality agenda. However, it is important to consider the longer historical lineages (and disciplinary bifurcations) that have given us learning to code. At a time when software development has ultimately made technologies “brain-dead easy to use,” and coding can be taught to a 5 year old, might it be the case that the learning to code movement will simply reproduce the notion that coding is a straightforward clerical task? Will learning to code produce leading-edge scientists, or just churn code monkeys for the software industries?

The emphasis on learning to code, perhaps at the expense of computer science, introduces infants to a distinctive culture of code that is now associated with the tech-entrepreneurial worldview of software developers, with its utopian and future-facing emphasis on “technical solutionism.” This reinforces a much longer history of disciplinary separation and labor division in the computing field. As learning to code gets rolled out — by, for example, supplying coding devices to kids, then enrolling the best of them into tech startup incubators — it might be a good idea to start thinking a bit like a historian, documenting our present from the future. Would a historian of computing, looking back 50 years from now, view learning to code as a historically decisive catalyst in technological innovation — the pedagogic invention that ultimately allowed cities, economies and industries to be reinvented — or as a clumsy, commercialized recruitment campaign for the software industry? Would the culture of code associated with computing pioneers such as Alan Turing, Claude Shannon and John von Neumann simply appear as an historical subculture to the triumphant technoculture of the Silicon Valley startup?

Banner image credit: Josh Graciano