Why teaching code matters

November 6, 2014 | 12:27
Why teaching code matters
Why teaching code matters
Since Snowden's revelations about NSA spying, more and more people are concerned about online privacy and freedom. Current discussions about possible solutions divide people who think computers users should be educated in code from those seeking solutions in policy and law. But how can good policy decisions and legal frameworks be made if legislators, politicians and citizens don't even know the inner workings of the information networks they rely on?

Mug shot business
Last October at the Dutch Royal Academy of Science (KNAW) the American journalist Adam Tanner presented his book What Stays in Vegas. The World of Personal Data -- Lifeblood of Big Business -- and the End of Privacy as We Know It. Tanner spoke about the relentless digital harvesting and aggregating of personal information by Big Business.

Tanner began his talk with an account of how, in 1988 when he was in Dresden doing research for the travel guide Eastern Europe and Yugoslavia on $ 25 a Day, his every move was recorded by ten KGB agents following him on the streets. As he found out later, they had created a minute-by-minute log on his walks through the city, complemented by photographs. "What ten KGB agents found out about me in one day," Tanner told the audience, "is nothing compared to what one company finds out about you when you search online, pay in a shop with your creditcard or simply stroll the streets with a mobile phone in your pocket."

Tanner gave many examples of business models invented each day around the data gathering and storing capacities of digital networks. Such as the 'mug shot' business of Kyle Prall. Prall harvested millions of police pictures taken of arrested suspects. Prall posted these on a website, creating a public repository of people one by one framed as criminal. In this public online sphere, completely divorced from the context in which the photographed people were arrested, these photos gained the power of ruining people's opportunities for jobs, partners, or houses. Prall earned big bucks with this, offering to take the pictures down within 24 hours when people paid him $ 108.

Sleeping with the NSA
In 2013 Edward Snowden told the world about the intrusive, far-reaching and violent data-gathering practices conducted by the NSA and other secret services. Adam Tanner tells the same story, this time from the perspective of business. Because of the work of storytellers like Snowden and Tanner, a growing public is becoming aware that the shiny connected gadgets in their homes, beds, pockets and on their bodies have a significant dark side. Sleeping with a phone next to you may be akin to sleeping next to the NSA. Buying an Apple iPhone means buying a perpetual tracking device. Opting in to a special offer in a casino, supermarket or car rental agency means being tracked, profiled and targeted by more personalized offerings intended to create profit for businesses at your expense.

Most people did not intend to use their gadgets in these ways when they bought them. People buy phones to be in touch with friends, family and business partners. They upload pictures to the cloud to invite responses from people whose opinion they find valuable. And they sign up for special offers because they can use the money. Yet, what stories like Snowden's and Tanner's show us is that in our information society we cannot predict all the other purposes that our pictures, emails, online comments and medical or criminal records will be used for one day. As long as people believe they can build viable business models around the information gathering and storing capacities of ICT's, and as long as governments feel they can reduce risk and threat through them, these information systems will adopt functions nobody can fully anticipate.

Code, policy and law
Tanner's presentation ended with an audience asking heated questions about the real cause of the problems and the nature of the solution. Since Snowden's revelations, such debates take place more and more often, in on- or offline newspapers, mailinglists and conferences. Proposed solutions typically take place within a spectrum. On the one end are those who argue that computer users need to be educated in code, enabling them to understand the systems they work with or to imagine and create new ones. On the other end are those who argue for solutions in the realm of policy and law, with international standards and regulations precisely defining when and under which conditions data-gathering is acceptable and when it is not.

Most people take a stance somewhere in the middle of the spectrum, arguing for a combination of both types of solutions. Yet, there are moments when both solutions seem diametrically opposed. At the Information Influx conference organized in the summer of 2014 by the Institute for Information Law in debate center De Roode Hoed in Amsterdam, keynote speaker Deirdre Mulligan, professor of law at the Berkeley school of Information, took a clear stance at one end of the spectrum. After having positioned herself firmly against the massive privacy-infringement practices of secret services and global corporations, Mulligan argued that "teaching people to code will not solve the problem." Mulligan referred to a test that was conducted amongst professional programmers, with none of the programmers being able to detect all the errors in a particular program. "If not even professional coders understand algorithms fully," Mulligan proposed, "the idea that we are going to understand how an algorithm works is kind of a long shot (...)" For her, the solution needs to come from "oversight and accountability" in the sphere of politics and law.

Others argue, by contrast, that faith in the power of policy and law to protect basic principles of social conduct in the online sphere is misplaced. This is precisely because politicians and most legislators are code-illiterate: they don't understand how these principles translate into and are reinforced by code. Technology critic and journalist Brenno de Winter argued this point in his 2012 discussion on ICT-decisions made by the Dutch government.

One such ICT decision regarded the implementation of electronic voting. After the parliament had decided on implementation, Dutch hacker Rop Gonggrijp initiated a campaign that eventually proved the computer of choice to be unreliable and the digital voting process non-private. De Winter argued: "No politician could explain comprehensively how these machines work, no-one had ever seen the code of this machine." By opting for this system, the elections ran the risk of becoming "an unaccountable process, with the power of control taken away from citizens."

"User-friendly" design
Mulligan's sober statement that it is a long shot for every citizen to understand the algorithms of the systems they rely on seems realistic in a world where many people simply don't interact with their technologies at the level of code. On the contrary: since the mid-1980s, design principles started focusing on making computer technologies more 'intuitive' and 'user-friendly'. In the process, information technologies have come to appear to us as friendly environments, and not at all as mechanisms that operate on the basis of pre-programmed code. This design bias has popularized along with the idea that code is tedious and difficult, and that people want graphics and frictionless shiny user-interfaces.

Yet, precisely because this is the default design principle of human-computer interaction, it is important to make code once again visible to computer users. Not with the purpose of training an entire population to become professional code-builders, but for a very different reason: in the process of trying to learn the codes of their technological environment, people will become aware of the difficulty of finding out about these codes. They would experience personally that their informational environment does not invite them to interact with technology in a 'friendly' way, but that it forces them to do so: Digital Rights Management software (DRM's), copyrights, patents, hardware designs, warranty stipulations and terms of services prevent people from opening up their computers or looking at the codes of the programs they rely on for their daily lives. It is because of these mechanisms and strategies, consciously applied by corporations, that data hunting and gathering can occur unchecked and often invisible to computer users themselves.

Barbed wire
The Ivir conference ended with Danja Vasiliev and Julian Oliver's NETworkshop: a workshop catering to computer users teaching them linux commands and tricks for finding out what happens 'under the hood' of information networks. Having followed this workshop several times, I have not turned into a professional programmer, nor do I think I will ever understand code the way professional programmers do. Yet, it showed me that our gadgets transmit a lot of information for anyone with the right hardware and software to scan, read and interpret. It showed me also that it is possible and not all that difficult to build your own network and to turn your own laptop into a server accessible from the internet.

But, most importantly, it made me more aware that when it comes to the existing infrastructure of the internet, there is so much that we can't enter, alter, look into or find out about. And this is not only because technology is complex, but because of the deliberately opaque institutional environment in which information technologies operate. As pointed out by Oliver and Vasiliev, this fact is perhaps best symbolized by the barbed-wire, video surveillance and armed guards that typically protect datacenters.

A recent parliamentary study showed that the Dutch government has a tendency to embrace information systems that are unreliable and that compromise the protection of personal data. The reason for this is not merely that politicians can't read code. The real problem is that they, and the larger public, don't even think of asking to see the code of machines they rely on for voting, the expression of thought or the exchange of medical data. Apparently, in our current society technological opacity is not only a technical and institutional reality but also a mindset. This is why technical education is necessary. Not because it should aim to turn all people into professional techies. But because it can change this mindset, make people curious about the ways information networks work and care about the unaccountability of the online sphere.
Loading comments...
related items