(Posted in ITsecurity.com: 11 April, 2002. The following article was commissioned by IEEE Spectrum, submitted to them on the 6th March 2002, and was due to appear in the April 2002 issue. However, their editors insisted on changing the text in ways that introduced material inaccuracies. For example, they insisted on crediting IBM with opposing export controls on intangibles, when at all material times IBM was the staunchest supporter among large IT companies of the crypto policy line taken by the US and UK governments. I refused to accept these edits and the IEEE cancelled the article.)
Esther Dyson famously argued that as the world will never be perfect, whether online or offline, it is foolish to expect higher standards on the Internet than we accept in `real life'.
Legislators are now turning this argument round, and arguing that they have to restrict traditional offline freedoms in order to enable the regulation of cyberspace.
A shocking example is an export control bill currently before Britain's parliament. This will enable Tony Blair's government to impose licensing restrictions on collaborations between scientists in the UK and elsewhere; to take powers to review and suppress scientific papers prior to publication; and even to license foreign students taught by British university teachers.
The justification offered for this is a European agreement to control the `intangible export' of technology.
During the late 1990s, arms export regulations prevented US nationals making cryptographic software available on their web pages, or sending it abroad by email. Phil Zimmermann, the author of the popular PGP encryption program, was investigated by a Grand Jury for letting it `escape' to the Internet. The law was ridiculed by students wearing T-shirts printed with encryption source code (`Warning - this T-shirt is a munition!') and challenged in the courts as an affront to free speech. Meanwhile, European engineers made crypto software freely available.
The Clinton administration fought back, with Al Gore pushing European governments to fall in line. After Tony Blair was elected in 1997, the British government became eager to help, but Parliament was by their first attempt in 1998 to impose export controls on intangibles. They then tried an `end run' around Parliament by quietly negotiating a Europe-wide agreement which they now say we have no choice about implementing.
Individual European countries have a lot of latitude about how they implement this agreement, but the British approach is draconian. The proposed law will give ministers wide powers to regulate the transfer of technologies that could have harmful effects. Ministers admitted in parliament that their overriding concern was to leave no loopholes: no T-shirts, no bar codes, no faxes, no covert channel through which controlled information could lawfully leave the country. The law even allows the government to control `non-documentary transfers' (read: speaking to foreigners) in cases where the technology may be used for certain types of weapons, such as guided missiles. As I am currently sitting in an office at MIT, on sabbatical, and helping US students think about integrating inertial navigation with sensor networks, it's lucky the bill isn't law yet. This new research topic only came up last week in a seminar, and I was able to pitch in some ideas at once. If I needed an arms export licence to take part in the discussion, this would have taken weeks or even months, and the value of spontaneous interaction would have been lost.
Controlling physical exports is easy, at least in principle; but once you try to control the electronic export of software, designs, specifications and technical support, it is hard not to end up controlling speech as well - the dividing line is too blurred. So is the concept of `abroad'. It is quite common for an email between two British scientists to travel via the USA, and an email sent to me at Cambridge, England, will be forwarded to Cambridge, Massachussetts, if that's where my body happens to be. Now if you give officials enough regulatory discretion to deal with all this, you give them the power to interfere with speech too - and much else. For example, the UK bill extends the scope of arms export controls from a few hundred `obvious' armament vendors to thousands of innocuous software companies. And what about the millions of people who use online services in foreign countries? Will it become an offence for a Brit who works with high technology to have an email account at a US provider, like AOL, to which messages get forwarded when she's travelling?
While the struggle to amend this particular bill is primarily a matter for Britain's scientific and engineering establishment, it is an example of a wider and worrying trend - of toxic overspill from attempts to regulate the Internet.
There are many more examples. In the USA, Hollywood's anxiety about digital copying led to the Digital Millennium Copyright Act. This gives special status to mechanisms that enforce copyright claims: their circumvention is now an offense. So manufacturers are now bundling copyright protection in with other, more objectionable, mechanisms, such as accessory control. For example, one games console manufacturer builds into its memory cartridges a chip that performs some copyright control functions but whose main purpose appears to be preventing other manufacturers from producing compatible devices. There is no obvious way to reconcile the tension between public policies on copyright and on competition.
The anti-terrorism laws that many nations now have give us yet more examples of regulatory overspill and overkill. In Britain, for example, terrorism is defined as acting in concert with others, for political or religious purposes, using certain means (including violence, property damage or interfering with a computer system) that achieve certain ends (including death, property damage or risks to public health). This definition followed police scaremongering about cyberterrorism, and has the following curious effect. Should I, here on U.S. soil, voice support for the Icelandic Medical Association's boycott of that country's controversial genetic database - which according to the government in Reykjavik is degrading the information flows they need to manage public health - then I would become an international terrorist on the spot. (Perhaps I'd better say no more.)
Meanwhile, worries about cybercrime are leading to a Europe-wide arrest warrant which overturns the time-honored principle of dual criminality - that you can only be extradited from one country to another if there is prima facie evidence that you've done something that's a crime according to the laws of both of them. Now Germany has strict hate speech laws - `Mein Kampf' is a banned book - while Britain does not. Right now, I could put an excerpt from that book on my website in the UK (or the USA) but not in Germany. However, the new arrest warrant would allow the German police to extradite me from Britain, for an offense that doesn't exist in British or American law. Thus, free speech rights online may be reduced to the lowest common denominator among the signatory nations.
At a conference in Berlin in 2000, the German federal justice minister said that her proudest achievement in office had been to stop Amazon selling `Mein Kampf' in Germany, and that her top ambition was to stop them selling it in Arizona, too. European arrest warrants do not quite go that far. But in the near future, if Amazon sold a copy of this book to a history professor in Finland, and Jeff Bezos were later passing through Madrid, the Germans could have him hauled off to Berlin for trial. (Meanwhile, the copyright in the book belongs to the State of Bavaria, so there is an easy way for the German government to prevent its distribution. But they seem determined to do it the hard way.)
Why do we get so many bad laws about information? Several factors are at work.
First, the Internet is no different from any other new frontier in that businessmen compete to make money out of it, while bureaucrats compete to build empires regulating it. The `dot-com' bubble is being followed by a `dot-gov' version. However, while poorly-thought-out business plans run out of cash and disappear, poorly-thought-out laws remain, together with irrelevant services and bureaucratic overheads.
Second, the Internet is different from (say) the Wild West in that the often harsh law enforcement of those times could be replaced and updated as new states were formed. There is no such natural opportunity to revise cyberlaw.
Third, the laws in newly created states were written by people elected by the folks who lived there. This isn't true at all for cyberspace, which is regulated by the same politicians and senior officials who run meatspace (and are beholden to its vested interests).
Fourth, there are issues of understanding as well as motivation. Cyberspace is more different from Arizona than Arizona is from New York. As politics is about managing the trade-offs between competing legitimate rights and interests, good public policy requires a good understanding of how causes and effects are related. The lack of this makes most governments incompetent at balancing public policy goals that affect cyberspace. It's hard enough to exchange email with a government department, let alone teach it how to draft laws that catch only the phenomena they are intended to.
Fifth, many of the bad laws have to do (in some broad sense) with computer security, or at least with the perceived vulnerability of the internet to hackers, bomb makers, credit card thieves, pornographers, and other undesirables. There is a huge amount of hype from the computer security industry - when people get fed up with hearing about hackers, the story becomes one of `cyberterrorism'. There are few or no balancing voices, as the interests of almost everyone involved in the security industry - vendors, government agencies, regulators, researchers - lie in talking up the threats. Journalists like the scare stories more than the rebuttals. As with Y2K, the still small voice of reason goes unheard.
What is to be done?
In the shorter term, there is much that individual engineers can do. Engineers and lawyers have at last started to talk to each other about technology policy, while colleagues and I are currently promoting cross-disciplinary research at the boundary between information security and economics.
In the longer term, much of the cyberlaw that has been rushed through in the last few years will need substantial revision. In the USA, that might happen through the Supreme Court, thought it might be unwise to rely on that completely. In the European Union, engineers should be seeking to influence the constitutional negotiations getting underway for the community's enlargement in 2005. We could try to introduce a mechanism whereby technology policy directives were automatically sent for revision every five to ten years.
But whatever the mechanisms, we technologists need more influence over the development of technology law. Our profession has grown rapidly in numbers over the last quarter century, and our contribution to economic development is decisive. However, our political clout hasn't grown to match. We have been too busy making the world a better, richer place to spend time infiltrating the citadels of power. Fixing this political deficit is now not just in our own interest, but in everybody's.