PRIVACY AND LAW IN TECHNOLOGY
Jennifer Farrell*
A. Introduction – Why the Internet has created privacy problems.
Technological change, digitalization and the internet have led to an extraordinary increase in the amount of personal information available and with it the potential to totally deprive us of what we have traditionally regarded as privacy. The efficiencies of the computer revolution have introduced insecurities in data control unthought of in paper filing mechanisms where the inherent inefficiencies have provided security. This has led to a transformation in privacy attitudes1 and a fear of technological abuse.
Privacy which has protected us from being judged out of context and misdefined appears to be almost nonexistent in a technological world “of short attention spans, a world in which information can easily be confused with knowledge.”2 Privacy is no longer represented by a physical space – in the “market place” or in the “home” but by an informational profile. Privacy in the digital world represents an escape from connection to the information exchange – inaccessibility through disconnection – a separateness from the System. It is longer the “right to be let alone” but the freedom to choose detachment from the digital world – the right to opt out, switch off the computer, to set limitations on the amount of information that is released, manipulated or passed on by those who collect it.
B. What is Privacy?
“Privacy is dynamic. Its face is constantly changing...The solutions that we find acceptable today...may be an unacceptable or ineffectual solution tomorrow.”3
The concept of privacy can be understood in relation to the context in which the word is used, as well as the age. In ancient Greece the division between public and private was related to the division in a physical sense between the home (the sphere for slaves and women) and the market place (for free men who could engage in trade). By the time that Warren and Brandeis analysed the right to privacy at the end of the 19th century, the term came to mean ‘the right to be left alone’ and to be safe from intrusion. In the internet era it can again be understood in the context of public and private spheres, however not in a physical sense as much as in the sense of connection or disconnection from technology.
Privacy has been defined as “the interest that individuals have in sustaining a ‘personal space’, free from interference by other people and organisations.”4 There is no general right to privacy at common law. It has been defined as “the interest of a person in sheltering his or her life from unwanted interference or public scrutiny.”5 A lack of precision in defining the concept of privacy has been recognised by the Chief Justice of the Australian High Court, Gleeson CJ, in Australian Broadcasting Corporation v Lenah Game Meats Pty Ltd6. He explained at [42] there is “no bright line which can be drawn between what is private and what is not”. The foundation of what is protected in relation to privacy he understood as “human dignity” because if the information was disclosed it would be “highly offensive to a reasonable person of ordinary sensibilities”.7
Privacy in the context of Cyberspace is concerned with informational privacy and the privacy of communications. As Alan West in has explained it is:
8“[t]he claim of individuals, groups, or institutions to determine for themselves how, when, and to what extent information about them is communicated to others.”
The existence of a vast amount of information in databanks that is linked and integrated “implies that society has a right to surveil its subjects and to define individual identities separate from the inherent nature of personhood.”9 The creation of national identity systems, in particular, “demeans the political values of identity by substituting ersatz-identities for identities based on personhood.” This concept has been expanded by Justice Douglas in United States v White10 who in his dissenting judgment referred to the philosophy of privacy underlying the “unreasonable searches and seizures” in the Fourth Amendment to the US Constitution and the analysis of Ramsay Clark.11 Privacy was understood by Clark to be the basis of individuality. He explained that solitude allows the personality to develop by reflection and self-examination without the influence of “uncontrolled external social stimulations”. The invasion of this privacy “demeans the individual” degrading people and allowing people to degrade each other while limiting “opportunities for individual fulfilment and national accomplishment.”
The expectation of privacy is important psychologically, sociologically, economically and politically, so that people feel free to behave and associate with others, to innovate and speak freely.12 As Sir Zelman Cowen wrote in 1969:
“A man without privacy is a man without dignity; the fear that Big Brother is watching and listening threatens the freedom of the individual no less than the prison bars.”13
It is however, not simply Big Brother who is watching. As Daniel Solove has explained the Orwellian totalitarian metaphor has limits. He views the Kafka dystopic vision encompassing bureaucratic decision-making processes as more threatening because of the “web of thoughtless decision made by low-level bureaucrats, standardized policies, rigid routines, and a way of relating to individuals and their information that often becomes indifferent to their welfare.”14
The unrestricted flow of information over the internet of vast quantities of data are forcing us to redefine what privacy is, what information we need to control and how we are going to control it.15
C. Threats to Privacy?
1) Surveillance
Surveillance has been described as “the greatest leveller of human privacy ever known”16 and by Roger Clarke17 as “one of the elements of tyranny”18. Its main purpose is to collect information or data about individuals or groups. People can now
be watched through their records, transactions and related physiological features. “Dataveillance”19 is inherently intrusive and ‘threatening’ and is often arbitrary.
Application of new technologies to data collection, in both the private and public sphere, is increasing the quality and quantity of the material while decreasing the cost. Therefore surveillance to an extent not even imagined in 1984 has been made possible by the use of techniques such as sniffers, biometrics20, saliva scans, low energy wave cameras, odour and universal sensors, vein maps and T-ray cameras. Even colour laser printers can play a role in surveillance. Some printer companies encode the serial number and the manufacturing code of their colour printers on every document the machines produces. This enables them to be used to track counterfeiters. The distributors obtain information about purchasers when the printer is sold and this is maintained in a database. The purchaser’s identity is associated with the serial number and the machine and this enables the documents printed to be traced to the purchaser of the printer.21
The FBI custom-built internet surveillance software, DCS-1000 (originally called Carnivore because it could get at the “meat” of suspicious communications), was designed to sift through all traffic going through a particular ISP network, capturing considerably more information than using traditional devices on land-line phone systems. It was viewed by the Electronic Frontier Foundation22 as having the “potential to turn into mass surveillance systems” by “over collection of personal information” thereby lowering individual’s expectation of privacy.23 This program has been abandoned in favour of the use of commercial surveillance software.24
Pervasive “privacy-destroying technologies” are available for governments and for millions of people throughout the world. Since the first use of digital cellular phones in the early 1990s mobile phones have become widely used. These mobile phones have recently become more invasive. The addition of cameras in these phones was supposed to have been introduced to enable professionals to take family photos with them when travelling. However many other uses have been found for the camphone, both beneficial and invasive.
Nokia has estimated that there will be over half a billion camphone25-weilding people by the end of 2005. Almost any image can be captured. Less than an hour after the London double decker bus was blown apart on 7 July 2005, an image taken with a camphone was available on the internet. Camphones have also been used to take pictures of women on Sydney beaches26, capture private picture of celebrities, used for industrial espionage27 and it has been reported that one mobile phone was used by the owner to send photos of his suicide to his former girlfriend28.
Websites can create surveillance problems because they collect information about the user and their computers from web forms, cookies, web beacons, web server logs and sometimes by adding spyware.29 Most of these methods can be controlled by the user if care is taken not to disclose sensitive information and cookies settings are adjusted to manage or decline all of them. Spyware presents a more difficult surveillance problem. According to Gartner analysts, John Girard and Mark Nicolett:
“Spyware has evolved from being an occasional nuisance to something that wastes IT user and technical support resources, and compromises the integrity of corporate systems, applications and data.”30
Anywhere between 67% and 90% of personal computers were estimated to be infected with spyware in 2004.31 Almost 80% of IT professionals believe that employees do not act safely online and introduce spyware to company computers by opening unsolicited emails and attachments and downloading malware32 from websites by surfing dubious websites.33 Some of the malware is distributed by ad- supported software which is widely distributed on the internet. This software can be used by companies to sell products and to support the website or it can be used to collection information. CoolWebSearch is an example of adware that hijacks homepages, changes browser settings, adds pornography links to the list of favourites, deposits large numbers of files throughout the user’s computer and triggers endless pop-up ads on users computers.34
2) Data Mining “The internet is a data miner’s paradise”35
The internet provides an enormous, relatively current database perfect for data mining36. The increases in computer speed, the use of parallel processing which runs similar algorithms on different parts of the data and the adaptation of statistical algorithms has made data mining a powerful business tool. This process is also known as “knowledge discovery in databases” or the “nontrivial process of identifying valid, novel, potentially useful and ultimately understandable patterns in data”37. It uses complex algorithms, artificial intelligence, neural networks and genetic-based modelling to discover unknown facts in a database and answer questions no-one knew to ask.38 The computers search for hidden patterns and make predictions with minimal intervention.
Data mining involves the application of a number of processes once the data is organized into aggregations or data warehouses. The data is “cleansed” by the data manager who discards unreliable information. Clustering then divides the database into homogenous sub-groups or clusters according to the patterns in the data itself. Then descriptive and predictive inquiries are made which describe the data as it is and future behaviour is predicted. The patterns of variables that usually associate with each other are discovered from the application of rules of sequential pattern discovery or similar time sequence discovery examining the links between sets of data.
Data mining is used for the detection of fraud and as well to promote customer service. Online advertising makes use of customer profiles with tailored ads for every customer. It is the selectivity of the data that threatens privacy. The group selected is pushed into a new market to purchase products in which there was no initial interest. It leads to intrusion through the use of unsolicited emails, manipulation where profiling has allowed hidden marketing with advertisements tailored for every customer according to specific profiles and discrimination by ignoring sections of the market.
The intrusive nature of data mining selectivity is illustrated by Tal Zarsky in his article, “Mine Your Own Business!” by the use of hypothetical customers, such as Mr Orange who purchases groceries over the internet but has recently stopped buying cigarettes. To enable the retailer to sell more cigarettes, Mr Orange finds the web site he visits for shopping presents cigarette ads to him and “complementary”
cigarettes arrive in his grocery order. Mr Black pays high life insurance premiums because he has had two recent heart attacks. Data mining by an employee of the insurance firm reveals his physical condition and this information is sold to his employer, leading to his dismissal.
The reality of these hypotheticals can be seen in Australia in the recent case of Rebecca Hartford39 who three years ago received a letter from her insurer refusing her request to increase her death and disability insurance because her insurer discovered she had haemochromatosis40. This information could have been passed on under Australian law to a potential employer or relatives or anyone. It led to a call for the federal government to introduce laws to protect genetic privacy41.
The commercial exploitation of medical databases and the associated privacy concerns can be illustrated by the establishment in Iceland of a central database with tissue samples and genetic information by a private company, deCODE Genetics42. The company has analysed data from over 100,000 volunteers (50% of the adult population) together with the genealogical data linking the entire present-day population stretching back over 1,100 years. The data was mined to trace inherited components of a disease, pinpointing the key disease genes and specific markers within the genes. The company obtained a licence from the government to build the data base. This licence has been challenged in Iceland’s Supreme Court on the basis that it does not comply with the constitutional guarantee of “freedom from interference with privacy, home, and family life” (Article 71)43.
Serious breaches of privacy in medical records was exposed in 2003 when the private medical records of over 13 patients were inadvertently posted on the St Vincent’s Hospital website. Sensitive information relating to mental illness and HIV status were revealed.44 There are, however, considerable benefits to be gained by maintaining these electronic records including reduced cost of research by the avoidance of duplication in testing and the application of the information to screening and preventative health measures. Professor Colin Thomson has expressed the view that researchers are finding privacy regulations obstructive to medical research and called for streamlining of privacy legislation to avoid delays such as those faced by scientists in a Victoria study of trauma treatment in gaining access to data from 140 sources.45
Once data has been collected the problem arises how this information is to be managed, secured and used in the future and more importantly, who will have control over it. The data itself may well be useful for research and provide enormous public benefit but what also has to be considered is the cost to the individual, particularly the invasion of privacy from the accidental or deliberate disclosure of private information.
3) Profiling
Profiling is the recording and classification of behaviours46 through the aggregation of information from multiples sources to build comprehensive profiles on individuals. These profiles can then be used to predict behaviour, to target individuals for specialized messages, instructions or treatment, sell products, and isolate groups that present security threats to society. Diagnostic profiles exist rather at the
intersection of actual and virtual worlds.”47 Generally decisions are made about individuals based on their computerized profile alone.48
There are many forms of profiling, depending on the focus of the information and target group. Some examples include commercial and government profiling to target groups most amenable to certain products and services, racial profiling directed at suspects or groups because of their race, or specific medical profiling such as DNA profiling used by Interpol as an investigative tool.49
In the process of profiling, individuals are divided into groups with categorized subgroups. In commercial profiling these can include such groups as “Affluentials” – Young Influentials, New Empty Nests, Boomers & Babies, Suburban Sprawl, Blue- Chip Blues); “Rustic Living” – Blue Highways, Rustic Elders, Back Country Folks, Scrub Pine Flats, Hard Scrabble or “Urban Cores” – Single City Blues, Hispanic Mix, Inner Cities.50 The profiles can be purchased at moderate amounts and they are often bought and sold over the internet indiscriminately with insufficient regard to privacy or security of the information. They can be of considerable commercial benefit to direct marketing organizations, such as Dunhill International Lists Co. Inc.
Profiling can target any individuals and is used by business and governments alike. The US military began creating a database of over 3 million high school graduates and close to 5 million college students in 2002 to target potential recruits.51 The database contains information on social security numbers, grade-point averages, email addresses and ethnicity. Privacy groups have shown concern that such sensitive information would be controlled by a private firm and possibly more vulnerable to misuse.
Problems with profiling can arise from errors in the data collected, the way it has been aggregated or the manner in which it is applied. Privacy problems can also arise from insecurity in databases from inadequate protocols. The LEAP52 database used by the Victorian police for their investigations was discredited by the release of some 20,000 pages of confidential information on hundreds of people, including criminals and victims.53 The government’s response was to create a new position, Commissioner for Law Enforcement Data Security, to supervise the police computer system. Thus indicating that technological solutions alone are rarely sufficient to provide adequate privacy protection.
Racial profiling has created special problems following the terrorist attacks in the USA on 11 September 200154. Investigation focussed on foreign nationals from middle eastern countries, particularly on young males who had recently entered the US from countries linked to terrorism. The USA Patriot Act introduced in September 2001 made surveillance and the collection of data easier, as well as aggregation and profiling. While the Chief Deputy Attorney General, Peter Siggins, conceded that ethnic identity should be combined with other factors exposed through investigation and analysis to avoid disparate treatment of all middle eastern men, he also concluded that:
“Protection is going to have to be accomplished through infiltration and surveillance, so all of us have to get used to new levels of government intrusion.”55
4) Emails “Email exemplifies the empowering capacity of the Internet, virtually
abolishing spatial and time constraints on communication.”56
Email is one of the most vulnerable, permanent and public forms of communication while appearing to the user seductively private and secure. It combines “the intimacy of the telephone with the infinite irretrievability of a letter”.57 Emails can be much more difficult to destroy than letters, although companies such as Liquid Machines58 and ZipLip59 offer secure messages by automatically encrypting information and Webroot Software Inc60 provides software like Window Washer and Spy Sweeper to assist in security management. Many people write emails informally and with little thought. Emails can be forwarded to thousands of addresses intentionally or unintentionally. Unless encrypted, they are exposed to surveillance, hacking, spam and phishing, misinterpretation and manipulation.
The misplaced trust in email security has been shown in many high profile cases such as the 1991 Rodney King case where an email message sent by a Los Angeles police officer, stated, “Ooops! I haven’t beaten anyone so bad in a long time”.61 Recently Justice Wilcox in the Federal Court of Australia in Universal Music Australia Pty Ltd v Sharman License Holding Ltd 62 concluded that he could not rely on the expert witness, Professor Ross63, after email evidence was submitted which showed a solicitor for Clayton Utz had crossed out a sentence in his report and suggested a substitute. Professor Ross left the solicitor’s words in the draft and replied to the solicitor, “...But if you say it is so, then fine by me”. This has raised serious questions about the preparation of expert witnesses after Justice Wilcox found at [231], “Professor Ross was prepared seriously to compromise his independence and intellectual integrity.”
Two additional examples illustrate some of the problems that follow when private information is being judged out of context and are subject to “overly intrusive forms of social scrutiny”64. The first concerns Professor Lawrence Lessig65 who had downloaded Microsoft’s Internet Explore hoping to win a PowerBook in a contest. When he discovered his Netscape bookmarks had been erased he sent an email to a Netscape acquaintance quoting a Jill Sobule song, “Sold my soul, and nothing happened.” This email was produced in 1997 during the Microsoft antitrust dispute after Judge Thomas Penfield Jackson had chosen Lawrence Lessig to advise him as a “special master”. Microsoft claimed Lessig was biased and he resigned. The email had been taken out of context. He resigned for technical reasons rather than because of the contents of the email. What was most disturbing for Prof Lessig was that he was not given the opportunity to explain the truth. According to Jeffrey Rosen, Lessig was neither biased nor interested in listening to Jill Sobule but prefers listening to Gregorian chants. Rosen’s comment was that “once the backstage curtain is lifted, Lessig and those who know him can only put the information in context by revealing even more private information...[which]...must be earned by the slow, reciprocal sharing of personal information,...[leading]...to greater intimacy, understanding, and trust.”66 This example also illustrates the dangers of accepting the “free” offer, the giveaways and prizes in exchange for permission to track and monitor. This Jeffery Rosen regards as “about as rational as allowing a camera into your bedroom in exchange for a free toaster.”67
The second example concerns two legal secretaries at the Sydney offices a large law firm68 Emails were exchanged between two employees concerning a missing sandwich of “ham, some cheese slices and two slices of bread”. The subsequent emails included more pointed accusations about one being a dumb blonde and the other “Miss, Can’t Keep A Boyfriend.” The emails began to circulate throughout the office after someone clicked the “reply all” button. Soon the conversation had been forwarded to many lawyers and investment bankers in Sydney and Melbourne and eventually to the inboxes of Allens’ partners. The conversation should have remained private. There was no “backstage area” to protect these individuals from “the burden of justifying differences that no one in a pluralistic society should be forced to subject to communal inspection and debate.”69
Spam or unsolicited, junk email is usually seen by those receiving it as a privacy- invading, expensive productivity drain. It is a difficult problem to solve because spammers invent new techniques to overcome technological defence mechanisms. A former spammer, Scott Richter, settled for $US7 million after sending out over 20 billion spam messages in a year70. It takes services such as Tumbleweed Dynamic Anti-Spam produced over 500 new rules a day to stay ahead of creative spammers.71
It has been reported recently that spam volumes are declined due to effective filtering programs by companies such as Postini72, a leading provider of email security however phishing73 has been increasing with over 19 million recorded in July 2005 of 14 billion emails processed. Nevertheless, spam remains a significant problem with estimates of cost of lost output at about US$50 billion worldwide and with about US$3 billion being spent on anti-spam technology.74
A Phishing scam is one that is sent by email and is a form of online identity theft using technical tricks and social engineering75 to steal personal information, particularly financial data such as bank account details, passwords, user names and credit card numbers. Because phishers hijack well known bank, ISP or credit card company’s web sites, they can often be very successful. The link provided in the email takes the recipient to a counterfeit website designed to look like the actual site. Information entered into the counterfeit website is routed directly to the phisher. The usual request is for the online customer to verify identity and passwords and arrives in the form of an email from people such as “Bakeline D. Flaked” or sometimes more believable addresses as “custservice”. The Anti-Phishing Working Group76 reported that the country hosting the most phishing websites in June was the United States with phishing report increasing from 6,957 in October 2004 to 15,050 by June 2005. The financial services sector was the most targeted industry accounting for almost 91% of attacks. There is also a trend to implement automated phishing systems using trojaning77 methods with a generic keylogger or crimeware. Some examples include mistyping a popular domain name and being directed to a counterfeit web site with malicious code that sends crimeware to a computer or the use of search engines which automatically download crimeware simply by searching.
It was estimated by Garner Inc78 that phishing attacks on 57 million in 2003 cost over $1.2 billion. InternetPerils Inc has developed a software program known as PerilScope to try to control phishing by using an early warning system that finds computers used to launch attacks. It is like“long range radar for cyberspace”79 with active visualization used as an investigative and predictive tool.
While email is therefore “empowering” for users it is also has the capacity to expose private information in a way unimagined in the pre-internet era.
5) Identity Theft
One of the most dangerous threats to privacy comes from identity theft and the use of spyware to secretly collect confidential information, usually without the victim knowing what has been taken. Theft of identity in relation to the Internet is the theft of information, in particular, financial information, for the purpose of committing fraud. Between 1998 and 2003 over 27 million American were victims of identity theft.80 The spyware software gives access to everything done online, including passwords, usernames, shopping purchases and emails. The spyware is installed remotely without physical access to the computer. The software can arrive via email or by “drive-by downloads”. When certain websites are visited the software hijacks the user’s homepage and search engine and installs pop-up generators.81 Tax file numbers and Medicare numbers can be used to take advantage of government benefits, purchases can be made on credit cards, bank accounts opened and loans taken out.
Identity theft is reported by the National Criminal Intelligence Service in London to be “the fastest growing crime in the world” with costs in Australia of more than $1 billion a year.82 It has become such a problem that Dr Harold Kraft recently founded MyPublic Info, Inc83 to provide identity theft prevention tools and data checks of publicly available database records84. The company provides a product called Public Information Profile (PIP) for proactive identity management at a cost of $US79.95 to check eight major areas, including real estate, criminal records and professional licenses.
The 2005 research report in the US by the Better Business Bureau and Javelin Strategy & Research indicated that identity theft is more prevalent offline than online and those managing their financial activities online are able to provide earlier detection and experience lower levels of financial loss – an average of $551 online and $4,543 offline.85 However, the banking sector is concerned with the decline in take-up rates for online banking, blaming the security threats, especially the increased sophistication and complexity of more recent criminal activity. The earlier hoax e-mails in ‘pidgin’ English have been replaced with believable content and even hoax security warnings.86
6) Legislation increasing security and the use of ID cards
The prevalence of acts of terrorism from 2001 to the present day has led to introduction of restrictive security legislation by governments in countries such as the US, UK and Australia. This security legislation is inherently invasive and as such raises serious questions about the maintenance of an acceptable balance between security and individual privacy.
In Australian this legislation has included: Border Security Legislation Amendment Act 2002 (Cth), Security Legislation Amendment (Terrorism) Act (Cth), Criminal Code Amendment (Anti-Hoax and Other Measures) Act 2002 (Cth), Criminal Code Amendment (Suppression of Terrorist Bombings) Act 2002 (Cth), Criminal Code Amendment (Espionage and Related Matters) Act 2002 (Cth), Criminal Code
Amendment (Offences against Australians) Act 2002 (Cth), Criminal Code Amendment (Terrorist Organisations Act 2004 (Cth), Anti-Terrorism Act 2004 (Cth), Anti-Terrorism Act (No 2) 2004 (Cth), Crimina Code Amendment (Terrorism) Act 2003(Cth), ASIO Legislation Amendment (Terrorism) Act 2003 (Cth), Suppression of the Financing of Terrorism Act 2002 (Cth), Telecommunications Interception Legislation Amendment Act 2002 (Cth), and the Telecommunications( Interception) Amendment Act 2004 (Cth). In commenting on this legislation the Commonwealth Attorney-General stated:
“The Government’s legislative response to terrorism has strengthened and reinforced the democratic processes so vital to both our national and human security.”87
The real cost of the strengthening of security is often loss of privacy. Without the provision of safeguards, such as judicial review and a bill of rights, there is no guarantee that strong democratic processes will be retained.
Linked to the war against terror is the proposal to introduce national ID cards to increase security by enabling the holder to be identified and authenticated. A national database of biometric data including fingerprints, palmprints, eyescans or DNA would be required. This would have numerous benefits but would however challenge data privacy and protection issues. In Europe identity cards are common, some used voluntarily as in France and Portugal, some compulsorily as in Germany and Spain. The use of ID cards has been seen by the UK government as a significant weapon to fight identity theft, illegal immigration, and criminal behaviour. There is concern that there will be ‘data creep’. An ID card that begins as “an identity management infrastructure ends up as an all-purpose database for controlling citizenry.”88
The Australian Privacy Foundation in July 200589 warned of the problems with the introduction of a national ID system, in particular the limited effectiveness of such a card unless it was part of a “massive and complex system featuring a centralised database”. Such a system would be vulnerable to hacking, manipulation and corruption, high costs, public opposition, the universal and unique personal identification number (UUPIN) could be used to “index, link, track and profile our movements, transaction and personal affairs, combining records in large scale and routine ways, not currently possible” now.
The impact on privacy would be profound. As John Howard stated in 1987 in opposition to the Australia Card:
“the assumption of the Australia Card legislation is that every Australian is a cheat...it involves establishing a level of intrusion of a draconian kind into the day to day activities of many people.”
As Prime Minister of Australia, John Howard is now facing similar arguments to oppose the current proposal for the introduction of ID cards. These ID cards which link digital dossiers to enormous cross referenced data bases have the potential to expose individual informational privacy through security lapses and technological weaknesses.
D. Privacy Protection
Privacy and the demand for readily accessible information are conflicting issues in the compilation of centralised electronic databases. Some solutions to these problems can be found in the education of internet users, in the use of newer technologies and in the application of specific privacy provisions in legislation.
1) Education & Technology ...the most effective way of controlling information about oneself is
not to share it in the first place.” 90
When information must be disclosed vigilance about the content of that information can minimize the invasion of privacy. Regulation in e-commerce and legislation will be not be successful in protecting privacy unless users take charge of protecting their own privacy by disabling cookies, checking web site privacy policies and using firewalls and anti-virus software. No legislation will protect consumers who freely divulge their personal information or leave doors unlocked. Internet users can limit the amount of personal data given to businesses and government bodies, protect passwords, maintain updates for applications, shred sensitive files, carefully read privacy policies of web sites and refuse to complete warranty cards, or purchase from telemarketers and catalogues. The success of social engineering by spammers and phishers can be limited by careful consideration of incoming mail and unsubstantiated requests for information.
According to David Bell, CEO of the Australian Bankers’ Association (ABA), “All banks are working on educating customers about threats. That activity is now feeding back into the bank with customers now much more likely to report activity they consider suspicious.” Therefore education is seen as the critical first line of defence.91
Technological solutions such as encryption can contribute to privacy protection. One of the internet standards for email encryption, Pretty Good Privacy (PGP), is free software which can be downloaded from the Massachusetts Institute of Technology (MIT) web site.92 This is a high-level encryption program and cannot be defeated by standard code-breaking methods. It has a public key and a private or secret key. The public key allows encryption and is available to email senders but the private key is kept secret and allows for decryption of the message encrypted with the public key. One-way encryption programs are also available93 and are suitable for infrequent and non confidential messages. Other basic technological solutions include the use of anti-virus, anti-spyware and anti-spamware.
A technological and legal framework designed to improve privacy on the internet and which entitled individuals to ownership of personal information was put forward in 199894. The aim of this proposal was to protect individual privacy while supporting electronic commerce and providing different privacy choices. Autonomous Computer Contracting Privacy Technology guidelines (ACCPT) promotes awareness and enables users to negotiate privacy preferences with websites. It supported the Platform for Privacy Preferences (P3P) sponsored by the World Wide Web Consortium (W3C).95 The proposal is intended to redefine what information should be private within a context-sensitive framework, providing a new understanding of
“’public’ and ‘private’, depending on who has access to the information, how it is used, and how much control the individual has.
2) Law
“Political, social, and economic changes entail the recognition of new rights, and the common law, in its eternal youth, grows to meet the demands of society.” (“The Right to Privacy”: Warren & Brandeis, Harvard law Review Vol IV Dec 15 1890 No 5)
Legislative response to privacy threats have been piecemeal and fragmented, even if extensive, with governments trying to maintain a balance between the provision of privacy and the maintenance of security and free speech.
The universal principles of fundamental rights in Art 17 of the International Covenant on Civil and Political Rights relates to the privacy of individuals and provides:
“1. No one shall be subjected to arbitrary or unlawful interference with his privacy, family, home or correspondence, nor to unlawful attacks on his honour and reputation.
The achievement of this ideal through legislative change has been extremely difficult.
In the US privacy law can be traced to 1890 to the influential article, “The Right to Privacy” by Samuel Warren and Louis Brandeis which was written in response to the impact of the development of the new technology of photography and of the print media. By 1960 four distinct torts could be recognised: intrusion upon seclusion; public disclosure of private facts; false light and appropriation. Privacy protection is also safeguarded by the First, Fourth and Fifth Amendments to the US Constitution. Over twenty statutes were passed from the 1970s to the present day to deal with specific privacy problems.
The weaknesses in US Privacy Law could be attributed to the collision between the expectation of “information, candour, and free speech” and the expectations of privacy. Since 2001, laws such as the Patriot Act have been passed to provide security from terrorism. These laws also collide with privacy protection as intrusion is essential for their effectiveness. Some of the provisions of the Patriot Act allow for instant police access to credit reports, new authority to compel information from Internet Service Providers and limited judicial oversight of surveillance. Perhaps as Al Gore has suggested, “We need an electronic bill of right for this electronic age.”96
In Europe the legislation has been more successful in introducing comprehensive and effective privacy protection. Article 8 of the European Human Rights Convention protects the right to privacy and the general provision can be found in the European Privacy Protection Directive (1995). Many European countries have adopted strict data protection laws. Germany has one of the strictest in the European Union. It covers the collection, processing and use of personal data. The “right of informational self-determination” which is limited by the “predominant public interest” was acknowledged by the Federal Constitutional Court in a case in 1983.97 The Court considered that a society in which the people are under permanent technical control is unconstitutional and harmful to liberal democracy. It also held that every individual
has the right to know and to determine which personal data is processed and how it is used. Dr Thilo Weichert, the Deputy Privacy Protection Commissioner of Schleswig-Holstein at a conference in 2000 in discussing the German legislation, nevertheless stressed the importance of public consciousness of the defence of civil liberties in the public sphere and saw democratic discussion as precondition for the defence of privacy.98
Australia, like the United States, has had limited success in providing comprehensive privacy protection. While in the US fundamental rights have been related to privacy protection under the American Constitution, in Australia there is no constitutional right to privacy and no bill of rights. Julian Burnside QC commented recently that the Australian Security Intelligence Organisation Legislation Amendment Act 2002 involves “significant erosion of ordinary civil liberties” and that “now more than ever it seems necessary” to entrench fundamental rights and freedoms in law99. The Privacy Act 1988 (Cth) and similar State legislation provide some, if not comprehensive protection.
In Australian Broadcasting Corporation v Lenah Game Meats Pty Ltd [2001] HCA 63, the High Court of Australia considered whether Australian law recognises a tort of invasion of privacy and its relevance to the implied freedom of political communication under the Constitution.
This case concerned an application for an interlocutory injunction to restrain the broadcasting of a film on a “brush tail possum processing facility” which showed the “stunning and killing of possums” and was to be included in a program, “7.30 Report”.100 Gleeson CJ considered at [41] that, “the lack of precision of the concept of privacy is a reason for caution in declaring a new tort...”. Further, he held at [43] that “There is no bright line which can be drawn between what is private and what is not.”
He regarded that the law of breach of confidence provided sufficient remedy in this case, although considered at [54] that “the reference to the gratuitously humiliating nature of the film ties in with the first of the four categories of privacy adopted in United States law, and the requirement that the intrusion upon seclusion be highly offensive to a reasonable person.”
Kirby J at [189] chose to postpose any determination of whether an actionable wrong of invasion of privacy exists in Australian law. Callanin J at [323] considered that “Any principles for an Australian tort of privacy would need to be worked out on a case by case basis in a distinctly Australian context.”
By 2003 some glimmer of hope for the recognition of a right of privacy emerged in Grosse v Purvis [2003] QDC 151 where it was held that a common law right of privacy existed in the circumstances of this case. Justice Heerey in Kalaba v Commonwealth of Australia [2004] FCAFC 326 considered that the circumstances of any claim would need to come within the four categories as determined by the US Courts.
It appears therefore that some progress towards the recognition of the rights of privacy has been made but this is threatened by the urgent demands for increased security.
E. Solution – Disconnection?
Is disconnection the only solution when we appear to have lost so much control over our private information?
When discussing the security and privacy of computers, Bruce Schneier considered:
“The only secure computer is one that’s turned off, locked in a safe, and buried twenty feet down in a secret location – and I’m not completely confident about that one, either.”101
Similarly J J Luna’s instructions for removing private information from a computer’s hard drive states:
“Sand the surface of the disk with a belt sander, melt it down, or hammer it into tiny pieces and then feed them slowly into a fast-moving river.”102
Drastic destructive measures provide some protection from software such as Guidance Software’s EnCase Forensic edition which can search for deleted files. It is used by police departments and organizations such as the FBI. Luna does, however, offer to provide privacy protection by teaching us to become invisible. He advises using pseudoanonymous and anonymous remailers103, not using wireless networks or file sharing programs, using a notebook computer with a removable hard drive, firewalls and antivirus software and never entering your correct address, phone number or other personal details when registering your computer. Luna’s book also offers to provide “a step-by-step guide to hiding assets, identity and your life.”104
Unless we disconnect from the System, the alternative appears to be to “surrender to technological determinism”105 and accept the transparent society and Scott McNealy’s advice:
“You have zero privacy. Get over it”.106
Perhaps the analysis of Professor Froomkin107 holds more hope when he stated in 2000 that “all is not lost – yet”.108 He considered that “despite the warnings of information privacy pessimists” and “the rapid deployment of privacy-destroying technologies by governments and businesses”, which threaten to make “informational privacy obsolete”109, there is a great deal that the law can do, particularly to regulate data collection, retention and use. This legal response together with a social response “that is at least as subtle and multifaceted as the technological challenge”110 will ensure that information privacy is maintained. Jeffrey Rosen also considered in 2000 that “the battle for privacy must be fought on many fronts – legal, political and technological – and each new assault must be vigilantly resisted as it occurs.”111
Daniel J Solove writing more recently has not found the solution in legislation because he sees that “the law still harbours conceptions of privacy that are not responsive to the realities of the Information Age.” 112 His solution lies in the architecture of the internet where “a particular social structure” needs to be established “that ensures individual participation in the collection and use of personal
information and responsibilities for entities that control that data.”113 The concept of architecture is used in a broad sense, similar to its application by Lawrence Lessig, in describing computer code and the design of information systems. The internet within this concept has a design that affects the way people communicate, the way data is transferred and the extent of privacy. Architecture in this sense can influence human behaviour, attitudes and interactions and play “a profound role in the structure of society.”114
F. Conclusion
An ethical, liberal and pluralistic society is one that should provide “backstage” private space where hyperbole, rich varieties of social behaviour and attitudes are free to develop without fear of censure. If there are to be alternatives to disconnection from the internet, a ‘rigorous and wide ranging debate’ is needed about the role of privacy as a ‘fundamental part of the fabric of society’115 and further legal and “non-legal mechanisms for discouraging privacy invasion”116 explored. Privacy is important – we don’t have to accept its demise.
NOTES
*Jennifer Farrell, Federal Court of Australia, Sydney Registry.
1 Rachel Lebihan: “Online privacy a mounting concern”, Australian Financial Review 28.10.04 page 25; Westin, Dr Alan, “What Consumers Have to Say About Information Privacy” 8.05.01 Prepared witness testimony, The House Committee on Energy and Commerce (USA). http://energycommerce.house.gov
2 Jeffrey Rosen: “The Eroded Self” at page 2 http://www.nytimes.com
3 Malcolm Crompton: “What is privacy?” A paper delivered by the Australian Federal Privacy Commissioner at the Privacy and Security in the Information Age Conference, 16 – 17 August 2001, Melbourne at page 13
4 Roger Clarke: Privacy Introduction and Definitions at http://www.anu.edu.au/people/Roger.Clarke/DV/Intro.html
5 Butterworths Australian Legal Dictionary at page 921
6 [2001] HCA 63 at [41]
7 [2001] HCA 63 at [43]
8 Alan Westin, “Privacy and Freedom”, (Atheneum 1967)
9 Richard Sobel: “The Demeaning of Identity and Personhood in National Identification Systems”. Harvard Journal of Law & Technology Vol.15, N0 2 Spring 2002 page 319 at page322.
10 401 U.S. 745, 764 (1971)
11 Ramsay Clark was appointed Attorney-General in 1967 by President Lyndon B Johnson.
12 Roger Clarke: Visual Surveillance and Privacy 8 August 2005 http://www.anu.edu.au/people/Roger.Clarek/DV/VisSurv0508.html
13 Zelman Cowen, 1969, ‘The Private Man’, The Boyer Lectures, Australian Broadcasting Commission p 9 – 10.
14 Daniel Solove: The digital person – Technology and privacy in the information age.” At page 41
15 C Baekkeland et al: “A Framework for Privacy Protection” http://cyber.law.harvard.edu/courses/Itac/privacy.html
16 United States v White 401 US 745, at 461
17 Roger Clarke is an author and public speaker, as well as a consultant specialising in strategic and policy aspects of business, information infrastructure, and data surveillance and privacy. He is a visiting Professor at the University of Hong Kong (in eCommerce), at the University of NSW (in Cyberspace Law & Policy) and at ANU (in Computer Science).
18 Roger Clarke: “Information Technology and Dataveillance” http://www.anu.edu.au
19 fn 18 dataveillance: “is the systematic use of personal data systems in the investigation or monitoring of the actions or communications of one or more persons.”
20 Biometrics refers to the automatic identification or identity verification of living persons using their enduring physical or behavioural characteristics and includes fingerprinting, facial recognition, iris/retina scanning and voice verification. http://www.eff.org “Biometrics. Who’s watching you”
21 http://www.eff.org “Investigating Machine Identification Code Technology in Color Laser Printers.”
22 http://www.eff.org The EFF(Electronic Frontier Foundation) was founded in 1990 and is a leading global non-profit organisation supporting free expression, privacy and the rights of individuals in an open society.
23 Statement of the Electronic Frontier Foundation before US House of Representatives subcommittee – “The Fourth Amendment and Carnivore” (July 28, 2000).
24 “FBI Ditches Carnivore Surveillance System” – Fox News 18 January 2005 http://www.foxnews.com/printer_friendly_story/0,3566,144809,00.html
25 camphone: a cellphone with an inbuilt camera. The first was produced by Sharp Corporation and marketed in November 2000. Major manufacturers are Nokia, Samsung, Motorola, Siemens, Sony Ericsson and LG Electronics with resolutions from 2 – 7 megapixels. http://en.wikipedia.org
26 http://www.smh.com.au 5 April 2005
27 http://en.wikipedia.org “Camera phone”
28 http://www.livingroom.org.au “Man Films His Suicide Live on Phone.”
29 web forms are the online information forms completed by the users; Cookies are small text files that websites leave on user’s computers and are used to deliver personalised content; web beacons are transparent images embedded on web pages to count visits and determine the effectiveness of the site; web server logs are the record of the number of requests for the page, the length of the visit and other data; spyware is unwanted software that tracks user activity, such as a keystroke logger and can provide a third party with personal and financial information.
30 Angus Kidman: “Knee-deep in Computer Nasties”. Italive, The Australian 20 September 2005 at page 4.
31 fn 25 and http://www.idc.com
32 sypware with malicious code.
34 http://www.spyware-reviews.com
35 Tal Z Zarshy: “Mine Your Own Business!: Making the Case for the Implications of the Data Mining of Personal Information in the Forum of Public Opinion.” 15 Yale Symp.L & Tech. 1 2002 at page 10.
36 Data mining originated in the 1990s and originally referred to the correlation of ridiculous facts such as between the stock market and the number of cows in a given area. It has been traced to research by Prof. Usama Fayyad in identifying latent defects in General Motors products – Zarshy, Tal Z.
37 fn 35
38 fn 35
39 L Dayton & K Dearne: “Damned by your own DNA” The Australian 12 September 2005 at page 10.
40 Primary haemochromatosis is an inherited disease thought to be caused by a gene mutation known as C282Y. It is a disease caused by excess iron in the body. http://www.netdoctor.co.uk
41 South Australian Democrats Senator, Natasha Stott Despoja put forward a motion in the Australian Senate in early September 2005, although she has been campaigning on the issue of genetic privacy since 1997. Also in May 2003 a report (“Essentially Yours”) on the protection of human genetic information by the Australian Law Reform Commission and the Australian Health Ethics Committee was tabled in Parliament. (http://www.alrc.gov.au ) The only response by the government so far has been the allocation of $7.6 million to establish an advisory body on human genetics.
43 http://www.justiceinitiative.org
44 Pollard, Ruth: “Privacy and safety are conflicting issues in the debate over electronic medical records” The Sydney Morning Herald Weekend Edition June 11- 12, 2005 at page 15
45 Dayton, Leigh; Dearne, Karen: “Privacy laws ‘hinder medical research’” The Australian 19 September 2005 at page 2.
46 http://www.epic.org “Privacy and Consumer Profiling”
47 William Bogard: “The Simulation of Surveillance: Hypercontrol in Telematic Societies” Cambridge University Press 1996 at page 27
48 Karl D Belgum: “Who Leads at Half Time?” Three Conflicting Versions of Internet Privacy Policy ,page 6 Rich.J.L.& Tech. 1.8 (Symposium 1999).
50 Electronic Privacy Information Center – “Privacy and Consumer Profiling” at pages 2 - 3
51 http://www.occupationwatch.org/headlines/archives/2005/06/military_enlist.html Mark Mazzetti, “Military Enlists Marketer to Get Data on Students for Recruits” 23 June 2005 Los Angles Times.
52 Law Enforcement Assistance Program commissioned in 1993 and in an audit in 1996 found to be providing significant benefits to the Victorian Police Force. www.audit.vic.gov.au
53 Michael Bachelard: “Insecure police database scrapped” The Australian 23 August 2005
54 Four US commercial aircraft were hijacked by 19 terrorist; two destroying the twin towers of the World Trade Centre in New York, the third striking the Pentagon and the fourth crashing in Pennsylvania.
55 Peter Siggins: “Racial Profiling in an Age of Terrorism” – a talk delivered to a Markkula Center for Applied Ethics forum 12 March 2002.
56 Whitaker, Reg: “The End of Privacy – how total surveillance is becoming a reality” The New Press, New York 1999 at page 104
57 Jeffrey Rosen: “The Eroded Self” http://www.nytimes.com
58 http://www.liquidmachines.com
61 http://www.wynnwilliams.co.nz On 3 March 1991 the beating of an African American, Rodney King, by four policemen was video taped by George Holliday.
62 [2005] FCA 1242 5 September 2005. This case concerned the operation of the Kazaa Internet P2P file-sharing system and the authorisation of copyright infringement
63 Professor Ross is the Professor of Computer Science at the Polytechnic University in Brooklyn, New York.
64 Jeffrey Rosen: “The Purposes of Privacy: A Response” Georgetown Law Journal Vol 89 2001 at page 4 http://papers.ssrn.com
65 Lawrence Lessig is a Professor of Law at Stanford Law School and founder of the school’s Center for Internet and Society. He is author of “Free Culture” (2004), “The Future of Ideas”(2001) and “Code and Other Laws of Cyberspace” (1999).
66 Jeffrey Rosen: “The Purposes of Privacy: at page 8
67 Jeffrey Rosen: “The Eroded Self” at page 15
68 http://www.theage.com.au Helen Westerman and Rebecca Urban: “Secretaries eat their words after an email to dine out on” and http://www.webpronews.com Jim Hedger, “Email Argument Costs Tow Secretaries Their Jobs.”
69 fn 63 at page 4
70 Tom Pullar –Strecker, “Microsoft finds NZ spam bill hard to swallow” 22 August 2005 http://smh.com.au
73 “Phishing is a scam in which the perpetrator sends out legitimate-looking emails appearing to come from legitimate e-commerce sites in an effort to obtain personal and financial information from the recipient. With consumers’ personal information, tech-criminals then commit credit-card fraud, identity theft and even perform unauthorized bank account transfers.” www.postini.com “Postini Reports Sizzling Summer Phishing Season.”
74 “Spam in decline as users wake up” The Economist 23 August 2005
75 social engineering is the use of psychological techniques of persuasion and influence to manipulate the human tendency to trust – see Granger, Sarah “Social Engineering Fundamentals, Part I: Hacker Tactics” http://www.securityfocus.com and Rusch, Jonathan: “The ‘Social Engineering’ of Internet Fraud” http://www.isoc.org
76 http://www.antiphishing.org
77 A Trojan horse is a destructive software program that appears to be a benign application such as one that can rid a computer of viruses but instead it infects the computer.
78 Gartner Inc is the world’s largest research and advisory company http://www.gartner.com
79 http://www.internetperils.com
80 http://www.spywareguide.com “Identity Theft and Spyware – The New Threat.”
81 Nicole Manktelow: “Home Invasion” 5 August 2004 http://smh.com.au
82 David Humphries: “Same old card trick” The Australian Weekend Edition July 23 – 24 2005 page 34
83 http://www.mypublicinfor.com
84 These include financial records, property ownership records, government licenses, law enforcement records and other Federal, Sate and County records.
85 http://www.bbbonline.org “New Research Shows That Identity Theft is More Prevalent Offline with Paper than Online.” 26 January 2005
86 Andrew Birmingham: “The criminal element” 6 September 2004 http://smh.com.au
87 The Hon Philip Ruddock MP: “Australia’s Legislative Response to the Ongoing Threat of Terrorism” 27 UNSW Law Journal Vol 27(2) at page 254
88 Paul Beynon-Davies: “Personal Identification in the Information Age: The case of the National Identity Card in the UK.” European Business Management School, University of Wales Swansea, Singleton Park, Swansea, UK.
89 http://www.privacy.org.au “A new ‘Australia Card’: the costs outweigh the benefits” – an open letter to Coalition MPs.
90 A Michael Froomkin: “The Death of Privacy?” Stanford Law Review Vol 52 page 1461 at 1464
91 Andrew Birmingham: “The criminal element” http://smh.com.au 6 September 2004
92 http://web.mit.edu/network/pgp.html
94 C Baekkelund et al: “A Framework for Privacy Protection” http://cyber.law.harvard.edu/courses/Itac98/privacy/html
95 P3P has been in development since 1997. It is a protocol designed to inform Web users about the data-collection practices of Web sites. It provides a way for a Web site to encode its data-collection and data-use practices in a machine-readable XML format known as a P3P policy. Under this system the organization marinating the server makes a declaration of its identity and privacy practices. It is a more flexible solution than the Platform for Internet Content Selection (PICS) by allowing grater freedom for negotiation. Websites however have no incentive to adopt P3P because legally they can collect information without negotiation. It is nevertheless emerging as an industry standard in the US.
96 Al Gore, former Vice President of the USA
97 BVerfGE 65, 1.
98 22 – 24 February 2000, European Conference on Video Surveillance: Video Surveillance – A Crime Prevention Instrument in European Comparison, at the University of Gottingen. http://www.datenschutzzentrum.de
99 Kate Gibbs: “Big guns push for Bill of Rights” Lawyers Weekly Issue 247 1 July 2005 at page 1
100 Lenah at [69]
101 J.J. Luna: “How to be Invisible” Thomas Dunne Books, New York 2004 at page 211
102 fn 97 at page 222
103 These are services that will strip away old identities, provide a new one and send the email. The anonymous remailer being more secure because the email will be send via a string of remailers.
104 http://www.howtobeinvisible.com
105 Jeffrey Rosen: “The Eroded Self” http://www.nytimes.com 30 April 2000
106 Scott McNealy, CEO of Sun Microsystems, Inc., in 1999 at a product launch.
107 Professor of Law, University of Miami School of Law.
108 A M Froomkin: “The Death of Privacy?” Stanford Law Review Vol 52 1461
109 fn 108
110 fn 108 at 1543
111 Jeffrey Rosen: “The Eroded Self” April 30, 2000 http://www.nytimes.com
112 Daniel J Solove: “the digital person. Technology and Privacy in the Information Age.” New York University Press 2004
113 fn 112 at page 123
114 fn 112 at page 98
115 Malcolm Crompton: “What is privacy?” A paper presented at the Privacy and Security in the Information Age Conference, 16 – 17 August 2001, Melbourne at page 11.
116 Jeffery Rosen: “The Purposes of Privacy: A Response” at page 51
Copyright 2005. Greek Legal and Medical Conference