Cyberspace privacy issues
Graham Greenleaf, revised 26 April 2002
This Reading Guide contains material prepared for other subjects,
but it may be useful to students in the broadening course 'ITLaw@hku,hk'
Objectives
This Reading Guide aims to help provide an understanding of the following:
-
What is special about privacy invasions in cyberspace?
-
The application of IPPs (and, potentially, surveillance laws) to cyberspace
transactions
-
Internet business practices affecting privacy, and the effect of IPPs on
them
-
Privacy-invasive technologies (PITs), and whether IPPs can control them
-
Privacy-enhancing technologies (PETs), and their relationship to legal
protection of privacy
-
The particular problem of the 'borderless' nature of cyberspace for privacy
1. Special nature of cyberspace privacy issues
Some suggested reasons why cyberspace privacy issues are different follow:
-
In most cyberspace interactions, consumers/citizens are potentially identifiable
(at least to some extent). Identifiability is the default condition of
cyberspace interaction.
-
The remote nature of cyberpsace transactions often creates pressures for
a higher level of identification than would be required in physical space
interactions.
-
Intrusive/invasive interactivity can take place without full identification,
but only partial indentification or location.
-
The traditional barriers of incompatible files and cost are steadily disappearing.
What other reasons can you suggest?
2. Application of IPPs to cyberspace transactions
There are two main categories of issues here:
(i) Is there 'personal information' so that IPPs can apply
at all? - This is always the threshold issue when looking at cyberspace
privacy issues.
-
<IMG ALIGN=MIDDLE SRC="http://www2.austlii.edu.au/itlaw/required.gif">
3.1.
What is 'personal information'? in G Greenleaf 'Private sector privacy:
Problems of interpretation' [2001] CyberLRes 3 -The rest of the article
gives illustrative examples for each of the NPPs of cyberspace-related
privacy issues, and problems with some key definitions (such as 'personal
information').
(ii) What do IPPs require in cyberspace transactions? -
-
Graham Greenleaf Privacy
Principles - irrelevant to cyberspace? (1996) 3 PLPR 114 - Covers the
applicability of each of the Austrlaian Federal IPPs (public sector ) to
privacy issues relating to the internet. (As to the Australian private
sector NPPs, see below and article above).
-
Federal Privacy Commissioner Draft
National Privacy Principle Guidelines (May 2001) - Throughout the Commissioner's
draft Guidelines there are Internet examples, but they are not conveniently
located at any one point.
(iii) Are IPPs adequate to address cyberspace issues? - Surrent
sets of IPPs may no longer be sufficient, even in principle. New IPPs may
be needed. For an early discussion of these issues, see:
3. Internet business practices affecting privacy
11.3.1. Internet business practice involving privacy
breaches
The purpose of this section is to illustrate the simple point that breaches
of privacy often occur because of defective business practices, not principally
because of the technologies employed (as to which, see the next section).
-
Tim Dixon Public
debacles prompt privacy rethink (2001) 7 PLPR 149 - Dixon catalogues
a host of privacy breaches by major organisations during 2000, including
Real Networks; DoubleClick; PSINet; Toysmart; Amazon; Toysrus; and the
Australian Taxation Office. In most cases, the problems arose because of
defective business practices.
We can use Dixon's examples to test the applicability of IPPs to cyberspace
problems. Try to apply the IPPs in the Hong Kong, Australian or other legislation
to these situations.
-
Real Networks - 'Real Jukebox' software extracted information from
hard disk about users listening habits and sent this to Real Networks.
-
DoubleClick - DoubleClick collected IP addresses of consumers viewing
ads, and could collate this with their web browsing habits; it proposed
to combine this with home address, name and purchasing habit information
bought from Abacus; suspended proposal after consumer backlash and loss
of 1/3 of share price (estimated loss as high as US$2.2BN). [See now the
terms of settlement
of the class actions against DoubleClick - note the 8 terms proposed,
and the breadth of the actions on which they are based.]
-
PSI Net - For a payment, ISP allowed its servers to be used to send
out 5-20M spam messages, while its stated policy was to suspend users who
did this.
-
Toysmart - Bankrupt business proposed to sell consumer database
of online purchasers, despite promising never to sell consumer information.
FTC insisted it should only sell the database as part of the whole business.
Shareholder (Disney) paid to destroy the datatbase.
-
Amazon - After Toysmart's problems, Amazon retrospectively amended
its privacy policy to explicitly allow it to sell its consumer datbase
as a separate asset.
-
ToysRUs - Outsourcing of customer data for analysis where the third
party doing the analysis was also entitled to use it for its own analysis
purposes.
-
Breaches of security - CD Universe and an ISP each had credit card
details of customers stolen by hackers; whereas IKEA, pwergen and the Australian
Tax Office inadvertently left data on clients visible on public web pages,
which was then found by web search engines or other customers.
11.3.2. Privacy policies on web sites
Privacy policies on web sites may establish contractual or other obligations
to those who browse those sites aware of or relying on those policies.
This is of diminishing relevance in jurisdictions with legislative IPps
which make some forms of 'privacy policies' including the disclosure of
collection practices compulsory.
However, in addition to the questions of whether they comply with the
legislative IPPs, privacy policies on web sites may be important because:
-
They are on sites run by small businesses (in Australia) or other entities
not bound by IPPs; and
-
They promise a degree of privacy protection going beyond what is required
by the IPPs (eg 'We will never disclose your personal information ...')
For a detailed discussion of the legal significance of website privacy
policies, see Mark Berthold
Website
Privacy Policy Statements And The Changing Face Of E-Commerce [2002]
PLPR (Issues 9 and 10). Part 2 mainly concerns Australian law, but the
other parts of the article are of more general relevance.
Some surveys of web site privacy policies:
`[EPIC] reviewed 100 of the most frequently visited web sites
on the Internet. We checked whether sites collected personal information,
had established privacy policies, made use of cookies, and allowed people
to visit without disclosing their actual identity. We found that few web
sites today have explicit privacy policies (only 17 of our sample) and
none of the top 100 web sites meet basic standards for privacy protection.'
11.3.3. Examples of practices and policies of particular
organisations
4. Privacy-invasive technologies (PITs)
Cyberspace has made possible the development of many technologies which
can be used for invasion of privacy which do not have exact equivalents
in the physical world.
11.4.1. General resources
Roger Clarke provides some of the best classifications and explanations
of technologies affecting privacy.
11.4.2. Cookies
Technical operation
What are `cookies'? `A cookie is a record stored on a user's machine as
a result of a web-server instructing a web-browser to do so. It is sent
to an appropriate web-server along with a request for pages.' (Clarke).
Netscape describes them as:
` Cookies are a general mechanism which server side connections
(such as CGI scripts) can use to both store and retrieve information on
the client side of the connection. The addition of a simple, persistent,
client-side state significantly extends the capabilities of Web-based client/server
applications.'
A lot of people regard them as a serious privacy invasion: The Electronic
Privacy Information Centre (EPIC) says `a cookie is a mechanism that allows
a web site to record your comings and goings, usually without your knowledge
or consent'. "I basically equate cookies to the notion of a store being
able to tattoo a bar code on your forehead, and then laser-scan you every
time you come through the doors" (Simson Garfinkel). Others think they
are one of the few ways to overcome the `statelessness' of web protocols
and are essentially benign.
Application of IPPs
For class discussion
11.4.3. Single pixel GIFs (aka 'invisible GIFs'
or 'Web bugs')
Technical operation
Related to cookies are single pixel GIFs, which are graphics that are usually
invisible to web users because they are 1x1 pixel in size, with no border
and the same colour as the page background. They are also known as 'web
bugs', 'web beacons', '1-by-1 GIFs', 'clear GIFs' and 'invisible GIFs'.
Single pixel GIFs have many different surveillance uses. Kaman
Tsoi explains (see reference below) that these 'web bugs' are used by network
advertisers
But when the user views the ad host's home page, in addition
to any cookie which may be set by the ad host itself, the network advertiser
serves a cookie to the user's browser. And because the banner ad graphic
is operating as a web bug, the network advertiser receives information
including the IP address of the user's computer, the URL of the ad host's
home page, the time that the page was viewed and the type of browser being
used by the user.
If the user then clicks on the banner ad to link through to
the advertiser's website, the further movements of that user would be monitored
to the extent that the network advertiser had invisible web bugs on any
pages of the advertiser's site. Each time a web-bugged page is viewed by
the user, the network advertiser receives the same information about the
IP address, page URL, and time and browser type, along with the cookie
value that was set when the banner ad was first viewed. Unless the user
deletes the cookie, this monitoring could occur even if the user did not
view the advertiser's site immediately or via the banner ad link.
...
While this is all impressive, particularly in comparison to
other forms of advertising, what really takes web bugs into mind-boggling
territory is simply this: for each network advertiser, there are many more
ad hosts, advertisers and users. What this means is that by using the same
cookie wherever the network advertiser has banners or web bugs on ad host
or advertiser sites, the network advertiser can consolidate the data related
to a particular cookie to form a detailed profile of browsing habits which
could include the types of sites visited. The network advertiser can then
add value to its advertisers by using these cookie profiles to determine
what ad is shown the next time a user with that cookie is identified visiting
an ad host's site. The major network advertisers hold hundreds of millions
of these consumer profiles between them. The AltaVista search engine can
be used to search for web bugs, and one recent search reported more than
four million web bugs planted by 30 vendors on the internet
More information:
Application of IPPs
For class discussion
11.4.4. Spam - Unsolicited bult email
According to Hormel Foods,
manufacturers of the 'popular' luncheon meat 'SPAM':
"Use of the term "spam" was adopted [in order to describe Unsolicited
Bulk Email (UBE) or Unsolicited Commercial Email (UCE)] as a result of
the Monty Python skit in which a group of Vikings sang a chorus of "spam,
spam, spam . . . " in an increasing crescendo, drowning out other conversation.
Hence, the analogy applied because UCE / UBE was drowning out normal discourse
on the Internet. "
Why spam is a problem
According to the Australian (NOIE) survey:
-
Commercial surveys suggest that spam may account for 10-20 per cent of
all email traffic, with significant consequential losses for public and
private sector organisations.
-
Spam is spreading beyond email to other forms of electronic messaging,
such as relay chat and instant messaging.
-
There are reports of spam causing IT system and security problems. For
example, some spamming operations appear to be overloading or temporarily
closing overwhelmed servers and networks of innocent intermediaries. Ultimately
this has implications for the stability of Internet services especially
if spam campaigns are deliberately used to deliver viruses.
-
Some Australian businesses are being 'spoofed' by spammers, when nuisance
email is being routed through, and appears to come from, those firms. Their
commercial reputation is then at risk, and their owners and managers are
obliged to spend inordinate time rectifying this, including attempting
to respond to many annoyed spam recipients.
Law regulating spam
Resources on spam
11.4.5. Location extraction
11.4.6. Data aggregation
5. Technical protections - 'Privacy enhancing technologies'
(PETs)
This part catalogues a number of what many describe as 'Privacy enhancing
technologies' or 'PETs' - forms of technological 'self-help'. In many cases
these have been promoted (particularly in the USA) as alternatives to legislative
protection of privacy, but in other countries they are more usefully considered
from the following perspectives:
-
As technologies that may provide part of the protection that legislation
requires, and are possibly more effective in the context of legislation.
-
As alternatives where legislative protection can't reach, particularly
where privacy invasions (eg SPAM) originate from outside your own jurisdiction.
11.5.1. Overviews of PETs
The best starting points are:
Some of these PETs are discussed further below
11.5.2. How serious are PETs as privacy solutions?
Advocates of PETs as a principal method of privacy protection include:
11.5.3. The W3C's Platform for Privacy Preferences
(P3P)
The World Wide Web Consortium (W3C) has
developed the Platform for Privacy Preferences
(P3P), which has been described as `a framework within which trust
can be achieved between web services providers and consumers' (Clarke's
description in 1998, though he would reject it now).
Explanations
Here is the official description from the P3P web pages:
What is P3P? (2002 description) The Platform for Privacy
Preferences Project (P3P), developed by the World Wide Web Consortium,
is emerging as an industry standard providing a simple, automated way for
users to gain more control over the use of personal information on Web
sites they visit. At its most basic level, P3P is a standardized set of
multiple-choice questions, covering all the major aspects of a Web site's
privacy policies. Taken together, they present a clear snapshot of how
a site handles personal information about its users. P3P-enabled Web sites
make this information available in a standard, machine-readable format.
P3P enabled browsers can "read" this snapshot automatically and compare
it to the consumer's own set of privacy preferences. P3P enhances user
control by putting privacy policies where users can find them, in a form
users can understand, and, most importantly, enables users to act on what
they see.
P3P Project in a nutshell (1998 description)P3P*
is a privacy assistant: users can be informed, in control, and use P3P
to simplify andhelp them make decisions based on their individual privacy
preferences. The P3P specification will enable Web sites to express
their privacy practices and users to exercise preferences over those practices.
P3P products will allow users to be informed of site practices, to delegate
decisions to their computer when possible, and allow users to tailor their
relationship to specific sites. Sites with practices that fall within the
range of a user's preference could, at the option of the user, be accessed
"seamlessly," otherwise users will be notified of a site's practices and
have the opportunity to agree to those terms or other terms and continue
browsing if they wish. P3P gives users the ability to make informed decisions
regarding their Web experience and their ability to control the use of
their information. Sites can use P3P to increase the level of confidence
users place in their services, as well as improve the quality of the services
offered, the customization of content, and simplify site access. P3P allows
one to make statements about privacy practices and preferences in a flexible
manner. P3P uses RDF/XML for making privacy statements as well as for exchanging
data under user control. P3P will support future digital certificate and
digital signature capabilities as they become available. P3P can be incorporated
into browsers, servers, or proxy servers that sit between a client and
server. * For brevity, we often refer to the P3P project, activity, specifications,
or products as "P3P."
The W3C's general approach to privacy issues is on its Privacy
Activities page.
More details, including all technical specifications, are available
from the P3P Project web pages.
Please browse
Roger Clarke 'Platform
for Privacy Preferences: An Overview' (April 1998) - provides a simple
explanation, and technical summaries as well.
Comments
[The following comments are from a paper I wrote in 1998 - they may now
need some revision: GG]
P3P is a protocol which is intended to be able to be applied to
support negotiations in a variety of internet contexts, including explicit
data provision (eg answers to questions on web forms), implicit data provision
(eg capture of the `click stream' or URLs of pages visited in succession),
and explicit data provision from third sources (eg a web user's stored
profile of preferences, demographic details etc). How it can be applied
to some extensions to basic HTML such as cookies, Java etc is not yet determined.
P3P allows web users to have multiple personae (digital pseudonyms), allowing
a user to choose between a `data-poor' or `data rich' personality depending
on the site visited[1].
P3P is the first important privacy initiative to have emerged
from the consultative and self-regulatory structures of internet governance
(although dominated by W3C staff members), and for that reason alone is
of considerable significance.
Clarke compares what P3P is attempting to deliver against the
OECD privacy Guidelines[2]http://www.anu.edu.au/people/Roger.Clarke/DV/P3PCrit.html],
and concludes that it only addresses parts of three of the OECD Principles
(data collection directly from the individual concerned; limitations on
use and disclosure, and openness about use and disclosure policies), but
does not address other principles relating to collection from third parties,
subject access to data held by the web-site operator, retention of data
and security. This is not necessarily a criticism, merely a limitation
of one tool, but it would seem that some of these matters could be addressed
by the same protocol in order to give more comprehensive privacy protection.
The more substantial criticism is that P3P says nothing about
measures to ensure that it is complied with. If the web service provider
breaches the practices that it has told the user that it adopts during
a P3P `negotiation' what can the user do about it (assuming he or she ever
finds out in the first place)? Some aspects of this problem are:
-
P3P does not require the web service provider to log access and uses of
the data it collects.
-
P3P is not a certification scheme, and provides no guarantee of audits
or similar protective measures. Some industry based initiatives like TRUSTe
could supplement it.
-
There is no guarantee that the P3P framework provides any linkage to a
particular country's laws (such as contract laws or data protection laws)
as Clarke points out, so P3P `promises' may be legally meaningless. There
is an `assurance statement' in the Protocol where an attempt could be made
to provide either contractually binding or legally descriptive statements,
but its use is not compulsory.
P3P may become `one important element among many others' (as Clarke concludes),
but it will be of little use unless it meshes with law and organisational
practices. Until it does that, it could be little more than a framework
for deception.
The Electronic Privacy Information Center (EPIC) identifies a
different danger in that it considers that a what is in effect framework
for efficient collection of personal information as a condition of entry
to web sites (with the possibility of increasing exclusion of those who
value their privacy) may be counter-productive to privacy, compared with
simply opposing the increased collection of personal information.
Critiques of P3P
There are now a wide range of papers available about P3P.
-
Testimonials
for P3P 1.0 (2002)
-
W3Cs' P3P pages Papers, Presentations,
Critiques, and Media Coverage - a very wide range of papers, pro-,
con- and neutral, but only by the P3P Working Group members.
Please browse for papers of interst.
-
Roger Clarke P3P
Re-visited (2001) 7 PLPR Issue 10 - Clarke is now openly hostile to
P3P (compare below), joining other critics such as Catlett and Rotenberg:
"P3P is nothing more than a 'privacy policy declaration' standard. That's
a mere fraction of what it was meant to be, and of what the situation demands."
[This paper is not listed in the W3C'ss pages, though other Clarke papers
are]
-
Roger Clarke 'Platform
for Privacy Preferences: A Critique' (April 1998) - Clarke situates
P3P in the history of IPPs, showing which aspects of standard privacy concerns
it does and does not address. He also discusses the enforcement aspects.
He reaches an `open verdict' on its utility.
-
Karen Coyle P3P: Pretty Poor Privacy?
A Social Analysis of the Platform for Privacy Preferences (P3P) (1999):
"P3P is the software equivalent of Mr. Potato Head. It is an engineer's
vision of how humans decide who to trust and what to tell about themselves.
It has a set of data elements and neatly interlocking parts, and it has
nothing at all to do with how people establish trust. "
-
Jeremy A. Birchman 'Is
P3P the Devil?' (University of Miami School of Law, student paper)
May, 1998
Implementations of P3P
How will the defaults be set in software that implements P3P?
6. The 'borderless' problem: Internet privacy invasions
from overseas
More than any other form of privacy problem, cyberspace issues are likely
to involve complaints of invasions of privacy by overseas organisations.
This leads to a premium on self-help (PETs).
What are legal systems doing about this?:
-
Where there is a local law, allowing non-residents to take action under
it.
-
Data export prohibitions
-
Some enforcement cooperation between Privacy Commissioners
-
Some attempts to develop international standards
But a TRIPS or WTO for privacy (an international standard which must be
enforced locally even though it is foreiners who are affected) is not yet
on the horizon outside Europe.
7. Multi-purpose ID Card / number systems
10.4.1. General resources
10.4.2. Privacy issues in ID cards
-
It's always a database as well, not just a card
-
Multi-purpose cards are inherently dangerous
-
Cancellation of multiple rights - `outlawing'
-
Risks of identity theft may be higher
-
Aggregation of separate personal data
-
`Function creep' is likely - no inherent limits
-
ID card + chip + digital signature (DigSig) may become a `cyberspace passport'
-
uses may become compulsory (by law or de facto)
-
The long-term dangers of repressive use
10.4.3. Hong Kong
10.4.3. Hong Kong
Hong Kong's SMARTICS ID smart card, to operate from mid-2003, will be one
of the most ambitious ID card systems in the world (a multi-purpose smart
card, with no defined limit to its uses, and potentially with digital signature
attached), and therefore one with very great potential dangers to privacy.
Questions:
-
What are the dangers / abuses of the existing ID card?
-
What dangers does the smart ID card pose above and beyond the existing
ID card?
-
Are the steps proposed by the Administration to prevent abuses of the smart
ID card sufficient to deal with possible abuses? What further steps could
/ should be taken?
-
What changes to legislation does the introduction of the smart ID card
require if privacy is to be adequately protected?
Existing HK ID card
PCO Code of Practice:
The SMARTICS ID smart card (from 2003)
Summary from the government statement Digital
21: 2001 HK Digital 21 Strategy: Key Result Area 5 :
"We will replace the existing Hong Kong citizens' identity
cards with a new generation of 'smart' identity cards from 2003 onwards.
This will cover a population of around seven million people. The identity
card replacement exercise presents us with a unique opportunity to capitalise
on the use of smart card technology for developing a user-friendly platform
to provide more efficient, better quality and value-added services to the
community. We have proposed that the new identity card should take the
form of a multi-application smart card with capacity to support different
types of applications. This will be a significant step forward in enhancing
our overall information infrastructure and achieving our aim to position
Hong Kong as a leading digital city. It will also facilitate the adoption
of e-business in the community. We are conducting feasibility studies to
examine how smart card technology can be used to provide additional value-added
functions through the new identity cards. We will carry out public consultation
on whether these functions should be adopted. We will also adopt comprehensive
measures to ensure that the smart identity cards are secure and to address
privacy and personal data protection. We target to roll out the new smart
identity cards with multi-application capacity starting from 2003."
G Greenleaf Slides
on Legal/technical protection of Internet privacy (go to slide 'The
HK `smart' ID card')
Official documents:
All of these documents are important.
Read as many as you can.
-
ITBB LegCo Panels briefing Non-immigration
Applications for Incorporation into the Smart ID Card (20 Dec 2001)
and slide
show ; some points to note -
-
Digital signature - At this stage, only HK Post's e-Cert can be
included on the ID card; digital signatures from other recognised CAs cannot
be included, though this is said to be 'under continuous review' [8].
-
Driver's licence - Details will only be held on the Transport Department's
back-end computer system, not on the card, and Police will use the ID number
to interrogate the computer online to verify licence details [12]. However
'many thousands of people' will still need to obtain a physical driver's
licence [12] in order to prove to others that they have one (car hire firms,
employers, foreign driving etc). ITBB claims that, since no data is being
held on the card, this change is 'voluntary' [30]. You will in fact have
to opt-out of having only a driver's licence held on the TD computer,
by requesting a physical one as well.
-
Library card - Will be 'voluntary' in that 'library users will have
the option to be issued with plastic library cards' [17] - again, seems
they will have to opt-out of only using the ID card as a borrowing card.
(Is any online checking involved here? Will LCSD do online checks using
ID number? Will the ID card be used to enter libraries and borrow books
by being swiped over readers?)
-
Change of address - Requires use of HK Post E-cert [18].
-
Legislative amendments for added functions - Will only be required
(by addition to Schedule in ROP Ordinance) if the 'require the storage
of addtional data in the chip or printing of additional information on
the card surface' [23]. E-cert will require this [24]. Neither driver's
licence nor library card applications require Schedule [25], but allowance
of ID card as library card requires change to the Libraries Regulations
[25] and changes to traffic Ordinances are needed, but only to remove the
need to carry a physical licence [26].
-
Thumbprints on card will only be a template [Annex] - but does RPO guarantee
that?; no indication as to who is allowed to use it to 'authenticate the
card['s] identity holder to prevent identity theft' (ImmD only?).
-
Individual Departments involved will maintain their own databases [Annex].
-
Registration
of Persons (Amendment) Bill 2001
-
Bills
Committees page
-
Legco
Brief on Registration of Persons (Amendment) Bill 2001
-
Legislative
Council Brief on Hong Kong Special Administrative Region Identity Card,
18 October 2000; issued by the Security Bureau - This is the base document
providing a public explanation and justification for the smart ID card
-
HK LegCo Panel on Security Papers
on the HKSAR Identity Card Project - links to government papers and
submissions received
-
Administration's information paper "HKSAR
Identity Card Project - Initial Privacy Assessment Report" Feb 2001
(presented to LegCo Panel on Security)
-
Administration's paper on "HKSAR
Identity Card Project-Initial View on Legislative Amendments" Nov 2000
(presented to LegCo Panel on Security)
-
Administration's paper "Progress
of the HKSAR Identity Card Project-Privacy Issues" Nov 2000 (presented
to LegCo Panel on Security); The paper concludes "More Privacy Impact Assessments
will be conducted at different stages of the project from the planning
stage to the post implementation stage. The Privacy Commissioner will be
informed of the findings of each assessment and his views will be taken
into account as data protection measures are formulated or upgraded. The
relevant laws will be observed at all times. This will guarantee that adequate
privacy safeguards are in place."
-
Submission
from the Office of the Privacy Commissioner for Personal Data 8 November
2000 (presented to LegCo Panel on Security)
Press articles:
10.4.4. United States
The US does not have a national ID card.
-
EPIC's National ID card
pages - One of the most current resources for US developments. EPIC
opposes an ID card for the US.
10.4.5. Australia
Australia is also relatively unusual in not having a national ID card,
following the defeat of the 'Australia Card' proposal in the 1980s.
Why was the Australia Card defeated? Does it make any difference?
The Australia Card - a defeated ID card scheme?
It is a decade since the defeat of the `Australia Card' proposals in late
19787, which led directly to the political compromise
of the Tax File Number (and thereby, a few years later, the Commonwealth's
mass data matching scheme) and the Privacy Act 1988 in the following
year.
The defeat of the Australia Card is still the most important object
lesson in Australia in how popular resistance can defeat a mass surveillance
proposal - but the story was always far more complex than that. A decade
later, we can still ask `have governments and the private sector achieved
everything they hoped for from the Australia Card, and more, by more subtle
means?' - and we do in fact ask it in the Question
`Who needs the Australia Card?'.
Here are some articles, written at the time, which chart the rise,
meaning, and demise of the Australia Card:
-
Why was it dangerous? - see Graham Greenleaf Australia
Card: towards a national surveillance system' (Law Society Journal
(NSW) Vol 25 No9, October 1987). This is a long article analysing the Australia
Card Bill 1986 and its implications, but the scheme is summarised in the
attached Tables
-
What killed the Ozcard? - for one version of the complex story of the Card's
demise, see Graham Greenleaf Lessons
from the Australia Card - deux ex machina ? The Computer Law and
Security Report, Vol 3 No 6, March/April 1988, pg6.
-
For another view, see Roger Clarke Just
Another Piece of Plastic for Your Wallet: The Australia Card
Prometheus,
5,1 June 1987 Republished in Computers & Society 18,1 (January
1988), with an Addendum in Computers & Society 18,3 (July 1988)
- This article covers the whole story.
Data matching and the Tax File Number: a story
of function creep
The following articles and papers track (in roughly historical order)
the history of the expanding use of the Tax File Number into the Commonwealth's
data-matching system (under the Data-matching
Program (Assistance And Tax) Act 1990 (Cth), and otherwise), one of
the world's more extensive mass surveillance systems.
-
Roger Clarke Tax
File Number Scheme: Case Study of Political Assurances and Function Creep'
Policy,
1991 - Documents the numerous ways in which promises about the limited
use which would be made of the Tax File Number were broken. New variation
on the old themes of `how far can you trust politicians' and `its the thin
end of the wedge'.
-
G Greenleaf `Can
the data matching epidemic be controlled?' (1991) 65 ALJ 220-23 (reprinted
in vol 7 Computer Law and Security Report, 1989 15-17) - mainly
about the mechanics (and some dangers) of the Data-matching legislation.
-
Roger Clarke `The
sad tale of the parallel data matching program' (1994) 1 PLPR 8; Clarke
claims DSS had the scheme approved by Cabinet and Parliament on the basis
of fraudulent estimates; The actual gains are at best 10% of the estimates
($30M for DSS) and at worst (Clarke's analysis) a net loss; About 12,000
people get an intimidating 'show cause' letter each year without it turning
out that any action is taken (action in 1,500 cases); It involves 6-9 runs
x 10 million attempted matches per year. Basically, Clarke argues that
other means of enforcement which are less privacy invasive of those not
involved in fraud, would give at least as good a result.
-
Roger Clarke - review of the Australian National Audit Office Efficiency
Audit- Department of Social Security- Data matching' (1993) 1 PLPR
12; ANO found that data matching did not outperform random selection drawn
from specific client groups re cancellations and downward reviews.
-
Tim Dixon `Data-matching programs reviewed' (1995) 2 PLPR 13; concludes
that the 4th set of reports (1993-94) 'confirms trends identified in Clarke's
analysis'; DSS seems to claim $63M net gain (previous estimate only 21.1M!),
but other agencies claim to have made virtually nothing or lost money (ATO);Th
e cost/ direct benefit ratio is falling - will be 1/2 in 97-98,
but DSS claims an estimated $90M extra recouped through 'voluntary compliance';
-
G Greenleaf `Data
matching in Australia - the facts' (1995) 2 PLPR 114 (Review
of Privacy Commissioner (Cth) Regulation of Data-matching in Commonwealth
Administration (Report to the Attorney-General) September 1994 ). This
Report surveyed data matching by Commonwealth agencies other than
that regulated under the data matching legislation. The Commissioner recommended
to the Attorney-General that the Privacy Act be amended to include uniform
controls for all data-matching.
-
Despite the findings of the above Report in 1994, there is now significant
data matching between Commonwealth agencies and the private sector and
State agencies - federal tax and welfare agencies routinely access major
databases, such as those of employers and higher education institutions
to match against their own client lists. No such legislation as recommended
in the above report has been enacted.
-
Various
Guidelines on data matching issued by the Federal Privacy Commissioner
are available. The Federal Privacy Commissioner Annual Reports since 1991
contain considerable information about the extent of data matching both
under the statutory scheme and other data matching (see for example 1999-2000
Annual Report)?