11. Cyberspace privacy issues
Graham Greenleaf, revised 26 April 2002
= required reading
material added since the date of the class concerning this topic
This Reading Guide aims to help provide an understanding of the
suggested reasons why cyberspace privacy issues are different follow:
- What is special about privacy invasions in cyberspace?
- The application of IPPs (and, potentially, surveillance laws) to
- Internet business practices affecting privacy, and the effect of IPPs on
- Privacy-invasive technologies (PITs), and whether IPPs can control them
- Privacy-enhancing technologies (PETs), and their relationship to legal
protection of privacy
- The particular problem of the 'borderless' nature of cyberspace for privacy
What other reasons can you suggest?
There are two main categories of issues here:
- In most cyberspace interactions, consumers/citizens are potentially
identifiable (at least to some extent). Identifiability is the default
condition of cyberspace interaction.
- The remote nature of cyberpsace transactions often creates pressures for a
higher level of identification than would be required in physical space
- Intrusive/invasive interactivity can take place without full
identification, but only partial indentification or location.
- The traditional barriers of incompatible files and cost are steadily
(i) Is there 'personal information' so that IPPs can apply at
all? - This is always the threshold issue when looking at cyberspace
(ii) What do IPPs require in cyberspace
3.1. What is 'personal information'? in G Greenleaf 'Private sector
privacy: Problems of interpretation'  CyberLRes 3 -The rest of the
article gives illustrative examples for each of the NPPs of cyberspace-related
privacy issues, and problems with some key definitions (such as 'personal
(iii) Are IPPs adequate to
address cyberspace issues? - Surrent sets of IPPs may no longer be
sufficient, even in principle. New IPPs may be needed. For an early discussion
of these issues, see:
purpose of this section is to illustrate the simple point that breaches of
privacy often occur because of defective business practices, not principally
because of the technologies employed (as to which, see the next section).
- Graham Greenleaf
Principles - irrelevant to cyberspace? (1996) 3 PLPR 114 - Covers the
applicability of each of the Austrlaian Federal IPPs (public sector ) to
privacy issues relating to the internet. (As to the Australian private sector
NPPs, see below and article above).
- Federal Privacy Commissioner
Draft National Privacy Principle Guidelines (May 2001) - Throughout the
Commissioner's draft Guidelines there are Internet examples, but they are not
conveniently located at any one point.
We can use Dixon's examples to test the applicability
of IPPs to cyberspace problems. Try to apply the IPPs in the Hong Kong,
Australian or other legislation to these situations.
Public debacles prompt privacy rethink (2001) 7 PLPR 149 - Dixon catalogues
a host of privacy breaches by major organisations during 2000, including Real
Networks; DoubleClick; PSINet; Toysmart; Amazon; Toysrus; and the Australian
Taxation Office. In most cases, the problems arose because of defective
policies on web sites may establish contractual or other obligations to those
who browse those sites aware of or relying on those policies. This is of
diminishing relevance in jurisdictions with legislative IPps which make some
forms of 'privacy policies' including the disclosure of collection practices
- Real Networks - 'Real Jukebox' software extracted information from
hard disk about users listening habits and sent this to Real Networks.
- DoubleClick - DoubleClick collected IP addresses of consumers
viewing ads, and could collate this with their web browsing habits; it
proposed to combine this with home address, name and purchasing habit
information bought from Abacus; suspended proposal after consumer backlash and
loss of 1/3 of share price (estimated loss as high as US$2.2BN). [See now the
terms of settlement of the class actions against DoubleClick - note the 8
terms proposed, and the breadth of the actions on which they are based.]
- PSI Net - For a payment, ISP allowed its servers to be used to send
out 5-20M spam messages, while its stated policy was to suspend users who did
- Toysmart - Bankrupt business proposed to sell consumer database of
online purchasers, despite promising never to sell consumer information. FTC
insisted it should only sell the database as part of the whole business.
Shareholder (Disney) paid to destroy the datatbase.
- Amazon - After Toysmart's problems, Amazon retrospectively amended
- ToysRUs - Outsourcing of customer data for analysis where the third
party doing the analysis was also entitled to use it for its own analysis
- Breaches of security - CD Universe and an ISP each had credit card
details of customers stolen by hackers; whereas IKEA, pwergen and the
Australian Tax Office inadvertently left data on clients visible on public web
pages, which was then found by web search engines or other customers.
However, in addition to the questions of whether they comply with the
legislative IPPs, privacy policies on web sites may be important because:
a detailed discussion of the legal significance of website privacy policies,
 PLPR (Issues 9 and 10). Part 2 mainly concerns Australian law, but the
other parts of the article are of more general relevance.
- They are on sites run by small businesses (in Australia) or other entities
not bound by IPPs; and
- They promise a degree of privacy protection going beyond what is required
by the IPPs (eg 'We will never disclose your personal information ...')
Some surveys of web site privacy policies:
`[EPIC] reviewed 100
of the most frequently visited
web sites on the Internet. We checked whether sites collected personal
their actual identity. We found that few web sites today have explicit privacy
policies (only 17 of
our sample) and none of the top 100 web sites meet basic standards for privacy
has made possible the development of many technologies which can be used for
invasion of privacy which do not have exact equivalents in the physical world.
Roger Clarke provides some of the best classifications and explanations of
technologies affecting privacy.
are `cookies'? `A cookie is a record stored on a user's machine as a result of
a web-server instructing a web-browser to do so. It is sent to an appropriate
web-server along with a request for pages.' (Clarke). Netscape describes them
` Cookies are a general mechanism which server side connections
(such as CGI scripts) can use to both store and retrieve information on the
client side of the connection. The addition of a simple, persistent,
client-side state significantly extends the capabilities of Web-based
A lot of people regard them as a serious privacy invasion: The
Electronic Privacy Information Centre (EPIC) says `a cookie is a mechanism that
allows a web site to record your comings and goings, usually without your
knowledge or consent'. "I basically equate cookies to the notion of a store
being able to tattoo a bar code on your forehead, and then laser-scan you every
time you come through the doors" (Simson Garfinkel). Others think they are one
of the few ways to overcome the `statelessness' of web protocols and are
Related to cookies are single pixel GIFs, which are graphics that are usually
invisible to web users because they are 1x1 pixel in size, with no border and
the same colour as the page background. They are also known as 'web bugs', 'web
beacons', '1-by-1 GIFs', 'clear GIFs' and 'invisible GIFs'.
Single pixel GIFs have many different surveillance uses. Kaman Tsoi explains
(see reference below) that these 'web bugs' are used by network advertisers
But when the user views the ad host's home page, in addition to any
cookie which may be set by the ad host itself, the network advertiser serves a
cookie to the user's browser. And because the banner ad graphic is operating as
a web bug, the network advertiser receives information including the IP address
of the user's computer, the URL of the ad host's home page, the time that the
page was viewed and the type of browser being used by the user.
If the user then clicks on the banner ad to link through to the
advertiser's website, the further movements of that user would be monitored to
the extent that the network advertiser had invisible web bugs on any pages of
the advertiser's site. Each time a web-bugged page is viewed by the user, the
network advertiser receives the same information about the IP address, page
URL, and time and browser type, along with the cookie value that was set when
the banner ad was first viewed. Unless the user deletes the cookie, this
monitoring could occur even if the user did not view the advertiser's site
immediately or via the banner ad link....
While this is all impressive, particularly in comparison to other
forms of advertising, what really takes web bugs into mind-boggling territory
is simply this: for each network advertiser, there are many more ad hosts,
advertisers and users. What this means is that by using the same cookie
wherever the network advertiser has banners or web bugs on ad host or
advertiser sites, the network advertiser can consolidate the data related to a
particular cookie to form a detailed profile of browsing habits which could
include the types of sites visited. The network advertiser can then add value
to its advertisers by using these cookie profiles to determine what ad is shown
the next time a user with that cookie is identified visiting an ad host's site.
The major network advertisers hold hundreds of millions of these consumer
profiles between them. The AltaVista search engine can be used to search for
web bugs, and one recent search reported more than four million web bugs
planted by 30 vendors on the internet
Hormel Foods, manufacturers of the 'popular' luncheon meat 'SPAM':
"Use of the term "spam" was adopted [in order to describe
Unsolicited Bulk Email (UBE) or Unsolicited Commercial Email (UCE)] as a
result of the Monty Python skit in which a group of Vikings sang a chorus of
"spam, spam, spam . . . " in an increasing crescendo, drowning out other
conversation. Hence, the analogy applied because UCE / UBE was drowning out
normal discourse on the Internet. "
to the Australian (NOIE) survey:
part catalogues a number of what many describe as 'Privacy enhancing
technologies' or 'PETs' - forms of technological 'self-help'. In many cases
these have been promoted (particularly in the USA) as alternatives to
legislative protection of privacy, but in other countries they are more
usefully considered from the following perspectives:
- Commercial surveys suggest that spam may account for 10-20 per cent of all
email traffic, with significant consequential losses for public and private
- Spam is spreading beyond email to other forms of electronic messaging,
such as relay chat and instant messaging.
- There are reports of spam causing IT system and security problems. For
example, some spamming operations appear to be overloading or temporarily
closing overwhelmed servers and networks of innocent intermediaries. Ultimately
this has implications for the stability of Internet services especially if spam
campaigns are deliberately used to deliver viruses.
- Some Australian businesses are being 'spoofed' by spammers, when nuisance
email is being routed through, and appears to come from, those firms. Their
commercial reputation is then at risk, and their owners and managers are
obliged to spend inordinate time rectifying this, including attempting to
respond to many annoyed spam recipients.
best starting points are:
Some of these PETs are discussed further below
Advocates of PETs as a principal method of privacy protection include:
- As technologies that may provide part of the protection that legislation
requires, and are possibly more effective in the context of legislation.
- As alternatives where legislative protection can't reach, particularly
where privacy invasions (eg SPAM) originate from outside your own jurisdiction.
Wide Web Consortium (W3C) has developed the
for Privacy Preferences (P3P), which has been described as `a framework
within which trust can be achieved between web services providers and
consumers' (Clarke's description in 1998, though he would reject it now).
Here is the official description from the P3P
What is P3P? (2002 description)
The Platform for Privacy Preferences Project (P3P), developed by the World Wide
Web Consortium, is emerging as an industry standard providing a simple,
automated way for users to gain more control over the use of personal
information on Web sites they visit. At its most basic level, P3P is a
standardized set of multiple-choice questions, covering all the major
aspects of a Web site's privacy policies. Taken together, they present a clear
snapshot of how a site handles personal information about its users.
P3P-enabled Web sites make this information available in a standard,
machine-readable format. P3P enabled browsers can "read" this snapshot
automatically and compare it to the consumer's own set of privacy
preferences. P3P enhances user control by putting privacy policies where
users can find them, in a form users can understand, and, most importantly,
enables users to act on what they see.
P3P Project in a nutshell (1998 description)
P3P* is a privacy assistant: users can be informed, in control, and
use P3P to simplify and
help them make decisions based on their individual privacy preferences.
The P3P specification will enable Web sites to express their privacy practices
and users to
exercise preferences over those practices. P3P products will allow users to be
informed of site
practices, to delegate decisions to their computer when possible, and allow
users to tailor their
relationship to specific sites. Sites with practices that fall within the range
of a user's preference
could, at the option of the user, be accessed "seamlessly," otherwise users
will be notified of a
site's practices and have the opportunity to agree to those terms or other
terms and continue
browsing if they wish.
P3P gives users the ability to make informed decisions regarding their Web
experience and their
ability to control the use of their information. Sites can use P3P to increase
the level of
confidence users place in their services, as well as improve the quality of the
services offered, the
customization of content, and simplify site access.
P3P allows one to make statements about privacy practices and preferences in a
P3P uses RDF/XML for making privacy statements as well as for exchanging data
control. P3P will support future digital certificate and digital signature
capabilities as they become
available. P3P can be incorporated into browsers, servers, or proxy servers
that sit between a
client and server.
* For brevity, we often refer to the P3P project, activity, specifications, or
products as "P3P."
The W3C's general approach to privacy issues is on its
More details, including all technical specifications, are available from the
Project web pages. Please browse
for Privacy Preferences: An Overview' (April 1998) - provides a simple
explanation, and technical summaries as well.
[The following comments are from a paper I wrote in 1998 - they may now need
some revision: GG]
P3P is a protocol which is intended to be able to be applied to support
negotiations in a variety of internet contexts, including explicit data
provision (eg answers to questions on web forms), implicit data provision (eg
capture of the `click stream' or URLs of pages visited in succession), and
explicit data provision from third sources (eg a web user's stored profile of
preferences, demographic details etc). How it can be applied to some extensions
to basic HTML such as cookies, Java etc is not yet determined. P3P allows web
users to have multiple personae (digital pseudonyms), allowing a user to choose
between a `data-poor' or `data rich' personality depending on the site
P3P is the first important privacy initiative to have emerged from the
consultative and self-regulatory structures of internet governance (although
dominated by W3C staff members), and for that reason alone is of considerable
Clarke compares what P3P is attempting to deliver against the OECD privacy
and concludes that it only addresses parts of three of the OECD Principles
(data collection directly from the individual concerned; limitations on use and
disclosure, and openness about use and disclosure policies), but does not
address other principles relating to collection from third parties, subject
access to data held by the web-site operator, retention of data and security.
This is not necessarily a criticism, merely a limitation of one tool, but it
would seem that some of these matters could be addressed by the same protocol
in order to give more comprehensive privacy protection.
The more substantial criticism is that P3P says nothing about measures to
ensure that it is complied with. If the web service provider breaches the
practices that it has told the user that it adopts during a P3P `negotiation'
what can the user do about it (assuming he or she ever finds out in the first
place)? Some aspects of this problem are:
P3P may become `one important element among many others'
(as Clarke concludes), but it will be of little use unless it meshes with law
and organisational practices. Until it does that, it could be little more than
a framework for deception.
- P3P does not require the web service provider to log access and uses of
the data it collects.
- P3P is not a certification scheme, and provides no guarantee of audits or
similar protective measures. Some industry based initiatives like TRUSTe could
- There is no guarantee that the P3P framework provides any linkage to a
particular country's laws (such as contract laws or data protection laws) as
Clarke points out, so P3P `promises' may be legally meaningless. There is an
`assurance statement' in the Protocol where an attempt could be made to provide
either contractually binding or legally descriptive statements, but its use is
The Electronic Privacy Information Center (EPIC) identifies a different danger
in that it considers that a what is in effect framework for efficient
collection of personal information as a condition of entry to web sites (with
the possibility of increasing exclusion of those who value their privacy) may
be counter-productive to privacy, compared with simply opposing the increased
collection of personal information.
There are now a wide range of papers available about P3P.
will the defaults be set in software that implements P3P?
than any other form of privacy problem, cyberspace issues are likely to involve
complaints of invasions of privacy by overseas organisations. This leads to a
premium on self-help (PETs).
Testimonials for P3P 1.0 (2002)
- W3Cs' P3P pages
Papers, Presentations, Critiques, and Media Coverage - a very wide range of
papers, pro-, con- and neutral, but only by the P3P Working Group members. Please
browse for papers of interst.
P3P Re-visited (2001) 7 PLPR Issue 10 - Clarke is now openly hostile to P3P
(compare below), joining other critics such as Catlett and Rotenberg: "P3P is
fraction of what it was meant to be, and of what the situation demands." [This
paper is not listed in the W3C'ss pages, though other Clarke papers are]
- Roger Clarke
for Privacy Preferences: A Critique' (April 1998) - Clarke situates P3P in
the history of IPPs, showing which aspects of standard privacy concerns it does
and does not address. He also discusses the enforcement aspects. He reaches an
`open verdict' on its utility.
- Karen Coyle
P3P: Pretty Poor Privacy? A Social Analysis of the Platform for Privacy
Preferences (P3P) (1999): "P3P is the software equivalent of Mr. Potato
Head. It is an engineer's vision of how humans decide who to trust and what to
tell about themselves. It has a set of data elements and neatly interlocking
parts, and it has nothing at all to do with how people establish trust. "
- Jeremy A. Birchman
P3P the Devil?' (University of Miami School of Law, student paper) May, 1998
What are legal systems doing about this?:
But a TRIPS or WTO
for privacy (an international standard which must be enforced locally even
though it is foreiners who are affected) is not yet on the horizon outside
- Where there is a local law, allowing non-residents to take action under it.
- Data export prohibitions
- Some enforcement cooperation between Privacy Commissioners
- Some attempts to develop international standards
 Paragraph summarised from Clarke, `Overview'
 Roger Clarke `Platform for Privacy
Preferences: A Critique' -