This chapter and the next take up the question of why data protection laws exist. In attempting to provide answers to this question, an examination is made in this chapter of the catalysts for the emergence of data protection laws. The following chapter examines the various interests these laws embody and the various values they aim to safeguard or promote. There is some overlap between the discussion in the two chapters, as the catalysts point to many of the relevant interests and values.
Both chapters provide important background material for resolving the main issues taken up in Part III and Part IV. More specifically, it should be noted that many of the developments described in this chapter affect not just persons as individuals but also groups and organisations. A large number of the values and interests described in Chapter 7 have a similar multiple relevance. Accordingly, the analysis in Part II can (and should) be considered from both the perspective of individual persons and the perspective of collective entities. The implications of this insight are examined in greater detail in Part III. Furthermore, many of the developments set out in this chapter help to explain the increasing popularity of profiling practices, while the material in Chapter 7 provides a register of values and interests against which the desirability of (regulating) such practices can be assessed. The implications of this insight are analysed in more detail in Part IV.
The catalysts for the emergence (and continued existence) of data protection laws fall into three broad categories: (i) technological and organisational developments, and the factors that drive them; (ii) public fears about these developments; and (iii) legal factors. In the following, I deal with each of these categories in the order they are listed above.
The emergence of data protection laws, along with their continued existence, cannot properly be explained without taking account of developments in information technology (hereinafter also termed "IT") particularly from the onset of the computer age in the 1950s. These developments have brought vastly expanded possibilities for amassing, linking and accessing personal data. The discourse of the 1960s and 1970s out of which data protection laws emerged, shows a preoccupation with these possibilities. Hence, it is with some justification that data protection laws have been characterised as regulatory reactions to technological developments. At the same time, it should not be forgotten that the core concern in data protection laws for safeguarding personal privacy and related values, has roots that reach much further back in time than the onset of the computer age.
One aspect of the technological developments alluded to above has been the strengthening of the technical capabilities of IT in terms of the operational inter-connectivity, speed, bandwidth and intelligence of computers. A related aspect has been the increasing miniaturisation and ubiquity of IT products. Both these aspects have been of some concern in the discourse out of which data protection laws have sprung. But of more immediate concern have been certain patterns (partly facilitated by the above-mentioned technical strides) in processing of data, particularly personal data.
During the early discourse on data protection issues, some of these patterns were little more than theoretical possibilities; in later years, all of them have become more or less manifest in concrete processes. In the following, I give a summary description of these patterns and processes. To some extent, they overlap with each other.
Two phenomena figure centrally: (i) growth in the amount of data - especially personal data - held by various types of organisations; and (ii) integration of these data holdings into centralised databanks. An early manifestation of these phenomena were proposals during the 1960s to establish centralised population registers. Another early manifestation were plans by several European governments to carry out national population censuses around 1970. Further manifestations can be seen in efforts to introduce common criteria (eg, multi-purpose Personal Identification Numbers (PINs)) for referencing stored data. All of these schemes provided considerable fuel for the public debates that helped set in motion the first round of investigative and legislative processes for enactment of data protection laws.
Concern has also been directed at a variety of other, related phenomena. One phenomenon is the increasingly extensive and routine sharing of data across traditional organisational boundaries. Part and parcel of this process is a growing interest of organisations in basing their decisions on data that already exist in structured format in databases maintained by themselves or other organisations. This interest is manifest, for example, in government efforts to co-ordinate and standardise the various information systems of public sector agencies such that the systems can effectively communicate with each other.
Also of concern is that the re-use of data often entails that the data are put to purposes other than those for which the data were originally collated. In short, the re-use of data tends to lead to their re-purposing. This is frequently the case, for example, when data collected by government agencies for administrative purposes is exploited commercially. It is also frequently the case when data are processed in connection with profiling operations. One potential consequence of re-purposing is that data are used contrary to the expectations of the data subject. Another potential consequence is that data are misconstrued and, accordingly, misapplied - ie, used for purposes for which they are not suited.
Instances of such misapplication are numerous. They are linked to a broader problem concerning inadequacies in the quality of data held by organisations. There is an accumulating body of evidence to suggest that significant amounts of these data are insufficiently precise, correct, complete and/or relevant in relation to the purposes for which they are processed. It goes without saying that such weaknesses can have a detrimental impact on the welfare of the data subjects.
Of equal concern is that the re-use of data can diminish the role played by the data subjects in decision-making processes affecting them. Exacerbating this problem is the increasing automatisation of organisational decision-making processes. Computers are beginning to execute assessments that have customarily been the preserve of human discretion - eg, in the context of determining persons' credit ratings, insurance premiums or social welfare entitlements. Today, such assessments tend to be based primarily on data collected directly from the data subjects in connection with the assessment at hand. But, in line with the trends noted above, it is also likely that these assessments will increasingly be based on pre-collected data found in the databases of third parties. Indeed, with effective communication links between the databases of large numbers of organisations, sophisticated software to trawl these databases, plus appropriate adaptation of the relevant legal rules (ie, an "automationsgerechte Rechtsetzung"), it is easy to envisage computerised decision-making processes that operate independently of any specific input from the affected data subjects.
In the wake of increasing automatisation of decision-making processes comes also a fear of automatic acceptance of the validity of the decisions reached. In the words of the EC Commission,
the result produced by the machine, using more and more sophisticated software, and even expert systems, has an apparently objective and incontrovertible character to which a human decision-maker may attach too much weight, thus abdicating his own responsibilities".
Of related concern is that, at the same time as data subjects' role in organisational decision-making processes is diminishing, the role of their registered data-images is growing. From such an image emerges a "digital persona" that is increasingly attributed a validity of its own by data controllers. Indeed, in the context of modern database systems, the digital persona threatens to usurp the constitutive authority of the physical self, despite the necessarily attenuated nature of the former relative to the latter. With (the threat of) usurpation comes (the threat of) alienation.
Also of concern is that the increases in the magnitude and complexity of cross-organisational data flows, and the resultant blurring of organisational lines, is making it more difficult for data subjects to trace the flow of data on themselves. Thus, data subjects' ability to control what happens with their various digital personas is threatened. Similarly, their ability to identify who or what is responsible for each of of the myriad transactions involving their data, plus the full parameters of these transactions, tends to be reduced. Such difficulties potentially alter the foundations for, and the terms of, the "social contracts" that are implicit in the relations between data subjects and the organisations with which they deal.
It would be wrong to see all of the developments outlined above as the sole catalysts for the emergence of data protection laws. Just as important are the forces driving these developments. One such force is modern organisations' enormous appetite for information. This is not to say that the informational appetite of individual persons fails to play any role; organisations are, of course, partly constituted by the desires and needs of individual persons. Nevertheless, it is the appetite for information on the part of persons acting collectively as formal organisations which is most significant for present analytical purposes.
Several scholars claim it was mainly this appetite that brought on the widespread public discussion in the 1960s and 1970s about the need for privacy and data protection. In the words of James Rule and his colleagues:
The main source of the privacy controversies of the 1960s and 1970s has been the demands of formal organizations for information on the people with whom these organizations must deal.
Concomitantly, it is claimed that the role played by technology - first and foremost in the form of mainframe computers - was essentially to exacerbate tensions in the populace caused by organisations' appetite for information.
While these claims have surface plausibility, we should be wary of explanations that attempt to single out one fundamental cause for the controversies out of which data protection laws emerged. And we should keep in mind that, in terms of cause and effect, the interaction between organisational phenomena and technological developments is two-way. Thus, organisations' informational appetite not only stimulates but is whetted by technological developments.
It should also not be overlooked that the enormity of this appetite, which at times appears to border on the insatiable, is itself a result of a complex array of other, non-technological factors. At a general level, it expresses what Anthony Giddens calls the "reflexivity" of modern society. By "reflexivity" is meant a condition in which social practices are examined and altered in the light of new information about those practices. While Giddens acknowledges that reflexivity is a characteristic of all purposive human behaviour, it is intensified in modern society: "With the advent of modernity, reflexivity [...] is introduced into the very basis of system reproduction, such that thought and action are constantly refracted back upon one another". Concomitantly, appeals to tradition, along with "inertia of habit", play a minor role in modern society relative to pre-modern cultures.
Part and parcel of this intensified reflexivity is a concern for rationalisation. Numerous social scientists, beginning with Max Weber, identify rationalisation as a fundamental trend of modern society. It is a process in which human activities are subjected to administration and classification based on formalised, impersonal rules of procedure and an emphasis on optimising performance efficiency. Rationalisation entails large-scale bureaucracy with well-developed facilities for collecting, storing and utilising information; indeed, access to (relevant) information is a sine qua non of rational administration. This point is extended by Alan Westin, who observes:
Rational government has an article of faith that what is missing from the policy-making process is better information about social processes, social problems, the actual effects of existing programs, and the possible effects of new programs.
This in turn
spurs the creation of more and more advanced types of computerized information systems with two basic principles at their core: full consolidation of relevant data and full circulation of this [sic] data to all appropriate agencies.
At the same time, sight should not be lost of the fact that the very essence of rationalisation on the informational plane is to search out, and take exclusive account of, information deemed relevant to the organisational tasks at hand. Thus, a truly rational organisation is equally concerned with weeding out and disregarding information as with collating it. This, of course, is necessary if an organisations' decision-making processes are to function effectively. In the words of James Beniger, "rationalization might be defined as the destruction or ignoring of information in order to facilitate its processing". Nevertheless, the ambit of relevance has increased in tact with the expansion of organisational responsibilities; and developments in IT (increasing data storage capacity, etc) have reduced the detrimental impact that collection of irrelevant information can have on the efficiency of organisational decision-making processes. Hence, in many cases, the concern for relevance has not resulted in a lessening of the amount of information gathered by organisations, nor a lessening of their informational appetite.
At a very general level, then, contemporary organisations' enormous appetite for information, along with their concomitant interest in developing and utilising IT, can be viewed as a function of the importance of reflexivity and rationalisation in modern society. This, of course, begs the more challenging question of why reflexivity and rationality have become so important. To find a satisfactory answer to this question, however, would rapidly transcend the boundaries of this thesis. Accordingly, I let the question lie.
More specifically, the interest in information and advanced IT reflects a concern to improve organisations' efficiency of performance, expressed, for example, in terms of maximising output and minimising costs. Other concerns have also been relevant, such as a desire on the part of government agencies to de-politicise a crisis by attributing its root cause to lack of necessary information. Kenneth Laudon, for instance, identifies this sort of desire behind the proposal in the late 1960s to establish a National Data Centre in the USA. Alternatively, an organisation's use of advanced IT can be aimed at giving an impression of efficiency, thus enhancing the organisation's status or attractiveness vis-à-vis, say, potential resource providers. In this regard, it should not be forgotten that IT is more than just a set of tools; it has symbolic/totemic dimensions too (eg, as an icon of progress and modernity). Concomitantly, emotional factors can play a role in modern organisations' enthusiasm to take on board new IT products. For instance, the growing sophistication of IT appeals to humans' innate fascination for the "technically sweet" in the form of advanced, push-button gadgetry.
Nevertheless, it is the concern to improve performance efficiency which has been of primary importance in motivating organisations to intensify their utilisation of information and advanced forms of IT. In the private commercial sector, the main end of increased efficiency is typically economic gain. For public sector organisations, improved efficiency typically serves other goals as well. One such goal is the defence and extension of national sovereignty. Another goal - at least in liberal democratic states - is the enhancement of citizens' welfare. All three goals, along with measures to realise them, have often been inter-linked; the connections between capitalism, military activity and welfare politics are numerous and significant. Many of the conditions for the emergence of large information systems in the civil sector have been laid through the prior bureaucratisation of military processes, while the latter have frequently been stimulated by, and generators of, capitalist economic enterprise.
These connections notwithstanding, the enhancement of citizens' welfare has been prominent in motivating and justifying expansion of civil sector agencies' gathering of data on citizens in liberal democratic states. The establishment of social welfare schemes has gone hand-in-hand with growth in the amount of citizen data collected by civil sector agencies. The more ambitious and/or discriminating these schemes have become, the greater has been the need for "fine-grained" assessments of citizens based on correspondingly detailed personal data. This need has usually been justified in terms of ensuring that the distribution of services and benefits is just; ie, ensuring that these goods flow only to those citizens who need and/or legally qualify for them.
In recent years, more entrepeneurial considerations have also played a role in motivating civil sector agencies' intensified processing of personal data. Under the sway of economic exigencies and the business-inspired ideals of "New Public Management", many current Western governments appear to be primarily concerned with cost-cutting. Their intensified utilisation of citizen data seems increasingly motivated by a fiscal imperative, a desire to reduce waste, fraud and abuse with respect to government services. A good example of this fiscal imperative at work is the dramatic increase over the last 15 years or so in systematic computerised matching of personal data stored in different government agencies' registers. Analysis of this development in the USA and Australia shows that these matching schemes aim mainly at detecting instances in which persons have received excessive government benefits or failed to pay appropriate taxes. The recent effort by various governments to foster the building of so-called "information infrastructures" and "information superhighways" is also inspired to a large degree by economic concerns. These concerns embrace more than simply a desire to cut government costs. Also evident is a desire to stimulate domestic economic productivity. Realisation of these goals is usually envisaged as depending on, and being driven by, a vigorously competitive private sector. Thus, there is growing pressure to open up the telecommunications industry to competition and improve information synergy between the public and private sectors.
Underlying these economic concerns is an awareness that much of contemporary economic activity is based on the production and exchange of information. Concomitantly, information is increasingly being regarded as a valuable resource in itself. The concept of "information resource(s) management" (IRM), for example, has become a popular element in current managerial policy making. IRM treats information as a resource of considerable economic (and, to some extent, administrative and political) value to be managed like other important resources. Closely linked to the IRM concept in both concern and connotation are notions such as "data warehousing" and "data mining".
There exists also a rapidly growing market in information services, a market in which information as such, and particularly personal data, can be bought and sold for significant financial sums. This market is not limited to the private sector; it increasingly embraces the data holdings of public sector organisations as well, leading to a growing cross-sectoral movement of data. To some extent, this cross-sectoral movement is officially sanctioned and encouraged by policies designed to improve synergy between the public and private sectors. But there are also examples of extensive cross-sectoral trading of personal data which is in breach of law. This illegal information trade underlines again the considerable economic value that has come to attach to particular kinds of personal data.
In the context of the discourse out of which data protection laws have sprung, the developments canvassed above would scarcely have aroused concern but for two inter-related factors. The first is that these developments have contributed to a marked increase in the ability of organisations to monitor systematically the activities of those with whom they deal. Expressed alternatively, the developments have augmented the surveillance capabilities of organisations. The second factor is that with this increase has come an enhancement of organisations' control capabilities - ie, their ability to exert influence over those who are the subjects of surveillance. In practice, the two types of capability are intimately linked: surveillance is usually carried out for control purposes, and has a controlling effect.
Numerous sociologists have observed that the scope of surveillance and social control in contemporary society is at an unprecedented high. Thus, David Lyon characterises contemporary society as a "surveillance society" in which "[p]recise details of our personal lives are collected, stored, retrieved and processed every day within huge computer databases belonging to big corporations and government departments". According to Lyon, "to participate in modern society is to be under electronic surveillance". Anthony Giddens describes surveillance (ie, "the supervisory control of subject populations") as one of the four basic "institutional dimensions" of modernity. Similarly, James Rule et al view "development of efficient systems of mass surveillance and control" as "one of the distinctive sociological features of advanced societies". They go on to claim that
[n]ever before our own era have large organizations been able to remain in direct interaction with literally millions of persons, both keeping abreast of their affairs and reaching out with authoritative bureaucratic action in response to such monitoring.
In light of the focus of Part III, it bears reminding that modern systems for surveillance and control are directed not simply at persons qua individuals but also at groups and organisations.
At the same time, though, surveillance and control levels have not increased uniformly across the board. While there has been growth of systems of mass surveillance and control (ie, systems by which organisations monitor and influence the behaviour of large numbers of people - both as individuals and as collective entities), this has occurred against the background of relative declining intensity, at least in numerous Western countries, of the levels of surveillance and control exercised by many small-scale groups, particularly families and neighbourhoods. Concomitantly, we should be careful about assuming that we have less privacy now than persons enjoyed previously, or that the loss of privacy engendered by mass surveillance is more distressing than the loss of privacy engendered by traditional, local surveillance.
Further, the growing pervasiveness of mass surveillance and control systems is not mainly the result of some sort of clandestine, organisational conspiracy with purely repressive intent. Many surveillance and control systems have been introduced for benign purposes and supported by large sections of the populace. In the words of Lyon, "the story is a subtle one, and cannot be reduced to any crude categories that assume that surveillance is born of malign collusion of economic and political power".
The emergence of systems of mass surveillance and control is the reverse side of the technological and organisational developments canvassed previously in this section. Surveillance is, for example, a kind of "reflexivity" - as Giddens notes. It is also an integral part of rational administration.
In his seminal work, Private Lives and Public Surveillance, James Rule identifies three major factors that have contributed to the development of systems of mass surveillance. The first is the growth in social scale. "On a number of counts", Rule writes, "the growth of scale and the extension of surveillance capacity virtually presuppose each other".
Another factor is the increasing symbiosis of the various surveillance systems run by different organisations. This symbiosis is hinted at previously in this section. As Rule notes, there are powerful economic incentives for increasing symbiosis, due to the substantial costs for organisations in independently collecting data and the fact that symbiosis improves the possibilities for organisations to get a fuller picture of their clients and to carry out cross-sectoral punishment of client behaviour that is harmful to their interests.
A third factor, also noted previously in this section, is the growth of "fine-grained concern" by organisations for the affairs of their clients. By "fine-grained concern" is meant organisations' interest in dealing with clients discriminatingly, "according to precise reckoning of subtle differences in each client's circumstances and background". This concern is not always foisted by organisations on an unwitting public; rather, it is very often engendered by popular pressure to see justice done in various ways.
The above three factors are undoubtedly central in explaining the growth of modern systems of mass surveillance and control, at least in Western democracies. But they are by no means the only relevant factors. Closely linked to the symbiosis factor is the tendency for organisational demands for information to spawn other such demands, thus leading to the creation of new surveillance systems. Factors of more ideological character are relevant too. Lyon, for example, points to the rise in the 1980s of "New Right" ideology as contributing to the growth of organisational surveillance in many Western countries. Such ideology opens up for extensive monitoring of both the political and consumerist behaviour of citizens. This is due to its emphasis on buttressing the power of the state to fight criminality and protect national security on the one hand, and to stimulate commercial enterprise on the other. More generally, sight should not be lost of the pivotal role played by military factors in establishing the foundations for contemporary systems of mass surveillance and control.
Finally, there are technological factors. These deserve closer analysis here, not least because their exact role in relation to surveillance and control is frequently at issue in data protection discourse.
It is sometimes claimed that technologies are neutral. While this can be true in the abstract, technologies are never introduced into, or used in, a social vacuum. In practice, the context in which technologies are used tends to undermine any a priori neutrality they might enjoy. Moreover, technologies often have an inherent logic or bias of their own which strongly influences (though does not necessarily determine) the way in which they are used. New technologies can also roll back various constraints that have prevented occurrence of a particular kind of activity. Thus, they help to shift the parameters of social interaction, creating new opportunities of activity, and magnifying existing opportunities. In doing so, they can also create conflicts as well as accentuate old ones.
These remarks should be borne in mind when considering the impact of technological factors on developments in mass surveillance and control. For these developments are in large part facilitated by new forms of technology. At the same time as the latter create new avenues for human interaction and self-realisation, they often provide new opportunities for registering and disclosing data about ourselves. We see this, for example, with the increasing amounts of data registered in connection with activity on the Internet.
To some extent, the developments in mass surveillance and control are also driven by new forms of technology. This is particularly the case in respect of technologies with an inherent bias towards enhancing surveillance and/or rolling back privacy. Remote sensing satellites that gather data on private land use and movement of persons; automated dialing machines that are programmed to call selected telephone numbers and make pre-recorded sales pitches; passive millimeter wave imagers (PMWIs) that "see through" ordinary clothing and walls - these are just a few of the multitude of technologies with such a bias.
Nevertheless, it should not be forgotten that other technological developments can, and do, enhance personal privacy and autonomy. In the words of Rule, "innovations in the technology of control ... have as their counterpart innovations in the technology of evasion". There is a range of technological mechanisms - often termed "privacy-enhancing technologies" or "PETs" - for directly reducing or eliminating the collection and further processing of data that can be used to identify persons. More subtly, telecommunications technology in many of its various applications (teleshopping, telebanking etc) frees persons from having to make face-to-face contact with service providers, thereby also freeing them from a situation by which transactional behaviour is traditionally monitored and moulded. In the domestic sphere, privacy and autonomy have been enhanced to an unprecedented extent by a wide range of appliances and tools, including automobiles, telephones, television sets, washing machines, bathrooms and air conditioning. Koen Raes goes so far as to claim that privacy today "is as much a result of modern technology as technology is a threat to the private lives of citizens".
Thus, if we consider technologies as an undifferentiated mass (ie, as technology rather than as various technologies), we see they have a certain double-sidedness in relation to, say, privacy. Technology both enhances and detracts from privacy. It gives with the one hand and takes with the other. Stefano Rodotà terms this a "paradox of privacy". I would rather term it a paradox of technology. This paradox occurs not just in relation to technology as an undifferentiated mass; also numerous individual technologies both enhance and detract from privacy. Telephones are an obvious case in point: they free us from face-to-face contact at the same time as they provide another point of contact through which our privacy can be disturbed. A similar double-sidedness can be noted in the impact of technology on personal and organisational empowerment. Technologies frequently have the potential to empower persons and organisations at the same time as they have the potential to disempower them. Video cameras are one such technology.
There is solid evidence to suggest, though, that the degree to which each type of potential is realised tends to follow existing power structures. For instance, a range of studies show that new forms of IT tend to be employed in ways that primarily serve the interests of dominant organisational elites. Hence, the main effect of introducing advanced computer networks in government bureaucracies seems usually to be the further consolidation of existing bureaucratic power, with improvement of citizen participation in government being only a secondary effect at best. The evidence of such studies suggests that a great deal of caution should be exercised before embracing claims that new forms of IT - particularly computer networks - are likely to bring about the demise of totalitarianism or hierarchies generally. Certainly, new computer networks will increase the difficulties experienced by totalitarian regimes - indeed, any regimes - in controlling data flow, but this does not mean that such regimes are thereby bound to fall. Furthermore, sight should not be lost of the fact that the networks open up new avenues for surveillance which can render illusory much of the freedom and privacy of using them.
While a great deal of data protection discourse has been preoccupied with technological threats to privacy and related values, it nevertheless appears to be infused by growing awareness of the double-sided character of technology as described above. We see clear evidence of this awareness in the recent burgeoning of proposals to apply IT in the service of enhanced privacy and data protection. At the same time, there is little evidence to indicate that the architects of data protection laws share a deep-seated hostility to computers and other forms of IT. Certainly, they share a suspicion of the potential dangers of such technology. Further, they share a desire to ensure that technological developments are subjected to assessment and, to some extent, regulated. Moreover, it is not to be denied that many early data protection laws have singled out computerised data processing as their sole object of control. But data protection laws are far from neo-Luddite in aim or inspiration.
What are the key features of modern systems of mass surveillance and control which have aroused concern in data protection discourse? One feature is their growing pervasiveness. This growth has mainly occurred along two axes. First, there has been an expansion across national boundaries. We see, for instance, the development of increasingly sophisticated transnational systems for policing as evidenced in the establishment of computerised information systems that are accessible by, and run for the benefit of, police agencies in multiple countries. Secondly, while systems of mass surveillance and control have traditionally been linked primarily to state institutions, we see such systems spreading into the private sector from the 1960s. Commercial transactions and consumption patterns in the private sector are increasingly subject to systematic monitoring for a variety of purposes - credit assessment, marketing, product evaluation, criminal investigation, etc - which tend ultimately to serve, directly or indirectly, the ends of social control. Part and parcel of this development is the relatively recent emergence of large numbers of private organisations (eg, credit reporting agencies, marketing firms) for which such monitoring is the sole or primary activity. Further, many surveillance and control functions that have traditionally been the preserve of public sector regulatory agencies, such as the police, are devolving gradually to private organisations.
Another key feature of concern has to do with surveillance techniques: these are now automated, de-personalised, capital-intensive operations to a far greater extent than in the past. As a result, physical proximity between the human watchers and the watched is decreasing. Today's techniques are often less physically obtrusive than their earlier counterparts, and more capable of transcending light conditions, physical barriers and limitations of time and space. Concomitantly, they allow for the gathering of information that previously could only have been collated by resorting to traditional coercive methods of intrusion. They are aimed increasingly at forestalling undesired action rather than simply tracking down such action once it has been carried out. Thus, instead of just targeting specific individuals, they tend also to place large numbers of persons under suspicion. An obvious case in point is the growing use of advanced optical, audio and sensory surveillance tools (eg, video cameras, microphones, infra-red sensors) as a substitute for, or supplement to, the use of security personnel. Another case in point is the evermore extensive practice of profiling for a variety of control purposes.
Profiling is just one instance of a range of increasingly used and increasingly refined techniques for monitoring and/or anticipating human behaviour through analysis of data in computerised record systems. The growing importance of such techniques for surveillance and control is a reflection of the fact that computer systems increasingly mediate, facilitate and register human activities. This fact reflects, in turn, the growing ubiquity, miniaturisation and inter-connectivity of computer systems. The important point, though, is that our transactions leave behind them an ever-richer variety of electronic trails. These trails attach not just to extraordinary transactions but to routine patterns of life. Systems for electronic funds transfer, related systems for electronic ordering and purchase of goods, systems for electronic logging and control of access to buildings, roads, etc - all of these generate data on our everyday activities. Also noteworthy here are recently established systems for monitoring Internet activity (eg, through the use of so-called "cookies").
Common for all of these systems is that they automatically generate and register enormous masses of transactional data that can be linked to large numbers of persons, as individuals and/or as groups. These data are additional to the already extensive amounts of information we are specifically asked to provide organisations in return for various services, or which organisations otherwise gather on us independently of transactions with them. These transactional data are often registered, or capable of being registered, without the data subjects' knowledge. Concomitantly, in the absence of data protection laws, data subjects tend to have little if any control over what is registered once they have come in contact with the system in question. While such transactional data are usually of trivial significance on their own, they can reveal much about the behaviour and personalities of the respective data subjects when linked with other data. Thus, the data can be of more than trivial significance for a variety of organisations. In the commercial sector, for example, the data will typically be helpful for the marketing of products; in the police sector, they will typically be useful for the investigation of crime. It is obvious that such uses of transactional data can threaten privacy and related values.
It should be emphasised that, although the above features of modern systems for mass surveillance and control have figured prominently in data protection discourse, they are not always directly or adequately addressed in data protection laws. Certainly, we find some aspects of these features reflected in the legislation. For instance, the increasing involvement of the private sector in mass surveillance and control explains the tendency for data protection laws to regulate both public and private sectors. Further, the privacy-invasive potential of apparently trivial transactional data is reflected in the fact that most data protection laws do not require (at least prima facie) personal data to have a predefined level of sensitivity in order to qualify for legal protection. Nevertheless, other aspects of modern surveillance and control systems, such as their increasingly transnational character, are only now in the process of being addressed by data protection laws - and not always adequately. Still other aspects have yet to receive prominent attention in data protection discourse generally. One such aspect is the growth in numbers of individual persons (as opposed to organisations) who possess, in their private capacity, the technological means to process (and, in particular, disseminate) massive amounts of personal data with increasing ease and decreasing expense.
Another major catalyst for the emergence and continued existence of data protection laws is an accumulating body of evidence to suggest that the quality of data/information utilised by numerous organisations is deficient; ie, that the data/information are insufficiently precise, correct, complete and/or relevant in relation to the purposes for which they are processed. The exact scale of the problem, though, is difficult to gauge as detailed empirical studies of data/information quality are lacking for many organisational sectors. Nevertheless, some such studies have generated alarming results.
One such study which is widely cited comes from the USA. There a survey was carried out in the early 1980s of the completeness, accuracy and ambiguity of records on persons' criminal histories and arrest warrants kept in information systems operated by the Federal Bureau of Investigation (FBI). These information systems were the National Crime Information Center Computerized Criminal History (NCIC-CCH), the Identification Division (Ident) and the National Crime Information Center Wanted Person System (NCIC-WPS). Also examined were the criminal-history record systems of three states. The study found that 54.1 percent of the records in the NCIC-CCH and 74.3 percent of the records in the Ident system, were incomplete, inaccurate or ambiguous. As for the three state criminal-history record systems, just 12.2 percent of the data in one of these systems (run by a state in the south-east) were found to be complete, accurate and unambiguous. Corresponding figures for the other two states were 18.9 percent and 49.4 percent respectively. With regard to the NCIC-WPS, 11.2 percent of recorded warrants were found to be no longer valid, 6.6 percent were found to be inaccurate in their classification of offence, and 15.1 percent were probably not capable of prosecution because they were more than 5 years old.
Similarly alarming are the results of a survey carried out in January 1998 by the Public Interest Research Group (PIRG) of the quality of credit reports maintained by the three largest US credit-reporting agencies (Experian, Equifax and Trans Union). The survey found, ia, that 29 percent of the reports contained "serious errors - false delinquencies or accounts that did not belong to the consumer - that could result in the denial of credit"; 41 percent contained invalid personal demographic data; 20 percent lacked relevant financial data ("major credit, loan, mortgage or other consumer accounts that demonstrate the creditworthiness of the consumer"); and 26 percent contained "credit accounts that had been closed by the consumer but incorrectly remained listed as open".
Also in other countries there is sporadic evidence of significant deficiencies in the quality of data/information held in major information systems. Often this evidence emerges not from planned empirical studies of data/information quality but more incidentally (eg, in the wake of various attempts by organisations to match data in their respective registers for control purposes).
Part and parcel of the above-mentioned problems has been a paucity of academic and managerial attention to data/information quality. On the academic front, Christopher Fox et al note:
[d]espite the importance of data quality, it has received little attention. [...] No common framework for studying data quality problems, nor an agreed-on terminology for discussing data quality, has emerged from the modest efforts to date.
On the managerial front, Donald Marchand makes the following claim in relation to the private sector:
For better or worse, managers exhibit a tendency to take information quality for granted as they navigate through the many sources of information available to them. Information overload seems to be a more pressing concern than the relative degree of information quality which their information sources represent. When executives do, on occasion, perceive that a failure in decision-making is due to the poor quality of the information they used, they tend to treat the occurence as exceptional or anecdotal, rather than as a deficiency in either the information services, sources or systems which are available to them in the company. That is, many executives would not consider a deficiency in information quality as a problem worthy of more systematic management attention!
These comments probably are applicable also to many bureaucrats working in the public sector. In Sweden, for instance, the National Audit Office (Riksrevisionsverket (RRV)) undertook a series of studies from the mid- to late 1980s of information quality in ten computerised information systems run by state agencies. The Audit Office found that data controllers were often unaware of the information quality in their respective systems. In addition, few analyses had been carried out by data controllers to establish both desired levels of information quality and the consequences of not reaching those levels. Similarly, results from an investigation of record-keeping practices of US federal government agencies carried out in 1985 by the OTA indicated that few agencies conducted audits of the quality of the data in their record systems. In Norway, the Directorate of Public Management (Statskonsult) has recently found that Norwegian authorities have yet to develop a comprehensive strategy for defining, measuring and securing adequate quality in relation to data/information; concomitantly, there is a paucity of systematic surveys of the degree to which customers and users of government information systems are satisfied with the systems' performance.
It is noteworthy these problems have existed despite the fact that they often have significant financial costs. More important (from the perspective of this thesis) is the fact that these problems have occurred despite the existence, in many cases, of legal rules aimed at minimising, if not eliminating, them. A great deal of these rules are found in data protection laws. We are thereby tempted to put a question-mark against the efficacy of such laws in ensuring adequate data/information quality. Are the relevant rules in these laws sufficiently clear as to what is required of data controllers in terms of quality assurance? Are the rules sufficiently stringent? Or do many organisations that are supposed to comply with the rules fail to do so for reasons of ignorance, apathy, indifference and/or an attitude that compliance is too burdensome? Aspects of these questions are taken up in Part IV where they have considerable bearing for determining the ability of data protection laws to minimise the potentially detrimental impact of profiling practices on data subjects.
Finally, it should be noted that a multitude of factors affect the quality of data/information. Some of these factors are basically technological in character (eg, faults with hardware and software); some are essentially organisational (eg, the extent to which information is actually used by the persons or organisations engaged in its processing); while others are primarily legal (eg, the availability and utilisation of access rights). Still others pertain primarily to human cognition. For present purposes, it is unnecessary to canvass all of these factors in detail. For the purposes of the discussion in Part IV, however, it is pertinent to stress that the set of factors relating to human cognition play a relatively large role in determining data/information quality. In other words, poor information quality can be often a reflection of poor thinking.
The latter point is borne out by, ia, a comprehensive study from the late 1970s of the quality of statistical data on production and consumption of energy in the USA. The study found that the most significant problem with the quality of statistical data on production and consumption of energy in the USA has not been the data's lack of validity (accuracy, precision or completeness) but their misinterpretation and consequent misapplication. According to Andrew Loebl, one of the study researchers, "in virtually every case where an energy analysis was said to be vulnerable on the grounds of inaccurate data, it was subsequently found to have used reasonably accurate data in an inappropriate (nonrelevant) manner". Data misapplication tended to occur because the understanding about the problems at hand, and hence about which data were relevant to these problems, was faulty. For example, data that were supposed to measure energy conservation actually measured energy consumption, while data meant to measure energy consumption in fact measured energy sales.
Another illustration of poor "cognitive quality" leading to misapplication of data is the outcome of a matching program initiated by a Swedish municipality, Kungsbacka, in the early 1980s. The aim of the program was to identify persons in illegal receipt of housing aid, and involved the matching of income data held in various data registers. The matching resulted in a large number of spurious "hits", primarily because account was not taken of the fact that the matched data registers operated with different concepts of "income". The results of this matching program illustrate the obvious but important point that many terms (such as "income"), which we use to categorise data, can have different underlying referents. This is a point that those responsible for the Swedish matching program failed to appreciate.
These examples of poor cognitive quality are important to keep in mind when considering the potential problems posed by profiling practices. These problems are taken up in Part IV.
While the developments outlined in section 6.2 have each contributed, albeit to varying degrees, to the emergence and/or continued existence of data protection laws, they are not sufficient causes of such legislation. What has helped transform them into issues of legislative concern is a congeries of public fears about some of these developments' potential and actual effects. One set of fears relates to increasing transparency, disorientation and disempowerment of data subjects vis-à-vis data controllers. Another set of fears concerns loss of control over technology. A third set pertains to dehumanisation of societal processes.
Anxiety over increasing transparency, disorientation and disempowerment of data subjects revolves mainly around the effects of two developments: (i) growth in the amount of data gathered and shared by organisations; (ii) diminishing participation by data subjects in decision-making processes affecting them. On the one hand, these developments involve increases in the knowledge organisations have about the individuals, groups and other organisations with whom they deal. With a rational basis in Francis Bacon's adage "knowledge is power", many people fear that this knowledge increase will make it easier for organisations to influence data subjects' behaviour in ways that unfairly undermine their autonomy and integrity. At the same time, they fear the possibility that the data disseminated within and between organisations are invalid, misconstrued or misapplied in some way, thereby leaving the data subject(s) vulnerable to unwarranted interference. On the other hand, the above developments involve increases in the complexity of cross-organisational data flows and in the blurring of organisational lines. Accordingly, it is feared that these developments will tend to make it more difficult for data subjects to trace the flow of data on themselves. This difficulty will threaten, in turn, data subjects' control over what happens with their various virtual personas. Similarly, their ability to identify who or what is responsible for each of the myriad transactions involving their data, plus the full parameters of these transactions, will tend to be reduced.
The sum of these fears is a general anxiety that the above developments, if unchecked, will result in an unprecedented aggregation of power in large organisations, thereby threatening the bases for democratic, pluralistic society. This anxiety is well-expressed in the report of the Australian Law Reform Commission (ALRC) recommending enactment of data protection legislation:
If privacy protection were not strengthened, it would be difficult for Australian society to maintain its traditions of individual liberty and democratic institutions in the face of technological change, which has given to public and private authorities the power to do what a combination of physical and socio-legal restraints have traditionally denied to them.
Similarly, in a famous decision of 1983 - responsible, in part, for the subsequent strengthening of German data protection legislation - the German Federal Constitutional Court (Bundesverfassungsgericht) observes that modern forms of data processing threaten the free development of personality by making it increasingly difficult for citizens to determine who knows what about them. The Court goes on to note that this difficulty can have a chilling effect on citizen's social engagement, thereby impairing pluralism and democracy.
The second set of fears revolves mainly about the spiralling complexity of IT, information flows and organisational patterns. People fear that the environment resulting from this complexity will elude full human comprehension. They warn of a future in which humans will increasingly come under the sway of runaway technology that cannot be effectively steered. Such fears are exemplified in the following comments by Justice Michael Kirby, who has been central in international efforts to arrive at common standards on data protection and security of information systems. For Kirby, the "fundamental risk" to human society
derives from the apparent incapacity of the international community and of the representative democratic process to keep pace with the social implications of technology. [...] The dazzling complexity of modern technology leaves many bureaucrats and lawyers bemused, even intimidated. There is, occasionally, a sense of despair that the subject matter of proposed regulation will ever be understood. If understood, the chances are that the target will move before the snail-pace procedures of regulation are set in train, let alone adopted.
The third set of fears revolves mainly around the encroachment of automated/machine processes on human interaction. These fears envision the gradual development of an instrumental, mechanistic conception of humans. Concomitantly, they portend a future in which human relations are subjugated by an unfeeling, purely instrumental rationality, of which computer technology is one manifestation. In such a society, it is claimed, human spirit will give way to moral indifference and fatalism. This set of fears, together with the second-mentioned set, are best expressed in the works of Jacques Ellul and Joseph Weizenbaum, though neither of these works has figured prominently in the discourse dealing specifically with data protection.
Of the three sets of fears described above, it is the first-mentioned that has predominated in data protection discourse and played the greatest role in kick-starting enactment of data protection legislation. To some extent, though, all three sets of fears overlap with each other. Moreover, many of the themes of the second and third-mentioned sets are reflected in the debate over use of PINs and the automatisation of decision-making processes.
While all three sets of fears ultimately show concern for the quality of human society generally, their primary concern has been for the privacy, autonomy, integrity and dignity of individual, natural/physical persons. In some instances, though, concern has also embraced the interests of private sector groups and organisations in the role of data subjects. And, very occasionally, concern has even embraced the interests of certain state institutions.
The above fears have been nourished by certain concrete experiences. Of especial importance in this respect have been the traumas of fascist oppression prior to and during World War Two. Also important, particularly in the USA, has been the Watergate scandal of the early 1970s.
The above fears have also been nourished by particular concrete manifestations of information technology. In this respect, it was arguably the mainframe computer in the form of the IBM 360 series which played a significant role in nourishing public fears during the 1960s and 1970s. For the average person, what seemed especially threatening about the mainframes was a combination of their physical bulk, their placement outside the public domain and their concomitantly mysterious but (for those times) powerful data-processing potential. In the course of the last 15 years, though, these threatening characteristics have lost much of their impact due to computers' increasing ubiquity, miniaturisation and user-friendliness. In terms of technology, what arguably tends to nourish public fears now is less any one image of a certain type of computer but a more variegated image of a web of interconnected technologies (video surveillance cameras, smart cards, vehicle tracking devices, etc) able to track people's myriad patterns of behaviour.
It is not just concrete experiences and concrete manifestations of IT which have nourished the above fears; certain dystopian visions have also played a significant role. In data protection discourse, the most salient of these visions stems from George Orwell's novel, Nineteen Eighty-Four, published in 1949. The novel's depiction of a mercilessly totalitarian future has become the primary point of reference for envisaging the potential endpoint of present tendencies towards greater surveillance and control. What makes Orwell's envisaged dystopia particularly relevant for contemporary discourse on these tendencies is the insightful depiction in the novel of the dynamics of social control. Control is seen as resting, in part, on state monopolisation and manipulation of information (witness, for instance, the activities of the "Ministry of Truth"). Control is also seen as resting on the use of relatively sophisticated technology for keeping tabs on the citizenry (witness the ubiqitous telescreen monitors). Other advanced forms of IT, however, are absent from the novel. Especially significant with the telescreen monitors is that the citizenry are never certain as to when they are in fact being watched. Orwell shows how this uncertainty tends, of itself, to produce behaviour in conformity with the wishes of the state.
This last point is central to a second dystopian vision of note; namely, the vision of "panopticism" expounded initially by Michel Foucault on the basis of Jeremy Bentham's famous prison plan of 1791. For Foucault, the control dynamics of Bentham's Panopticon involve inducing in the prisoner
a state of conscious and permanent visibility that assures the automatic functioning of power; ... in short, that the inmates should be caught up in a power situation of which they themselves are the bearers.
The Panopticon achieves this by "dissociating the see/being seen dyad": the seer (guard) can see all without being seen; the seen (inmates) are totally exposed without ever seeing. Control, then, rests upon an informational imbalance between the observers and the observed. The latter are made transparent vis-à-vis the former, but not vice-versa.486
While the extent to which these control dynamics actually permeate contemporary societies is debatable, Foucault's vision of panopticism alerts us to the intimate connection between surveillance and control, and to the subtelty with which the latter mechanisms can work. From a data protection perspective, the linking of control to informational imbalance between observers and the observed is particularly important. Also important is how panopticism shows that the mere registration of personal data - quite apart from the actual use of the data in decisions affecting the data subject(s) - has disciplinary potential. Accordingly, the notion of panopticism figures increasingly in data protection discourse. Numerous scholars are taking up Foucault's analysis and adapting it to take account specifically of modern applications of information technology. Nevertheless, it is my impression that the notion of panopticism is still not as prominent in the general public consciousness as Nineteen Eighty-Four. And the latter has undoubtedly played a more significant role in igniting debate in Western societies on the need for greater privacy and data protection to counter the growing pervasiveness of systems of mass surveillance.
Some of the fears described above - particularly the first set - are reflected in recent surveys of public attitudes to privacy and data protection issues. Although such surveys often suffer from methodological weaknesses, they do provide evidence of high levels of public concern for privacy and data protection, at least in the abstract. In the USA, where there is a relatively long history of such surveys, this concern appears to have increased significantly over the last two decades, though it seems to be levelling off in the mid-1990s. The surveys also provide evidence of a growing feeling amongst people that they are losing their privacy and/or ability to control how data on themselves are being used. Accompanying this feeling are low levels in public trust that organisations will not misuse personal information.
On the basis of the survey material from the USA and Canada, Westin concludes that the extent to which persons express concern for threats to their privacy tends to increase the less they trust that their interests will be cared for by organisations and technology. Westin has also expressed trust levels in terms of "alienation", claiming that the more alienated persons feel from technology and organisations, the more likely they are to be concerned about threats to their privacy.
These levels of alienation and distrust, along with the fears they manifest, are arguably part of a more general trend in contemporary society whereby human interaction and self-perception are increasingly pervaded by consciousness of risk. By "risk" is meant the possibility of human action triggering events with detrimental consequences for society. The theme of risk has come into focus in recent sociology, primarily through the work of Ulrich Beck. According to Beck and others, a major distinguishing feature of present-day society is that human behaviour is weighed down by a growing awareness of threat, vulnerability and unpredictability. We experience a gradual loss of "cognitive sovereignty" over the parameters and consequences of our actions. We feel less able to divine what is dangerous and what is safe for ourselves. At the same time, our apprehension of danger is focused increasingly on what we do not see, what we do not feel, what we do not know. We are more and more wary of "safe" appearances, promises and assurances. Our actions are increasingly motivated by anticipated dangers of the future, so-called "Not-Yet-Events".
While sociological discourse on risk society often focuses on threats to the natural environment brought on by industrial processes, it is clear that the growing pervasiveness of systems of mass surveillance and control, and associated developments in utilisation of personal data, help to constitute the above features of risk consciousnessness. As noted previously in this chapter, we are faced with information systems of growing complexity and diminishing transparency; data on ourselves - both as individuals and as members of various collective entities - are being handled by many persons and organisations of which we know little or nothing. Exacerbating the anxiety brought on by this loss of cognitive sovereignty are the dystopian visions of Orwell, Foucault and others.
The expansion of risk consciousness makes it apposite to view data protection laws as concerned with shoring up public trust in modern information systems or, more specifically, in the way organisations process personal data. This concern manifests itself in, ia, the basic principles of data protection laws, particularly those principles, such as that of purpose specification, which are directly aimed at promoting foreseeability in data-processing outcomes and thereby reducing deficits in data subjects' cognitive sovereignty. However, the effort at generating trust in information systems is manifest not just in the contents of data protection laws but in their very legality. As Burkert points out, data protection laws attempt to generate trust in information systems largely by utilising public trust in the efficacy of legal norms.
At the same time, the way in which data protection laws - particularly the first pieces of such legislation - have been drafted shows traces of a deficit in cognitive sovereignty on the part of legislators and other policy makers. The prominence in these laws of procedural controls and relatively diffuse, open-ended rules, together with the creation of data protection authorities, are partly symptomatic of legislative uncertainty about the appropriate regulatory response to the fears outlined above. Somewhat paradoxically, though, it is likely that certain of these features - namely, the use of relatively diffuse, open-ended rules - can have a debilitating effect on the generation of legal certainty and thereby the generation of public trust.
Despite heightened risk consciousness and large numbers of people expressing concern for privacy and data protection, actual examples of large-scale, popular movements with such concern figuring prominently on their agenda are few. The dry remark of a former member of the US Congress seems appropriate: "privacy is an issue in which public concern is a mile wide and an inch deep". Concomitantly, the process leading to enactment of data protection laws has been steered in most cases only indirectly by pressure from the general public. Of greater influence have been the prescriptions of a relatively small, transnational network of concerned experts, such as Spiros Simitis, Michael Kirby, Jan Freese and Hans Peter Gassmann.
The three sets of fears set out previously in this section are not the only fears that have acted as catalysts for the emergence of data protection laws. A fourth set of fears, touched upon in Part I, has also had an impact. Unlike the other three sets of fears, this fourth set is primarily economic in character. And it is shared mainly by governments and business organisations. Moreover, it does not help to explain the initial emergence of data protection laws but the adoption of data protection instruments after the first wave of laws were in place.
One aspect of this set of fears revolves around the desire by governments and business groups to stimulate consumer interest in participating in various electronic transactions, particularly those of a commercial nature. It is feared that without data protection legislation in place, there will not be sufficient consumer confidence to engage in these transactions. This fear has mainly manifested itself in the last few years, when full-scale electronic commerce has become technically feasible and economically desirable.
Another aspect of this fourth set of fears revolves about the fact that many data protection laws allow for restrictions to be put on the flow of personal data to countries without sufficient levels of data protection; thus, there is a possibility for these laws to hinder transnational data flows, thereby disrupting commercial and/or governmental processes. This possibility has helped prompt national governments to enact data protection laws that are recognised as adequate by countries already in the possession of such laws. The clearest example of this process at work is the passage of the UK Data Protection Act of 1984: the desire by the UK government to avoid restrictions on the flow of data into the country was decisive in its decision to enact the legislation. However, it is in relation to adoption of international data protection instruments, especially the OECD Guidelines and EC Directive, that fear of disrupted data flows has had the greatest impact. This fear has played a significant role in stimulating adoption of these instruments. For instance, Justice Michael Kirby, who was Chairman of the expert group responsible for drafting the OECD Guidelines, writes:
It was the fear that local regulation, ostensibly for privacy protection, would, in truth, be enacted for purposes of economic protectionism, that led to the initiative of the OECD to establish the expert group which developed its Privacy Guidelines. The spectre was presented that the economically beneficial flow of data across national boundaries might be impeded unnecessarily and regulated inefficiently producing a cacophony of laws which did little to advance human rights but much to interfere in the free flow of information and ideas.
Similarly, work on drafting the EC Directive was motivated to a large extent by fear that disharmony between the various data protection regimes of EC/EU member states would hinder realisation of the internal market. However, the possibility - alluded to by Kirby - of national data protection laws being passed for the purposes of economic protectionism seems to have been absent from the concerns of the EC organs when they set about drafting the Directive.
Fears about economic protectionism were aired mainly in North American quarters during the late 1970s and early 1980s. They tended to result in allegations that an underlying motivation for the enactment of many of the national data protection laws in Europe was to protect the nascent, European data-processing industries from foreign competition. A typical example of this claim is found in an article by Pinegar, who writes:
The salutory motive of privacy protection is not the only and perhaps not the primary impetus behind the European data protection legislation. There can be no question that, either intentionally or consequentially, these laws (particularly the Austrian Act and others granting protection to legal entities) are effective non-tariff barriers to the free flow of commercial and other information. [...]
Th[e] predominant position of the US in high technology fields has been an economic thorn in the flesh of European and Third-World industries. Some of these countries fear and detest what is perceived to be technological and economic dependence on the US as a result of its leading role in the development of the data processing industry. Thus, in addition to the need for privacy protection, these nations have been prompted economically and politically to enact restrictive data protection legislation.
Such allegations reflected unease, especially on the part of US trade representatives, over the spate of European data protection laws that were enacted in the mid- to late 1970s. It was feared that these laws were introduced too quickly, without adequate discussion of their economic consequences, and would hinder the international expansion of the data-processing industry, which is dominated by American firms. Criticism focused upon two features of these laws. The first was that the laws contain provisions restricting transborder flows of personal data in certain circumstances. The second feature, which essentially is an expansion of the first, was that some of these laws protect data on legal persons in addition to data on individuals, with the consequence that the scope of the restriction on transborder flows of data is widened.
Very little solid evidence has been provided to back up the allegations of economic protectionism. Statements made in 1978 by Louis Joinet (former French Minister of Justice and a principal drafter of the French data protection law of 1978) and by Gerhard Stadler (who helped shape the Austrian data protection legislation of 1978) have been cited as relevant evidence. However, it is difficult to infer anything conclusive from these statements about why the French and Austrian data protection Acts were passed in the form they were, particularly as few details are given about the context in which the statements were made. The statements are very general and, in the case of Stadler, rather ambiguous. It is even more difficult, of course, to infer from them anything conclusive about why other European countries enacted data protection laws.
Regarding the legislative history of the Norwegian Personal Data Registers Act, there is a complete absence of any mention in the Act's preparatory documents of the need to protect Norwegian industry from foreign competition. As for the actual consequences of the Act's regulation of transborder data flows, empirical studies have not found evidence of this regulation being practised in a protectionist manner. The same applies with respect to regulation of transborder data flows pursuant to the first data protection laws of Germany, Austria, Sweden, France and the UK. In relation to Denmark's Private Registers Act, however, Blume notes that the rationale for the Act's rules on transborder data flows is not concerned solely with protection of individual persons; the rules are also grounded upon a desire to build up a national computer industry, such that public or private enterprise in Denmark can continue to operate independent of happenings in other countries. Nevertheless, it would appear that it is not the desire for economic protectionism as such that is one of the motivating forces for Danish regulation of transborder data flows but the desire to ensure that enterprises in the country are able to continue functioning in the event of some foreign crisis.
The final proof advanced in support of the protectionism theory is the fact that some of the European data protection laws expressly protect data on legal persons. In the opinion of Pinegar and Grossman, this means that one cannot claim that these laws have been passed simply in order to protect the right of privacy; hence, these laws have also been passed for the purpose of economic protectionism. This argumentation rests upon two assumptions: (i) that the purpose of "pure" data protection laws is only to safeguard privacy; and (ii) that privacy as a concept and legal right can only embrace natural/physical persons. As Part III of this thesis shows, both assumptions are highly questionable.
While the protectionism theory seems to lack validity in relation to national data protection laws passed in Europe in the 1970s and 1980s, it is perhaps less easily refuted with respect to the EC Directive. There is plenty of evidence indicating that the EC Commission, together with the Council of Ministers, first took up the issue of data protection in the 1970s largely out of concern for fostering development of the internal market and European IT-industry. We find traces of such a concern also in the Commission Communication setting out the first proposal for the data protection Directive. But to what extent this concern accurately reflects a desire for economic protectionism is unclear; equally unclear is the extent to which final adoption of the Directive took place in order to fulfil such a desire. Nevertheless, it is scarcely to be overlooked that implementation of the Directive - particularly Arts 25 and 26 - might well have protectionist benefits for data controllers established within the EU.
A range of legal factors have contributed to the emergence and continued existence of data protection laws. These factors can be divided into two main categories according to the kind of contribution they have made. First, there are factors that have served as sources of inspiration for the development of data protection laws by positively providing the latter with a normative basis. In the following, these factors are termed "positive legal factors" for short. Secondly, there are factors that have contributed to the emergence of data protection laws by failing to tackle adequately the problems arising as a result of the developments outlined in sections 6.2 and 6.3. In the following, these factors are termed "negative legal factors" for short. As shown below, the two categories are not mutually exclusive; some of the legal factors concerned work both "positively" and "negatively".
Legal sources of inspiration for the development and continued existence of data protection laws are spread over a variety of instruments: international treaties, national Constitutions, other national legislation and judicially created doctrines. In the following, I deal with relevant aspects of each of these types of instruments in the order the instruments are set out above. I focus mainly on those factors that exercise, or have exercised, a substantial influence on the development of data protection laws.
Much of the formal normative basis for law on data protection is provided by catalogues of fundamental human rights as set out in certain multilateral instruments, notably the Universal Declaration of Human Rights (UDHR) of 1948, the International Covenant on Civil and Political Rights (ICCPR) of 1966, and the main regional human rights instruments. The normative significance of these catalogues is expressly recognised in some of the data protection laws themselves, with the CoE Convention and the EC Directive being two prime examples. A variety of provisions in the catalogues inspire the central principles of data protection laws. Examples here are provisions proclaiming rights to liberty, freedom of thought, freedom from discrimination and freedom from torture.
However, it is provisions proclaiming a right to privacy or private life which constitute the most direct inspiration for the principles of data protection laws. The central significance of such provisions is manifested, ia, in the CoE Convention and EC Directive both of which single out the "right to privacy" from other rights of data subjects as being especially pertinent in the context of data protection. It is also manifested in case law developed pursuant to Art 17 of the ICCPR and Art 8 of the ECHR. The former provision reads:
1. No one shall be subjected to arbitrary or unlawful interference with his privacy, family, home or correspondence, nor to unlawful attacks upon his honour and reputation.
2. Everyone has the right to the protection of the law against such interference or attacks.
Article 8 of the ECHR reads:
1. Everyone has the right to respect for his private and family life, his home and correspondence.
2. There shall be no interference by a public authority with the exercise of this right except such as is in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the rights and freedoms of others.
Both provisions have been authoritatively interpreted as embracing many of the core principles of data protection laws. Indeed, case law developed pursuant to both provisions indicates that each have the potential to embrace all of these core principles. The clearest indication of such potential is found in General Comment 16 of the (UN) Human Rights Committee. This states that Art 17 of the ICCPR requires implementation of essential data protection guarantees in both the public and private sectors:
The competent public authorities should only be able to call for such information relating to an individual's private life the knowledge of which is essential in the interests of society as understood under the Covenant. [...] The gathering and holding of personal information on computers, databanks and other devices, whether by public authorities or private individuals and bodies, must be regulated by law. Effective measures have to be taken by States to ensure that information concerning a person's private life does not reach the hands of persons who are not authorized by law to receive, process and use it, and is never used for purposes incompatible with the Covenant. In order to have the most effective protection of his private life, every individual should have the right to ascertain in an intelligible form, whether, and if so, what personal data is stored in automatic data files, and for what purposes. Every individual should also be able to ascertain which public authorities or private individuals or bodies control or may control their files. If such files contain incorrect personal data or have been collected or processed contrary to the provisions of the law, every individual should have the right to request rectification or elimination.
Although the data protection guarantees listed here by the Committee are significantly truncated relative to the guarantees specified in the data protection instruments canvassed in Part I, it is extremely doubtful that the Comment is intended to delineate exhaustively the extent to which Art 17 embraces data protection. In other words, there is little reason for not being able to treat the Committee's general comment as merely laying down some but not all of the data protection guarantees capable of specification pursuant to Art 17.
A second important source of legal inspiration for the emergence of data protection laws are various provisions in national Constitutions (or Basic Laws). Sometimes the link between data protection laws and Constitutional provisions is expressly recognised in the former. More commonly, though, the link is expressly provided for in the Constitutions. Some of the latter contain an express right to data protection. Other Constitutions expressly require that data protection legislation be enacted. Constitutions often also contain a broad range of other provisions that help form the normative underpinnings of data protection laws. These provisions are expressed in terms of protecting such values as human dignity, personality, privacy and the like.
Though the latter sorts of values are relatively diffusely formulated, their normative relevance - both actual and potential - for the development of data protection laws has been made manifest in judicial decision making, notably that of the Federal German Constitutional Court. In a famous and influential decision of 15.12.1983, the Court struck down parts of the federal Census Act (Volkzählungsgesetz) of 1983 for breaching Arts 1(1) and 2(1) of the Federal Republic's Basic Law. Article 1(1) provides: "Human dignity is inviolable. To respect and protect it is the duty of all state authority". Article 2(1) provides: "Everyone has the right to the free development of his personality insofar as he does not violate the rights of others or offend against the constitutional order or against morality". The Court held that the two provisions give individuals a right to "informational self-determination" ("informationelle Selbstbestimmung"); ie, a right for the individual "to determine for himself whether his personal data shall be disclosed and utilised". The Court went on to hold that, though this right is not absolute, it will be infringed if personal data are not processed in accordance with basic data protection principles. Of the latter, the Court focused especially on the principle of purpose specification.
Another important judicial decision in this context is the ruling of 9.4.1991 by the Hungarian Constitutional Court in which census legislation was struck down for violating Art 59(1) of the national Constitution. In reaching its decision, the Court expounded substantially the same line taken by the German Federal Constitutional Court in the Census Act judgment. It laid particular emphasis on the purpose specification principle, and stipulated, concomitantly, that the creation of a general, uniform PIN for unrestricted use is unconstitutional.
Both of the above decisions have had a significant impact on the development and conceptualisation of data protection law in Germany and Hungary respectively. The Census Act decision helped stimulate efforts to revise and strengthen, ia, Germany's federal data protection legislation. The impact of the judgment of the Hungarian Constitutional Court is seen in, ia, Art 7(2) of Hungary's data protection Act which states that "unlimited, general and uniform personal identification codes shall not be used".
Facets of administrative law provide a third important source of inspiration for the emergence of data protection legislation. Traditional rules on due administrative process - as manifest in legislation on government decision-making procedures or in common law on judicial review of government decisions - embody principles that are precursors to some of the central rules of data protection laws. These principles require, ia, that government decision makers: (i) be unbiased or disinterested in the matter which is decided; (ii) base their decisions on relevant evidence; and (iii) give an opportunity to be heard to persons whose interests will be adversely affected by the decisions. There are strong links between the first two of these principles and those provisions of data protection laws dealing with information quality. There are equally strong links between the third-listed principle and those provisions in data protection laws dealing with data subject participation and control. Some of the latter provisions - particularly those concerning the access rights of data subjects - also parallel the thrust of legislation on public access to government-held information (hereinafter termed legislation on "freedom of information" (FOI)).
At a higher level of abstraction, we can discern within data protection laws considerable influence from older doctrines on "rule of law". Such doctrines are broadly concerned with regulating power relations between the state and citizens by curbing arbitrariness in the exercise of state power. In furtherance of this concern, they stipulate the importance of subjecting state power to legal controls that promote foreseeability and accountability in government decision-making processes. Data protection laws embrace such concern at the same time as they broaden the focus of the latter from administrative decision making to data processing more generally.
This is not to say that the concerns of doctrines on rule of law are fully commensurate with the concerns of data protection laws. While doctrines on rule of law - and, concomitantly, administrative law - are traditionally limited to governing the relationship between state organs and citizens, most data protection laws also regulate directly the relationship between private organisations and citizens. Moreover, doctrines on rule of law - and, concomitantly, large parts of administrative law - focus traditionally on specific administrative decision-making processes to which a private individual or organisation is a party, whereas the focus of data protection laws is on the processing of personal information. Such processing need not be directly related to a specific decision-making process, though it often is. Further, doctrines on rule of law tend to encompass a range of issues - eg, the quality of legal norms and the quality of judicial operations - with which data protection laws are not directly concerned.
A fourth major source of inspiration for the emergence of data protection laws are rules in national legislation and case law which lay down rights to privacy and/or personality. Rules dealing with defamation, wrongful discrimination and intellectual property are also pertinent, though to a lesser degree. All of these rules prefigure the basic thrust of data protection laws in that they prohibit various kinds of behaviour, including certain ways of processing personal data, in order to protect the autonomy, integrity, dignity and/or privacy of the data subject(s).
There is little doubt that general doctrines on property rights have also played a role in inspiring data protection laws, though the exact importance and extent of this role are difficult to gauge. Much depends on how one defines property rights. These can be defined at such a level of generality that they are taken as providing the fundamental basis for enormous tracts of the legal system. If we define property rights as conferring ownership of some object or thing, in the sense that the rights holder is given a legally enforceable claim to exclude others from utilising that object/thing, some reflection of such rights can be discerned in those provisions of data protection laws that make the processing of personal data conditional on the consent of the data subject(s). At the same time, however, these provisions are frequently watered down by exemptions that make it difficult to see the resultant level of data ownership (in the above-defined sense) as much more than symbolic. Moreover, there tend to be few, if any, other direct and obvious manifestations of property rights doctrines in data protection laws or their travaux préparatoires. This is not to deny, however, the possibility that several of the core principles of the legislation can serve to protect, albeit indirectly, the idea(l) of data subjects owning their data (again, in the above-defined sense of ownership). It is also noteworthy that some of the early and influential contributors to the discourse out of which data protection laws emerged have championed property rights doctrines as providing a desirable basis for data protection regimes. A similar line has also been advanced by some of the more recent contributors to data protection discourse. Nevertheless, just as many, if not more, contributors to this discourse - especially outside North America - are sceptical to such an approach.
It would be wrong to see the existence of each of the legal factors canvassed above as a necessary precondition for the enactment of data protection laws. For instance, some countries - such as the UK and Germany - have enacted data protection laws without having comprehensive FOI legislation already in place. To take another example, some countries - such as Australia and the UK - have enacted data protection laws without specifically recognising a right to privacy in their respective legal systems.
It should also be emphasised that the links between each of the above-cited legal factors and data protection laws are not always directly recognised in the travaux préparatoires of the latter or in other related commentary. Likewise, awareness of such links has varied from jurisdiction to jurisdiction and from period to period. In Norway, for example, the enactment of the Personal Data Registers Act was accompanied by considerable awareness of the close similarity between data protection concerns and administrative law doctrines, whilst the links to human rights as formalised, say, in the ECHR and ICCPR were downplayed. In recent years, however, data protection discourse in Norway has shown increasing recognition of the normative importance of human rights law for data protection. To take another example, legislators in some European countries, such as France, appear to have failed to see the close connections between laws on data protection and laws on FOI, at least at the time these laws were first enacted. This is in contrast to Canada and Hungary where the two types of laws have been enacted in single, co-ordinated legislative packages.
Finally, it should be emphasised that the development and existence of data protection laws have inspired - and will continue to inspire - changes in other legal fields, including those to which the above-cited legal factors belong. There is, in other words, an ongoing cross-fertilisation of legal influences. This process is most apparent in the interaction of data protection laws and human rights law. On the one hand, greater readiness to construe treaty provisions on the right to privacy as containing data protection guarantees is partly inspired by the emergence of data protection laws. On the other hand, such readiness serves to stimulate the enactment of data protection laws in countries where such laws do not already exist, or to stimulate the strengthening of existing laws. Such readiness also serves to anchor data protection laws more firmly in traditional human rights doctrines, thereby influencing the way these laws are conceptualised.
In relation to some legal fields, we see only the beginnings of a potential cross-fertilisation process. An example here is the interaction of data protection laws with competition law. In at least one jurisdiction (Belgium), the enactment of data protection law is leading to changes in traditional doctrines on "fair competition", with the latter being infused with elements of the former. However, the full extent and manner of such impact remain to be seen, as do the ways in which competition law might rub off on the practice and/or conceptualisation of data protection laws.
Data protection laws would not have emerged had legislators believed that pre-existing legal rules could assuage public fears over the technological and organisational developments outlined in section 6.2. Thus, the introduction of data protection laws has been preceded by a range of studies concluding, for the most part, that other rules already in existence lack the precision and/or breadth to tackle these fears sufficiently.
In some cases, pre-existing legal rules have also been found to have the potential to exacerbate threats to personal privacy and integrity. This is best exemplified in Sweden, which has a long-standing tradition of open government enshrined in constitutional provisions granting citizens a right of access to government documents. While concern in the late 1960s about this access right was focused initially on the prospect of the right being curtailed because of its possible inapplicability to machine-readable data, there emerged subsequent concern that computerisation might well lead to a situation in which exercise of the right facilitated the fast and easy dissemination of large amounts of personal data. Accordingly, the enactment of Sweden's Data Act of 1973 can be viewed as "a qualification of the principle of freedom of information, made in recognition of the threat to personal privacy posed by the age of computers".
It is noteworthy that some legal instruments previously judged inadequate from a data protection perspective, have subsequently shown considerable potential to embrace data protection principles. An example is the ECHR. Work by the CoE on drafting its early Resolutions on data protection, followed by its 1981 Convention on the same matter, arose out of a perception that the ECHR did not provide sufficient protection for individuals in the face of computerised processing of personal data, particularly in the private sector. However, as noted in section 6.4.1, the Strasbourg organs have since exhibited increasing willingness to read basic data protection principles into Art 8 of the ECHR.
 See also Chapter 1 (section 1.1).
 In the USA, see particularly A F Westin, Privacy and Freedom (New York: Atheneum, 1967), chapts 7 & 12; A R Miller, The Assault on Privacy: Computers, Data Banks and Dossiers (Ann Arbor: University of Michigan Press, 1971), chapts I-III. Both works have also been influential outside the USA. In the UK, see, eg, M Warner & M Stone, The Databank Society: Organizations, Computers, and Social Freedom (London: Allen & Unwin, 1970); P Sieghart, Privacy and Computers (London: Latimer, 1976), espec chapts 2 & 3. In Norway, see, eg, E Samuelsen, Statlige databanker og personlighetsvern (Oslo: Universitetsforlaget, 1972), 11-12 & chapt 4; Offentlige persondatasystem og personvern, NOU 1975:10, espec 10ff; Persondata og personvern, NOU 1974:22, espec 6-7, 28ff. In Sweden, see particularly Data och integritet, SOU 1972:47, espec 30-32 & chapts 3-7. In Switzerland, see espec Botschaft zum Bundesgesetz über den Datenschutz vom 23.3.1988, 4-5. For a general overview of this discourse and the issues motivating it, see Bennett, supra n 10, espec chapt 2.
 See, ia, S Simitis, "Auf dem Weg zu einem neuen Datenschutzrecht" (1984) Informatica e diritto, no 3, 97, 105 ("Die Datenschutzgesetze sind durchweg Reaktionen auf die radikal veränderten Formen der Verarbeitungstechnik. Keine Datenschutzvorschrift lässt sich deshalb richtig verstehen, wenn nicht zugleich der technologische Hintergrund sorgsam bedacht wird"); Swire & Litan, supra n 239, 50 ("computers are the key reason for data protection rules"); Burkert, supra n 33, 170 ("Data protection regulations are ... part of a new type of regulations caused by technological changes").
 See, ia, section 6.4.1.
 The most high-profile instance was the proposal to set up a National Data Center in the USA which would consolidate in one database all information on US citizens held by federal government agencies, ostensibly for the purpose of improving social planning: see further Miller, supra n 355, 54-67.
 See generally, Bennett, supra n 10, 51-53.
 See, eg, the Scandinavian countries' comprehensive, public sector schemes for referencing personal data by way of unique identification codes. For overviews of these schemes, see K S Selmer, "Hvem er du? Om systemer for registrering og identifikasjon av personer" (1992) LoR, 311, 322ff (detailing the Norwegian system); Personnummer: Integritet och effektivitet, SOU 1994:63 (outlining the Swedish system); P Blume, "The Personal Identity Number in Danish Law" (1989-90) 3 CLSR, no 5, 10-13 (describing the Danish system).
 See generally, Bennett, supra n 10, 46-53. It is worth noting, though, that little public debate about privacy and data protection issues accompanied the introduction of national PIN schemes in the Scandinavian countries, mainly because the schemes were put in place prior to widespread use of computers: see Blume, supra n 360, 10; Selmer, supra n 360, 323 & 332. These schemes have generated, nevertheless, a large amount of debate on privacy and data protection issues - not least in countries outside Scandinavia - in subsequent years. See, eg, CoE, The Introduction and Use of Personal Identification Numbers: The Data Protection Issues (Strasbourg: CoE, 1991), espec 20ff; Flaherty, supra n 298, espec 15-16, 77-78, 166.
 For further discussion, see, ia, J Bing, "The informatics of public administration: introducing a new academic discipline" (1992) Informatica e diritto, no 1-2, 23, 28ff.
 A fairly recent example from Norway being the program "Nasjonal infrastruktur for EDB", administered by the Directorate of Public Management (Statskonsult) in the period 1990-1992.
 For further discussion, see, ia, H Burkert, "The Commercial Use of Government Controlled Information and its Information Law Environment in the EEC", in W F K Altes, E J Dommering, P B Hugenholtz & J J C Kabel (eds), Information Law Towards the 21st Century (Deventer/Boston: Kluwer Law & Taxation Publishers, 1992), 223-246; P Blume, "Kommercialisering af offentlig information", in Ret & Privatisering (Copenhagen: GadJura, 1995), 65-84.
 See further Chapter 17.
 See further, eg, H Burkert: "Data-Protection Legislation and the Modernization of Public Administration" (1996) 62 Int Rev of Administrative Sciences, 557, espec 564-565.
 Examples are provided in section 6.2.3 below.
 See further section 6.2.3. For closer analysis of the various components of data/information quality, see Chapter 7 (section 7.2.5).
 A recent instance of this trend is the use of computer programs by insurance companies to calculate the degree to which a car-crash victim has been caused neck injury ("whiplash") as a result of the accident. See Aftenposten (morning edition), 3.3.1997, 4.
 The phrase is taken from Herbert Fiedler's article of the same name: "Automationsgerechte Rechtsetzung" (1974) 9 Datareport, no 2, 12-17. See also J Bing, "Automationsvennlig lovgivning" (1977) TfR, 195-299. The phrase denotes the adaptation of legal rules to facilitate automated decision-making processes. For a more recent analysis in English of what such adaptation entails, see J Bing, "Three Generations of Computerized Systems for Public Administration and Some Implications for Legal Decision-Making" (1990) 3 Ratio Juris, 219-236. As Bing points out, a basic thrust of the process involves the substitution of rule structures based on "strict", easily quantifiable criteria for rule structures based on "vague", less quantifiable criteria. The former rule structures tend to be larger and less context-dependant than the latter.
 For a concrete example of an administrative decision-making system with a large, though not total, degree of such automation, see J Bing, "The Emergence of a New Law of Public Administration. Research Issues Related to the Norwegian Housing Aid System", in H W K Kaspersen & A Oskamp (eds), Amongst Friends in Computers and Law: A Collection of Essays in Remembrance of Guy Vandenberghe (Deventer/Boston: Kluwer Law & Taxation Publishers, 1990), 229-240 (describing the decision-making system for assessment of benefits under the Norwegian Housing Aid Scheme).
 COM(92) 422 final - SYN 287, 15.10.1992, 26.
 For further discussion, see, ia, R A Clarke, "The Digital Persona and its Application to Data Surveillance" (1994) 10 The Information Society, 77-92; M Poster, The Mode of Information: Poststructuralism and Social Context (Cambridge: Polity Press, 1990), 97-98.
 See also J Bing, Personvern i faresonen (Oslo: Cappelen, 1991), 12-13, 69; S Bråten, Dialogens vilkår i datasamfunnet. Essays om modellmonopol og meningshorisont i organisasjons- og informasjonssammenheng (Oslo: Universitetsforlaget, 1983), 60.
 Joachim Benno sums up these difficulties in terms of the "anonymization" of transactions. By this he means, in essence, the dissolution and merger of transactional contours, together with the resultant problems in identifying them. See J Benno, "Transaktionens anonymisering och dess påverkan på rättsliga problemställningar", in R Punsvik (ed), Elektronisk handel - rettslige aspekter (Oslo: Tano Aschehoug, 1998), 50-75; J Benno, "The `anonymisation' of the transaction and its impact on legal problems", The IT Law Observatory Report 6/98, Swedish IT Commission, Stockholm, 1998.
 See further Part III (espec Chapter 12).
 J Rule, D McAdam, L Stearns & D Uglow, "Preserving Individual Autonomy in an Information-Oriented Society", in L J Hoffman (ed), Computers and Privacy in the Next Decade (New York: Academic Press, 1980), 65. Alan Westin makes a similar point in his article, "Civil Liberties and Computerized Data Systems", in M Greenberger (ed), Computers, Communications, and the Public Interest (Baltimore & London: The Johns Hopkins Press, 1971), 151, 156.
 See, eg, Westin, ibid, 165 ("What the computer is doing, in effect, is sharpening to a razor edge the existing problems of bureaucracy, social control, and government-citizen relations created by extensive record keeping in a complex society"); J Rule, D McAdam, L Stearns & D Uglow, The Politics of Privacy: Planning for Personal Data Systems as Powerful Technologies (New York: Elsevier, 1980), 11-12 ("The key condition for the emergence of the privacy issue in the 1960s and 1970s ... is not any particular technology so much as new social relationships [emerging from] ... distinct demands (often but not always abetted by new technologies) for personal information").
 For further discussion of this interaction, see infra n 421 et seq and accompanying text.
 A Giddens, The Consequences of Modernity (Cambridge: Polity Press, 1990), 36ff.
 Ibid, 38.
 See, eg, M Weber, The Theory of Social and Economic Organization, trans A M Henderson & T Parsons (New York: The Free Press of Glencoe, 1964), 337 ("The primary source of bureaucratic administration lies in the role of technical knowledge which, through the development of modern technology and business methods in the production of goods, has become completely indispensable"). See also ibid, 339 ("Bureaucratic administration means fundamentally the exercise of control on the basis of knowledge").
 Westin, supra n 377, 157.
 Hence, for example, data controllers are concerned to reduce the amount of information on persons with whom they deal to what is deemed relevant for the organisational tasks at hand. This aspect of rationalisation helps explain why, in the context of modern database systems, the digital persona threatens to usurp the constitutive authority of the data subject's physical self despite the former's attenuated nature relative to the latter: the attenuation is intentional.
 J R Beniger, The Control Revolution: Technological and Economic Origins of the Information Society (Cambridge, Massachusetts: Harvard University Press, 1986), 15.
 A sophisticated and persuasive attempt to answer aspects of the question is made by Beniger in The Control Revolution, ibid. In a nutshell, Beniger's thesis is that the increasing concern for reflexivity and rationality (manifest as concern for developing more sophisticated systems for processing information) is a response to a long-term "crisis of control". This crisis was brought about by the rapid transformations in production and distribution of goods during the Industrial Revolution, transformations that outstripped the rudimentary technological and organisational control mechanisms then available. According to Beniger, the expansion over the last 150 years of formal bureaucracies, along with the development and application of other, increasingly advanced forms of IT, have constituted a "control revolution" aimed at containing this crisis.
 K C Laudon, Computers and Bureaucratic Reform. The Political Functions of Urban Information Systems (New York: John Wiley & Sons, 1974), 50 (the proposal embodied the idea "that the answer to many of the nation's social problems lay in the consolidation of existing government records, full circulation of government information from point of collection to point of need by a decision maker, and the bringing to bear of this mother-lode of data on the critical issues of the day. In theory, the integration of government records involved no changes in political jurisdiction, authority, or funding, thus appeared to be politically viable. Moreover, the use of computers as the mechanisms for transferring government information conferred to computer projects an aura of rationality (if not prestige) that had been denied those advocating radical democratic reforms").
 See, eg, R Kling, "Automated Welfare Client-Tracking and Service Integration: The Political Economy of Computing" (1978) 21 Communications of the ACM, 484-493 (documenting concrete instances of this factor motivating the computerisation strategies of public sector organisations in the USA).
 See S Beckman, "A world-shaping technology", in M Karlsson & L Sturesson (eds), The World's Largest Machine: Global Communications and the Human Condition (Stockholm: Almqvist & Wiksell International, 1995), 260, 271 ("the functional versatility and growing power of IT is due to the fact that it is not only perceived as a collection of technical objects, tools, that we can do various things with; it is also a powerful medium for expressing modernity, a totem which speaks with the commanding voice of progress").
 See generally C Dandeker, Surveillance, Power and Modernity: Bureaucracy and Discipline from 1700 to the Present Day (Cambridge: Polity, 1990), chapt 3 and references cited therein.
 See further the observations by James Rule et al, infra n 417.
 See generally D Lyon, The Electronic Eye: The Rise of Surveillance Society (Cambridge: Polity Press, 1994), 88ff and references cited therein.
 See, eg, R A Clarke, "Dataveillance by Governments: The Technique of Computer Matching" (1994) 7 Information Technology & People, no 2, 46, 49ff.
 For a detailed presentation of US, UK and EU initiatives in this regard, see S J Saxby, Public Policy and Legal Regulation of the Information Market in the Digital Network Environment, CompLex 2/96 (Oslo: Tano, 1996), espec chapts 2-4.
 These concerns are neatly exemplified in the influential "Bangemann Report" issued by an EC task force headed by Martin Bangemann: see Europe and the Global Information Society. Recommendations to the European Council (Brussels, 26.5.1994). The main conclusions of the report have been endorsed by the European Commission in its communication, Europe's Way to the Information Society. An Action Plan (COM(94) 347 final, 19.7.1994).
 It is tempting to claim that economic activity today is based to an unprecedented extent on information. This assumes, though, that the economic dimensions of information use in past times have been fully analysed and are commensurable with analyses of the current role of information in economic activity. Such an assumption is tenuous. As Carolyn Marvin argues, it might be more accurate to say that information has always constituted a crucial element in economic production and consumption, and that what has changed over time "is not the total contribution of information to economic activity, but the forms of energy in which information is captured and exchanged, and the nature of its social classification": C Marvin, "Information and History", in J D Slack & F Fejes (eds), The Ideology of the Information Age (Norwood, New Jersey: Ablex Publishing Corporation, 1987), 49, 57.
 For an introductory overview of the philosophy and central elements of IRM, see I Wormell, Understanding Information (Copenhagen: Danmarks Biblioteksskole, 1992), 97-132.
 See further Chapter 17 (section 17.2).
 See, eg, E Novek, N Sinha & O Gandy, "The value of your name" (1990) 12 Media, Culture and Society, 525-543 (analysing the burgeoning trade in customer lists as commodities in their own right).
 See, eg, European Commission, Guidelines for improving the synergy between the public and private sectors in the information market (Luxembourg: Office for Official Publications of the EC, 1989). Guideline 1 notes that the information materials of public organisations "have value beyond their use by governments, and their wider availability would be beneficial both to the public sector and to the private industry." Accordingly, Guideline 1 goes on to state that public organisations "should, as far as practicable and when access is not restricted for the protection of legitimate public or private interests, allow [their] basic information materials to be used by the private sector and exploited by the information industry through electronic information services".
 The most startling evidence I have come across is from Australia where a multi-million dollar, illicit trade in confidential personal data from government records was revealed in the early 1990s: see New South Wales Independent Commission Against Corruption (ICAC), Report on Unauthorised Release of Government Information (Sydney: ICAC, 1992). Officers of government agencies at local, state and federal levels were involved in the trade, as were lawyers, police officers, debt collectors, private investigators, real estate agents, banks and insurance companies. It is, of course, difficult to gauge the extent to which similar practices occur in other countries, particularly given the illicit nature of such activity. But the former head of ICAC, Ian Temby, has claimed that "[t]here is every reason to believe that what ICAC uncovered in Australia is replicated ... throughout the so-called developed world": The Weekend Australian, 18.9.1993, 11.
 Lyon, supra n 394, 3.
 Ibid, 4.
 See, eg, A Giddens, Modernity and Self-Identity: Self and Society in the Late Modern Age (Cambridge: Polity Press, 1991), 15; Giddens, supra n 380, 57-58. The other three dimensions are industrialism, capitalism and military power.
 Rule et al, supra n 377, 68.
 See J Rule, Private Lives and Public Surveillance (New York: Schocken Books, 1974), 342 ("If bureaucratic, corporate control is growing in importance, the extended family, the neighbourhood and other small-scale, primary groups are losing their grip"). More generally, see also N Christie, Hvor tett et samfunn? (Oslo: Universitetsforlaget, 1982, 2nd rev ed).
 See also Rule, supra n 409, 332-333 ("Probably any effort to compare `total privacy' afforded in different social settings is bound to require matching incommensurables. Again, how is it possible to weigh the exposure of personal information to scores of indifferent operatives, in a modern surveillance system, against disclosure to a handful of neighbours in a tightly-knit community? How, for that matter, is one to weigh the discursive but relatively evanescent information transmitted in gossip against the terse ... but enduring data held in dossiers or computer files?").
 Lyon, supra n 394, 10-11.
 Giddens, Modernity and Self-Identity, supra n 406, 149.
 Rule, supra n 409, 301.
 Ibid, 308ff.
 See the description above of the push to link various organisations' information systems in an effort to enhance sharing of personal data between them.
 Ibid, 321.
 Rule et al, supra n 378, 134 ("People seek their own `just desserts', in terms of the credit privileges, insurance rates, tax liability, passport use, or whatever to which they feel themselves entitled. At the same time, the public also demands effective discriminations against welfare cheaters, poor credit risks, dangerous drivers, tax evaders, criminals and the like. And these discriminations in the treatment of persons by organizations can only be achieved by recourse to personal-data keeping"). See also ibid, 43.
 Rule et al, supra n 378, 42 ("Organizations demand authoritative information on the people with whom they deal, and these demands give rise to other formal organizations to provide this information. These new organizations typically must authenticate the personal data which they use for their own decision-making purposes, and so a new cycle of demands begins").
 Lyon, supra n 394, 54-55.
 See Dandeker, supra n 392, 57-58 ("The technical demands of war not only led to an enhancement of the ability of the state to exercise surveillance over its armed forces but also to a tightening of the networks of surveillance over the rest of society. Although this process was rooted in the industrialization of war in the nineteenth century - particularly in the equation linking citizenship and military service - the main connections between war and ... the rise of the `security state' were forged during the two world wars and the subsequent nuclear age. Whilst some of the advanced societies have lessened their dependence on mass conscripted armies, it remains the case that defence and security issues have provided important grounds for supervising the civilian population in time of war or threat of war"). For more detailed analysis of these factors, see ibid, chapt 3.
 For instance, a recent survey by the US Federal Trade Commission of some 1,400 commercial Internet sites run by US-based companies found that approximately 90 percent of these sites collect at least one type of personal information (including e-mail addresses) from site visitors. See Federal Trade Commission, Privacy Online: A Report to Congress, issued June 1998, available at URL <http://www.ftc.gov/ reports/privacy3/priv-23a.pdf> (last visited 30.5.1999), 23. A large proportion of these sites also collect from site visitors additional types of personal information. Ibid, 24ff.
 PMWIs enable their users to see through certain types of materials, similarly to an x-ray apparatus. But rather than sending out x-rays, PMWIs register electromagnetic waves emitted by humans. Objects that come in the way of these waves, such as guns or explosives concealed behind a persons' clothing, are shown up on a monitor on the imager device. See "New X-ray gun raises privacy concerns", USA Today, 12.8.1996.
 Rule, supra n 409, 342.
 For a general overview of PETs, see H Burkert, "Privacy-Enhancing Technologies: Typology, Critique, Vision", in P E Agre & M Rotenberg (eds), Technology and Privacy: The New Landscape (Cambridge, Massachusetts: MIT Press, 1997), 125-142. For an overview of PETs in relation to electronic payment transactions, see M Froomkin, "Flood Control on the Information Ocean: Living with Anonymity, Digital Cash, and Distributed Databases" (1996) 15 U of Pittsburgh J of Law and Commerce, 395, Part III (available at URL <http://www.law.miami.edu/~froomkin/ articles/oceanno.htm> (last visited 1.7.1999).
 K Raes, "The Privacy of Technology and the Technology of Privacy: The Rise of Privatism and the Deprivation of Public Culture", in A Sajó & F B Petrik (eds), High-Technology and Law: A Critical Approach (Budapest: Institute of Political and Legal Sciences, Hungarian Academy of Sciences, 1989), 78.
 Rodotà, supra n 33, 263.
 See espec the work carried out by researchers at the University of California's Irvine School of research and instruction in information systems. For an overview of the work, see K L Kraemer & J L King, "Social Analysis of Information Systems: The Irvine School, 1970-1994" (1994) 3 Informatization and the Public Sector, 163-182.
 See, eg, the basic conclusion of the studies set out in W B H J van de Donk, I T M Snellen & P W Tops (eds), Orwell in Athens: A Perspective on Informatization and Democracy (Amsterdam: IOS Press, 1995).
 One of the most exuberant of such claims is found in George Gilder, Life After Television: The Coming Transformation of Media and American Life (New York: Norton, 1994), 60-61 (asserting that the computer "will blow apart all the monopolies, hierarchies, pyramids, and power grids of established industrial society. It will undermine all totalitarian regimes. Police states cannot endure under the advance of the computer because it increases the power of the people far faster than the powers of surveillance. All hierarchies will tend to become `heterarchies' - systems in which each individual rules his own domain"). Greater caution is displayed in M Ethan Katsh, The Electronic Media and the Transformation of Law (New York: Oxford University Press, 1989), 114 ("The existence of a totalitarian state whose foundation is control over information is less probable in an era of widespread electronic communication than at any previous time. Rather than being an ally of state power, the new media are more likely to be a force that will undermine state control and authority").
 M Ethan Katsh appears to overlook this point when extolling the "hypertext environment" as supporting "the image of an individual with power and discretion, an individual who has tools to exploit opportunities for expression and association": M E Katsh, Law in a Digital World (New York: Oxford University Press, 1995), 236. Further, Katsh does not consider how many individuals may fit such an image. It could well be that such an image is relevant only for a relatively small number of persons who belong to an elite. Concomitantly, Katsh does not consider the extent to which "the hypertext environment" is or will be typical for the broad stream of future social activity.
 See, eg, the references cited supra n 355.
 See, eg, Information and Privacy Commissioner of Ontario & & Registratiekamer of the Netherlands, "Privacy-Enhancing Technologies: The Path to Anonymity", August 1995, available at URL <http://www.ipc.on.ca/web_site.eng/matters/sum_ pap/summary.htm> (last visited 31.5.1999); E Boe, "Pseudo-Identities in Health Registers? Information Technology as a Vehicle for Privacy Protection" (1994) 2 The Int Privacy Bulletin, no 3, 8-13; Pseudonyme helseregistre, NOU 1993:22.
 See further Chapter 2 (section 2.4.2).
 See also Simitis, supra n 356, 115-116 ("Der Datenschutz war und ist kein Dokument blinder Regression in die angeblich pastorale Idylle einer automationsfreien Zeit. Er manifestiert vielmehr nur die Verpflichtung, jeden Schritt auf eine bessere Verarbeitung hin nicht lediglich also Triumph ingeniöser Technik zu feiern, sondern auch und gerade an seine politischen und rechtlichen Konsequenzen zu messen").
 The Schengen Information System (SIS) is an excellent example. For a useful description of the system, see M Baldwin-Edwards & B Hebenton, B: "Will SIS be Europe's Big Brother?", in M Andersen & M den Boer (eds), Policing Across National Boundaries (London/New York: Pinter Publishers, 1994), 137-157.
 See generally Lyon, supra n 394, chapt 8; O H Gandy Jr, The Panoptic Sort: A Political Economy of Personal Information (Boulder: Westview Press, 1993), chapt 3.
 Note, eg, the recent trend in some countries to place private financial institutions under a legal duty to report to the police suspected attempts at "white-washing" illegally generated monies. For a brief description of this trend in Norway, see J P Berg, "Finansinstitusjonenes rapporteringsplikt til ØKOKRIM ved mistanke om hvitvasking av penger - et gjennombrudd for `informant'-samfunnet?" (1996) 23 Kritisk Juss, 147-163.
 For more detailed analysis of these developments in surveillance techniques, see, ia, G T Marx, Undercover: Police Surveillance in America (Berkeley: University of California Press, 1988), chapt 10; Lyon, supra n 394, chapt 3.
 Witness, for instance, the massive expansion of Closed Circuit Video Television (CCTV) systems along public thoroughfares. For overviews of this development in the UK, see S Davies, "Surveillance on the streets" (1995) 2 PLPR, 24-26; Der Spiegel, 5.7.1999, 122-124.
 See further Chapter 17 (section 17.2).
 See generally I Mestad, Elektroniske spor. Nye perspektiver på personvern, CompLex 3/86 (Oslo: Universitetsforlaget, 1986), chapt 2. See also Bing, supra n 33, 252ff.
 See further Chapter 17 (section 17.2). In essence, cookies are transactional data about a browser's Internet activity which are automatically stored by an Internet server on the browser's computer, often without the browser's knowledge. The primary aim of cookies is to allow for the customising of an Internet service for the browser's subsequent use of the service or linked services. For a description of the way cookies are generated and of the issues they raise for data protection laws, see V Mayer-Schönberger, "The Internet and Privacy Legislation: Cookies for a Treat?" (1998) 14 CLSR, 166-174.
 For instances of the various uses for which private actors might exploit transactional data, see, eg, B Leuthardt, Leben online. Von der Chipkarte bis zum Europol-Netz: Der Mensch unter ständigem Verdacht (Hamburg: Rowohlt, 1996), 137ff. See further Chapter 17 (section 17.2).
 A famous case in point is the use by the (West) German Bundeskriminalamt of electricity billing records in tracking down the Hamburg residence of the wanted terrorist, Rolf-Clemens Wagner, in early 1980. For a short description of the main facts of the case, see H P Bull, Datenschutz oder Die Angst vor dem Computer (Munich: Piper, 1984), 239-240.
 Not infrequently, though, transactional data will also serve to protect the interests of the data subjects as, ia, consumers (eg, in complaints over incorrect billing for telephone services) or as suspected criminals (eg, in verifying alibis). This point serves to illustrate again the double-sided nature of many of today's systems of mass surveillance and control: at the same time as they can enhance the power of data controllers, they can often help to ensure that certain interests of data subjects are respected.
 See further Chapter 2 (section 2.4.3).
 See further Chapter 2 (section 2.4.1).
 See further the assessment of regulation of profiling practices in Chapters 18-19.
 A problem actualised especially in the context of the Internet. See further, ia, St meld 44 (1998-99), Datatilsynets årsmelding 1998, 20-21.
 The study was carried out by Kenneth Laudon with funding from the (now defunct) Office of Technology Assessment (OTA) attached to the US Congress. The study results are set out in K C Laudon, "Data Quality and Due Process in Large Interorganisational Record Systems" (1986) 29 Communications of the ACM, no 1, 4-11; & Laudon, supra n 125, 135-145.
 The most common problem "was lack of court disposition information (where present at all), ambiguity of record, or some combination of the above": Laudon, supra n 450, 8.
 Ibid, 9. For the purposes of the study of criminal-history records, such a record was considered to be incomplete if it noted an arrest but no formal court disposition had been recorded within a year of the date of the arrest, or if it noted conviction of "attempt" without stating the specific crime, or if it set out sentencing information without noting conviction information, or if it failed to present correctional information at the same time as it presented other data. A criminal-history record was deemed to be inaccurate "when the arrest, court disposition, or sentencing information on [it] ... does not correspond with the actual manual court records". A criminal-history record was deemed to be ambiguous when it "shows more charges than dispositions or more court dispositions than charges", or it contains dates that do not correspond with each other, or it sets out "a number of arrest charges followed by a single court disposition where it is not clear for which particular crime the individual was convicted": ibid, 6.
 Ibid, 8-9. The results of this study are even more disturbing given the fact that the US Department of Justice had issued regulations in 1975 requiring, ia, state agencies to conduct annual audits of the quality of criminal history records. Already in 1980, however, a study instituted by the OTA found that few state agencies complied with this requirement. A summary of the findings is set out in Laudon, supra n 125, 181-185. In the words of Laudon (ibid, 183): "In 80 percent of the states no audit [of criminal history records] has ever been conducted, most states have no adequate procedure to monitor incomplete records, many states (33 percent) cannot trace the flow of information down to the individual level with transaction logs, and nearly 80 percent of the states rarely if ever review transaction logs ...".
 The survey findings are available at URL <http://www.pirg.org/consumer/credit/ mistakes/index.htm> (last visited 31.5.1999).
 See also Schwartz & Reidenberg, supra n 62, 299 (noting data quality problems suffered by the US credit-reporting industry). These problems occur in the face of legal rules requiring credit information to be as accurate as possible: see the federal Fair Credit Reporting Act (15 USC SS 1681a et seq): "Whenever a consumer reporting agency prepares a consumer report it shall follow reasonable procedures to assure maximum possible accuracy of the information concerning the individual about whom the report relates" (15 USC SS1681e(b); see also 15 USC SSSS 1681d(d)(4), 1681k(a)(2) & 1681l). The legislation also provides data subjects with access and rectification rights (see 15 USC SSSS 1681g-1681i).
 For an extensive set of examples, with particular focus on Norway, see L A Bygrave, "Ensuring Right Information on the Right Person(s): Legal Controls of the Quality of Personal Information - Part I", Manuscript Series on Information Technology and Administrative Systems, University of Oslo, 1996, vol 4, no 4, 13ff.
 A noteworthy example of such evidence is the results of a matching program carried out by the social welfare offices of three neighbouring municipalities in Norway in 1993. The program was initiated in order to identify persons who were illegally claiming and receiving social security benefits from more than one of the offices at the same time. An initial match revealed a substantial number of social security clients apparently engaged in "double-dipping". Subsequent analysis of the matching results revealed, however, that there was no fraud. That persons were registered as clients of more than one municipal social welfare office was due to the failure of the offices to up-date their respective client registers when a client moved residence from one municipality to another. See Computerworld Norge, 11.3.1994, 18. Similarly, an on-going matching program carried out by Norway's Labour Directorate in order to identify persons in illegal receipt of unemployment benefits, has revealed that a large proportion of "hits" (ie, cases of apparent fraud) stem from poor data quality in the matched data registers rather than from fraud. See Computerworld Norge, 15.4.1994, 4; Computerworld Norge, 10.6.1994, 8.
 Fox et al, supra n 86, 10. See also K Ivanov, Quality-control of Information. On the Concept of Accuracy of Information in Data-Banks and in Management Information Systems (Stockholm: Royal Institute of Technology, 1972), especially chapter 1 (noting little agreement in the field of computer and information science on how to define data/information quality as a concept and on how to measure it).
 D Marchand, "Managing Information Quality", in I Wormell (ed), Information Quality - Definitions and Dimensions (London: Taylor Graham, 1990), 7, 8.
 There was a concomitant failure to work out in detail the purpose(s) for which information was processed, and how it was to be categorised. Moreover, the uncertain parameters of many of the information systems, along with the fact that these parameters sometimes failed to follow traditional organisational boundaries, made it difficult for data controllers to establish the extent of their respective spheres of responsibility for the quality of information they processed. See Riksrevisionsverket (RRV), supra n 127. See also En ny datalag, SOU 1993:10, 331ff (listing miscellaneous instances in which data subjects have sought compensation for harm caused them by poor information quality in personal data registers kept by Swedish authorities).
 A total of 142 agencies were asked by the OTA for the results of any data quality audits they conducted on their record systems falling within the ambit of the Privacy Act of 1974 and computerised record systems maintained for law enforcement, investigative and/or intelligence purposes. Of the 127 agencies that responded, only 13 percent stated that they conducted such audits, but only one agency provided any audit results. See US Congress, Office of Technology Assessment, Federal Government Information Technology: Electronic Record Systems and Individual Privacy, OTA-CIT-296 (Washington, DC: US Government Printing Office, June 1986), 111. See also US National Research Council, Computer Science and Telecommunications Board, System Security Study Committee, Computers at Risk: Safe Computing in the Information Age (Washington, DC: National Academy Press, 1991), 3 (noting that US government initiatives to enhance computer and communications security have focused traditionally on safeguarding the confidentiality of information (ie, on ensuring that information is not disclosed to unauthorised persons), with relatively little attention paid to safeguarding other aspects of security/quality, such as ensuring that information is correct, complete and relevant in relation to the purposes for which it is used).
 Statskonsult, Utvikling av metode for kartlegging av datakvalitet i grunndataregistre, Report 4204.20 (Oslo: Statskonsult, 19.3.1996), 3 ("Offentlige dataregistre har i dag ikke et helhetlig kvalitetssystem for registerinnhold med etablerte måltall. Kundene/brukerne gis heller ikke deklarasjoner eller garantier for datakvalitet, og systematiske målinger av kundenes tilfredshet mangler"). See also the findings of a 1993 study instigated by the Nordic Council of Ministers of the information security practices and needs of selected Nordic governmental institutions in the civil sector. The study report concluded, ia, that measures taken by these institutions to safeguard the confidentiality of information were generally quite strong, whereas measures taken to safeguard the quality of information were relatively weak. See Nordic Council of Ministers, supra n 106.
 In Sweden, the National Audit Office attempted to estimate in 1991 some of the financial costs of poor information quality in computerised address registers used by the Swedish postal service. The study found that errors in these registers cost the postal service SEK 100-200 million per annum: Riksrevisionsverket (RRV), Fel data kostar! Exemplet Postens kostnader för fel i adressregister, F 1992:2 (Stockholm: RRV, 1992), 37. It was noted that such errors also result in considerable costs for a large number of other organisations that apply or rely on the data in the address registers. For other examples of the potentially high financial cost of relatively simple data errors, see E T O'Neill & D Vizine-Goetz, "Quality Control in Online Databases" (1988) 23 Annual Rev of Information Science and Technology, 125, 128; "Devil in Your Data", Information Week, 31.8.1992, 48-54.
 See, eg, supra nn 453 & 455.
 See further Chapter 3 (espec section 3.5) and Chapter 18 (espec section 18.4.4).
 For more detailed analyses of these factors, see, eg, Bygrave, supra n 456, R W Bailey, Human Error in Computer Systems (Englewood Cliffs, New Jersey: Prentice-Hall, 1983); & F B Cohen, Protection and Security on the Information Superhighway (New York: John Wiley & Sons, 1995), 33-56.
 See, eg, Bailey, supra n 466, 6, 22 (noting that a large proportion of informational error is attributable to faulty calculations, judgements and classifications).
 The study was carried out between 1978 and 1981 by the Oak Ridge National Laboratory, on the initiative of the US Energy Information Administration.
 A S Loebl, "Accuracy and Relevance and the Quality of Data", in G E Liepins & V R R Uppuluri (eds), Data Quality Control: Theory and Pragmatics (New York: Marcel Dekker, 1990), 105, 139.
 For further examples of faulty models, see ibid, 133-135.
 See, eg, B Nyberg, Samkörning av personregister, IRI-rapport 1984:2 (Stockholm: Institutet för Rättsinformatikk, 1984), 16-21; Bing, supra n 33, 251-252; & J Freese, Den maktfullkomliga oförmågan (Stockholm: Wahlström & Widstrand, 1987), 94-96.
 ALRC, supra n 295, vol 1, 17.
 See further section 6.4.1 below.
 65 BVerfGE, 1, 43 ("Wer nicht mit hinreichender Sicherheit überschauen kann, welche ihn betreffende Informationen in bestimmten Bereichen seiner sozialen Umwelt bekannt sind, und wer das Wissen möglicher Kommunikationspartner nicht einigermassen abzuschätzen vermag, kann in seiner Freiheit wesentlich gehemmt werden, aus eigener Selbstbestimmung zu planen oder zu entstehen. [...] Wer unsicher ist, ob abweichende Verhaltensweisen jederzeit notiert und als Information dauerhaft gespeichert, verwendet oder weitergegeben werden, wird versuchen, nicht durch solche Verhaltensweisen aufzufallen. Wer damit rechnet, dass etwa die Teilnahme an einer Versammlung oder einer Bürgerinitiative behördlich registriert wird und dass ihm dadurch Risiken entstehen können, wird möglicherweise auf eine Ausübung seiner entsprechenden Grundrechte ... verzichten. Dies würde nicht nur die individuellen Entfaltungschancen des einzelnen beeinträchtigen, sondern auch das Gemeinwohl, weil Selbsbestimmung eine elementare Funktionsbedingung eines auf Handlungs- und Mitwirkungsfähigkeit seiner Bürger begründeten freiheitlichen demokratischen Gemeinwesens ist").
 M D Kirby, "Information Security - OECD Initiatives" (1992) 3 J of Law and Information Science, no 1, 25, 26, 29-30; also published in (1992) 8 CLSR, 102, 103-104. Cf M D Kirby, "Privacy in Cyberspace" (1998) 21 University of New South Wales LJ, no 2, available at URL <http://www.austlii.edu.au/au/other/ unswlj/thematic/1998/vol21no2/kirby.html> (visited 1.6.1999).
 See espec J Ellul, The Technological Society, trans J Wilkinson (New York: Vintage Books, 1964) - originally published as La technique ou l'enjou du siècle (Paris: Armand Colin, 1954); and J Weizenbaum, Computer Power and Human Reason. From Judgement to Calculation (San Francisco: W H Freeman & Company, 1976). See also A Moshowitz, The Conquest of Will: Information Processing in Human Affairs (Reading, Massachusetts: Addison-Wesley, 1976).
 See further Part III.
 Note the discussion in Germany about maintaining "Informationsgleichgewicht" between branches of government: see Chapter 2 (section 2.3).
 See, ia, K S Selmer, "Elektronisk databehandling og rettssamfunnet", in Forhandlingene ved Det 30. nordiske juristmøtet, Oslo 15.-17. august 1984 (Oslo: Det norske styret for De nordiske juristmøter, 1984), Part II, 41, 44 (noting that a major impulse for enacting data protection law in West European countries was a desire to prevent a repitition of the bitter experiences of Nazi dictatorship during the 1930s and -40s).
 See, ia, Rule et al, supra n 378, 62 ("The Watergate drama ... did perhaps more than anything else to force official treatment of personal data into the arena of public controversy. Every element of the scandals seemed to involve some conflict over official use of personal data - bugging, snooping into psychiatric records, misuse of the surveillance capacities of the IRS, or whatever"); Bennett, supra n 10, 72 ("The Privacy Act would not have been passed in 1974 had it not been for Watergate").
 As Alan Westin allegedly commented in 1972, "you do not find computers on streetcorners or in free nature, but in big, powerful organizations": cited in Bing, supra n 33, 247.
 At the same time, there are aspects of Nineteen Eighty-Four - most notably, its focus on state power and on blatantly violent control measures - which diminish its relevance for discourse on contemporary surveillance and control, at least in relation to the majority of citizens in Western liberal democracies. I would suggest that the more subtle, pleasurable and insidious forms of control depicted in Aldous Huxley's Brave New World (1932) better describe what these citizens are likely to experience.
 See M Foucault, Discipline and Punish: The Birth of the Prison, trans A Sheridan (Harmondsworth: Penguin, 1977), 195-228. Bentham's plan was for the building of a prison he termed the Panopticon. The prison would allow for the constant surveillance of prisoners from a central watch tower but prevent (through special lighting devices) prisoners from identifying when and by whom they were watched.
 Ibid, 201.
 Ibid, 202.
486 Of course, this imbalance in itself is not sufficient to lead to the observed behaving in conformity with the observers' wishes independent of actual coercion. The observed must also be made aware of their transparency and of the behaviour the observers wish them to exhibit. Usually, they need also to be made aware of threatened sanctions in the event of not exhibiting such behaviour. In his discussion of Bentham's Panopticon, Foucault appears to take such awareness for granted.
 For a sensitive discussion of the present and future sociological utility of the notion of panopticism, see Lyon, supra n 394, 71ff, 166ff & 202ff.
 See, eg, K Robins & F Webster, "Cybernetic capitalism: Information, technology, everyday life", in V Mosco & J Wasko (eds), The Political Economy of Information (Madison: University of Wisconsin Press, 1988), 44-75; Gandy, supra n 436; Marx, supra n 438, 220ff; Poster, supra n 373, 91ff. Surprisingly, Foucault makes no mention of modern computer technology in his discussion of panopticism.
 See, eg, W H Dutton & R G Meadow, "A tolerance for surveillance: American public opinion concerning privacy and civil liberties", in K B Levitan (ed), Government Infostructures (New York: Greenwood Press, 1987), 147, 167 ("Surveys focused on privacy and civil liberty issues create a context that can artificially inflate public expressions of interest and concern. A second problem is that those most concerned about privacy are likely to be underrepresented in surveys because they disproportionately decline to be interviewed. A third and more fundamental problem is definitional. Much of the debate over privacy is clouded by definitional differences in what constitutes an invasion of privacy, which range from gossipy neighbours to electronic surveillance").
 For Australia, see Privacy Commissioner, Community Attitudes to Privacy, Information Paper 3 (Canberra: AGPS, 1995), 7. For Canada, see Louis Harris & Associates (in association with A F Westin), Equifax Canada Report on Consumers and Privacy in the Information Age (Ville d'Anjou: Equifax Canada, 1995), 4; see also Fédération nationale des associations de consommateurs du Québec (FNACQ) & Public Interest Advocacy Centre (PIAC), Surveying Boundaries: Canadians and Their Personal Information (Ottawa/Québec: FNACQ/PIAC, 1995), 1, 54-55. For Denmark, see Instituttet for Fremtidsforskning, Danskernes holdninger til informationsteknologi (Copenhagen: Post Danmark, 1996), 50 & 96; see also AIM Media- og sociale undersøgelser, Holdning til registre, Job nr 7585 (Copenhagen: AIM, December 1986), 7. For (West) Germany, see H Becker, "Bürger in der Modernen Informationsgesellschaft", in Informationsgesellschaft oder Überwachungsstaat, Symposium of the Hessian State Government held in Wiesbaden, 3-5 September 1984 (Wiesbaden: Hessendienst der Staatskanzlei, 1984), 343, 411. For Norway, see E Gulløy, Undersøkelse om personvern: Holdninger og erfaringer 1997, Notat 97/46 (Oslo: Statistisk sentralbyrå, 1997), 17 & 29. For Sweden, see, eg, Central Bureau of Statistics ("Statistiska Centralbyrån" (SCB)), Data och integritet: Allmänhetens kunskaper och attityder allmänt och till SCB (Stockholm: SCB, March 1985), 27-28. For the UK, see Data Protection Registrar, Tenth Report of the Data Protection Registrar, June 1994 (London: HMSO, 1994), 79. For the USA, see, eg, Louis Harris & Associates (in association with A F Westin): Harris-Equifax Mid-Decade Consumer Privacy Survey 1995 (Atlanta: Equifax, 1995), 7.
 The surveys show that privacy and data protection appear, by and large, to be second-order concerns of the public. Moreover, they show that people tend not to forego receiving some sort of service (eg credit, insurance) in order to protect their privacy. See, eg, J E Katz & A R Tassone, "Public Opinion Trends: Privacy and Information Technology" (1990) 54 Public Opinion Quarterly, 124, 127, 137-138; Gulløy, supra n 490, 17, 20-21, 30, 41-42; Data Protection Registrar, supra n 490, 79; Instituttet for Fremtidsforskning, supra n 490, 13, 55-60; Central Bureau of Statistics, supra n 490, 27-28.
 See Louis Harris & Associates (in association with A F Westin), supra n 490, 17.
 See, eg, Privacy Commissioner, supra n 490, 7 & 10; Louis Harris & Associates (Canada), supra n 490, 6 & 23; Central Bureau of Statistics, supra n 490, 25; Katz & Tassone, supra n 491, 128-129, 138; Louis Harris & Associates, supra n 490, 24 & 37. Note also results of a Norwegian Scan-Fact survey undertaken in 1996 in which over 50% of respondents felt that development of the "datasamfunn" will lead to worsening "personvern" (defined in terms of ease by which data about oneself can be accessed by others). These results were formerly available at URL <http://odin.dep.no/it/scanfact/kap6.html> (visited 12.2.1998); I have a printed copy of them on file.
 See, eg, I Székely, "New Rights and Old Concerns: Information Privacy in Public Opinion and in the Press in Hungary" (1994) 3 Informatization and the Public Sector, 99, 102-103; Becker, supra n 490, 412ff; Central Bureau of Statistics, supra n 490, 14-16, 34-36; Louis Harris & Associates, supra n 490, 40; Gandy, supra n 436, 140-141. Cf Instituttet for fremtidsforskning, supra n 490, 54 & 80 (indicating relatively high levels of trust on the part of Danes). Cf also Hälsodataregister Vårdregister, SOU 1995:95, Appendix 2, 258 (indicating a slight reversal in 1995 of earlier drops in the percentage of Swedes supporting dissemination of data in the central population register to various organisations).
 Louis Harris & Associates, (Canada), supra n 490, vi-viii; Louis Harris & Associates, supra n 490, 12-14.
 See, eg, Louis Harris & Associates (in conjunction with A F Westin), The Dimensions of Privacy: A National Opinion Research Survey of Attitudes toward Privacy (Stevens Point, Wisconsin: Sentry Insurance, 1979), 100-101. Here Westin attempts to measure such alienation by analysing the extent to which persons agree or disagree with the following statements: (1) Technology has almost gotten out of control; (2) Government can generally be trusted to look after our interests; (3) The way one votes has no effect on what the government does; (4) In general, business helps us more than it harms us. Agreement with statements 1 and 3, or disagreement with the remaining two statements, are considered as indicating alienation. Note also R L Esquerra, Personal Privacy in a Computer Information Society, doctoral dissertation, University of Arizona, 1982 (Ann Arbor: University Microfilms International, 1986). Employing the same "alienation index" as Westin, Esquerra found "definite systemic relationships between the degree of alienation an Arizona resident feels and his or her attitude toward issues of privacy and perceptions of computer benefits and dangers": ibid, 292. See further chapt 14 of Esquerra's thesis.
 See espec U Beck, Risikogesellschaft. Auf den Weg in eine andere Moderne (Frankfurt am Main: Suhrkamp Verlag, 1986) - published in English as Risk Society: Towards a New Modernity (London: Sage Publications, 1992). Other prominent sociologists, such as Niklas Luhmann and Anthony Giddens, also take up the theme: see espec N Luhmann, Soziologie des Risikos (Berlin/New York: Walter de Gruyter, 1991) - published in English as Risk: A Sociological Theory (Berlin/New York: Walter de Gruyter, 1993); Giddens, The Consequences of Modernity, supra n 380, 124ff; Giddens, Modernity and Self-Identity, supra n 406, espec 27ff.
 Beck, supra n 497, 53.
 Ibid, 33.
 Beck, for example, employs the concept of risk to signify threats (primarily to the natural environment and human life) arising out of "industrial overproduction": ibid, 21. Cf Luhmann, supra n 497, in which risk is seen as essentially a product of human perception and communication. Concomitantly, Luhmann views the preoccupation with risk in modern society as not just technology-induced, but also a result of expansion of the administrative sphere, a concern to secure certainty, stability and calculability, and growth in our body of knowledge.
 See further Chapter 7 (section 7.3). This dimension of data protection laws is emphasised particularly in the work of Herbert Burkert: see espec his article "Systemvertrauen: Ein Versuch über einige Zusammenhänge zwischen Karte und Datenschutz" (1991) á la Card Euro-Journal, no 1, 52-66 (also available at URL <http://www.gmd.de/People/Herbert.Burkert/CardTrust.html> (visited 31.5.1999)).
 See further Chapters 7 (section 7.2.5) and 18 (section 18.4.2).
 Burkert, supra n 501 ("Die Funktion des Datenschutzes ist es, das generelle Vertrauen in das allgemeine Rechtssystem mit dem Vertrauen in das technische System `Information und Kommunikation' zu koppeln und letzteres von ersterem profitieren zu lassen").
 See also, ia, Burkert, supra n 342; S Simitis, "New Trends in National and International Data Protection Law", in J Dumortier (ed), Recent Developments in Data Privacy Law (Leuven: Leuven University Press, 1992), 17.
 I write "partly" symptomatic because, in some cases, these features of data protection laws are arguably also the result of a desire not to fundamentally upset organisations' existing data-processing practices: see further Chapter 7 (section 7.3).
 See generally Bennett, supra n 10, 146 & 243
 Glenn English, quoted in Dutton & Meadow, supra n 489, 148.
 See Bennett, supra n 10, 127ff.
 See, eg, Industry Canada & Justice Canada, Task Force on Electronic Commerce, The Protection of Personal Information - Building Canada's Information Economy and Society (Ottawa: Industry Canada/Justice Canada, 1998), 6 ("In an environment where over half of Canadians agree that the information highway is reducing the level of privacy in Canada, ensuring consumer confidence is key to securing growth in the Canadian information economy. Legislation that establishes a set of common rules for the protection of personal information will help to build consumer confidence ..."). See also the Bangemann Report, supra n 397, 33 ("Without the legal security of a Union-wide approach [with regard to privacy], lack of consumer confidence will certainly undermine the rapid development of the information society. Given the importance and sensitivity of the privacy issue, a fast decision from Member States is required on the Commission's proposed Directive setting out general principles of data protection").
 See Chapter 4 (section 4.5).
 See I J Lloyd, "The Data Protection Act - Little Brother Fights Back?" (1985) 48 Modern L Rev, 190, 190-191; Bennett, supra n 10, 141-143.
 M D Kirby, "Legal Aspects of Transborder Data Flows" (1991) 5 Int Computer Law Adviser, no 5, 4, 5-6.
 See, eg, Commission Communication of 13.9.1990 on the protection of individuals in relation to the processing of personal data in the Community and information security (COM(90) 314 final), 4 ("The diversity of national approaches and the lack of a system of protection at Community level are an obstacle to completion of the internal market"). See also recitals 3, 5 & 7 in the preamble to the EC Directive.
 K R Pinegar, "Privacy Protection Acts: Privacy Protectionism or Economic Protectionism?" (1984) 12 Int Business Lawyer, 183-188. See also, ia, Office of the US Special Trade Representative, "Trade Barriers to Telecommunications, Data and Information Services" (1981) 4 TDR, no 5, 53; G S Grossman, "Transborder Data Flow: Separating the Privacy Interests of Individuals and Corporations" (1982) 4 Northwestern J of Int Law and Business, no 1, 1-36; R P McGuire, "The Information Age: An Introduction to Transborder Data Flow" (1979-80) 20 Jurimetrics J, 1-7; C Rumbelow, "Privacy and Transborder Data Flow in the UK and Europe" (1984) 12 Int Business Lawyer, 153-157.
 See generally D P Farnsworth, "Data Privacy: an American's View of European Legislation" (1983) 6 TDR, no 5, 285-290.
 Joinet is quoted as remarking, ia, that "[i]nformation has an economic value and the ability to store and process certain types of data may well give one country political and technological advantage over other countries": Pinegar, supra n 514, 187. McGuire also quotes Joinet on this point: McGuire, supra n 514, 3. Stadler is quoted as remarking that the "economic-technical" situation of national governments "forces [them] ... to make a decision, if they want to promote the interests of national enterprises, or to trust in the inland information market to represent their national companies in foreign countries. It seems to be certain that these three fields of interest - and the aim of securing the privacy of the citizens - do not correspond, but are in competition ... There exists already the fear to be dependent on decisions outside the [sic] own sovereignty". See Pinegar, id. In addition, Pinegar seizes upon a study published in 1978 by the French Ministry of Justice. According to Pinegar, the study "urged legislative action and official policies that would encourage an increase in the French (and European) share of the computer and data processing market, foster decreasing dependence on US data networks, and increase and improve the quality of Europe's national networks": Pinegar, id.
 See the discussion on regulating transborder data flows in Ot prp 2 (1977-78), 9-10, 96. See also H Seip, "Unfair Competition in Computer Services?" (1981) 4 TDR, no 8, 33 (protectionist motives "are neither indicated by the legislative history of the Act nor by the way it has made itself felt in practice").
 See Ellger, supra n 130, 428-430 (concluding on the basis of an in-depth examination of the data protection regimes of Austria, Sweden, Denmark, Norway, France, the Federal Republic of Germany and the UK that, at least up until 1990, there is no solid evidence that rules for restricting TBDF under these regimes have operated as "non-tariff trade barriers"). Ellger points out also (ibid, 429 & 270) that only an extremely small percentage of cross-border transfers of personal data have been stopped. For example, in the period 1.1.1980 to 1.1.1985, approximately 1500 applications were made for permission to transfer personal data out of Austria; in less than one percent of cases was permission refused. The findings of an earlier, albeit narrower, study by Bing are in line with Ellger's findings: see J Bing, Data Protection in Practice - International Service Bureaux and Transnational Data Flows, CompLex 1/85 (Oslo: Universitetsforlaget, 1985).
 Blume, supra n 152, 129. Blume's claims are based on comments of the Danish Ministry of Justice during parliamentary debate in the lead-up to enactment of the Private Registers Act: see Folketingstidende 1977/78, Tillæg A sp. 614-15.
 Pinegar, supra n 514, 188; Grossman, supra n 514, 12, 20. See also McGuire, supra n 514, 4.
 See generally W J Kirsch, "The Protection of Privacy and Transborder Flows of Personal Data: the Work of the Council of Europe, the Organization for Economic Co-operation and Development and the European Economic Community" (1982) Legal Issues of European Integration, no 2, 21, 34-37; H Geiger, "Europäischer Informationsmarkt und Datenschutz" (1989) 5 RDV, 203-210; R Ellger, "Datenschutzgesetz und europäischer Binnenmarkt (Teil 1)" (1991) 7 RDV, 57, 59-61; Simitis, supra n 153, 7-8.
 See COM(90) 314 final, 13.9.1990, 4 ("The diversity of national approaches and the lack of a system of protection at Community level are an obstacle to completion of the internal market. [...] A Community approach towards the protection of individuals in relation to the processing of personal data is also essential to the development of the data processing industry and of value-added data communication services").
 Adopted 16.12.1966; in force 23.3.1976.
 These being the European Convention for the Protection of Human Rights and Fundamental Freedoms (ECHR) of 1950 (supra n 7), the American Declaration of the Rights and Duties of Man (ADRDM) of 1948, the American Convention on Human Rights (ACHR) of 1969 (in force 18.7.1978), and the African Charter on Human and People's Rights (ACHPR) of 1981 (in force 21.10.1986).
 See Art 1 of the Convention and Art 1 and recital 10 of the Directive. All three provisions are set out in Chapter 2 (section 2.3).
 See Art 12 of the UDHR, Art 17 of the ICCPR (set out further below), Art 8 of the ECHR (set out further below), Art V of the ADRDM and Art 11 of the ACHR. Cf the ACHPR which omits express protection for privacy or private life. This omission is not repeated in all human rights catalogues generated outside the Western, liberal-democratic sphere. See, eg, the Cairo Declaration on Human Rights in Islam of 5.8.1990 (UN Doc A/45/421/5/21797, 199), Art 18 of which expressly recognises a right to privacy for individuals.
 See Art 1 of both instruments set out in Chapter 2 (section 2.3). Note too the Preamble to Australia's federal Privacy Act (indicating that the Act is, in part, "necessary to give effect to the right of persons not to be subjected to arbitrary or unlawful interference with their privacy, family, home or correspondence" pursuant to Art 17 of the ICCPR).
 For a comprehensive analysis, see Bygrave, supra n 162.
 While the views reached by the Committee are not binding under international law, they carry a great deal of weight otherwise. See further D McGoldrick, The Human Rights Committee: Its Role in the Development of the International Covenant on Civil and Political Rights (Oxford: Clarendon Press, 1991), 151-152 and references cited therein; M Nowak, U.N. Covenant on Civil and Political Rights: CCPR Commentary (Kehl am Rhein/Strasbourg/Arlington: Engel, 1993), xix. The International Court of Justice does not have jurisdiction to hear complaints concerning breaches of the Covenant.
 General Comment 16, adopted 23.3.1988 (UN Doc A/43/40, 181-183), paras 7 & 10.
 Bygrave, supra n 162.
 As for the relevant case law pursuant to Art 8 of the ECHR, this is relatively extensive but has not yielded within the confines of one ruling a sweeping pronouncement on data protection principles along the lines of General Comment 16 of the Human Rights Committee. Instead, the Strasbourg organs have inched towards a recognition of various data protection guarantees in Art 8 on a case-by-case basis. See further Bygrave, supra n 162. Case law pursuant to other articles in the ECHR - notably Arts 6, 10 and 13 - has also on occasion touched upon data protection issues, but it is Art 8 case law that is of central importance here.
 See, eg, s 2(a) of the US federal Privacy Act (stating that the purpose of the Act is, ia, to safeguard citizens' "right to privacy", and describing the latter as "a personal and fundamental right protected by the Constitution of the United States" (s 2(a)(4)). See also the preamble to the Netherlands' data protection law of 1988.
 See, eg, Art 59(1) of the Hungarian Constitution of 1949: "Everyone in the Republic of Hungary shall have the right to good reputation, the inviolability of the privacy of his home and correspondence, and the protection of his personal data". See also, ia, Art 35 of the Portuguese Constitution of 1976; and Art 19(3) of the Slovak Constitution of 1992. Note that all Constitutional references here and in the following are taken from the comprehensive, regularly up-dated collection of national Constitutions available via URL <http://www.uni-wuerzburg.de/law/> (last visited 30.5.1999)).
 See, eg, Art 8 of Chapt II of the Finnish Constitution of 1919: "Everyone's private life, honour and home shall be secured. More specific provisions on the protection of personal data shall be prescribed and specified by an Act of Parliament". See also, ia, Art 10(2) & (3) of the Netherlands' Constitution of 1983; and Art 18(4) of the Spanish Constitution of 1978. Cf the less stringent requirement in Art 3 of Chapt 2 of Sweden's Instrument of Government of 1975 (Regeringsformen, SFS 1974:152): "Citizens shall be protected to the extent determined in detail by law against any infringement of their personal integrity resulting from the registration of information about them by means of electronic data processing".
 65 BVerfGE, 1; also available at URL <http://www.uni-wuerzburg.de/glaw/bv065 001.html> (last visited 30.5.1999). For an English translation of the Court's decision, see (1984) 5 HRLJ, no 1, 94ff. For detailed commentary, see, ia, S Simitis, "Die informationelle Selbstbestimmung - Grundbedingung einer verfassungs-konformen Informationsordnung" (1984) Neue juristische Wochenschrift, 398-405; and E H Riedel, "New Bearings in German Data Protection: Census Act 1983 Partially Unconstitutional" (1984) 5 HRLJ, no 1, 67-75. For analysis of the decision in the light of the Court's subsequent case law, see J Aulehner, "10 Jahre `Volkzählungs'-Urteil: Rechtsgut und Schutzbereich des Rechts auf informationelle Selbstbestimmung in der Rechtsprechung" (1993) 7 CR, 446-455. For comparison of the decision with the equivalent case law of the US Supreme Court, see P M Schwartz, "The Computer in German and American Constitutional Law: Towards an American Right of Informational Self-Determination" (1989) 37 American J of Comparative Law, 675-701.
 65 BVerfGE, 43 ("Das Grundrecht gewährleistet ... die Befugnis des Einzelnen, grundsätzlich selbst über die Preisgabe und Verwendung seiner persönlichen Daten zu bestimmen").
 Ibid, 43-44.
 Ibid, 46ff.
 See supra n 534.
 See Hungary's Official Gazette (Magyar Kozlony), No 30, 13.4.1991, 805. An unofficial English translation of the decision is available via URL <http://snyside.sunnyside.com/cpsr//privacy/privacy_international/country_reports/ hungary>. For commentary on the court's decision and its impact on Hungarian society, see I Székely, "Hungary Outlaws Personal Number" (1991) 14 TDR, no 5, 25-27.
 See part II of the judgment.
 See part III, point 3 of the judgment.
 See generally Simitis, supra n 102, paras 26ff.
 See, eg, M Allars, Introduction to Australian Administrative Law (Sydney: Butterworths, 1990), chapt 6, for an overview of these principles as found in Australian administrative law. For an overview of the equivalent principles as found in Norwegian administrative law, see, eg, Eckhoff & Smith, supra n 19, chapts 18, 23 & 24.
 For an overview of such provisions, see Chapter 3 (section 3.5).
 For an overview of such provisions, see Chapter 3 (section 3.6).
 See further the overview of doctrines on rule of law in E Boe, "Forholdet mellom rule of law og rettssikkerhet", in D R Doublet, K Krüger & A Strandbakken (eds), Stat, politikk og folkestyre: festskrift til Per Stavang på 70-årsdagen (Bergen: Alma Mater, 1998), 43-65; and Allars, supra n 545, 14ff.
 See further Chapter 7 (section 7.2.5).
 These and other aspects of the relationship between doctrines on rule of law and data protection are discussed more fully in, ia, D W Schartum, "Mot et helhetlig perspektiv på publikumsinteresser i offentlig forvaltning? - Rettssikkerhet, personvern og service" (1993) 16 Retfærd, no 63, 43, espec 51-52; D W Schartum, Rettssikkerhet og systemutvikling i offentlig forvaltning (Oslo: Universitetsforlaget, 1993), 58ff; and Pseudonyme helseregistre, NOU 1993:22, 46-47. See also Chapter 7 (section 7.2.5). Note that the above-cited works focus upon the concept of "rettssikkerhet", which is the Norwegian equivalent of the concept of rule of law. While the two concepts are broadly similar, they are not completely identical in ambit and concern: see further Boe, supra n 548. These differences, however, have little significance for the discussion above. As an aside, I see no real reason - apart from the weight of tradition - for maintaining the two limits identified above in the concerns of rule of law doctrines. I see such doctrines as logically capable of being applied to private sector practices and to the processing of personal information relatively independent of specific decision-making processes. This does not mean, however, that I am in favour of data protection laws being fully subsumed under the notion of rule of law: see further Chapter 8.
 For Norwegian examples, see infra n 672 et seq and accompanying text.
 This is exemplified in the following claims by Samuel Warren and Louis Brandeis in their seminal law review article on the right to privacy in Anglo-American common law: "The right of property in its widest sense, including all possession, including all rights and privileges, and hence embracing the right to an inviolate personality, affords alone that broad basis upon which the protection which the individual demands can be rested": S Warren & L Brandeis, "The Right to Privacy" (1890-91) 4 Harvard L Rev, 193, 211. In other parts of their article, however, Warren and Brandeis seem to view such a broad use of the notion of property rights as out of keeping with usual understanding of the notion: see, eg, ibid, 213 ("the principle which has been applied to protect these rights is in reality not the principle of private property, unless that word be used in an extended and unusual sense").
 See further D Elgesem, "Remarks on the Right of Data Protection", in J Bing & O Torvund (eds), 25 Years Anniversary Anthology in Computers and Law (Oslo: TANO, 1995), 83, 90ff (analysing the "property function" of data protection laws; ie, the way in which the latter help secure a data subject's "claim to ex ante agreement to the transfer of personal information").
 Alan Westin is the primary example. In Privacy and Freedom, Westin writes: "personal information, thought of as the right of decision over one's private personality, should be defined as a property right, with all the restraints on interference by public or private authorities and due-process guarantees that our law of property has been so skillful in devising. Along with this concept should go the idea that circulation of personal information by someone other than the owner or his trusted agent is handling a dangerous commodity in interstate commerce, and creates special duties and liabilities on the information utility or government system handling it": supra n 355, 324-325.
 See, eg, P Mell, "Seeking Shade in a Land of Perpetual Sunlight: Privacy as Property in the Electronic Wilderness" (1996) 11 Berkeley Technology LJ, 1, espec 74ff; R T Nimmer & P A Krauthaus, "Information as Property: Databases and Commercial Property" (1993) 1 Int J of Law and Information Technology, 3, espec 29ff; P Blume, "New Technologies and Human Rights: Data Protection, Privacy and the Information Society", Paper no 67, Institute of Legal Science, Section B, University of Copenhagen, 1998, 4. Note too that, in recent years, some Norwegian medical organisations have issued statements supporting the notion of patients "owning" data concerning their medical condition: see Ø Rasmussen, Kommunikasjonsrett og taushetsplikt i helsevesenet (Ålesund: AS Borgund, 1998), 53-54 and references cited therein. According to Rasmussen, a property rights perspective has also been embraced by Erik Boe in a review of developments in Norwegian administrative law: ibid, 53 (claiming that Boe "tar til orde for at datasubjektets rådighet bør sees i et eiendomsrettslig perspektiv"). However, Rasmussen overplays his claim. In the article concerned (see E Boe, "Utviklingsslinjer i forvaltningsretten" (1989) LoR, 51, 69), Boe does not expressly advocate adopting a property rights perspective; rather he claims that a "right to control over one's personality" based on such a perspective, is an issue that must be resolved ("På sikt må det fundamentale spørsmålet om `retten til å råde over sin personlighet' ... - etter mer eller mindre `eiendomsrettslige' modeller - bli avklart"). In a subsequent article, Boe expresses scepticism over the use of "market-systems" whereby individuals price and sell their privacy, because these systems "individualise and commercialise privacy protection" ("individualiserer og kommersialiserer personvernet"): E Boe, "`The Right to Privacy' i USA" (1994) LoR, 577, 578.
 See, eg, Y Poullet, "Data Protection between Property and Liberties - A Civil Law Approach", in H W K Kaspersen & A Oskamp (eds), Amongst Friends in Computers and Law: A Collection of Essays in Remembrance of Guy Vandenberghe (Deventer/Boston: Kluwer Law & Taxation Publishers, 1990), 161-181; Miller, supra n 355, 211ff; R Wacks, Personal Information: Privacy and the Law (Oxford: Clarendon Press, 1989), 49; Selmer, supra n 38, 13. I am also sceptical to adopting a regulatory approach in the field of data protection based on property rights doctrines. This is for several reasons. First, adoption of such an approach could lead to a commodification of data protection rights and ideals which favours certain sectors of the population more than other sectors. Secondly, it is questionable that adoption of property rights approaches will assist arguments for providing increased levels of data protection, as such rights - like most other rights - seldom are applied in an absolute manner. Thirdly, the conceptual propriety and utility of the notion of "ownership" of personal data/information are doubtful. Fourthly, many of the challenges that data protection law and policy face cannot be adequately addressed under the property rights rubric. One such challenge, for instance, concerns the ability (or, rather, increasing inability) of data subjects to comprehend the logic of information systems.
 See further Chapter 7 (section 7.2.4) and references cited therein.
 There is, eg, a paucity of references to human rights in the travaux préparatoires to the PDRA. Cf Bing, supra n 86, 232 (claiming in 1981 that Norwegian and other European data protection laws are "more closely related to the law of public administration than to the law of individual liberties").
 See, ia, P Falck, Personvern som menneskerett. Den europeiske menneskerettighets-konvensjon artikkel 8 som skranke for innsamling, behandling og bruk av personopplysninger, Det juridiske fakultets skriftserie nr. 56 (Bergen: University of Bergen, 1995); J P Berg, "Offentlige skattelister - i strid med EMK?" (1998) Kritisk Juss, 203-204; Bygrave, supra n 162; Et bedre personvern - forslag til lov om behandling av personopplysninger, NOU 1997:19, 41-42. Cf Rasmussen, supra n 555, 50-52 (underplaying this importance somewhat).
 See, eg, H Burkert, "Access to Information and Data Protection Considerations", in C de Terwangne, H Burkert & Y Poullet (eds), Towards a Legal Framework for a Diffusion Policy for Data held by the Public Sector (Deventer/Boston: Kluwer Law & Taxation Publishers, 1995), 23, 49.
 See further Bygrave, supra n 162 with respect to case law pursuant to Art 17 of the ICCPR and Art 8 of the ECHR.
 See the Belgian case law referred to infra n 777.
 For Australia, see particularly ALRC, supra n 295, vol 1, part III, espec 476-477. For Denmark, see particularly Delbetænkning om private registre, Bet 687 (Copenhagen: Statens trykningskontor, 1973), espec 39-40; Delbetænkning om offentlige registre, Bet 767 (Copenhagen: Statens trykningskontor, 1976), espec 147-148. For Norway, see particularly Offentlige persondatasystem og personvern, NOU 1975:10, espec 43-50; Persondata og personvern, NOU 1974:22, espec 20-23. For Sweden, see particularly Data och integritet, SOU 1972:47, espec 61-64. For Switzerland, see espec Botschaft zum Bundesgesetz über den Datenschutz vom 23.3.1988, 8-9. For the USA, see particularly Westin, supra n 355, chapts 13-14; Miller, supra n 355, chapts V-VI.
 See Chapter 2, SS 2 of the Freedom of the Press Act of 1949 (Tryckfrihets-förordningen, SFS 1949:105), which is part of the Swedish Constitution. This right of access was first established in the Freedom of the Press Act of 1766.
 See further Bennett, supra n 10, 62-65.
 Flaherty, supra n 298, 99.
 Resolution (73)22 on the Protection of the Privacy of Individuals vis-à-vis Electronic Data Banks in the Private Sector (adopted 26.09.1973), and Resolution (74)29 on the Protection of the Privacy of Individuals vis-à-vis Electronic Data Banks in the Public Sector (adopted 24.09.1974).
 See, eg, Hondius, supra n 101, 63ff and references cited therein.
 See generally Bygrave, supra n 162.