Chapter 3: Models For Self-regulation
A. Regulatory Models for Protecting Privacy in the Internet
Regulatory Models for Protecting Privacy in the Internet
Digital computer networking technologies threaten personal privacy because they make it easier to collect, store, retrieve, and disseminate data on individuals. While norms for protecting personal privacy when these technologies are widely used are reasonably well agree upon, the legal regulatory frameworks for applying the norms do not exist, and there is little agreement on the best regulatory approaches. This paper considers alternative regulatory models, emphasizing the preconditions for effective self regulation.
The basic norms for protecting personal privacy are reasonably well agreed upon. Essentially the same norms are embodied in the Federal Privacy Act in the United States, in the Code of Good Practice for the industry, and in the European and British privacy protections schemes. All of these legal texts include the following norms:
No personal data record keeping system may be maintained in secret.
Individuals must have a means of determining what information about them is in a record and how it is used.
Individuals must have a means of preventing information about them obtained for one purpose from being used or made available for other purposes without their consent.
Individuals must have a means to correct or amend a record of identifiable information about themselves.
Limits should be placed on the disclosure of certain personal information to third parties.
The individual whose request for change is denied may file a statement of disagreement, which must be included in the record and disclosed along with it thereafter.
Organizations creating, maintaining, using, or disseminating records of identifiable personal data must assure the reliability of the data for their intended use and must take reasonable precautions to prevent misuses of the data.
An individual should have means of seeking review of a denied request, or an alleged violation of duty.
BASIC REGULATORY ALTERNATIVES
While the norms are agreed upon, there is little agreement on the best way for the law to enforce them. Privacy advocates have been highly effective in directing public attention to perceived threats against individual privacy resulting from commercial exploitation of transactional data and certain uses of health, demographic and vital records information, but they have been less coherent and articulate in defining legal proposals.
At one end, is purely private enforcement of norms through civil actions for invasion of privacy. At the other extreme is the approach adopted in the European Commission directive and in Britain, which requires advance licensing of databases that might tend to give rise to privacy invasions. In between are more narrowly focused regimes such as those contained in the recently-enacted Health Care Portability Act, and proposals to establish a property right in personal data.
Until the early 1990's it would have been reasonable to project growing support in the United States for a European style privacy regulatory regime, basically representing federal legislative extension of federal Privacy Act duties to most of the private sector, which also would correlate with legally imposing the DMA's voluntary code of Fair Information Practices. Traders in personal information would be obligated to notify subjects of data of intended uses, to disclose data on request, to accept corrections, to limit dissemination to uses contemplated when the data were collected, and to meet duties of reasonable care with respect to accuracy. Part of this scenario very well could have included some kind of privacy regulatory agency. Licensing of private exploiters of private data, on the British model, was less likely.
Now, enthusiasm for the European model has abated, possibly because initial implementation of the European privacy directive has both raised a number of difficult implementation issues, and also has not provided revolutionary new protections for individual privacy in fact. The European system has proven both less effective and more difficult than its advocates foresaw. But in the absence of a consensus among privacy advocates in support of the European model, what kind of response to growing political pressure for some kind of privacy protection is conceivable?
One possibility is the establishment of a privacy commission or privacy ombudsman. Such a new governmental institution would not have enforcement powers, and indeed the regulatory regime well might lack legally enforceable duties of any kind. Instead, there would be a governmental watchdog to direct public attention to firms that violate good privacy practice, such as is expressed by the DMA guidelines.
A second scenario would expand legal duties more or less corresponding to those contained in the FCRA to a broader class of traders in personal data. This would encompass--and probably preempt--the recent proposals for protection of health information and children's privacy, and concerns about competitive damage if the U. S. does not provide protection "equivalent" to the European standards. This scenario is not dramatically different in substance from adoption of the European model, but would be more incremental, relying to a greater extent on existing statutory language, existing case law, and perhaps on the Federal Trade Commission for interpretation and enforcement.
Under a third scenario, an individual property-type right would be created in personal data. Such a right might be created by legislatures or by courts in response to lawsuits such as that brought by Avrahami against U.S. News and World Report. State legislation along these lines is more likely and federal legislation less likely than in the direction represented by scenario two because of the superficial attraction in smaller and more volatile political communities of the notion that ordinary people should have some kind of veto over other people making money off information about them. The third scenario might include a legislatively mandated opt-out for public agencies such as motor vehicle administrations, possessing personal information.
An individual (or institutional) property right in personal information is problematic for at least three reasons. First, any system to allow individuals a meaningful opt-out power would be cumbersome and expensive to administer, and prone to actual or claimed errors. Second, if even a few subjects opt not to have data about them acquired or disclosed, the commercial utility of the resulting data collections would be diminished for those interested in finding "skipped" persons, or in direct marketing uses of the data.
Third, possession of an opt-out power would encourage those possessing the power to demand compensation for not exercising the opt out. Thus traders in personal data would be confronted with a growing need to pay not only governmental sources of data, but also individual subjects. This not only would increase the costs of data acquisition; it also would be a nightmare to administer, given the difficult of tracing data sources, the small amounts of money likely to be involved, and the limitations of existing electronic payment systems involving larger numbers of individuals and commercial entities not bound by ongoing commercial relationships. The possibility that some individuals would demand a high price for information about them is less serious than the possibility that many would demand a dollar.
Indeed the problems with a strong property-based approach are so great that legislatures probably could be persuaded from pursuing this scenario in a strong sense, although courts hearing individual lawsuits may be less competent to consider such systemic objections to recognition of property rights.
If significant political momentum develops in state legislatures behind a property-based approach, it likely can be deflected into actual legislative language that makes it relatively easy to infer waiver of the opt out power, and to presume acceptance of standard waiver terms written into basic documents through which information is acquired. Still, this middle ground is difficult to fashion logically for data disclosure compelled by law, such as that involved in many vital records held by governmental authorities. There some of the difficulties with even a limited number of individuals opting out of commercial exploitation of data about them would remain.
None of these approaches has received widespread support. Although the European directive is in force, it remains to be seen exactly how members of the European community will implement it and how effective implementation will work out in practice. One particular problem is that in order for such a regulatory regime to be effective, it must restrict movement of data outside the geographic limits of the regulatory regime, and it is far from clear how this will take place, except through contractual restrictions which are tantamount to delegating implementation to private regulation.
The Fair Credit Reporting Act and Health Care Portability Act approaches could be extended to personal databases more generally, but it does not appear to be political support in the United States to do that at the present time.
It is conceivable, however, that the growth of data enterprises, and growing public sensitivity to real and perceived abuses of personal data could change the political balance and cause the Congress to enact legislation. At the same time the growing use and visibility of personal data for commercial purposes is stimulating more and more negative political reaction around the world. Even more likely, is the growth of a patchwork of state statutes and regulations restricting access to and disclosure of personal data. Reportedly a 1991 Time/CNN poll found 93% of Americans favoring regulations prohibiting sale of information without permission of the subject.1
The European Commission's data privacy directive obligates national governments in Europe to restrict private sector database assembly and use. One important aspect of the directive restricts data transfer to other countries lacking equivalent regulation. In one recent article, Joel R. Reidenberg, a law professor and prominent commentator on privacy law, expressed the view that the political equation may be shifting in favor of comprehensive data privacy regulation in the United States, as the business community sees such a regulatory regime necessary to promote global business activity.2 Professor Reidenberg's view is not unusual.
The Canadian Information Highway Advisory Council recently released a report "Building the Information Society: Moving Canada into the 21st Century," which, among other things recommended new legal rights for subjects of electronic databases.
The principal problem with self-regulation is political: consumer groups and privacy advocates are concerned that self-regulation would not be sufficiently protective of the interests of data subjects. Assuming that problem can be overcome, there still are a number of legal and conceptual challenges to make self-regulation effective.
While regulatory programs succeed largely through voluntary compliance, they also must have some means of enforcement. Otherwise, recalcitrant members of an industry can not only avoid the regulatory norms, but also gain competitive advantage over those who do comply with the norms. Any regulatory program seeks to limit competition on the subject matter of the regulation, and like and cartel, a self-regulation framework tends to unravel because of "cheers."
Any enforcement mechanism must have a way of defining those who are bound by the rules set by the rulemakers, and must have some defined procedures for making the rules and for adjudicating claims of noncompliance. Many private associations perform these functions, defining membership, making rules through governing bodies defined in by-laws, and adjudicating, with more or less formality, claims of noncompliance with the rules. A typical private association punishes noncompliance with expulsion. The effectiveness of expulsion depends on the association having some benefits desired by one subject to the sanction.
One of the problems in designing a self-regulation scheme is in defining the benefits that would be cut off by expulsion. In the absence of such benefit, expulsion is an ineffective sanction, and noncompliance cannot be punished, and the regulatory scheme tends to be ineffective.
One of the most interesting recent proposals for a self-regulatory mechanism is to build self- regulation around the administration of domain names. Promoted by David R. Johnson and David Post, this means of self-governance would require all those obtaining an Internet domain to comply with certain rules developed by one or more private associations that administer the domain registry. Failure to comply with the rules would result in cancellation of the domain and denial of participation in the Internet.
That takes care of the carrot and stick problem. It does not take care of another problem arising from application of competition (antitrust) law in the United States and elsewhere. It is a violation of competition law for competitors to combine to set the terms of competition and then to enforce those terms on their competitors. That is exactly what self-regulation entails, although it may be scrutinized less harshly than a cartel that would set the terms of competition on prices.
There are two ways to deal with the antitrust problem in the absence of a statutory privilege for self-regulation. One is to follow the model for standard setting bodies under antitrust law. The other is to follow the model of collective bargaining.
Because collective bargaining is not well understood by many people concerned with cyberspace and cyber law, it may be useful to review the basic concepts.
Collective bargaining functions for a "bargaining unit"--a group of employees defined in advance by the union and employer, or by the National Labor Relations Board through a certification process. People within the bargaining unit are bound by terms of employment negotiated through collective bargaining, and are legally prohibited from negotiating individual terms of employment in conflict with the collective ones. People outside the bargaining unit are not bound by the collectively bargained terms.
A duty of fair representation obligates the bargaining representative (usually a trade union) to represent everyone in the bargaining unit fairly when it negotiates terms of employment and when it handles grievances (individual claims that the collective agreement has been violated). No particular procedures are mandated by the duty of fair representation, and in fact the way in which bargaining power is delegated to negotiators varies considerably from union to union and from employer to employer.
As long as bargaining takes place in the context of employer/employee relations, the statutory and nonstatutory exemptions from the antitrust laws operate to shield unions and employers and their individual representatives from antitrust liability.
In extending the collective bargaining model to other forms of self-regulation, one needs an analogue of the bargaining unit, of the fair representation duty, and of the antitrust exemption. The bargaining unit concept easily can be developed privately. As noted earlier, all private associations defined their membership. Although the bargaining unit is a distinct concept from union membership (someone can be in the bargaining unit and thus bound by the collective agreement without being a union member), association membership is a reasonable proxy for a membership in a bargaining unit. The duty of fair representation probably exists as a matter of common law for any association that undertakes to represent the interests of its members or other defined constituents or beneficiaries. Trust and fiduciary concepts need not be extended very far to resemble the duty of fair representation.
The biggest problem is the antitrust exemption. The labor exemption does not extend beyond traditional employment relationships, and self-regulatory mechanisms to protect privacy in cyberspace do not ordinarily concern the employment relationship.
1 Epic Alert 3.11 (May 29, 1996), http://www. epic.org/.
2 Joel R. Reidenberg & Francoise Gamet-Pol, The Fundamental Role of Privacy and Confidence in the Network, 30 Wake Forest L. Rev. 105 (1995).
Self-regulation: Some Dutch Experiences
Self-regulation has become a hallmark of the "second generation" of data protection laws in Europe, which is characterized by an effort to reduce administrative burdens for personal data controllers. In the United Kingdom, the Data Protection Registrar is expressly charged with encouraging trade associations or other bodies of data users to provide their members with codes of practice. The European Union Data Protection Directive in Article 27 charges both the member-States and the European Commission to promote national and Union-wide codes of conduct for specific sectors.
The Netherlands provided an early example of self-regulation in this field. More than twenty years ago, in 1974, a Royal Commission was charged with studying the need for privacy legislation. In its interim report, the Commission invited social institutions to join in the data protection effort. Major elements of data protection are not solely dependent on government regulation, the Royal Commission argued, but may be put into place by the involved parties. Personnel registration could be the subject of collective labor agreements, universities could set limits on their systems of student records, the credit industry can adopt rules on bad debtors files, and so on.1 So the element of self-regulation played a role early on during the protracted process leading to the adoption of data protection legislation. The Royal Commission eventually worked out a three-layered approach:
Notification of simple personal data systems of an internal character (such as payroll systems or customers files) to a special supervisory body, called the Registration Chamber, that was to maintain a national public register.
In addition to notification, controllers of non-routine systems would be required to draft privacy regulations subject to control by the Registration Chamber.2
Databases that store sensitive data or routinely transmit personal data to third parties were to be subjected to a full license by the Registration Chamber.
The first Bill based on these proposals ran into a newly inaugurated political campaign of deregulation. As a result, in the final Bill the element of self-regulation by personal record keepers was stressed even more. The licensing system was dropped, as was the public register, and other procedures were simplified. Sectoral self-regulation was introduced. This was in line with the practice that had evolved during the long-drawn debate on the proposals of the Royal Commission. A wide variety of responsible keepers of personal data had begun to issue privacy regulations of their own in anticipation that the law would be enacted some day.
Many years later the Data Protection Act of 1988 (DPA) was approved by parliament.3 It provides for two forms of self-regulation:
Compulsory self-regulation for individual users. The DPA provides for separate forms of self-regulation for private and public sector data banks:
Non-routine systems users in the private sector must notify the Registration Chamber, the supervisory body created by the law. Notification includes a description of the records system according to a questionnaire prescribed by the Registration Chamber.
For public sector data banks, privacy regulations must be issued by the responsible keeper along lines required by the DPA. Nineteen types of routine systems (such as payrolls, customer and student files) have been exempted from notification or regulation by Decree, provided they are kept within strict pre-set limits.
Optional self-regulation on sector level.
The law provides for the drafting of codes of conduct by business and professional sectoral organizations, preferably in consultation with groups of data subjects, that may be submitted to the Registration Chamber for approval. This provision is part of a carrot-and-stick approach: if self-regulation is found to be lacking in a particular field, the law authorizes the Government to issue sectoral privacy regulations by Decree. So far, because sectoral organizations have stepped in with their own codes, it has not used this authority.
A major overview of both forms of self-regulation practice in Holland was carried out in 1995 by Margriet Overkleeft-Verburg, a former member of the Dutch Registration Chamber, in a report to the Ministry of Justice that also served as her doctoral thesis.4
In a five year period from January 1, 1990 to January 1, 1995 more than 50,000 personal data files were registered under the DPA, over two-thirds by public sector data controllers. Article 20 of the DPA sets forth eleven subjects must be covered by public sector regulations:
1) The purpose of the record system.
2) Categories of persons about whom data are stored.
3) Categories of data that are stored and the way they are obtained.
4) Provisions for the removal of data (retention time limits).
5) Categories of persons or institutions to whom data may be disclosed.
6) Categories of data to be disclosed to third parties.
7) How direct access to the files is provided.
8) Linkage of the record system to any other data system.
9) Access by the data subject to and correction of the register.
10) Provisions for informing data subjects about disclosure of data.
11) The main elements of systems management.
These subjects also appear on the notification form for private sector databanks. As a result, the distinction between private sector notification requirements and public sector regulations has blurred. Both regulatory schemes leave the responsible keeper considerable flexibility. As an example, banks and insurance firms may prefer to "cluster" notification of their systems at the operating level instead of filing separate notifications for separate branches. Also, various modes of linkage (including front-end verification, matching and profiling) are left to the discretion of the data controller as long as they are compatible with the stated purpose of the system.
After an initial surge, the number of notifications has gradually declined. From a random sample Overkleeft concluded that the quality of notification and regulation differs widely. All in all, she bluntly termed the results of compulsory self-regulation "largely disappointing. It is ignored by many, or only taken as a token obligation." A major complication is the overspill of what Overkleeft terms "parallel information law" (on subjects such as freedom of information, secrecy, certain obligations under the Civil Code) to the field of data protection.
A positive result, however, is the willingness of professional and business organizations to develop model regulations and to provide information and support for self-regulation at the operational level. In general, it seemed that self-regulation has functioned best where:
- Self-regulation could build upon a tradition of secrecy pledges;
- Privacy rules also function as internal instructions or procedures; or
- Privacy rules can be of value as a marketing tool.
For self-regulation to be effective at the operational level, certain conditions have to be met. Overkleeft identifies five:
1) The information system is sufficiently stable over time;
2) There has been made an adequate survey of existing and foreseeable information needs, both structural and incidental;
3) The effects of the law on concrete applications can be precisely defined;
4) The terms of the privacy regulations are framed with a sufficient degree of flexibility; and
5) The controller is willing and able to define and implement his own regulations.
A general side effect of notification and regulation schemes that emerges from this study is that of "bureaucracy and a fixation on formalities." This not only puts the normative function of operational self-regulation at risk but also its informational role in providing internal and external transparency. The reliance on notification and regulations has promoted an "administrative fallacy" of data protection as a formal mechanism.5 Data controllers tend to assume that they protect privacy as long as they have filled in the proper forms and have been properly registered. Data protection agencies tend to become--in Herbert Burkert's memorable phrase "administrators of symbolism."6 A Dutch Minister of Justice openly warned that privacy protection "should not get lost in bureaucracy and red tape."7
A more dynamic, problem-oriented approach is called for, one of Holland's most distinguished jurists, Tymen Koopmans, suggests. Presently a Solicitor General to the Dutch Supreme Court, he is a former law professor and European Judge who more than twenty years ago chaired the Royal Commission on data protection. At the XVIth International Conference of Data Protection Commissioners in The Hague, Koopmans predicted that "informational privacy in the future will probably be assured by less emphasis on regulation and administrative systems of surveillance, and by more emphasis on individual rights and on protection by the courts."8
This calls not so much for a reliance on formal notification schemes as for more problem-oriented data protection policies, in which codes of good practice play a role.
CODES OF CONDUCT
The codes of conduct drafted by business and professional sectoral organizations under the provisions of the DPA have been characterized as "a bridge" between the substantive rules of the Data Protection Act and their implementation at the operational level. By mid-1995 sixteen codes based on the DPA had been realized, and by late-1996 most had been approved by the Registration Chamber. Lately it approved a privacy code for chipcard (smartcard) applications. The Chamber termed such a code a prerequisite for the massive introduction of chipcards in the Netherlands that is envisaged by the industry. The chipcard code represents the first major example of the use of a code of conduct to regulate new products and services. Any organization, having adopted a code of conduct, may apply for a declaration that in the Chamber's judgement the code conforms with the Act and meets reasonable requirements for the protection of the privacy of data subjects.
A formal application is usually submitted only "after extensive informal discussion," the President of the Dutch Registration Chamber, Peter Hustinx, noted at the International Conference on Data Protection and Privacy in 1991.9 The Chamber may consider the application if:
- the applicant is a representative of the sector concerned,
- the sector is precisely defined in the code, and
- the code has been drawn up with due care and notably in adequate consultation with interested parties.
The consultation requirement, in particular, may cause practical problems as it will not always be easy to find an adequate representative of data subjects, Hustinx noted.
A number of professional and business sectors have relied on a general privacy watchdog organization, called Stichting Waakzaamheid Persoonsregistratie (SWP) (Personal Registration Vigilance Foundation). SWP was formed in the wake of widespread popular unrest about the 1971 Census that also prompted government action on data protection legislation. But after almost twenty years of activity, SWP closed in 1994 after its newly formed privacy research center went bankrupt. Recently, the Registration Chamber seems to have somewhat relaxed its requirement of direct participation by interest groups and has accepted consultations of a more general nature.
To be approved, a code is expected to provide some "added value,"10 including detailed rules for the collection and use of personal data or a specific grievance procedure, and/or an in-house data protection officer (as distinguished from the general role of the Registration Chamber in receiving complaints). Provisions for systems security are a standard requirement.11
Approved codes are published in the Government Gazette. Declarations of the Registration Chamber are valid for a maximum period of five years. In a strictly legal sense, the effects of a declaration are rather complicated. Approval by the Registration Chamber is not binding on the courts in the event of a legal dispute.12 This opens the possibility that a court may find against a data controller even though he has followed an approved code of conduct. Decisions of the Registration Chamber on codes of conduct that have been submitted, in their turn, cannot be appealed in court.13 This may be an incentive for sectoral business and professional organizations to refrain from submitting a code of conduct for formal approval and issue it on its own accord.14
Whether approved or not, a code may have legal effect along various lines:
Article 10 of the Data Protection Act provides for a data protection class action. Special interest groups may apply to the courts for an injunction in the event of damages or threatening damages resulting from data processing practices that overstep the boundaries set by the Act or "resulting from it." Accordingly, a group that has negotiated a code of conduct may expect at some time that it may be considered by the courts.
A data protection code may be enforceable as part of the internal bylaws of a sectoral organization, or as professional standards, or contractual conditions for services.
A code of conduct also may contribute to the general standard of due care required by law in various situations.
TWELVE CODES APPROVED
As intended by parliament, codes of conduct have been introduced mainly in the private sector, particularly by professional services, e.g., direct marketing, market surveys, trade information, data processing, recruitment services, and in information-intensive sectors such as mail order, the pharmaceutical industry, publishing, banking and insurance. Codes have been prepared for the social and health research sectors. No codes of conduct have been drafted in the public sector, which relies on model regulations. These model regulations are subject to approval by the Registration Chamber. Significant examples of this category are local authorities and the police. A brief description and analysis of the 12 approved codes follows. A typical code consists of a short preamble and between six and 22 articles plus an explanatory note.
1) Organization of Consultants for Personnel Recruitment (OAWS).15 OAWS is an organization of some 50 bureaus for personnel recruitment and executive search. The code integrates data protection and professional ethics and includes a complaint procedure. It applies to both computerized and manual files. Data subjects may require the erasure of data. Express consent of the data subject is required for communication of particulars to the principal but an exception is allowed if there is a risk that the intervention of the advisor may interfere with an individual's existing employment relation. The code is referenced in organizational publications (the so-called Bureau Guide) and in recruitment advertising by individual bureaus.
2) Association for Information Technology (COSSO).16 The COSSO code applies to data protection in consultancy and training, and in software services. Its main feature is a systems security checklist that has been added as an annex. The code is integral to a certification scheme for computer service bureaus and a general COSSO-code of good practice covering business relations. As the COSSO code expired in 1994, one of COSSO's founding organizations is preparing a new code.
3) Association of Research Institutes (VOI).17 This code covers a variety of social research by independent non-profit institutes, including universities. It differentiates according to type of research files. Research data may not be used for commercial purposes. Secondary use of stored data is allowed when it is compatible with the purpose for which the data originally were collected. A main motive for writing this code was the so-called "non-response problem," a growing reluctance among the public to cooperate with social research.
4) Association of Market Research Bureaus (VMO) and the Dutch Association of Market Research (NVvM).18 This code is a co-production of two closely linked associations. It is related to a certification scheme for market research bureaus. Relations with data subjects (individual respondents in market research projects) are based on the principle of informed consent (for children: by their parents). The same principle applies, to the extent possible, to the use of observation techniques as a research tool. Data subjects may request the removal of stored data. As a rule, basic identifying data (name, address and place of residence) will be separated from other personal details within a period of six months. The code provides for an in-house data protection officer. One of the motives for approval by the Registration Chamber was that the code may help to counter "mistrust and skepticism" towards professional market research among the public.
5) Direct Marketing Institute Netherlands (DMIN).19 This code covers direct marketing, distance selling and sales promotion, both business-to-business and business-to-consumer. It applies for non-profit as well as commercial purposes. The collection and storage of personal data must be tested against undue harm for the privacy of data subjects. The code adopts the existing Robinson List systems under which the industry pledges to honor individual requests to block names and addresses for direct mail (both voice and mail). It also provides for in-house data protection officers and an external complaints procedure. [Note: separate codes are in existence for i.e., telemarketing, list brokering and "house sampling."]
6) Association of the Pharmaceutical Industry (NEFARMA).20 The privacy code applies to both pharmaceutical research and commercial activities. It has been linked to a general Code of Practice for the pharmaceutical industry in the Netherlands (1983), including a complaint procedure. Eventually the privacy regulations may be incorporated into a revised general Code of Practice. Third-party record systems will only be used if the responsible keeper has accepted the DMIN-Code (supra no. 5). The privacy code is divided into four parts:
- Contract research on drugs
- Side-effects of drugs
- Registration of doctors, pharmacists and others
- Registration of researchers.
7) Association of Producers and Importers of Veterinary Products (FIDIN).21 Data on veterinary surgeons (both of a commercial nature and on the basis of veterinary drugs regulations) and researchers are covered. The code has similarities with the NEFARMA code (supra nr 6). There is already a tradition of self-regulation in this sector. The access rights of data subjects includes information sources such as reports by field officers of the organization. Complaints are handled by the governing body of FIDIN, assisted by independent legal counsel.
8) Mail Order Union (NPB).This code partly overlaps 22 with the direct mail code (supra no. 5) and the market research code (supra no. 4). It adopts the Robinson List principle and provides for an in-house data protection officer and a complaint procedure.
9) Association of Commercial Information Bureaus (NVH).23 This code in particular addresses the handling of sensitive data. Storage of most categories of sensitive data such as race, religion and medical data is excluded. [Only certain criminal antecedents, after a balancing test against privacy interests, may be stored.] As a rule, personal data will be erased eight years after the date of first storage or after the date of last verification.
10) Council for Health Research.24 Organizations of doctors and patients were consulted in drafting a Code for Health Research. The Registration Chamber has welcomed this code as "a justified balance between personal privacy and the progress of health research." The main issue confronted by the drafters of this code was the definition of personal data in the DPA which includes information that may be traced to an identifiable individual. In this field, research data increasingly are stripped of names and provided with an individual code number known only to the researcher. The code has a tripartite system of rules for anonymous, identifying and coded data.
11) The Banking Association (NVB). The approval procedure25 in this case was rather stormy. An early draft was stalled because of controversy between the association and the Registration Chamber. A special feature is a list of statutory requirements for the disclosure of banking information to government agencies. Participants are pledged to balance use of credit data against privacy considerations [Note: the central bureau for credit records (BKR) has longstanding privacy regulations of its own, but as yet it is not a sectoral code. It has not been subject to approval by the Registration Chamber.] Commercial information will only be provided to third parties with the consent of the data subject. Complaints are handled by a foundation that mediates various kinds of consumer disputes. Because of rapid Information Technology developments in financial services the Registration Chamber has insisted on periodic consultations with the banking association on the application of its Privacy Code and the possible need of amending it.
12) National Chipcard Platform (NCP). A Privacy Agreement26 is part of a wider set of agreements on an Open Infrastructure for Chipcard Applications (OIC). In the Platform a wide variety of interests are represented, ranging from the banking industry and the national government to retailers and consumer organizations. The code applies to both single-purpose and multi-purpose cards. For specific applications the chipcard code refers to applicable codes and regulations. The code covers both personal privacy and "card integrity." The basic principle is openness in the three phases of data processing:
a) In the collection phase this works out as the principle of "transparency" referred to as "recognizability," which provides that:
data should be obtained by lawful means; and
data should be obtained with the consent of the data subject.
b) In the storage phase the guiding principle is that of "due care." Due care requires that:
personal data should be verified and be relevant, accurate, complete and up-to-date;
the chipcard holder has a right of access and correction of stored data; and
personal data should be protected by technical and organizational means against unauthorized use, disclosure, modification or destruction.
c) Use of the chipcard is guided by the "purpose principle." Under the purpose principle:
the purpose for which data are collected and stored should be clearly stated to the cardholder;
use and disclosure of data will be limited to this purpose (unless required by law or with consent of the data subject); and
data may only be kept to the extent and for the period of time that is necessary for the purpose.
At present, talks on approval by the Registration Chamber are continuing with the Insurance Union and the Association of Magazine Publishers (NOTU). The formal insurance Privacy Code--a recommendation of the Union to its members--has been in force since 1989. The magazine publishers code was implemented in 1990.
PROS AND CONS
In the analysis of the approval procedure by Mrs. Overkleeft, she indicated that it has shown a marked "susceptibility to conflict" stemming from, among other factors, the ambiguity of the starting points of the Parliament and some of the legal concepts embedded in DPA. A deeper cause is what she terms "the tension between the top-down and bottom-up approaches." In the former the element of "law as enforced order" is stressed; in the latter is the discretionary nature of self-regulation. This tension, it is argued, has been manifest mainly in the consumer sector. Is privacy regulation a consumer matter and as such fully the responsibility of the parties in the market? Or is it part of an interlinked set of regulations controlled by the Government?
After some initial haggling, the approval procedure lately seems to be a negotiating process between the sector organizations and the Registration Chamber. If Mrs. Overkleeft had some misgivings about this development, she concluded on a markedly strong note: "The choice for primacy of self-regulation at the sectoral level has proved to be correct." Particularly in the consumer sector, codes of conduct that have been realized "show the effect of public opinion as a factor of real commercial relevance--perhaps even stronger than the legal norms."
A prime example is the OAWS recruitment code that has been advertized by the profession. Non-affiliated bureaus have voluntarily joined the code so as to be able to mention the code in their marketing programs. The Association of Social Research Institutes (VOI) also has invited non-members to join its privacy code. In the public health sector individual researchers are reported to have some troubles with the code, particularly the requirement of drafting regulations for each separate database, but privacy has been recognized in setting up a major research project on chronic diseases. Individual complaints under the Health Research Code have not been reported, suggesting a lack of public awareness. In the banking sector, specific privacy complaints under the code, according to knowledgeable observers, are rare. But the privacy aspect may be part of a complaint about, e.g., an adverse credit rating.
Although exchange of customer data in the private sector has scored as a matter of concern in various public opinion polls, the introduction of the so-called Air Miles scheme was a great success. Under this scheme consumers may buy points for air travel when making purchases in, e.g., supermarkets and petrol stations. The Registration Chamber sounded an alarm because Loyalty Management Netherlands (LMN), the organization which runs the scheme, did not properly inform Air Miles card-holders about the use of their data for marketing purposes by participants such as a major supermarket chain, a bank and a department store chain. LMN thereupon adapted its regulations.27 This episode drew attention from the media, not so much because of consumer complaints, but because of the intervention by the Registration Chamber.
After neatly balancing five pros and five cons of a privacy code of conduct, Hustinx in 1991 concluded that "the advantages of codes of conduct prevail over the inconvenience that we have encountered." He confirmed this judgement at the XVIth International Conference in The Hague (1994).
As inconveniences, Hustinx mentioned:
Regulation by code of conduct may be limited by competitive conditions and other peculiarities of a certain sector.
The adequate consultation requirement may create problems of finding a sufficiently competent representative of data subjects.
Data subjects are not always aware of the status of a code of conduct, whether approved or not.
Codes of conduct may complicate the legal framework which applies to a certain sector.
The practical effect of a code may vary according to its scope and status, and due to other specific conditions.
These inconveniences, however, are outweighed by some marked advantages:
Codes of conduct prove to be very flexible instruments for the implementation of the law in specific sectors.
The relevant procedures have a very positive effect on the relationships of the Chamber with various sectors.
Both lead to increased understanding and awareness of the privacy issues which are specific to each sector.
Codes of conduct offer an opportunity to certain sectors to show a real concern for privacy issues.
Sectors with an approved code may serve as example and intermediary for others.
As the history of data protection law in Holland shows, self-regulation may serve various purposes in relation to the legislative process:
- Self-regulation may be intended to avoid legislation;
- Self-regulation may be used to anticipate legislation;
- Self-regulation may serve to implement legislation; and
- Self-regulation also may supplement legislation.
The first point refers to the old debate--still going on in some countries--whether data protection legislation is necessary. Several years ago Hustinx stated as his clear conviction: "only legislation can provide a sound basis for a well-developed data protection scheme." He stressed the second point, the Dutch Government's policy since 1975 to actively promote self-regulation as "a means to prepare for legislation."
An interesting form of self-regulation is found in transborder data transfers from countries with data protection legislation in place to countries without a comparable public law. This may be a barrier for international communications. The 1981 Council of Europe Convention on Data Protection (No. 108) requires "equivalent protection" of transborder flows of personal data to third countries. The more recent Data Protection Directive of the European Union allows data exports to third countries if an appropriate level of protection is ensured.
To prevent data flows from being hampered by a lack of data protection legislation in receiving countries, a contractual solution has been developed. A widely publicized example are personal data transfers within the FIAT car manufacturing company between its branch in France (with a data protection law) and its headquarters in Italy (without a law). This approach also could serve unrelated companies (independent economic interests).
The Consultative Committee, which administers the Council of Europe Convention, in consultation with the Commission of the European Union and the International Chamber of Commerce, developed a Model Transborder Data Flow Contract that was adopted by the Council of Europe in October 1992.28 The EU Commission has accepted data protection contracts as a means of helping to protect offshore transfers of personal data.29
1 Regulation by code of conduct may be limited by competitive conditions and other peculiarities of a certain sector.
2 Privacy and Personal Data Files, Interim Report of the Commission on Protection of Personal Privacy and Personal Data Files (Interimrapport van de Commissie bescherming persoonlijke levenssfeer en persoonsregistratie, Den Haag 1974).
3 Rules for the Protection of Privacy in Connection with Personal Data Files (Data Protection Act) of Dec 28, 1988. (Wet van 28 december 1988 houdende regels ter bescherming van de persoonlijke levenssfeer in verband met persoonsregistraties Wet persoonsregistraties--WPR).
4 G.Overkleeft-Verburg, De Wet persoonsregistraties, Norm, toepassing en evaluatie (with a Summary in English), W.E.J.Tjeenk Willink publishers, Zwolle 1995, ISBN 90 271 4045 5.
5 Frank Kuitenbrouwer, Foreword, in: Victor Bekkers, Bert-Jaap Koops & Sjaak Nouwt (Eds), Emerging Electronic Highways, New Challenges for Politics and Law, Kluwer Law International, The Hague/London/Boston 1995 ISBN 90-411-0183-7.
6 Herbert Burkert, The Temptations of Data Protection (Paper presented at the celebration of the 20th anniversary of the Data Inspection Board, Stockholm, Oct 1993), Transnational Data and Communications Report XVI (May/ June 1994) 3 p 20.
7 Minister van justitie A.Kosto, Notitie over het ontwerp van de Europese richtlijn bescherming persoonsgegevens van 27.6.94 (Note on the European data protection directive, Parliamentary Documents) TK 23 900 VI no. 11.
8 Tymen Koopmans, Privacy and the dilemma of human rights protection, in: P.Ippel, G. de Heij & B.Crouwers (Eds), Privacy disputed, Sdu, Den Haag 1995 ISBN 9034631966.
9 Peter Hustinx, The role of self-regulation in the scheme of data protection, XIIIth Conference of Data Protection Commissioners, Strasbourg 2-4 October 1991, Council of Europe Doc CONF/XIII/ DPC (91) 35.
10 Jaarverslag Registratiekamer (Annual Report, Registration Chamber) 1994, p 24.
11 Jaarverslag Registratiekamer (Annual Report, Registration Chamber) 1989-1991, p 9.
12 Article 15,6 DPA: A declaration is not binding on the judge.
13 Article 15,7 DPA: Decisions of the Registration Chamber [on codes] will state the Chamber's reasoning. They cannot be appealed to the administrative courts.
14 Frits de Graaf, Haken en ogen aan de gedragscode voor persoonsregistraties (Squabbles on
the code of conduct on personal record systems), Computerrecht 1990/3 p 119.
15 Summary published in Staatscourant (Government Gazette) 1990, 232.
16 COSSO gedragscode persoonsregistraties, Staatscourant (Government Gazette) 1991, 12.
17 VOI-gedragscode persoonsregistraties, Staatscourant (Government Gazette) 1991, 88.
18 Privacy-gedragscode markt-en opinieonderzoek, Staatscourant (Government Gazette) 1991, 111.
19 Gedragscode Direct Marketing Instituut Nederland, Staatscourant (Government Gazette) 1992, 194.
20 NEFARMA-Privacy-Gedragsregels, Staatscourant (Government Gazette) 1992, 198.
21 FIDIN-privacygedragsregels, Staatscourant (Government Gazette) 1992, 235.
22 Gedragscode Nederlandse Postorderbond, Staatscourant (Government Gazette) 1993, 60.
23 Gedragscode Nederlandse Vereniging van Handelsinformatiebureaus, Staatscourant (Government Gazette) 1993, 118.
24 Gedragscode Gezondheidsonderzoek, Staatscourant (Government Gazette) 1995, 140.
25 Privacy Gedragscode banken, Staatscourant (Government Gazette) 1995, 207 .
26 Gedragsregels voor Privacy en Kaartintegriteit Open Chipkaarttoepassingen, Nationaal Chipcard Platform, Leidschendam 1996 (English Text available).
27 Jaarverslag Registratiekamer (Annual Report, Registration Chamber) 1995, p 29.
28 Frank Kuitenbrouwer, Protection Clauses for Data Flows, Transnational Data and Communications Report, XVI (Sept/Oct 1993) 5, p 9.
29 Article 26,2 of the Data Protection Directive (supra note 1).
Electronic Commerce... Its Regulation is not "Closely-Related to Banking"
Mary Clare Fitzgerald
Paul A. Seader(2)
As the Economist noted a number of years ago, "In banking,America is a special case: a long history of populist hostility to banksgave American banking a legal structure that no other countries quite share." As the debate begins on the appropriate regulatory regime for the electronic commerce industry, we would like to humbly suggest that the rest of the world may be on to something. America should eagerly embrace this opportunity to move away from the perceived oppressive regulatory structure still endured by its banking industry to a cooperative regulatory environment more appropriate to the rapidly developing markets of the 21st century.
The development of electronic commerce has been both revolutionary and evolutionary. Revolutionary since new technological applications can result in sudden and abrupt changes in the way business is conducted. Evolutionary in that individuals gradually modify their behavior as they adapt to technological refinements and improvements.
Fundamentally, what we call "electronic commerce" may be viewed as a method by which business is transacted in a more direct and efficient way. However, it is overly simplistic and incorrect to view electronic commerce merely as a new way of accomplishing time worn tasks. The many new technological applications--smart cards, stored value cards, electronic money, biometric devices and the Internet--can establish a paperless and intimidating "hard" currency-less environment in which the ability to create entirely new products and services easily outpaces any legislative process. Technology makes possible what was heretofore impossible. As a result, competition has been enhanced; the consumer is given a greater choice of goods and services; the business person has greater control over the types and scope of products and services offered.
It has often been observed that the basic requirement for effective governmental regulation is a thorough knowledge of what is being regulated. However, in recent years the ability of regulators to remain knowledgeable has been severely tested by both the rapid development of technology as well as the increasingly diverse and complex types of activities in which financial institutions, technology companies and others are now engaged. The current legal and regulatory framework--with clearly defined permissible activities based upon the nature of a corporate charter--is ill equipped to accommodate the requirements of new electronic products, services and delivery and payment systems.
As a result, we believe that now is the time to examine and consider the creation of a new regulatory framework for electronic commerce. We can begin our examination by reviewing the origins of the current financial services regulatory structure, which was constructed in the 1930's against the backdrop of the greatest economic catastrophe of the 20th century--the Great Depression.
As a new regulatory regime was debated in the crisis atmosphere of the 1930's, not all the focus was on the banking industry. A regulatory regime for the securities industry was also being developed--one which we suggest could provide a paradigm for the ultimate development of a framework for the new electronic commerce industry. Perhaps even more importantly, such a framework could also be the vehicle for the transition from today's compartmentalized regulatory structure to a more fluid oversight of the new electronic world.
In a recent speech, Securities and Exchange Commissioner Steven M. H. Wallman observed that the terms "entity" and "functional regulation" may be concepts whose time and utility have passed. Commissioner Wallman made clear that he had not begun to reach any final conclusions, but had been looking at what he termed "goal-oriented" regulation, where the regulator could shed the traditional "command and control" approach, to simply establish the goals the industry would pursue, but leave the specific "ways and means" to the private sector.
For our purposes in this paper, there would seem to be two essential requirements for a new regulatory regime for the electronic commerce industry: (1) ease of entry to the market for new products and (2) a formalized oversight structure which facilitates problem-solving by institutionalizing continuous interaction and dialogue among experts in the private, public and academic sectors. There should also be reasonable and affordable access to these new products by both individual consumers and commercial entities.
In 1938, with the passage of the Maloney Act, a model was established to oversee the newly emerging over-the-counter markets in the U.S. A framework was set up to allow for an interaction among the formal government regulatory authority, the Securities and Exchange Commission ("SEC"), the stock exchanges and a self-regulatory organization, the National Association of Securities Dealers ("NASD"). Senator Maloney of Connecticut, then-Chairman of the Senate Committee on Banking and Currency and the primary author of the bill, described the solution as a "balanced political compromise, providing for an agency to oversee the industry, but also allowing for regulation within the industry, with substantial responsibilities being imposed upon both the stock exchanges and the NASD for the internal regulation of their members."1 Senator Maloney later said he would describe this not so much as self-regulation, but as cooperative regulation.
We would add one additional thought to Senator Maloney's apt statement: For the transition period through which the electronic commerce industry must proceed into the next century, cooperative regulation must also allow for the careful and constant management by government and the private sector, both domestically and worldwide, of what could well turn out to be a "bumpy" transition period of indefinite duration. Cooperative regulation may provide the flexibility, creativity and durability to navigate through this formative period. Such a structure must foster this transition in an orderly fashion, but it also must impose an appropriate level of regulation, to the extent necessary to enable broad, but properly conducted, participation by diverse entities on a global basis.
A self-regulatory organization (SRO) paradigm, or more appropriately, as Senator Maloney named it, a "cooperative regulatory" vehicle could provide both flexibility and comprehensive expertise. Privacy, fraud, encryption, interoperability, universal access, asset security and systemic integrity--we're all familiar with these issues. The question is, through what lens are we viewing these fundamental public policy concerns?
Before tackling the perceived problems, we need a framework, a problem-solving apparatus appropriate to the emerging industry, with which to view these issues--not from the parochial perspective of narrowly focused industry segments, but from the evolving new electronic commerce industry perspective.
The balance of this paper will be devoted to a brief discussion of the existing supervisory and regulatory framework governing financial services in the U.S. In the next section we examine how banking regulation evolved. We then discuss the changes in the markets and in technology which began to tear at the existing regulatory framework. Finally, we look more deeply at the SRO model and its adoption and application to electronic commerce.
BRIEF HISTORY OF BANKING SUPERVISION IN THE U.S.
For more than 130 years, Congress has been concerned with the soundness and stability of the nation's financial institutions. Since the late 19th century, federal law has sought to address the objectives of preserving the soundness of the banking system by separating the business of commercial banking from unrelated economic and investment activities. Financial compartmentalization in the United States has been reexamined repeatedly during the past thirty years; however, the barriers to affiliation have remained steadfast. This long-established tradition reflects U.S. historical experience. A major feature of this experience is the dual banking system of often competing state and national chartering authorities.
A discussion of the existing bank regulatory framework can only begin with an explanation of why and how the impenetrable "maze" known as bank regulation, came to be. When attempting to explain the recent history of bank supervision in the U.S., the words of one of the senior statesmen of American banking, Carter H. Golembe, come to mind:
The simple and rather straightforward government guarantee of deposits adopted in 1933, during the Great Depression when over 9,000 banks failed, has grown into a massive financial stabilization system in which the reimbursement of depositors of failed banks is almost incidental. The inevitable consequence of this has been intensive and often intrusive government involvement in the management of banks by federal agencies.2
Mr. Golembe has further asserted over the years that the American public's deeply ingrained fear of bank power basically explained how easy it was to maintain the hijacking of market discipline for so long.
While economists generally come to the same conclusion as Carter H. Golembe, they tend to provide a more analytical explanation of why the circumstances demanded the "hijacking of market discipline." The explanation generally describes public policymakers' seeking a least-cost solution to containing risk. While SRO's were seen as appropriate for the securities industry, in the case of the banking industry, however, policymakers believed the following other realities took precedence:
- Government finance imperatives;
- Natural asymmetries of information, and
- Systemic risk.
In other words, the substitution of governmental regulatory agency judgment for market discipline was planned. This transfer of responsibility from private to public sector appeared to work reasonably well until the 1960's when, as we discuss below, this arrangement became counterproductive.
This stands in marked contrast to the regulatory framework that governs the securities markets. In this regard, the fundamental principle on which securities oversight rests is full, fair and accurate disclosure to potential investors. Information disclosure requirements are intended to insure that investors can make informed decisions about investing in financial instruments sold in the public markets. The underlying philosophy inherent in the securities laws is that full disclosure will promote public confidence in the capital markets and encourage investments that spur economic growth.
Considerable doubt existed with respect to a mandatory disclosure system when the securities laws were first enacted. In the 1960's, academia made the first serious arguments against the mandatory disclosure system as (1) not beneficial to investors and (2) imposing unnecessary costs on corporations. Today the academic literature is still split for and against this proposition.
However, for the purpose of discussion, the utility of mandatory disclosure as a tool for the future regulation of the electronic commerce industry was underscored in recent remarks of the Comptroller of the Currency, Eugene A. Ludwig. Mr. Ludwig cited disclosure as a worthy topic of enhanced attention from policymakers. He said that while surveys consistently demonstrate that consumers often ignore written disclosures or simply fail to understand them, they nevertheless welcome and retain the substance of verbal disclosures. In an "electronic" world, full and understandable disclosure can be delivered in a far more user friendly mode, greatly enhancing its value in a cooperative regulatory model. One example cited by the Comptroller is the ability of technology to effect person-to-person "dialogue" through computer screens. Thus, the technology will enable the consumer to have control over the format and timing of disclosures.
As mentioned above, in the 1960's, there was a general recognition that anti-competitive banking markets had been created. Depository institutions--commercial banks and savings associations--were told how much they could charge borrowers (usury ceilings were applied to specific types of loans) and how much they could pay depositors (Regulation Q--which also allowed savings associations to pay 25 basis points more than commercial banks). In many states branching was severely restricted (in some cases, prohibited) and small banks often enjoyed "home office" protection. Savings associations could only make residential mortgage loans and could not offer checking accounts. The results of this stifling regulatory environment culminated in the massive thrift crisis of the early 1980's. If savings associations had been allowed to develop competitive products earlier, instead of being forced to remain in fixed and long-term offerings, some of the $150 billion in taxpayer costs might not have been incurred.
So--belatedly, but doggedly--there has been an ongoing effort to engage in "piecemeal dismantling of banking structures established in the 1930's."3 By pursuing piecemeal legislation over the last 15 years, Congress has failed to consider "comprehensive structural reform alternatives [that] would recognize and address the role of the banking industry as an integral part of the larger financial services industry."4
It has been observed that the failure to consider broad modernization of the bank supervisory structure "holds other dangers because measures taken under such an approach divert attention from the development of a more realistic structure for the entire financial services industry and a rational regulatory system applicable to both depository and non-depository financial service providers."5
Therefore, we firmly believe that the electronic commerce industry should not be brought under the bank supervisory umbrella, nor become embroiled in further "piecemeal" efforts to deregulate a fundamentally flawed system. Rather, it ideally should serve as a catalyst for the prompt enactment of comprehensive structural reform and modernization of the bank regulatory regime. Should there be any doubt of this, the market is daily providing clear and compelling evidence.
NEW MARKET REALITIES
The technological advances which have already been introduced into the markets have resulted in changes in consumer attitudes and shifting patterns of trade which have caused many to question the viability of maintaining the existing bank industry "handcuffs" to competition, and have almost completely circumvented the basic foundations of the 1930's bank regulatory regime.6
Some of these recent changes have already (or are about to) dramatically transform the banking industry. For example, Booz-Allen & Hamilton recently estimated that on-line households will generate 30% of the banking industry's retail profits by the end of this decade.7 Currently over 300 banks have sites on the World Wide Web, many of them offering full-fledged banking packages.
An increasing number of bank customers have become more comfortable with new technology. These numbers will inevitably grow more rapidly in the future. Electronic fund transfer volume has increased significantly in recent years, from an annual aggregate of $3.5 billion in 1985 to more than $7.5 billion by 1992.8 This will increase exponentially when the U.S. government begins to pay all vendors and benefit recipients electronically in January 1999. Similarly, the use of debit cards has grown from a modest $84 million in 1985 to $324 million in 1992.9
At the same time, banking products have become disaggregated and are increasingly being offered by other (technologically knowledgeable) financial intermediaries. As a result, Ford Motor Credit, Chrysler Financial Corp. and General Electric Credit Corporation collectively have a 40% share of consumer installment debt.10
Clearly technology has not been gentle with the remnants of the 1930's bank regulatory scheme. Additionally, reviewing the original policy reasons for instituting such an approach to bank regulation reveals that government finance and asymmetries of information no longer exist. As for the third policy imperative--systemic risk-- many analysts believe, "the extent to which the reality of competitive pressures has increased the risk exposure of federally insured deposits in the nation's banking system is unclear. Whether banking industry reforms since 1980 have been successful in this important regard must be questioned in light of consequences, such as the thrift industry crisis."11
The appropriate regulatory requirements inherent in emerging electronic commerce are not government finance and asymmetries of information, but rather, as cited earlier:
- Ease of entry for new products; and
- Private/public partnership in problem solving.
We believe the SRO model described below can provide some of the answers, giving us a jumping-off point for a comprehensive structural framework for this new, evolving industry. We now turn to the SRO model and the parallels we believe exist between the original purposes for an SRO approach and the needs of the growing electronic commerce industry.
THE MALONEY ACT
The SEC's relationship to the industry it oversees is quite different from that of the other independent Federal administrative agencies. Unlike the bank regulatory agencies, the SEC does not directly supervise individual financial institutions, but rather oversees the securities industry self-regulatory organizations--the various stock exchanges and the over-the-counter markets. There are, to be sure, other instances in which the policy of empowering the private sector has been adopted, but securities industry regulation appears to be unique in its utilization of statutorily sanctioned, self-regulatory organizations.
In the 1930's, self-regulation was adopted on the ground of "practicality."12 Originally, attention was focused on the established stock exchanges-- like today's emerging electronic commerce industry, the over-the-counter market was too uncharted in 1934 to be capable of detailed regulation. Governmental oversight of the over-the-counter markets developed in several distinct stages which may serve as a model for the oversight of electronic commerce today.
While little was known of these markets in 1934 when the Securities Exchange Act was passed, it was nevertheless recognized that their regulation was necessary, since "to leave the over-the-counter markets out of a regulatory system would be to destroy the effects of regulating the organized exchanges."13 Because of the potential scope and complexity of over-the-counter regulation, the 1934 Act gave the SEC broad rulemaking powers over the markets instead of mandating a detailed system of regulation as was the case for the stock exchanges.14
During the first four years of the SEC's existence, the broker-dealer community engaged in concerted efforts to organize and oversee itself. Even before enactment of the 1934 Act, the Investment Bankers Code Committee was formed, under the authority of the National Recovery Administration for the express purpose of the promulgation and enforcement of an industry code of conduct. After the Code Committee's powers were invalidated by the Supreme Court in A.L.A. Schechter Poultry Corp. v. United States, 295 U.S. 495 (1935), the Investment Bankers Conference Committee was established under the SEC's supervision to draft a voluntary program of industry self-regulation. It was soon recognized, however, that enabling legislation was needed, particularly exempting the activities of an industry organization from the application of the antitrust laws.
In October, 1936, a new group was formed called the Investment Bankers Conference, Inc. ("IBC"). Working together with the SEC as well as key members of Congress (including Senate Banking Committee Chairman Francis Maloney), the IBC was successful in supporting the enactment of section 15A of the Securities Exchange Act--known as the Maloney Act of 1938. The Maloney Act provided for government and industry cooperative regulation of the over-the-counter securities markets.15
Under the Maloney Act, self-regulation is achieved by permitting qualified associations of broker-dealers to register with the SEC as "national securities associations." As an incentive to join, Section 15A of the Securities Exchange Act allows a registered association to prohibit its members from dealing with nonmembers "except at the same prices, for the same commissions or fees, and on the same terms and conditions as are by such member accorded to the general public."
In order to become registered, an association must comply with certain precise standards. For example, it must be appropriately organized by reason of the number and geographical distribution of its members and the scope of their transactions; it must admit to membership all qualified broker-dealers; and it must discipline members who violate its rules.
Shortly after enactment of this legislation, Senator Maloney described the division of regulatory responsibilities between the registered associations and the SEC as follows:
Congress has undertaken to provide a mechanism whereby the securities business of the country may deal with the problems of technical regulation, leaving to the Securities and Exchange Commission what it is hoped will be the residual position of policing the submarginal fringe which recognizes no sanctions save those of criminal law and of dealing with those problems of regulation with which the industry, as organized under the act, finds itself unsuited or unable to deal.16
After enactment of the Maloney Act, a joint committee including representatives of the Investment Bankers Association and the IBC recommended that a single national securities association be created. While the Act allowed for the registration of regional or functional based associations, the SEC endorsed the concept of a single national association. Thus, in the summer of 1939, a new corporate charter was issued to the National Association of Securities Dealers, Inc.
CONCLUSION: COOPERATIVE REGULATORY MODEL FOR ELECTRONIC COMMERCE
This cooperative regulatory relationship appears to serve as a viable model for the oversight of the emerging electronic commerce industry. In a rapidly evolving operating environment, it is most appropriate that those with technical knowledge and expertise--in this case the private sector--be charged with the responsibility for what Senator Maloney called "technical regulation" while an appropriate federal governmental regulatory agency--or group of agencies--be accorded overarching oversight to deal with "the submarginal fringe which recognizes no sanctions save those of criminal law" and other matters of regulation and enforcement with which the electronic commerce industry finds itself "unsuited or unable to deal."
Recent developments in electronic commerce warrant review of the current system of governmental supervision, regulation and oversight. Historically regulators have been able to regulate by reason of their power to allow access to entry to the market. Those entities who were unwilling or unable to comply were simply denied entry or confined to very limited activities or functions.
Increasingly, it appears that the regulatory framework must be modified from one focusing on particular corporate charters and industry segments to one that facilitates entry to the market for new products and services. Barriers to affiliation--such as the Glass-Steagall Act and Bank Holding Company Act--have become irrelevant and are increasingly inhibiting the ability of U.S. firms to remain globally competitive. There should be a shift away from separately regulating activities of various components of the emerging electronic commerce industry toward a cooperative regulatory system that assures the diverse participants in electronic commerce can compete responsibly in a new electronic environment, transcending national boundaries.
1 Sheldon M. Jaffe, Broker-Dealers and Securities Markets, A Guide to the Regulatory Process, 8.
2 Carter H. Golembe, Interstate Branching-- The End of a Long Road, 1994-5, The Golembe Report 11.
3 John J. Kang, Comment, The Dangers of Piecemeal Reformation of the U. S. Banking Industry, 39 St. Louis L.J. 1099, 1127 (1995).
4 Id. At 1129
7 American Banker, July 11, 1996.
8 St. Louis L.J., supra note 4, at 1129.
12 Hearings on H.R. 7852 and H.R. 8720 before the House Committee on Interstate and Foreign Commerce, 73rd Cong., 2d Sess., 514 (1934).
13 U.S. Government Printing Office, Report of Special Study of Securities Markets, 604 (1963).
14 Hearings on S.3255, 75th Cong., 3d Sess., 7 (1938) (Testimony of Commissioner George Matthews).
15 See S. Rep. 1455 and H. Rep. 2307, 75th Cong., 3d Sess. (1938).
16 Address to California Security Dealers Association, Investment Bankers Association, and NASD, San Francisco, August 22, 1939, quoted in Report of Special Study of Securities Markets, p.606.
A Commercial Lawyer's Take on the Electronic Purse: An Analysis of Commercial Law Issues Associated with Stored Value Cards and Electronic Money, Uniform Commercial Code Committee, Subcommittee on Payments, Banking Law Committee, Subcommittee on Domestic and International Payments and EFT Transactions, Committee on Law of Commerce in Cyberspace, Business Law Section, American Bar Association, 1996 American Bar Association.
Book Review, Loss and Seligman on Securities Regulation, 78 Geo. L.J. 1753 (June, 1990).
Sheldon M. Jaffe, Broker-Dealers and Securities Markets, A Guide to the Regulatory Process, 8 (Sheppard's Inc., Colorado Springs, 1977).
The Dangers of Piecemeal Reformation of the U.S. Banking Industry, 39 St. Louis L.J. 1099, 1995 St. Louis U. Sch. L., St. Louis U. L. Rev.
John M. Pachkowski, J.D., Fair Credit Reporting Act Reform Highlighted, 1677, Federal Banking Law Reports.
Carter H. Golembe, "Interstate Branching--The End of a Long Road. 1994-95 The Golembe Reports (CHG Consulting, Inc.)
H. R. 2307, 75th Congress, 3d Sess (1938).
Kalpak S. Gude, The Integration of Banking and Telecommunications: The Need for Regulatory Reform, 46 Federal Communications Law Journal 3 (June 1994).
Steven M. H. Wallman, Regulation for a New World, Business Law Today, Nov/ Dec 1996, (American Bar Association Section of Business Law).
U.S. Government Printing Office, Report of Special Study of Securities Markets of the Securities and Exchange Commission, (Washington, DC, 1963).
R. 1455, 75th Congress, 3d Sess. (1938).
Symposium: Functional Regulation: A Concept for Glass-Steagall Reform? Board of Trustees of the Leland Junior University, Fall 1995, 2 Stan. J.L. Bus. & Fin. 89.
The Transformation of Banking, Dwight B. Crane and Zvi Bodie, Harvard Business Review, March-April 1996.
Duncan A. MacDonald
Vice President and Group Counsel
Citicorp Credit Services, Inc.
The information revolution of the late twentieth century has provided consumers with an almost dizzying choice among increasingly sophisticated information-based products and services. The services industries in general, and the financial services sector in particular, compete on the basis of their ability to link together in a seamless and largely invisible way the data systems maintained by a large number of diverse participants in our economy, including merchants, lenders, credit bureaus, depository institutions, and others. By making these linkages work, service providers are now able to offer consumers nearly instantaneous worldwide access to their credit and deposit accounts through on-line authorization systems, 24-hour customer service departments, omnipresent ATM machines, and computerized shopping and banking services.
Financial service providers typically enter into continuing contractual relationships with their customers under which they are obligated to continue to provide services on request, while dissatisfied consumers can, at little cost or inconvenience to themselves, switch to a competing provider at virtually any time. On their side, service providers maintain a long-term interest in the goodwill of their customers. A one-time misuse of personal data in such an environment might provide a small profit on an individual transaction, but it could also result in an irate customer who abandons what might otherwise be a mutually beneficial long-term relationship.
In such a dynamic environment, participants, including both consumers and businesses, have come to rely on the flexibility provided by general principles of contract law to define their agreements with one another about how data should be used and shared. Consumers in effect agree to provide access to information about their private activities and preferences in exchange for convenient and efficient services. In such an environment, flexible mechanisms of disclosure and private contract ensure that rigid and often unnecessary public law restraints do not interfere with the ability of businesses to provide the services consumers want to buy, while giving consumers the right to "purchase" the kinds of privacy protection they truly value.
THE LIMITS OF PUBLIC LAW
Some countries, notably in Europe, have adopted omnibus legislation to define principles and procedures for the protection of personal data. Even there, the contractual relationship further defines the consumer's privacy choices and expectations in particular applications. In this country, no consensus has emerged around any universal conception of "informational privacy" that lends itself to codification. As noted in the NTIA's October 1995 report, Privacy and the NII, the privacy of personal information is treated differently in different contexts, in both federal and state statutes and common law jurisprudence, as well as in private contracts and industry codes of conduct, without a common set of definitions and procedures. Financial services companies, for example, are typically subject to bank secrecy laws, as well as quite specific regulations concerning credit reporting practices. With regard to the treatment of personal information in the private sector, the lack of uniformity across business sectors seems to reflect the diversity of public opinion about the tradeoffs between convenience and anonymity in different contexts, as well as the fact that technological and marketplace changes are constantly altering that equation.
In the United States, federal and state regulation is further constrained by the uncertain constitutionality of attempts to regulate the collection and use of non-fraudulent, non-defamatory information. The limits imposed by the First Amendment on the regulation of commercial speech would almost certainly condemn any omnibus data protection law for the private sector that contained vague or overly broad restrictions on the use of data. Privacy rights drawn from the Fourth Amendment ordinarily concern government, not private, intrusions on individual privacy. Compare Whalen v. Roe, 429 U.S. 589 (1977) (constitutionally protected privacy interest in state medical records) with Smith v. Maryland, 442 U.S. 735 (1979) (no constitutionally protected privacy interest in telephone company calling records). Although the private sector therefore does not face direct constitutional restraints on its use of personal data, government efforts to limit private uses of data acquired lawfully and with the consent of all parties to the underlying transactions could give rise to Fifth Amendment takings claims.
Self-regulatory measures, by contrast, avoid such constitutional issues and can be crafted to address particular privacy concerns while offering consumers a range of choices. Such measures range from professional ethics rules (in law and accounting, for example) that generally require consent before disclosure of confidential client information, to industry codes of conduct and joint implementation procedures such as the Direct Mail Association's privacy code and computerized mail preference list.1 Whatever professional or industry codes (or, for that matter, regulatory framework) may apply, each retail service provider has, in the end, a contractual relationship with its individual customers. It is Citicorp's experience that the contractual agreements between a company and its customers, as well as those with its suppliers, data processors, and other parties involved in the use of consumer data, can be used effectively to define and give effect to consumer privacy expectations. Such contractual arrangements can also be used to satisfy foreign privacy law standards for the adequate protection of personal data transmitted to the United States and other jurisdictions that do not have similar omnibus privacy laws and related supervisory authorities.
CONTRACTUAL PRIVACY PROTECTION
Citicorp and other consumer financial services providers, like other companies that necessarily obtain or use private customer information in the course of extending credit or processing payments, have several incentives of their own to maintain the confidentiality of customer information and to control its dissemination. They have an interest, for example, in avoiding losses due to fraud, building customer confidence so that customers will use their financial services products, and protecting competitively valuable customer service and marketing information. Thus, the prerequisites exist in this industry for a self-regulatory privacy regime that is mutually advantageous to companies and their customers. As consumers have become more sensitive to privacy issues, this topic increasingly appears in contracts and public statements of corporate policy, thus giving consumers the ability to choose among service providers if those practices are sufficiently important to the consumers in question.
It is true that privacy policies of this sort are not separately negotiated with each individual consumer. Most companies, even in a highly competitive market such as consumer financial services, must obtain and use certain data in relatively standard ways in order to provide the requested services efficiently, and it would be wholly impractical for such companies to collect and process data according to a large number of variable protocols, depending on variations in particular contractual arrangements reached with individual customers. But companies can inform their customers about their data handling practices and allow customers to make choices from a menu of standard options with respect to particular uses of customer data (such as marketing).
Thus, Citicorp's cardholder agreement summarizes the data handling policies of its credit card business and informs customers that they may ask to be excluded from particular marketing practices. So, too, do Citicorp's agreements with its private banking and branch banking customers, and its contracts with corporate customers. Moreover, Citicorp periodically reminds its customers of those policies and rights. Citicorp guarantees the security of all personal information that it obtains in providing services, whether processed by Citicorp or by a third party hired by Citicorp, and undertakes, on request, to make corrections and communicate those corrections to credit reporting agencies. Citicorp does not furnish transactional data to third parties (except in the context of credit reporting and debt collection, within the bounds established by the Fair Credit Reporting Act and the Fair Debt Collection Practices Act, respectively). It does, however, provide customer information to its own affiliates and also distributes third-party advertising in correspondence with cardholders. Citicorp allows any customer to "opt-out" of either or both of these marketing practices. Citicorp also uses market research, including surveys and focus groups, to test consumer perceptions of privacy interests and procedures as these evolve over time.
The contractual approach adopted by Citicorp satisfies the notice and consent principles identified in the NTIA's 1995 NII privacy report (at 20-27) for telecommunications-related personal information, where there is less opportunity than in most service industries to change service providers based on privacy practices. It is also consistent with the "transparency" (notice or disclosure) and consent principles found in international data protection instruments such as the OECD Guidelines, Council of Europe Convention 1082, and the more recent European Union Directive 95/46/EC.
Broadly, the transparency or notice principle means that companies should inform their customers of their policies regarding personal data collection, processing, use, and transmission to third parties, while the consent principle suggests that consumers should be given (a) sufficient information to decline a transaction based on privacy concerns and (b) a choice as to the use of personal data for additional purposes beyond the requirements of that transaction. In addition, these international instruments and relevant foreign national privacy laws characteristically include provisions designed to ensure that these principles are given effect through meaningful remedies, including the opportunity to correct inaccurate data, rectify practices contrary to law or announced policies (or to the privacy options the customer has selected), and obtain monetary compensation where the consumer suffers pecuniary loss due to the improper use or dissemination of personal data.
Contractual approaches, such as Citicorp's, can include such procedures for correcting data and rectifying data practices, and these agreed procedures can be enforced under normal contract law principles, including monetary damages if they can be proven. In the case of consumer credit services, certain remedies, as well as related notice requirements, are also mandated by federal or state law, such as those provided under the federal Fair Credit Reporting Act as amended this year. In addition, noncompliance with announced privacy practices might serve as grounds for termination of a consumer contract, especially if the practices are outlined in the contract itself. Although actual damages for such defaults may be difficult to establish, consumers can, in appropriate cases, petition the courts to order companies to correct inaccurate data and enforce contractual privacy policies.
USING CONTRACTUAL APPROACHES TO SATISFY FOREIGN PRIVACY LAWS
Companies that operate internationally, as Citicorp does, are already typically subject to omnibus data protection legislation in Europe and in a growing number of jurisdictions in other regions. A common feature of those laws is that data protection authorities, or designated corporate data controllers, are directed to block the export of personal data to other countries lacking similar privacy law legislation, unless there are satisfactory assurances of "equivalent" or "adequate" privacy protection. Such assurances can take the form of internal security measures and contractual undertakings with consumers and with the affiliate or other third party handling data outside the consumer's home country. Some foreign privacy laws require specific notice to the data subjects when their data is to be processed abroad, and companies may wish to make such disclosure in any event in the interest of good customer relations.
Data collection and processing has long since ceased to be an activity confined to national borders, especially in the financial services sector. Commercial banking and credit card customers depend upon electronic data transfers to conduct individual transactions across the globe, and some service providers, such as Citicorp, actively market their services from one country to citizens of a number of countries. The institutions that facilitate credit and payment transactions exchange data as required to effect payments or transfers and detect fraud, and the individual companies typically find it necessary, or at least more economically efficient, to consolidate their account and transactions processing in a small number of data centers. For example, virtually all of the information associated with Citicorp's credit card accounts--whether collected domestically or abroad--is ultimately processed by affiliates in Nevada, South Dakota, or Maryland. For companies such as Citicorp, then, establishing "adequate" security measures and contractual privacy safeguards offers a means of processing efficiently--under a single standard--customer data gathered worldwide and otherwise subject to the informational privacy laws of a multitude of nations.
The data protection authorities in those countries that have adopted laws based on Council of Europe Convention 108 normally have experience authorizing data transfers to the US and other jurisdictions based on such contractual arrangements, sometimes backed by additional undertakings to the data protection authority itself. To facilitate the use of contractual arrangements in these cases, the Council of Europe, with input from the European Commission and the International Chamber of Commerce, adopted a set of model contract clauses on a trial basis. Citicorp and other companies have employed such clauses in their transborder data processing arrangements.
The EU privacy directive includes relevant "derogations" to the prohibition of data transfers to countries without similar legislation, suggesting that contractual solutions will still be viable after the Directive's implementation deadline in October 1998:
Consent: Data transfers are to be permitted where the individuals have given "unambiguous" informed consent to the processing of data abroad (Art. 26(1)(a)) (except that Member States may restrict such transfers in the case of defined categories of "sensitive data." See Art. 8(2)(a)).
Contractual or precontractual transactions: Data transfers necessary to the performance of a contract (or to the creation of a contract) with the data subject can be effected (Art. 26(1)(b) and (c)). It is not clear whether this necessarily entails notice to the customer or prospective customer under Arts. 10(c) and 11(1)(c), which require disclosure of "further information" relevant to fair processing by the company or a third party.
Contractual safeguards: EU Member States are explicitly permitted, in assessing the "adequacy" of privacy protection for a contemplated "transfer or set of transfers," to consider safeguards provided by "appropriate contractual clauses" (Art. 26(2)). Moreover, the European Commission may mandate transfer authorizations based on the use of "certain standard contractual clauses" (Art. 26(4)). (These approaches are subject to discussion and recommendations by the Committee of national and Commission privacy experts created under Article 31 of the Directive, and those bodies have not yet outlined the circumstances under which they consider contractual solutions adequate.)
Citicorp has used contractual solutions to meet the requirements of current European privacy laws. For example, in the case of data transfers to process applications and transactions for its German customers at data centers located in the United States, Citicorp has satisfied German data protection and banking authorities by (a) including appropriate disclosures in its cardholder agreements and (b) entering into an "Interterritorial Agreement" among the various Citicorp affiliates operating in Germany and processing data in the United States. Under the Interterritorial Agreement, the Citicorp processing entities agreed to follow the privacy principles laid down in German law, to subject themselves to German administrative and civil law remedies, and to indemnify the German affiliates for any damages paid by them to aggrieved customers. Citicorp also entered into an agreement directly with the German authorities to allow them to audit its data processing facilities in Nevada and South Dakota.3
The contractual model is not the entire solution to all informational privacy concerns. As noted above, in the case of consumer financial services contractual approaches operate in conjunction with federal, state, and foreign consumer protection and banking regulations, which establish certain minimum requirements. And as the NTIA's Privacy and the NII report observed, there are industries in which consumers have less choice among service providers and privacy practices. Thus, Congress and the FCC have acted to impose unique restrictions on the use of subscriber information by telephone and cable television operators, and have adopted measures restricting forms of marketing by telephone and fax that are considered particularly intrusive or costly from the consumer's perspective.
For most of the private sector, however, contractual approaches should be the preferred method of offering privacy protection. They provide the flexibility for companies to introduce operational efficiencies and explore new services and applications, as well as new privacy options. Thus, the contractual model is a form of self-regulation that allows innovation and efficiency while accommodating personal choices over personal privacy.
1 Congress mandated a similar self-regulatory mechanism in the Telephone Consumer Protection Act of 1991, 47 U.S.C. § 227, (requiring telemarketers to consult a list of those who have opted not to receive telephone sales solicitations and to abide by those stated preferences).
2 Council of Europe Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data.
3 In addition, U.S. and German banking authorities have exchanged letters agreeing on the conditions under which German affiliates of U.S. banks may have their data processed at U.S. facilities.
Content Ratings for the Internet and Recreational Software
Recreational Software Advisory Council
Self-regulation is a tough road. It requires considerable effort, time, resources, good judgment, honesty and fair execution on the parts of all those within an industry who have opted (or have been persuaded) to regulate themselves. It is usually preceded by threats of or actual legislation by government and is often accompanied by a skeptical public who must be won over to the notion that the fox can, indeed, be trusted with the hen house. In this paper, I will outline two examples of self-regulation, one in the field of computer games, the other, harmful material on the internet, both of which illustrate the difficult, though not insurmountable, challenges of an industry policing itself.
Back in the winter of 1993, U.S. Senator Joe Lieberman of Connecticut was made aware of the killing move in the hit computer game, Mortal Kombat, in which the winning player severs off his opponent's head and pulls out his spinal cord. This, together with similarly gruesome and sexually explicit material in other games, persuaded the Senator that something needed to be done. Children were being increasingly exposed to excessively violent material in what were supposed to be entertainment games. And there was very little that parents, concerned about what their children were being exposed to, could do to discern from the cover of a box what lurked within a cd-rom or Nintendo video game.
Together with Senator Kohl of Wisconsin, Senator Lieberman held a series of hearings where senior executives of the computer and video game industry were called to testify. Legislation in the form of the Video Game Ratings Act of 1994 was drafted and held as a potent threat over the heads of the industry to get their houses in order. As with the recent V-Chip legislation, there was a get-out clause that allowed the industry a one year period to create a self-regulated rating system for computer games and be spared the new law. Otherwise Congress would create and administer a rating system itself.
This challenge was taken up by the Software Publishers Association (SPA), the largest trade association in the computer software sector. Together with five other associations, the SPA established a Computer Game Ratings Working Party to address and to meet the elements that Lieberman stated would be benchmarks "to measure the industry's efforts." A "good" rating board would have to:
- Be independent;
- Have members who reflect the public, not the industry;
- Have the power to penalize wrongdoers;
- Be able to keep pace with technological advances; and
- Advertise the ratings so that they become as well known to purchasers as movie ratings are today.
Further, Lieberman laid out the three aspects that were essential for the self-regulated ratings system to be seen as credible: it must be subject to sanctions; it must provide as much information about the reason for the rating as possible; and it must have "tough, conservative standards."
STATEMENT OF PRINCIPLES
To its credit, the SPA's Working Party addressed each of these areas in their Statement of Principles. This statement was agreed to by the fifty-plus members of the Working Party and it specifically addressed two key areas: what kind of a rating system was to be developed and, crucially, what kind of organization would oversee it. Of the seven principles listed, two stand out in this current debate. The first reads:
To ensure credibility amongst consumers, and to build broad participation by all segments of the software industry, the ratings program must be administered by a truly independent body organized outside of any industry trade association.
The other significant principle states:
The ratings program should strive to achieve a high degree of objectivity in order to assign consistent ratings independent of industry pressure.
These ambitious goals were later to be transformed into the non-profit organization, the Recreational Software Advisory Council and the ratings system for computer games (and later for the Internet) developed by Dr. Donald Roberts, Chairman of the Communications Department of Stanford University.
A NEW KIND OF RATING SYSTEM
Unlike a motion picture, which averages one and a half hours to view, a typical computer game can take one hundred hours of playing before you have uncovered all the material on a disk. This fact alone posed an enormous challenge to the Working Party when they began to design a rating system for interactive CD-ROMS. In addition, there was a growing chorus of criticism leveled at the Motion Picture Association of America (MPAA) system familiar to movie goers for being too subjective, secretive in its criteria and decision-making processes, judgmental about who should or should not see a film, lenient on violence and unduly tough on sex. And it was obvious that a full, prior review system involving a panel of reviewers, interacting with the over 2,000 titles produced each year, was not going to be feasible for this market.
The Working Party turned to Dr. Roberts, an expert on the effect of media violence on children, to devise a content-based, rules-driven, objective, self-rating system that would give detailed, yet easy to understand information about the levels of Violence, Nudity/Sex and Language in a given product. These categories would be rated automatically through an ingenious questionnaire that branched the game maker through a series of highly detailed and carefully defined questions, the answers to which were either "yes" or "no." The internal algorithm allowed for immediate access to the category scores (from 0 to 4) and to descriptors which give further information about what a parent can expect to find in a title. Thus, Doom rated a Violence Level 3, with the descriptor, "blood and gore." The rating score is displayed on the front cover of the box in a clear and unequivocal way to alert parents of the content of the box, much like the FDA food labels for a can of soup.
To fund the administrative costs of administering and promoting the system, a fee scale was created which took into account the financial status of the software company taking part. Currently these charges are:
- $400/title for companies with a gross income in their previous financial year of over $1M
- $250/title for those between $100,000 and $1M
$50/title for those under $100,000
FORMATION OF RSAC
To administer this new system, the Working Party helped to establish the Recreational Software Advisory Council (RSAC) in a way that met with the Senators demands for credibility. Written into the By-laws of the new organization was a requirement that there would always be an inbuilt majority of Board members from outside the industry. And, although not stated in the constitution of RSAC, the Board decided to appoint the Executive Director from a family and social-welfare field. (My previous position was Director of the National Stepfamily Association of Great Britain). Thus, from its inception, RSAC would have an arms-length distance from the industry that had helped to create it. It became a vital part of its early success that the organization could be seen to be fair, balanced and not unduly influenced by game makers and their distributors.
CHECKS, BALANCES AND AUDITS
One of the most critical aspects of any self- regulatory regime is the lengths that it goes to ensure that people aren't cheating the system. Software makers using the RSAC self-rating scheme must enter into a legally binding contractual agreement with each game they rate. Section 7 of the contract states:
In the event that the Rated Software Title does not meet the standards and specifications of the Assigned Rating, or there is any material misrepresentation or violation of the Ratings Application, RSAC may, after written notice and an opportunity for Applicant to defend the basis for the Assigned Rating, take appropriate action, including but not limited to corrective labeling, consumer and press advisories, product recalls, and/or monetary fines.
In other words, if the makers of Doom tried to pass the game off as a Violence Level 1, they could face considerable legal action, bad press and a fine of $10,000. Inherent with that threat, would be the even greater threat that if the industry couldn't participate fairly in its own rating system, then Congress would dust off the shelved legislation and institute its own.
In addition to the legal contract, spot checks and audits of a random selection of the rated software titles was begun. A team of auditors was established in the Psychology Department at Yale University under the guidance of Dr. Dorothy Singer. Reviewers were asked to play through all levels of the games and to fill in the questionnaire themselves to see what score they got. (They were assisted by "cheat guides" and "god codes" provided by the software publisher to ensure they reached the end). These scores were then compared with those generated by the game maker and any discrepancies taken up with the game company. In some cases, the RSAC Appeals Committee was set up to adjudicate between what a game publisher thought the game should be rated and the results of the questionnaire and in-built scoring system. (Out of nearly 500 games, only two have been taken to a full Appeals Committee.)
Like the movie rating system, RSAC is a voluntary scheme. If you are a film maker, you don't have to rate your movie. If you are a computer game maker, you have a choice to rate or not. If, however, you wish to distribute either your movie or cd-rom, it is more likely that you will feel compelled to rate. In the movie industry, The National Association of Theater Owners (NATO) clearly states that its members will not show unrated movies. Retailers of recreational software have shown a similar attitude to unrated computer games. Wal*Mart was the first chain to declare that they would only stock rated games and that they would reserve the right to not carry product which, in their opinion, did not reflect the values of their company. Toys R Us, Sears, and NeoStar have either made similar announcements or made it known to software makers that if they want shelf space, they had better submit their titles for rating.
Again, Senator Lieberman was instrumental in bringing pressure to bear on this part of the industry. He and Senator Kohl wrote to the major retail outlets and held a press conference praising those who conformed, and criticizing those who did not. After nearly three years of entreaties, private and public pressure, drafted legislation and full-scale Senate hearings, virtually all software games that sell at retail have a rating label on them. After some initial resistance from within parts of the industry, rating labels have become so commonplace that it is hardly commented on by software publishers or retailers. Through a process of carrot and stick, the government has ensured that the industry has "voluntarily" imposed a regulatory rating scheme upon itself without the need of a dedicated government department and all the expenditure required to bring one into place.
CONTENT RATINGS FOR THE INTERNET
If Senators Lieberman and Kohl used a combination of carrot and stick on the software industry, the government has used nothing but stick on the internet industry. From the hopelessly ill-informed Senate Judiciary hearings in the summer of 1995 to the crude (and possibly unconstitutional) elements of the Communications Decency Act (CDA), Congress has badly managed the issue of how or even whether to regulate the Net. As this is being written, the CDA remains in legal limbo awaiting a Supreme Court ruling. The very mixed message given by government has created an anxious waiting period as content providers, civil libertarian groups, internet service providers (ISPs) and the major telecom companies all look to the Court for clarification. In the meantime, groups such as Enough is Enough are promising some kind of "Son of CDA" if the current law is overruled on constitutional grounds. Nevertheless, there are some clear signs that, once again, an industry will get behind a self-regulatory scheme, this time in the area of content filtering and parental controls on the internet.
RSACi--RSAC ON THE INTERNET
In August 1995, Senator Grassley chaired a Senate Judiciary Hearing into the issue of pornography on the internet. These hearings were held in an atmosphere of near hysteria following the cover article in Time magazine on the Rimm report suggesting that filth pedlars roamed the net unchecked and that merely switching on your computer would expose you and your children to an avalanche of smut, porn and bestiality.
In my testimony to these hearings (http:// www.rsac.org/press/950724.html), I argued for restraint on the part of would-be government regulators and, instead, suggested that the industry be encouraged to get behind a self- regulatory scheme. I committed the organization at that time, to adapt and convert our highly regarded rating system for use on the web.
We had to move quickly. The Exxon amendment was gaining momentum (and not a little notoriety). Fortunately for us, an important initiative was being formed at the World Wide Web Consortium (W3C) at MIT. This became known as The Platform for Internet Content Selection (PICS). PICS is a protocol or computer language that enables rating systems to be read and understood by browsers, web masters and search engines. It is the HTML of rating systems.
RSACi WORKING PARTY
During the fall of 1995, we formed the RSACi Working Party to convert the rating and selection process from the physical world of software boxes and retail shelves to cyberspace. We had participation from Microsoft, Bell Atlantic, Time Warner, MIT, Stanford University and George Washington University. We decided early on that the system would have to be fully web- based with no paper transactions at any stage. It was agreed that it would be a free system to both content providers and to users (parents, teachers, etc.). And we would incorporate an element that became known as "granularity" or the ability for a content provider to rate their website with one rating, or to separate out various branches, directories, even individual pages with separate ratings. Thus the Playboy site could rate their Jimmy Carter interview of 1976 differently than the January Playmate of the Month.
Unlike the computer game ratings, RSACi was planned to be a free service. The Working Party wanted to ensure that there were as few inhibitors to using the voluntary scheme as possible, and cost was a big one. In addition, there was an issue of how secure cash transactions would be over the net and the resulting administrative headache of chasing non or late payers.
Instead, the group opted for a sponsorship scheme to help raise the necessary financing to run the new ratings service. Three levels of sponsors were created:
Corporate Partner - $100,000
Corporate Sponsor - $50,000
Corporate Donor - $10,000
and a category known as an RSACi Licensee for companies that wished to use the system for a particular application within their own product.
For their sponsorship, companies were promised a considerable PR and media blitz to announce their generous support. In addition, there would be a prominent link from the RSAC home page back to their web site and a place on the newly formed RSACi Advisory Committee. Sponsorship of RSACi was promoted as a way of showing leadership and corporate responsibility in this highly contentious and political issue.
A business plan was produced and sent out to many of the top internet companies with appeals for their support. A sum of $350,000 would have to be raised in the first year to get the system up and running. And herein lies a major problem area for all self-regulatory schemes--Who pays? In the case of a homogenized and coherent industry such as the movie or computer game industry, it is very clear that fees should be levied against the producers of the material to be rated. On the internet, it is less clear who makes up the industry and, therefore, who should be responsible for supporting such a scheme.
Of course, one of the greatest weapons in the arsenal of a self-regulatory body is the threat (real or imagined) of further or worse government legislation if the industry concerned doesn't get its act together and participate. This, however, can back-fire particularly in the arena of free speech and expression on the Internet. News sites and civil libertarian groups can (and some have) argue that should further legislation come down the line, they will be at the forefront of challenging it on constitutional grounds.
LAUNCH OF RSACi
By mid-April all of the elements were in place. Sponsorship from Microsoft, the Software Publishers Association and Dell Computers was announced (later to be joined by CompuServe, US Web and UltraNet) and the service began at the RSAC home page. Every major newspaper in the United States covered the story and the resulting interest was impressive. Since the launch over 11,000 sites have rated at a rate of over 100 sites per day. The RSACi system has been incorporated into the Microsoft browser, Internet Explorer 3.0 and into the leading software blocking device, CyberPatrol. Netscape has publicly announced that they will incorporate PICS into a later version of their browser, but have been unspecific about when.
CHECKS, AUDITS AND CONTRACTUAL ARRANGEMENTS
Similar to the computer game rating system, RSACi incorporates a number of checks and balances to ensure fair play. Every site that is rated is checked to ensure that the rating labels are correctly placed within the header of the rated site. In addition, a random selection of sites are chosen each day for thorough evaluation to ensure the rating tags accurately reflect the content on the site. Further, a web crawler is being developed that will visit every RSACi tag on the web and check it against the RSACi database of registered users. There is also a proposal to incorporate digital signatures that will give further security and reliability to the labels.
A further inhibitor to cheating the system is the contractual agreement entered into when using the RSACi system. This appears at the beginning of the ratings process and alerts the content provider that they are entering into a legally-binding agreement with RSAC that they have not willfully misrepresented themselves in the ratings process. Lastly, we depend on the citizens of the net to alert us to sites that they have come across that they feel have been wrongly labeled. (To date, only three sites have been brought to our attention in this way. Two of them had misunderstood the process, and the third decided to unrate.)
What began as a response to threatened legislation in the U.S. has, unwittingly, become a major factor in the discussions regarding content on the internet within governments around the world. In addition, our system is widely used, as it is currently structured, by sites throughout Europe (over 1,000 sites rated in the UK alone), the Far East and Australia. In virtually every case, governments are struggling with the issue of whether to wade in with draft legislation, or to encourage, or even coerce, the internet industry to regulate itself.
In Britain, the association of Internet service providers have backed PICS and called upon all of their users to rate with RSACi by the end of the year. In the European Union, the Commission dealing with telecom issues, DGXIII, have backed the notion of self-regulation and commended RSACi (with some reservations) in their official report. The Chairman of the Australian Broadcasting Authority, Peter Webb, recently backed both PICS and RSACi in a public address in Sydney, and interest has emerged from Norway, Canada, Spain, Singapore and China.
We are currently in discussions with a number of major ISPs in some of the countries listed above, to mirror the RSAC site in their country to allow for easier access for local users. Also, the system itself would be localized to ensure that the terms and questions were literally and culturally translated while still preserving the inherent logic of the rating scales. By its very nature, the Internet is an international phenomenon, albeit with a heavy American accent. Any effectual self-regulatory scheme will have to encompass the interests and concerns of a wide range of countries and cultures.
THE V-CHIP AND TELEVISION RATINGS
Running alongside the development of a self-regulatory scheme for both computer games and the Internet, has been the equally political debate over content on television. Congressman Markey and Senator Conrad successfully steered the V-Chip amendment through Congress during the Telecom Bill passage and a steering group headed by Jack Valenti have deliberated for the past year on what kind of rating system should be developed for TV. We have been a part of that discussion and made a presentation to the Working Party in the summer of this year. We have offered our experiences in the self-regulatory field of content rating and proposed a modified version of the RSAC system could be used with the V-Chip. We also recommended that an independent body oversee the ratings process with representation from outside the industry to include child experts, psychologists and children advocates.
Unfortunately, the Valenti group appear to taking a very different stand. From all indications, the ratings process will be completely controlled by the industry with no outside involvement whatsoever (similar to the MPAA movie ratings system). Also, the concerted call from virtually all interested parties for detailed content advisories for violence, sex and language seems to have gone unheeded. It remains to be seen if the FCC will even approve the scheme as it now stands. If it is rejected, there are already signs that the TV industry will take the whole issue to the Supreme Court citing the First Amendment. It is possible that this example of (coerced) voluntary self-regulation will die a dramatic death or re-emerge in a federally mandated way. It would be a great pity indeed, if this historic opportunity for the television industry to redeem itself after decades of criticism, was lost in a Supreme Court standoff. Only time will tell if self-regulation can work in this highly politicized industry.
REFLECTIONS ON SELF-REGULATION AND THE ROLE OF GOVERNMENT
In some ways successful self-regulation comes down to carrot and stick. What are the positive incentives that will motivate an industry to regulate itself and how far can the government go to threaten, cajole or plead with an industry to get its house in order? And at what point does a legislature simply force legislation onto a reluctant and recalcitrant sector?
From my experiences in two different, though related industries--software and the Internet, and my involvement with a third, television, I would say that it is very rare for a group of companies to voluntarily (in the true sense of the word) and without prompting, decide to set up a rigorous, self-policing system that will cost its members time and money to set up, administer, promote and develop. Further, it could be argued that to do this would run counter to the mission of most trade associations unless there was a very real and potent threat of similar if not worse legislation coming from central government. Only then can an industry association legitimately spend its member dues on rallying behind a self-regulatory regime.
In my view, it is the role of government to reflect the legitimate concerns of the public and to bring these issues to a wider audience through hearings, press briefings and, eventually, draft legislation. If this means that legislators embarrass, criticize or even humiliate an industry into recognizing its shortcomings (take the tobacco industry for example), then so be it. If it means that through legitimate pressure, Congress can persuade an industry to take action itself or suffer the consequences, then that seems like a perfectly reasonable role for them to take. And if, after months (and often years) of making their point, the government still cannot bring an industry group to act, then it is very much in the public's interest to legislate. It is then up to the courts to decide if the regulator scheme created through central government legislation is constitutional.
I am a great believer in good self-regulation and in good government. With the right framework, checks and balances, oversight and controls, self-regulation is by far a more attractive route to take then central government mandate.
But self- regulation is a tough road and it takes time, money and resources to make it work. It also requires a healthy partnership between industry, government and the general public for it to succeed.
1. Frank Kuitenbrouwer is a legal commentator of the business newspaper NRC Handelsblad. He writes and lectures widely on data protection, computer crime, transborder data flows and the legal and policy aspects of informatics. He is on the editorial board of several specialized journals including Transnational Data and Communications Report (I-Ways). In 1990 he was a visiting professor in informatics and law (Belle van Zuijlen Chair) at Utrecht University.
3. This report was prepared in response to a request from the National Telecommunications and Information Administration, United States Department of Commerce. Citicorp acknowledges the contributions of Duncan A. MacDonald, General Counsel of Citicorp Credit Services, Inc.; P. Michael Nugent, Associate General Counsel of Citicorp Credit Services, Inc.; Peter J. Gray, Director of Domestic Government Relations, Citicorp; and W. Scott Blackmer and David G. Gray of Wilmer, Cutler & Pickering, Washington, DC.