The British Royal Academy of Engineers has published a very interesting report on privacy and technology: Dilemmas of Privacy and Surveillance. It is important that those that design technology have an appreciation of the privacy impact of that technology and this report is an encouraging step in that direction.
1. Executive Summary
This study identifies likely developments in information technology in the near future, considers their impact on the
citizen, and makes recommendations on how to optimize their benefits to society. The report focuses on an area where
the developments in IT have had a particularly significant impact in our everyday lives - the use of IT in surveillance,
data-capture, and identity management. It looks at the threats that these technologies may pose and at the role
engineering can play in avoiding and managing these risks. The following is a summary of the central concepts and
issues that the report investigates and the judgments the report makes about them.
Technological development: Technologies for the collection, storage, transmission and processing of data are
developing rapidly. These technological developments promise many benefits: improved means of storing and
analysing medical records and health data could lead to improvements in medical care and in management of public
health; electronic logging of journey details can promise improved provision of public transport and more logical
pricing for road use; and more details of peoples' everyday behaviour offer the possibility for developing better public
policy generally.
However, the development of these technologies also has the potential to impact significantly on privacy. How they
develop is to a large extent under the control of society. They can be allowed to develop in a way that means personal
data are open to the view of others - either centralised spies or local peeping toms. Or, they can be allowed to develop
so that personal data are collected and stored in an organised, controlled and secure manner. There is a choice
between a 'Big Brother' world where individual privacy is almost extinct and a world where the data are kept by
individual organisations or services, and kept secret and secure. The development of technology should be monitored
and managed so that its potential effects are understood and controlled. The possibility of failures of technologies
needs to be explored thoroughly, so that failures can be prepared for and, where possible, prevented.
Designing for privacy: There is a challenge to engineers to design products and services which can be enjoyed whilst
their users' privacy is protected. Just as security features have been incorporated into car design, privacy protecting
features should be incorporated into the design of products and services that rely on divulging personal information.
For example: means of charging road users for the journeys they make can be devised in such a way that an individuals'
journeys are kept private; ID or 'rights' cards can be designed so that they can be used to verify essential information
without giving away superfluous personal information or creating a detailed audit trail of individuals' behaviour;
sensitive personal information stored electronically could potentially be protected from theft or misuse by using digital
rights management technology. Engineering ingenuity should be exploited to explore new ways of protecting privacy.
Privacy and the law: British and European citizens have a right to privacy that is protected in law. The adequate
exercise of that right depends on what is understood by 'privacy'. This notion needs clarification, in order to aid the
application of the law, and to protect adequately those whose privacy is under threat. In particular, it is essential that
privacy laws keep up with the technological developments which impact on the right to and the expectation of privacy,
especially the development of the Internet as a networking space and a repository of personal information. The laws
protecting privacy need to be clarified in order to be more effective. As well as making the letter of the law more
perspicuous, the spirit must be made more powerful - the penalties for breaches of the Data Protection Act (1998) are
close to trivial. The report backs calls for greater penalties for misuse of data - including custodial sentences.
Surveillance: The level of surveillance of public spaces has increased rapidly over recent years, and continues to grow. Moreover, the development of digital surveillance technology means that the nature of surveillance has changed
dramatically. Digital surveillance means that there is no barrier to storing all footage indefinitely and ever-improving
means of image-searching, in tandem with developments in face and gait-recognition technologies, allows footage to
be searched for individual people. This will one day make it possible to 'Google spacetime', to find the location of a
specified individual at some particular time and date.
Methods of surveillance need to be explored which can offer the benefits of surveillance whilst being publicly
acceptable. This will involve frank discussion of the effectiveness of surveillance. There should also be investigation of
the possibility of designing surveillance systems that are successful in reducing crimes whilst reducing collateral
intrusion into the lives of law-abiding citizens.
Technology and trust: Trust in the government is essential to democracy. Government use of surveillance and data
collection technology, as well as the greater collection and storage of personal data by government, have the potential
to decrease the level of democratic trust significantly. The extent of citizens' trust in the government to procure and
manage new technologies successfully can be damaged if such projects fail. Essential to generating trust is action by
government to consider as wide a range of failure scenarios as possible, so that failures can be prevented where
possible, and government can be prepared for them where not. There also need to be new processes and agencies to
implement improvements. If a government is seen as implementing technologies wisely, then it will be considered
more trustworthy.
Protecting data: Loss or theft of personal data, or significant mistakes in personal data, can have catastrophic effects on
an individual. They may find themselves refused credit, refused services, the subject of suspicion, or liable for debts that
they did not incur. There is a need for new thinking on how personal data is stored and processed. Trusted third parties
could act as data banks, holding data securely, ensuring it is correct and passing it on only when authorised. Citizens
could have their rights over the ownership, use and protection of their personal data clarified in a digital charter which
would specify just how electronic personal data can be used and how it should be protected.
Equality: Personal data are frequently used to construct profiles and the results used to make judgements about
individuals in terms of their creditworthiness, their value to a company and the level of customer service they should
receive. Although profiling will reveal significant differences between individuals, the results of profiling should not be
used for unjustifiable discrimination against individuals or groups. Profiling should also be executed with care, to avoid
individuals being mistakenly classified in a certain group and thus losing rights which are legitimately theirs.
Reciprocity: Reciprocity between subject and controller is essential to ensure that data collection and surveillance
technologies are used in a fair way. Reciprocity is the establishment of two-way communication and genuine dialogue,
and is key to making surveillance acceptable to citizens. An essential problem with the surveillance of public spaces is
that the individual citizen is in no position either to accept or reject surveillance. This heightens the sense that we may
be developing a 'Big Brother' society. This should be redressed by allowing citizens access to more information about
exactly when, where and why they are being watched, so that they can raise objections to surveillance if it is deemed
unnecessary or excessively intrusive.
Recommendations
R1 Systems that involve the collection, checking and processing of personal information should be designed in order to
diminish the risk of failure as far as reasonably practicable. Development of such systems should make the best use of
engineering expertise in assessing and managing vulnerabilities and risks. Public sector organisations should take the
lead in this area, as they collect and process a great deal of sensitive personal data, often on a non-voluntary basis.
R2 Many failures can be foreseen. It is essential to have procedures in place to deal with the consequences of failure in
systems used to collect, store or process personal information. These should include processes for aiding and
compensating individuals who are affected.
R3 Human rights law already requires that everyone should have their reasonable expectation of privacy respected and
protected. Clarification of what counts as a reasonable expectation of privacy is necessary in order to protect this right
and a public debate, including the legal, technical and political communities, should be encouraged in order to work
towards a consensus on the definition of what is a 'reasonable expectation'. This debate should take into account the
effect of an easily searchable Internet when deciding what counts as a reasonable expectation of privacy.
R4 The powers of the Information Commissioner should be extended. Significant penalties - including custodial
sentences - should be imposed on individuals or organisations that misuse data. The Information Commissioner should
also have the power to perform audits and to direct that audits be performed by approved auditors in order to
encourage organisations to always process data in accordance with the Data Protection Act. A public debate should be
held on whether the primary control should be on the collection of data, or whether it is the processing and use of data
that should be controlled, with penalties for improper use.
R5 Organisations should not seek to identify the individuals with whom they have dealings if all they require is
authentication of rightful access to goods or services. Systems that allow automated access to a service such as public
transport should be developed to use only the minimal authenticating information necessary. When organisations do
desire identification, they should be required to justify why identification, rather than authentication, is needed. In such
circumstances, a minimum of identifying information should be expected.
R6 Research into the effectiveness of camera surveillance is necessary, to judge whether its potential intrusion into
people's privacy is outweighed by its benefits. Effort should be put into researching ways of monitoring public spaces
that minimise the impact on privacy - for example, pursuing engineering research into developing effective means of
automated surveillance which ignore law-abiding activities.
R7 Information technology services should be designed to maintain privacy. Research should be pursued into the
possibility of 'designing for privacy' and a concern for privacy should be encouraged amongst practising engineers and
engineering teachers. Possibilities include designing methods of payment for travel and other goods and services
without revealing identity and protecting electronic personal information by using similar methods to those used for
protecting copyrighted electronic material.
R8 There is need for clarity on the rights and expectations that individuals have over their personal information. A
digital charter outlining an individual's rights and expectations over how their data are managed, shared and protected
would deliver that clarity. Access by individuals to their personal data should also be made easier; for example, by
automatically providing free copies of credit reports annually.
There should be debate on how personal data are protected - how it can be ensured that the data are accurate, secure
and private. Companies, or other trusted, third-party organisations, could have the role of data banks - trusted
guardians of personal data. Research into innovative business models for such companies should be encouraged.
R9 Commercial organisations that select their customers or vary their offers to individuals on the basis of profiling
should be required, on request, to divulge to the data subjects that profiling has been used. Profiling will always be
used to differentiate between customers, but unfair or excessively discriminating profiling systems should not be
permitted.
R10 Data collection and use systems should be designed so that there is reciprocity between data subjects and owners
of the system. This includes transparency about the kinds of data collected and the uses intended for it; and data
subjects having the right to receive clear explanations and justifications for data requests. In the case of camera
surveillance, there should be debate on and research into ways to allow the public some level of access to the images
captured by surveillance cameras.