By Mary Prior
This paper takes as its starting point a contribution to the first ETHICOMP conference in 1995 that called for the equivalent of a ‘Hippocratic Oath’ for IS professionals. It considers whether such an oath is still required and reviews selected codes of conduct/ethics that have been developed or revised over the past decade to examine the extent to which they address the criticisms levelled at existing codes in the 1995 paper. Finally it discusses the means by which more socially responsible practices by those who design, develop and deploy Information and Communications Technologies (ICT) could be promoted.
A paper presented at the first ETHICOMP conference made the case for a Hippocratic Oath for Information Systems (IS) professionals [Prior, 1995]. The paper claimed that there had been little debate among members of the profession about potentially harmful applications of computer technology, such as the use of software to control weapons of mass destruction or the violation of human rights through the development of databases to track ‘dissidents’. It suggested that codes of conduct may have a role to play in stimulating debate and helping to encourage the growth of practitioners with a ‘well-developed conscience’, a term used by Thring  in his arguments for a Hippocratic oath for engineers. However, the paper noted that unlike the more established professions such as medicine and the law, there were a number of professional bodies for computing professionals, in general their codes contained flaws (such as not providing guidelines about the priority of obligations when there is a conflict of interest), there were no sanctions for members not abiding by them and there was no obligation for those working in the IS field to even belong to any professional body. It nevertheless proposed that codes of conduct include a Hippocratic oath, ‘committing the IS professional to ensure that the work s/he engages in is for the benefit of human society and the world it inhabits’.
This paper, firstly, argues that the need for a Hippocratic Oath for computing and IS professionals or its equivalent is at least as strong as it was a decade ago. Secondly, it considers the changes that have been made over the past decade to selected codes of conduct and evaluates the extent to which they now meet that need. Finally, it considers whether codes of conduct are in themselves sufficient to promote ethical behaviour and if not, what other measures might be required to help develop more socially responsible practices by those who design, develop and deploy Information and Communications Technology (ICT).
Over the past decade both new technologies and their applications have continued to grow at a tremendous rate such that there is scarcely an aspect of human endeavour that remains unaffected by them. The ways in which commercial activity is undertaken, services are delivered, individuals spend their leisure time and communicate with each other, and education, medical care and government services are delivered are among the areas that are profoundly altered by the use of ICT. There are countless ways in which these changes enable and empower the individuals and societies affected by them; however it is the purpose of this paper to highlight a few of the harmful ways in which ICT can be, or is being, used.
One of the more questionable applications referred to in the 1995 paper was military technology. According to an article in the New Scientist, the outcome of the 2003 invasion of Iraq depended ‘to an unprecedented degree on the success’ of ‘smart bombs’. Whereas just 10% of the weaponry deployed in the 1991 Gulf conflict consisted of guided bombs, the figure was 90% for the war that began in 2003. With all-weather guided bombs and good intelligence, the objective was to target the Iraq military and avoid the killing of civilians and the destruction of infrastructure [Sample, 2003]. This objective was not achieved. A study published in The Lancet estimates the death toll associated with the invasion and occupation of Iraq to be ‘about 100,000 people, and may be much higher’, of whom more than half were women or children [Roberts et al, 2004]. Human Rights Watch attributes many unnecessary civilian deaths to the use of cluster munitions (which endanger civilians due to their broad dispersal) and to inadequate intelligence that meant that all fifty acknowledged attacks targeting Iraqi leadership failed in their objective but killed and injured dozens of civilians [Off Target, 2003].
Another dubious application alluded to in the 1995 paper was the use of databases by government agencies to track dissidents. Since the attacks of 11th September 2001 there has been a widespread increase in the quantity and intensity of surveillance systems. Concern for citizens’ security is essential, but as Lyon  has pointed out, ‘under some circumstances, intensified surveillance may have socially negative effects which mean that proscription takes precedence over protection, control over care’. Technologies such as iris scans, face recognition, DNA and biometrics all rely on the use of large, searchable databases. Several countries have already implemented smart identity cards using biometric data, others have plans to do so; ‘there does seem to be a global trend towards deploying a full-scale smart national ID card based on biometrics that offers multiple uses, but for varying reasons and with different outcomes’ [Lyon, 2004]. The databases holding citizen data provide the means for ‘social sorting’ by ethnicity or religion, or by bodily or behavioural characteristics such as skin colour, accent or attitude. As Lyon says, ‘the onus is on those who propose new ID card systems to demonstrate that they will not exhibit authoritarian tendencies in practice’ [Lyon, 2004]. The International Civil Liberties Monitoring Group has warned that governments are co-operating to build ‘a global registration and surveillance infrastructure’ with the aim of monitoring the movements and activities of whole populations in ‘an unprecedented project of social control’ [Norton-Taylor, 2005]. Thus it is no longer just ‘dissidents’, however that term may be defined, but every citizen who may come under state surveillance; privacy rights have been undermined in many ways since the events of September 2001 [Dempsey, 2002; Heymann, 2002]. On a different level, electronic surveillance in the workplace has become widespread, endangering the privacy of many employees every working day [Cripps, 2004] and the increasing use of CCTV in public places raises further concerns about privacy [Watching Them Watching Us, 2003; Anon, 2005].
A key feature of the past decade has been the phenomenal growth of the internet. With few families in the developed world without access either from home, work or school, the potential for undesirable material to be within the reach of adults and children alike has grown too, whether this be merely inconvenient (such as junk mail and spam) or illegal (as with child pornography).
These few examples will suffice to suggest that with the expansion of ICT into most areas of our lives, the potential for it to be used to harm rather than enhance society has grown too. A software engineer who has doubts about the morality of conducting wars can of course choose not to work with military applications. However, many other more pervasive applications have the potential to be used in ways other than originally intended or are subject to ‘function creep’ (for example, any application that stores personal data about employee or consumer characteristics and behaviour). In addition, computer systems lie at the heart of contemporary commerce, communications, transport, medical and consumer products, where malfunction could lead to serious disruption of services, financial loss and human injury or death. There is still a need for those involved in the design, development and deployment of computer systems to consider the possible effects of their work on society; there is still a case to be made for something equivalent to a Hippocratic Oath for professionals working in the field of ICT.
The codes of conduct/ethics of professional associations have a variety of purposes. They can be seen as a means to serve the association by helping to establish their status as a profession, regulate their membership and thus persuade society at large that they may be trusted. They can also be used as a means of educating the membership, and providing the latter with guidance as to ethical decision-making and support when making a stand against perceived unethical practice. A code is required for each profession because, in addition to the set of values that all human beings may share (such as integrity and justice) and the obligations that all professionals have as a result of their role and expertise, each profession carries obligations unique to its own practice [Gotterbarn, 1999].
The 1995 paper made several criticisms of codes of conduct. Firstly, that they tended to emphasize the computer professional’s obligations to employer and client at the expense of that to the public at large. Secondly, none provided guidelines about the priority of obligations when there is a conflict between them. Thirdly, there appeared to be no sanctions if a member did violate the code. Overall, they appeared to focus on the professional’s duty within a project rather than consider the much broader issue of the nature of the systems being developed and their contribution or otherwise to the social good:
‘Ethical considerations for the computer professional typically deal with what you do after you sit down at a terminal. However, an initial consideration should be why one is sitting down at the terminal in the first place’ [Summers & Markusen, 1992].
This section will examine selected codes that have been developed or revised over the past decade and the extent to which they address these criticisms. The codes to be considered are: the Software Engineering Code of Ethics and Professional Practice (1999), the British Computer Society Code of Conduct (2001), the Institute for the Management of Information Systems Code of Ethics (2001) and the Australian Computer Society Code of Ethics (2003).
The Software Engineering Code of Ethics and Professional Practice was adopted in 1999 by both the IEEE Computer Society and the Association for Computing Machinery (ACM). It comes in two versions; a short one that is inspirational and provides a summary of aspirations, and a full version that includes examples and details of how the aspirations should affect professional practice. The ‘Preamble’ states that, ‘safety and welfare of the public is primary; that is, the “Public Interest” is central to this Code’. The first principle is that ‘software engineers shall act consistently with the public interest’ and the full version adds, ‘the ultimate effect of the work should be to the public good’. These provisions address the criticism made of earlier codes that they do not address the public at large nor the ‘social good’ but focus on obligations to the employer or client within a particular project.
In terms of conflicts, the ‘Preamble’ notes that where ‘standards may be in tension with each other or with standards from other sources’, the software engineer should use their ‘ethical judgement to act in a manner which is most consistent with the spirit of the Code’. Guidance, therefore, is provided on this point too.
The British Computer Society (BCS) Code of Conduct is a short document and follows conventional practice in covering members’ obligations to the public interest, ‘relevant authority’ (i.e. employer or client) and the profession. The public interest section includes having ‘regard for the public health, safety and environment’ and encouragement to ‘promote equal access to the benefits of IS by all groups in society’. The introduction, however, states that the Code governs the personal conduct of individual members and ‘not the nature of business or ethics of the relevant authority’. Before the publication of this revised (2001) Code, a BCS membership panel member had pointed out that the ‘public interest’ provision was not enforced; the application for membership from a candidate whose work supported the marketing of a product which is ‘admitted by most people – including its own customers and its own management – on good scientific evidence, to be bad for health’ was agreed despite this member of the panel concluding that the ‘regard for public health’ rule ‘made it impossible for us to admit the candidate’ [Race, 2000]. The revised Code makes explicit a principle that is probably implicitly applied by other professional associations: that no judgement is made about the compliance of various industry sectors with the Code when membership applications are considered.
The Code contains no guidance on what to do should obligations conflict.
The Institute for the Management of Information Systems (IMIS) Code of Ethics states six fundamental principles, with some detail provided to amplify each. Overall, members are expected ‘to ensure that the contribution made by the profession to society is both beneficial and respected’. The first principle commits members to ‘uphold the health, safety and welfare of wider society, future generations and the environment’.
The ‘Preamble’ to the Code notes that conflicts may arise between parts of the Code or with other codes; in such circumstances ‘the professional should reflect on the principles and the underlying spirit of the Code and strive to achieve a balance that is most in harmony with the aims of the Code’. Ultimately, ‘the public good shall at all times be held paramount’. This provides a stronger statement of the professional’s duty to society than the BCS Code and is more in line with the Software Engineering Code of Ethics and Professional Practice.
The Australian Computer Society (ACS) Code of Ethics forms a part of the Society’s Regulations. It expects members to ‘loyally serve the community’ and ‘use special knowledge and skill for the advancement of human welfare’. The ‘Values and Ideals’ listed include ‘Social Implications’, under which members must ‘strive to enhance the quality of life of those affected’ by their work. In terms of priorities, members must ‘place the interests of the community above those of personal or sectional interests’.
Thus, the Code emphasizes social responsibility issues and places the interests of ‘the community’ above those of employer or client.
The BCS Code of Conduct has the strongest statement concerning breaches; the introduction states that any breach of the Code ‘brought to the attention of the Society will be considered under the Society’s disciplinary procedures’. A search resulted in just one reported case of a breach that was dealt with in September 2004 when a student member was found to have breached the Code of Conduct (as well as Examination Regulations) in having cheated during an examination [Disciplinary Case, 2004]. A reprimand and three-year suspension was issued to the member concerned.
The ACS Code states that ‘compliance with the Code is mandatory for Members of the Society’ and that failure to observe it ‘could also lead to a disciplinary charge or complaint being made against the member by either another member of the Society or by any other person or client’. However, no reports of any charges could be found.
The IMIS and Software Engineering Codes do not have a regulatory function and thus do not refer to possible sanctions in the case of breaches of them. The IMIS Code ‘details an ethical basis for the practitioner’s professional commitment’, providing values and standards that are intended to guide the professional conduct of members. The Software Engineering Code explicitly draws attention to its educational function; ‘it is a means to educate both the public and aspiring professionals about the ethical obligations of all software engineers’.
Indeed, the overriding issue remains that those working in the area of computing are not required to belong to a professional society, so the imposition of sanctions by a given society has little relevance. The student member of the BCS who received a three-year suspension from the Society is at liberty to continue employment in the field of ICT. As Gotterbarn  noted with respect to the Software Engineering Code, ‘sanctions will occur only when the Code is publicly adopted as a generally accepted standard of practice, and when both society and legislators view the failure to follow the Code as negligence, malpractice, or just poor workmanship’.
Only four codes have been considered here, but they do cover major computing professional associations from the USA, UK and Australia. The Software Engineering and IMIS Codes, in particular, emphasize the computing professional’s primary duty to the public interest, and thus do address the criticisms made of code content in the 1995 paper.
However, it remains the case that those working in the field of ICT are fragmented into many different specialisms and there is no requirement for any of them to join a professional society. It may seem curious that in many countries a license is required to own a dog yet no license to practice is required of those designing and developing the computer systems on which contemporary society relies so heavily. In 1998 the Texas Board of Professional Engineers adopted software engineering as a distinct licensable engineering discipline, however there have been few other moves towards licensing, indeed the ACM in 2000 decided it could not support the licensing of software engineers [A Summary of the ACM Position on Software Engineering as a Licensed Engineering Profession, 2000]. Nevertheless, a longitudinal study of the ethical attitudes of members of IMIS has found an increasing level of support for licensing between 2000 and 2004, with more than two-thirds of respondents supporting the introduction of licensing for computer professionals in the latest survey [Prior et al, 2005].
It may be, therefore, that there is scope to pursue some form of licensing for the profession and as part of the process, ensure that ‘licensed practitioners’ commit themselves to working within an appropriate Code of Ethics and Professional Practice. Such an endeavour is, however, fraught with difficulty. While the ubiquitous nature of ICT is an argument for the urgency of considering its socially responsible use, it is also a phenomenon that renders any attempt to define a ‘computing profession’ more difficult. The ability to set up a website or construct a database has, with developments in software and the incorporation of IT skills in education, become available to a higher proportion of the population in the developed world than a decade ago. At the same time, therefore, as the argument for a form of licensing has become stronger, it has become even more difficult to contemplate how any licensing system could be introduced.
Any impetus to restrict who can practice as a software engineer or systems designer will not come from those working in the field, it can only come from society at large putting pressure on those who legislate. In some areas of scientific and technological advances such as the use of genetically modified crops in food, or medical advances such as genetic engineering, there is a high level of public awareness and interest in the issues, and this can result in pressure on policy-makers. Such involvement appears to be lacking with respect to ICT. There is a need to promote a similar level of public debate about the many social effects and ethical issues raised by the applications of ICT that permeate our societies, and to promote a much greater level of awareness of the imperative for professional practises such the involvement of users and rigorous testing.
It is important for professional societies to develop appropriate codes of conduct that recognise the paramount responsibility of the professional to work in the public interest, to promote these principles and educate their members to help them develop a high level of professionalism in their work. However, this is not enough. The onus is on the professional societies, on the educators of future professionals and on researchers in the field of computing and social responsibility, to find ways of informing the wider public and policy makers of the ethical dimension of ICT and stimulating debate concerning the desirability of ‘professionalizing’ its practitioners.
Measures indicated in the previous section could help promote more responsible professional practice by ICT practitioners. This might help ensure that projects are undertaken with the highest possible level of professional endeavour, with the public interest clearly held in the foreground.
There will always be debate about the extent to which some applications (for example, in the defence or surveillance industries, or in industries that promote products harmful to human health) contribute to or detract from the public good. It would be a sign of a healthy profession to engage in robust debate not just among ourselves but with the wider public on these and other contentious issues.
It may be an idealistic, but it is still a desirable, objective to hope that one day all scientists and engineers, including ICT practitioners, might subscribe to the sentiments of the Student Pugwash USA Pledge:
‘I promise to work for a better world, where science and technology are used in socially responsible ways. I will not use my education for any purpose intended to harm human beings or the environment. Throughout my career, I will consider the ethical implications of my work before I take action. While the demands placed upon me may be great, I sign this declaration because I recognize that individual responsibility is the first step on the path to peace’.
Anon (2005), London Congestion Charge CCTV Privacy Concerns, online at http://www.spy.org.uk, accessed 21.06.2005.
Australian Computer Society Code of Ethics (2003), online at http://www.acs.org.au/about_acs/acs131.htm, accessed 20.04.2005.
British Computer Society Code of Conduct (2001), online at
http://www.bcs.org/BCS/AboutBCS/codes/conduct, accessed 20.04.2005.
Cripps, Alison (2004), Workplace Surveillance, New South Wales Council for Civil Liberties, November 2004, online at http://www.nswccl.org.au/docs/pdf/workplace%20surveillance.pdf, accessed 07.07.2005.
Dempsey, James X. (2002), Civil Liberties in a Time of Crisis, Human Rights, Winter, 8-10.
Disciplinary Case (2004), online at http://www.bcs.org/BCS/AboutBCS/codes/conduct/disciplinary/sep04.htm accessed 20.04.2005.
Gotterbarn, Don (1999), How the New Software Engineering Code of Ethics Affects You, IEEE Software, November/December, 58-64.
Heymann, Philip B. (2002), Civil Liberties and Human Rights in the Aftermath of September 11, Human Rights, Winter, 18-19, 25.
Institute for the Management of Information Systems Code of Ethics (2001), online at
http://www.imis.org.uk/30_professional_stds/60_ethics, accessed 20.04.2005.
Lyon, David (2001), Surveillance after September 11, Sociological Research Online, 6 (3), online at http://www.socresonline.org.uk/6/3/lyon.html, accessed 05.07.2005.
Lyon, David (2004), Identity cards: social sorting by database, Oxford Internet Institute, Internet Issue Brief No. 3, November 2004, online at http://www.oii.ox.ac.uk/resources/publications/IB3all.pdf, accessed 05.07.2005.
Norton-Taylor, Richard, Warning on Spread of State Surveillance, The Guardian, 21 April 2005, online at http://www.guardian.co.uk/print/0,3858,5175745-103681,00.html, accessed 20.06.2005.
Off Target: the Conduct of the War and Civilian Casualties in Iraq, Human Rights Watch, December 2003, online at http://hrw.org/reports/2003/usa1203/3.htm, accessed 05.07.2005.
Prior, Mary (1995), The Case for a Hippocratic Oath for Information Systems Professionals, ETHICOMP95, De Montfort University, Leicester, March 28-30 1995.
Prior, M., Fairweather, N.B., Rogerson, S. and Dave, K. (2005), IS IT Ethical? 2004 ETHICOMP Survey of Professional Practice, IMIS.
Race, John (2000), How High-Principled Should BCS Members Be?, The Computer Bulletin, November, 16-17.
Roberts, Les et al. (2004), Mortality Before and After the 2003 Invasion of Iraq: Cluster Sample Survey, The Lancet, 364, 1857-64.
Sample, Ian (2003), US Gambles On a ‘Smart’ War in Iraq, New Scientist, 19th March 2003, online at http://www.newscientist.com/article.ns?id=dn3518, accessed 05.07.2005.
Software Engineering Code of Ethics and Professional Practice (1999), online at http://www.computer.org/tab/seprof/code.htm, accessed 20.04.2005.
Student Pugwash USA Pledge, online at http://www.spusa.org/pledge/, accessed 08.07.2005.
A Summary of the ACM Position on Software Engineering as a Licensed Engineering Profession, Final Version (July 17 2000), online at http://www.acm.org/serving/se_policy/selep_main.html, accessed on 05.07.2005.
Summers, C and Markusen, E. (1992), Computers, ethics and collective violence, Journal of Systems Software, 17 (1), 91-103.
Thring, M. (1980), The Engineer’s Conscience, Northgate Pub. Co. Ltd.
Watching Them, Watching Us, online at http://www.spy.org.uk/wtwu.htm, accessed 07.07.2005.