1800040-2004-12 - technological complexity and ethical control
TRANSCRIPT
-
7/30/2019 1800040-2004-12 - Technological Complexity and Ethical Control
1/7
IEEE Technology and Society Magazine, Spring 2002 33
ETHICS, CONTROL, AND
TECHNICAL SYSTEMSRichard Rhodes words touch anerve, for the pace and complexity
of technological change pose
daunting challenges for our moral
intelligence. The technical, social,
and moral complexities of techni-
cal systems have profoundly
affected the ethical control of tech-
nical systems. Control is not
restricted to the use or production
of technical systems, but applies toa technologys full life cycle, from
The author is Professor of Phi-
losophy at the The University of
Sudbury, Sudbury, Ontario, Cana-
da P3E 2C6. He can be reached at:
4 Irish Lane, Barrie, Ontario L4M
6H8; email: [email protected].
0278-0079/02/$10.002002IEEE
Technological Complexityand Ethical Control
Vincent di Norcia
The western world hasargued passionately
about technology
whether its good or
bad for us, ...even
while inventing it at a
furious and accelerating rate.
Richard Rhodes, Visions
of Technology [1].
EYEWIRE
-
7/30/2019 1800040-2004-12 - Technological Complexity and Ethical Control
2/7
34 IEEE Technology and Society Magazine, Spring 2002
invention, design, and develop-
ment through manufacture and use
to final disposal. Ethical control
means that the ends and means
involved in systems control are
guided by core ethical values such
as care for life (human and natur-
al), socio-economic welfare, and
communication [2].Notwithstanding their complexi-
ty, most of the ethical problems that
controlling technical systems pose
should be solvable, assuming ade-
quate resources are available [3].
However, technologies are so
diverse that we should first clarify
the concept of a technical system,
e.g., by dividing it into four types,
based on their degree of complexi-
ty and increase in scale. The four
types of technical system we will
refer to here are: tools, technolo-
gies, technology networks, and
technology fields. As we move
from relatively simple tools and
technologies to more complex
technology networks and fields,
we are involved, I will suggest, in
an increasingly difficult and uncer-
tain social experiment [4]. To solve
the technically, socially, and
morally complex problems that
technical systems pose therefore
demands an informed and
resourceful moral intelligence. To
understand what this involves let
us begin with the simplest case,
tools; more specifically, the knife.
TOOLSTools are simple technical sys-
tems like knives, hammers, pens,
and clocks. Knives, for example,
have few mechanisms if any, and
are designed to facilitate individual
ease of use in solving practical
problems. There are differentknives for different uses: from
paper, butter, and meat knives to
shaving razors, axes, and swords.
One can usually assess the appro-
priateness of a knife to a specific
use [5]. One does not for instance
use a butter knife to chop wood.
Also, related tools are often linked
together to make up useful sets.
Knives, forks, and dishes for
instance are used together for eat-
ing, just as pen, ink, and paper are
useful for writing.
As tools, knives are familiar and
easy to use, usually without formal
training. Since users can usually
predict the effects of normal use,
outcomes tend to match userintent. Thus the ethical control of a
tool presents no great difficulties.
But skilled use, such as slicing
meat finely and evenly or compe-
tent sword play has traditionally
involved an apprenticeship, e.g., to
a butcher or fencing master. Skill-
ful performance also evokes the
idea of virtue as excellence in a
practice, for excellence, Aristotle
argued, extends from technical
practice to ethical (see [6]).
Traditionally knives were high-
ly crafted unique tools, but today
most are mass produced in stan-
dard designs, facilitating their gen-
eral availability and ease of use.
Tool culture connotes tradition-
al values. The designs of knives
and swords, for example, are slow
to change. Older knives can even
be antiques or works of art, viz., a
kitchen knife found in ancient
Pompeii or a Samurai sword. In
addition, many tools are organical-
ly connected with their users habi-
tat. Hunting knives are used to kill
an animal and slice the meat; axes
to cut local wood for burning or
building. If tools like knives repre-
sent tradition, technologies like the
automobile signify change.
TECHNOLOGIESOver the last few centuries the
old tool-based farm economy has
been replaced by more efficient
new industrial technologies: trains,factories, large buildings, eleva-
tors, and automobiles. While new
technologies may compete with
old for users, they do not always
replace older systems.
Automobiles for instance have
not eliminated bicycles, and we still
use knives to cut food and paper.
But technologies are more complex
than tools, numerically, systemical-
ly, socially, and morally. Numeri-
cally, technologies typically contain
several component tools, and sys-
temically most technologies opera-
tionally interconnect several sub-
systems (ultimately composed of
tools), into a single integrated oper-
ating system. In operation, automo-bile technology, for instance,
dynamically connects several sub-
systems: accelerating, gearing,
steering, engine cooling, cabin heat-
ing, instrumentation, etc. All work
together as one transportation sys-
tem. Watches link springs, gears,
hands, face, and case together into
one mechanical system designed to
tell the time. Each subsystem is
composed of tools. Hitting the
brake pedal sends brake fluid
through the lines to make the pads
slow down the wheels.
Abstractly, a technology might
be reduced to its separate compo-
nents, but not as a dynamic system.
To drive a car you do not manipu-
late its components separately. You
do not stop it by picking up a brake
pad and holding it to the wheel. On
the contrary, the normal operation
of a technology like an automobile
involves numerous high-speed
interactions among its subsystems
and component tools. But those
subsystems are loosely coupled.
Brake failure affects, but does not
destroy, the steering. A driver can
in consequence often avoid a seri-
ous accident. Technologies whose
subsystems are tightly coupled
however are prone to normal acci-
dents [7]. In such systems, prob-
lems in one subsystem can affect
others and quickly threaten to
cause total system breakdown,
with attendant risks to humanhealth and the environment. Thus
problems in the cooling system of
a nuclear reactor can rapidly lead
to a meltdown; and a short circuit
in a planes wiring can cause it to
crash. One way of reducing such
risks is to favor resilience in
design, through loosely coupled
subsystems and redundancy.
-
7/30/2019 1800040-2004-12 - Technological Complexity and Ethical Control
3/7
Despite its systemic complexity,
most people today can operate a
car. Drivers must of course comply
with the automobiles prescribed
control techniques, viz., in manipu-
lating the gear shift or turning the
steering wheel, and staying on the
right or left side of the road. The
need for compliance typifies whatUrsula Franklin terms prescriptive
technologies, in contrast to the
greater freedom users had with old-
er holistic tools, like knives [8].
But prescriptive compliance and
standardization are not necessarily
unethical.
Standardizing operational con-
trols has facilitated safer driving.
Indeed one of the most important
achievements of industrial society
was the development of mass pro-
duction manufacturing technology,
as with Fords Model T early in the
20th century.
While many technological inno-
vations solved user problems and
enhanced performance, they also
increased system complexity. The
simple user-friendly controls of the
automobile mask significant sys-
tem complexity under the hood.
Today many drivers do not
understand how their automobiles
mechanisms function or know how
to repair their car. This often means
that specialized technical expertise
and training are needed for its oper-
ation and maintenance [9]. The
same can be said for VCRs, com-
puters, and other common tech-
nologies. The complexity of such
systems reflects the growth of the
modern sciences and technical pro-
fessions, or what John Kenneth
Galbraith termed the technostruc-
ture of modern societies [10]. It
has in consequence evoked socialcomplexity. Contemporary auto-
mobile manufacturing demands far
more knowledge and resources
than making a two wheel buggy or
a knife. The design and develop-
ment of an automobile now
involves numerous scientists, engi-
neers, and technicians, usually
working together in the R&D divi-
sions of large, international corpo-
rations [11], [12]. No wonder the
growth of the automobile industry
reinforced that of the technically
complex petrochemical industry
and firms like Siemens and Daim-
ler Benz.
The technology production
phase, from innovation and designto development and manufacture,
is usually beyond the capacity and
resources of most individuals. The
era of the lone inventor creating a
new tool is long gone. And old
automobiles usually end up in
junk yards. The growing vol-
ume and toxicity of the
wastes automobiles produce
constitute a major environ-
mental problem. In response,
the German government has
required auto parts to be
recyclable. This in turn has
led to simpler design, and
fewer, more standardized
parts. The control of automo-
bile technology throughout
its life cycle, one can see, is
also socially complex. Dif-
ferent groups compete to
influence each phase of the
cycle, from automobile pro-
duction through traffic and
use to disposal. Groups such as
manufacturers, technicians, work-
ers, distributors, owner/drivers,
mechanics, communities, including
non-drivers whose health is at risk
from traffic smog, are all stake-
holders in the automobile technolo-
gy. In his classic study of Nuclear
Regulatory Commission hearings
in California Richard Meehan
showed how competing stakehold-
er interests clashed sharply in seek-
ing to influence public decisions
about reactor sites [13]. The ethicalcontrol of complex modern tech-
nologies is becoming a socially
complex and even political process,
one that should be open and demo-
cratic [14], [15].
Social complexity in turn
evokes moral complexity, for
diverse values technical, eco-
nomic, social and ethical com-
pete to guide the control of tech-
nologies throughout their life
cycle. In the industrial era, techno-
logical innovations were increas-
ingly developed in business con-
texts. But market values often
clash with other values, such as
technical quality and efficiency,
social and economic welfare, andenvironmental protection [16].
But, some have argued, the very
possibility of ethical control has
been blocked by a technological
imperative [11]. Anything that is
technologically possible, it dic-
tates, must be done, usually by
experts. But one has doubts about
the imperatives force. Automobile
firms for instance have been able
to resist technological innovations
in fuel efficiency, pollution reduc-
tion, and safety, alleging financial
or competitive concerns; and other
firms have suppressed new tech-
nologies, notably for competitive
reasons [17]. So ethical values,
too, can override any so-called
imperative and influence the con-trol of a technology.
Ethical control is itself morally
complex [18]. The problems that
arise in the development of automo-
biles typically involve what Cather-
ine Whitbeck terms multiple con-
straints cost, comfort, fuel
efficiency, safety, and environmen-
tal protection and they may not
IEEE Technology and Society Magazine, Spring 2002 35
Few foresaw the automobiles
contribution to urban sprawl,adolescent sex, and global
warming.
-
7/30/2019 1800040-2004-12 - Technological Complexity and Ethical Control
4/7
be simultaneously satisfiable[19].
Accordingly, one should not expect
a uniquely correct solution, but a
range of acceptable solutions. Bal-
ancing these diverse social, eco-
nomic, technical, and environmental
values presents a significant chal-
lenge to many industry stakehold-
ers: owners, management, technicalprofessionals, mechanics, car own-
ers, junk yard operators, and public
authorities.
It is interesting to note that the
scientific professions like engineer-
ing have all developed multi-val-
ued ethical codes. Those codes typ-
ically stipulate that professionals
should use their technical knowl-
edge for the benefit of clients, the
profession, the community, and the
natural environment [20], [21]. In
this framework, professionalism
itself involves a morally complexmandate: to balance the public wel-
fare, economic growth, scientific
knowledge, and technical efficien-
cy. Living up to these competing
obligations may be difficult; but
they do imply that professional
expertise, and the technostructure,
is not morally neutral. Indeed,
engineers contributed to the pro-
gressive social movements of the
1920s. Indeed it is risky to down-
play expert knowledge in favor of
narrow business interests, ideolo-
gies, or social factions, for techni-
cal knowledge is not merely a
social construct. A respect for pro-
fessional knowledge has guided
several notable scientists and engi-neers, from Rachel Carson to Jeff
Wigand, Roger Boisjoly, and Nan-
cy Olivieri. Each has taken a lead-
ing role in publicly advocating the
ethical control of a technology,
respectively: pesticides, cigarettes,
manned rockets, and pharmaceuti-
cals. The ethical control of a tech-
nology, professionalism sug-
gests, requires one to minimize
the socio-economic and envi-
ronmental risks of a technolo-
gy to acceptable levels, while
realizing its technical and
socio-economic benefits.
Technological innovation is
frequently unpredictable. The
newer a technology, and the
faster the pace of innovation,
the less anyone can foresee or
control its development and
impacts as it moves through its
life cycle [22]. In 1900, for
example, few if any foresaw
the contraceptive pill, peni-
cillin, TV, nuclear energy,
genetics, or the computer [23].
A century ago one might pre-
dict that society would need to
construct more roads as the
number of automobiles
increased, but few foresaw the
automobiles contribution to urban
sprawl, adolescent sex, and global
warming. Indeed the more radical
an innovation the less predictable
its life cycle or its socio-economic
and environmental impacts. Whatwas originally a fertility pill for
example became a contraceptive,
and advanced the emancipation of
women. Technology life cycles fur-
thermore are becoming shorter, as
the pace of technological change
speeds up. As the complexity,
unpredictability and pace of events,
and the severity of global environ-
mental stress soar, Thomas
Homer-Dixon claims, modern soci-
eties face an ingenuity gap, a
shortfall between the rapidly rising
need for ingenuity and the inade-
quate supply [24]. And that inge-
nuity gap grows still wider as tech-
nologies themselves interact in
ever larger Technology Networks.
TECHNOLOGY NETWORKSTechnology Networks link dif-
ferent technologies serving a com-
mon function, such as transporta-
tion, communication, or energy.
Networks are usually more com-
plex and larger in scale than their
component technologies. Specific
tools and technologies may come
and go, but the underlying networks
persist, for networks have longer
life cycles than do their component
technologies and tools. This is part-
ly for an ethical reason. Network
functions satisfy fundamental
human needs, e.g., for communica-
tion, energy, transportation, etc.
Large scale communication, trans-
portation, and energy networks
emerged in modern societies over
the last century to serve the needs of
their growing populations.
In a technology network one
finds diverse technologies at differ-
ent stages in their life cycle, all
competing for users with different
needs and preferences. In commu-
nications networks, users can
choose between phones, regular
mail, email, radios, TV, and com-
puters to fulfill their communica-
tion needs. In energy networks, user
options include electrical, natural
gas, oil, coal, nuclear, wind, solar,
and passive energy technologies.
The competition among old and
new technologies to serve a func-tion has ethical implications. Inter-
mediate technologies and simple
user friendly tools such as a vil-
lage phone or small diesel engine,
E. F. Schumacher claimed, may be
more appropriate to the needs of
communities in less developed
societies than newer, more complex
high tech systems such as the latest
36 IEEE Technology and Society Magazine, Spring 2002
Indeed the more radical an
innovation the less predictable
its life cycle or its
socio-economic and
environmental impacts.
-
7/30/2019 1800040-2004-12 - Technological Complexity and Ethical Control
5/7
Internet computer email system or a
nuclear reactor [5]. Similarly, pub-
lic energy policy in modern soci-
eties should explore all energy tech-
nology options including
conservation, rather than restrict
itself to fossil fuels and nuclear.
Given their technical complexi-
ty, network operators need to bewell educated and technically
trained. The need for a technically
informed moral intelligence was
learned early in the history of engi-
neering [21], [25]. That lesson was
tragically reinforced in spring 2000
when the technically untrained
manager of the Walkerton, Ontario,
Public Utility ignored test lab data
showing fatal e. coli bacteria in the
towns water supply and did not
respond to related inquiries from
the area Medical Officer of Health.
As a result seven townspeople died
and hundreds more got sick, many
seriously [26].
Networks are socially complex
in their organization. Large public
and private organizations like states
and corporations alone tend to have
the social resources and technical
expertise to operate large commu-
nication and energy networks [27].
Thus the BBC and CBC are gov-
ernment owned broadcasting
media, while ABC, Fox, CBS,
NBC, and cable TV are private
businesses. Bell Canada runs the
central Canadian phone system,
and private firms generate electric-
ity, while governments own the dis-
tribution grids. Radio, TV, and
energy networks are all regulated
by government agencies, such the
FCC in the U.S. and the CRTC in
Canada and by numerous state and
province energy commissions.
Network boundaries are more-over often vague and ill-defined,
extending beyond buildings and
towns to whole regions and conti-
nents. Both international commu-
nications and financial markets
flow across organizational and
political borders. Such networks
call for decentralized organization
and widely distributed control sys-
tems. Centralized control seems
inappropriate for very high speed
and high volume electronic com-
munication networks. Efficient
functioning is also affected by the
way networks integrate channel
bandwidth, routing, and nodes, as
well as filter signals from noise,
etc. To ensure network resiliencetheir components should be loosely
coupled and pathways designed to
allow messages to flow around bot-
tlenecks, as in the Internet and
transportation networks.
Network administration and
transaction costs should be mini-
mal so that message transmission,
energy, and traffic flows are close
to frictionless. Easy access, low
transaction costs, and minimal
administrative oversight are
required to enable these dynamic
networks to continue to operate
efficiently. Almost free access and
low-fee, subscription or rental
based user rights seem to fit elec-
tronic communications networks
such as the phone, email, and the
Internet [28], [29]. This also affects
network ownership. Forms of
shared network ownership and
joint control are even accepted in
private broadcasting media. Where
resources are mobile and flow
through networks, as in the case of
information, electricity, oil, gas,
birds, and fish, ownership should
be common and public rather than
private [30]. The bundle of owner-
ship rights, that is, should be dis-
tributed among network producers,
distributors (or communicators),
system operators, and service
providers, both public and private.
Contemporary electronic media
process information in nanosecond
time slices. If we think of our high-speed communication, energy, and
transportation networks as rivers
flowing at high speeds and vol-
umes, then you may not be able to
step into the same river twice, as
Heraclitus observed 2500 years
ago. And he might have added, the
high speeds and volumes have
affected our values [31]. Instant
service and easy access are now
expected in our social life. Many
wrist watches have split second
stop watch functions. Millions of
people daily drive autos and fly
planes, where they are increasingly
treated in high volume terms, not
unlike human freight. And ease of
access and openness in communi-cation networks has its risks, to pri-
vacy and data security. So our fast-
flowing, open access information
networks should be guided by a
communication ethic of respect for
the integrity of information flows,
and for data integrity, privacy, and
copyright.
The great rivers of information
and traffic flowing through todays
communication, energy, and road
traffic networks are in addition
prone to sudden flooding. The high
speed flows in communications
networks and electricity grids leave
little time to respond to emergen-
cies. Small problems in one part of
a network can rapidly multiply and
spread through the network, espe-
cially if its subsystems are tightly
interconnected. Gridlock can
develop very quickly, and threaten
to shut down whole regions. The
failure of electricity, communica-
tions, or air traffic networks, can
moreover pose serious risks to wel-
fare and public safety. Networks
whose flows are reinforced by pos-
itive feedback loops are especially
crisis prone, as shown by the 1995,
1997, and 2000 crises in electronic
financial markets in Mexico, Asia,
and Silicon Valley.
One way to reduce the risk of
breakdown may be to be more
responsive to negative feedback, so
that early warnings of emerging
problems receive a timely response.Keeping such networks distributed,
loosely coupled, and resilient can
reduce the ingenuity gap between
the need to minimize risks of break-
down and the available supply of
relevant technical and moral intelli-
gence [24]. Computerized high
speed sensors and flexible response
technologies might be designed to
IEEE Technology and Society Magazine, Spring 2002 37
-
7/30/2019 1800040-2004-12 - Technological Complexity and Ethical Control
6/7
facilitate rapid response to early
warnings of problems. For flows in
some networks move near the speed
of light e.g., in financial trading.
Communication and collaboration
among network users, operators,
experts, and other stakeholders
should also be enhanced, and sup-
ported by democratic, participatoryapproaches to control [8], [13], [14].
Technology Fields however offer an
even greater ethical challenge.
TECHNOLOGY FIELDSA Technology Field is all the
tools, technologies and networks
present in any defined space and
time, e.g., in a room over a day.
Technology networks commonly
cut across field boundaries. While a
technology fields boundaries are
arbitrary, they are well defined.
And distinct borders constitute the
infrastructure of the sovereign
state. They also open technology
fields to regulation by that areas
political authorities. But technolo-
gy fields range widely in size, from
mini-fields like rooms to regional
and planetary mega-fields. Tech-
nology fields are even found in
space, as the growing pile of space
debris indicates. But few govern-
ments, corporations, or internation-
al organizations have adequate
resources even to inventory a
regional technology field, much
less control it.
Within a technology field, dif-
ferent technology networks com-
pete for dominance. Success in that
competition defined the Stone, Iron
and Bronze Ages, and more recent-
ly the Industrial Age. In it, mechan-
ical and chemical energy, produc-
tion, and transportation networks
prevailed over the organic net-works based on agriculture and vil-
lage life; but the older technolo-
gies, e.g., in agriculture, did not
completely disappear. The pace of
technological change has also sped
up, as technology field life cycles
have shortened. The Stone Age
endured a millenium, but the indus-
trial era lasted only about 250
years, and is now giving way to a
new communications age.
The emergence of regional and
global technology mega-fields over
the last 50 years reflects an evolu-
tionarily unprecedented level of
competition between human popu-
lations and other species for
ecosystem resources. The pace ofspecies extinctions has ratcheted up
across the globe. Unlike previous
eras, Richard Rhodes comments,
today we swim in technology as
fish swim in the sea [1]. Technol-
ogy fields shift the ethical control
focus to the sea more than the fish,
that is, to our technological envi-
ronments and their impacts on nat-
ural ecosystems [32].
The ambivalence of the interna-
tional political communitys
response to the environmental cri-
sis, like the timidity of the Kyoto
treatys requirements, suggest that
few political leaders grasp the
epochal significance of technology
mega-fields. This reflects the ill-
noted fact that the density of tech-
nology fields, or the number of
technologies in relation to popula-
tion, has increased significantly
over the last two centuries. (The
greater the technology-to-popula-
tion ratio, the denser the technolo-
gy field). Contrast for example the
U.S., France, or Japan with Kaza-
khstan, Romania, or an Aboriginal
community. In these societies both
economic development and envi-
ronmental impact levels correlate
with the technology density level.
The rise in species extinctions,
global warming, acid rain, and
ozone layer depletion, all reflect
the emergence of highly dense
technology mega-fields. The envi-
ronmental crisis may representecological revenge effects as
nature responds to technology field
size and density [33]. The radical
mitigation of the environmental
impacts in fact may represent the
major ethical challenge facing civi-
lization in the coming century.
Density, furthermore, affects the
ethical control of a technology
field. The thinner a field, the more
one can identify its technologies;
but the impacts of new technolo-
gies on a thin field are at times as
extensive as they are unpredictable.
The invention of the spur for exam-
ple transformed medieval society.
The more dense a field is, the less
observable and predictable it alsomay be. At the field level of analy-
sis then one discerns a kernel of
truth in talk of technology out of
control [34]. On the other hand
fields are mere aggregates of dis-
parate technologies, not
autonomous technical systems.
Which technologies in a field
interact, and which dont, which are
loosely coupled, and which tightly
coupled, are often unclear. Which
interactions in a technology field
produce which social and environ-
mental impacts is largely unknown.
And technology fields contain what
Homer-Dixon calls unknown
unknowns, or situations where we
often dont know what we dont
know [24]. In them, then, 21st cen-
tury civilization faces a critical inge-
nuity gap. While the supply of
knowledge increases, so do the
environmental and social problems
created by the interactions between
technology fields and human popu-
lations, locally and globally. The
ratio of knowledges to unknowns
may not be changing in our favor.
Since humans have in truth never
been able to completely control
their society or their future, attempts
to control technology mega-fields
represent daunting regulatory chal-
lenges for political and international
authorities. The need for an
informed and effective moral intelli-
gence to control technical systems,
it seems, is constantly growing.
COMPLEX CONCLUSIONSThe ethical control of technolo-
gies, we have seen, varies with the
scale and complexity of the techni-
cal system; and those systems in
turn vary in their systemic, social,
and moral complexity. Individuals
may control tools like knives and
38 IEEE Technology and Society Magazine, Spring 2002
-
7/30/2019 1800040-2004-12 - Technological Complexity and Ethical Control
7/7
technologies like automobiles and
do a tentative inventory of the tech-
nology mini-field in their resi-
dences, but the control of commu-
nication and energy networks
requires large organizations. As
technologies themselves become
more complex, control requires
additional knowledge, know how,and, usually, formal education and
training. And as the pace of techno-
logical change increases, old
knowledge and norms become out-
dated and ineffective.
Ethical control starts with small
baby steps, e.g., in handling tools
and technologies. Small scale
experiments can help us climb the
relevant learning curve. In her cri-
tique of the unthinking overuse of
highly toxic pesticides, for exam-
ple, Rachel Carson argued for an
ethical control strategy, viz., pre-
testing pesticides, small pilot pro-
jects, and favoring biological over
technological controls [35]. The
ethical control project, especially
of technology networks and fields,
is complicated by the interplay of
diverse norms: e.g., technical, eco-
nomic, social, and environmental.
And as we move from tools to tech-
nologies, networks, and fields,
diverse stakeholders and often con-
flicting interests compete to affect
their control. The result is calls for
open but slow democratic control
processes, and regional and global
collaboration. But not all claims
are equally credible, nor are all
interests equally at risk. So we
always need to exercise our moral
intelligence. We need to solve the
ethical problems at issue. We need
to learn from our failures as well as
build on our successes. In many
cases we need to minimize the riskof disaster [6]. To that end we need
to become more responsive to ear-
ly warnings of problems, while
they are still solvable, long before
we face a full blown crisis. Tech-
nology can help here. Electronic
communications networks can
facilitate better detection and more
rapid emergency response.
The ethical control of increasing-ly complex technologies, networks
and fields may affect the future path
of civilization and, as, the environ-
mental crisis suggests, the planetary
ecosystem itself. Hopefully, the
challenges it involves do not tran-
scend the limitations of our collec-
tive moral intelligence.
REFERENCES[1] R. Rhodes, Ed., Visions of Technology,New York, NY: Simon and Schuster, 1998,pp. 21, 22, 329.
[2] V. di Norcia,Hard Like Water: Ethics inBusiness. Toronto, Ont.: Oxford Univ.Press, 1998.[3] C. Whitbeck, Ethics in EngineeringPractice and Research, Cambridge, U.K.:Cambridge Univ. Press, 1998, ch. 1 and 2.[4] M. Martin and R. Shinzinger,Introduc-tion to Engineering Ethics. New York, NY:McGraw-Hill, 1999.[5] E. F. Schumacher, Small is Beautiful,2nd ed. Vancouver, B.C.: Hartley andMarks, 1999, pp. 146f.[6] Aristotle, Nicomachean Ethics, ch. I,sect. 7.[7] C. Perrow, Normal Accidents, Livingwith High Risk Technologies. New York,NY: Harper, 1984.
[8] U. Franklin, The Real World of Technol-ogy, 2nd ed. Toronto, Ont.: Anansi, 1999,ch. 4, pp. 10f.[9] E. J.Woodhouse and D. Nieusma,When expert advice works, and when itdoes not, Technology and Society Mag.,pp. 23-29, Spr. 1997.[10] J. K. Galbraith, The New IndustrialState. New York, NY: New American, 1967,ch. VI.[11] C. Freeman and L. Soete, The Eco-nomics of Industrial Innovation, 3rd ed.London, U.K.: Pinter, 1997.[12] R. Buderi,Engines of Tomorrow. NewYork, NY: Simon and Schuster, 2000.[13] R. Meehan, The Atom and the Fault,Cambridge, MA: M.I.T., 1984, ch 8, 1984.
[14] C. Mitcham, Justifying public partic-ipation in technical decision making, Tech-
nology and Society Mag., pp. 40-46, Spr.1997.[15] J. Herkert, Ethical risk assessment:valuing public perceptions, Technologyand Society Mag., pp 4-10, Spr. 1994.[16] N. Balabanian, Controlling technolo-gy: Should we rely on the marketplace?Technology and Society Mag., pp. 23-30,Sum. 2000.[17] R. Dunford, Suppressing Technolo-gy, Administrative Science Quart., pp.
512-25, 1987.[18] G.F. McLean, Integrating ethics anddesign, Technology and Society Mag., pp.19-30, Fall 1993.[19] C. Whitbeck, The trouble with dilem-mas, Business and Professional Ethics,vol. 1, nos. 1 and 2, pp. 119-141, 1992.[20] C. Morison and P. Hughes, Profession-al Engineering Practice: Ethical Aspects,2nd ed. Toronto, Ont.: McGraw-Hill Ryer-son, 1988.[20] H. A. Linstone, Technological slow-down or societal speedup the price of sys-tem complexity? Technological Forecast-ing and Social Change, vol. 51, pp.195-205, 1996.[21] E. Leyton, The Revolt of the Engineers.
Baltimore, MD: Johns Hopkins Univ., chs.2 and 3, 1986.[23] M. Sullivan, America in 1900, inVisions of Technology, R. Rhodes, Ed. NewYork, NY: Simon and Schuster, pp. 29f, 1998.[24] T. Homer-Dixon. The Ingenuity Gap.New York, NY: Knopf, 2000, pp. 1f, 26f,ch. 1 and 7.[25] W.D. Rifkin and B. Martin, Negotiat-ing expert status, Technology and Society
Mag., pp 30-39, Spr. 1997.[26] S. Oziewicz and P. Cheney, Thiscould have been prevented, The Globe and
Mail, May 26, 2000.[27] W. Rowland, The Spirit of the Web:The Age of Information from Telegraph to
Internet. Toronto, Ont.: Key Porter, 1999.
[28] Tim Berners-Lee, Weaving the Web.New York, NY: Harper, 1999.[29] M. Stefik, Ed. The Internet Edge.Cambridge, MA: M.I.T., 1999.[30] E. Ostrom, Governing the Commons,Cambridge, MA: Cambridge Univ. Press,1990.[31] J. Gleick, Faster. New York, NY: Vin-tage, 2000.[32] P. A. Vesilind and A. S. Gunn, Engi-neering Ethics and the Environment. Cam-bridge, MA: Cambridge Univ. Press, 1998.[33] E. Tenner. Why Things Bite Back. NewYork, NY: Vintage, 1996, pp. 5f.[34] L. Winner, Autonomous Technology.Cambridge, MA: M.I.T., 1977, ch. 5, pp.197f.
[35] R. Carson, Silent Spring. New York,NY: Houghton Mifflin, 1994.
IEEE Technology and Society Magazine, Spring 2002 39