You are on page 1of 26

TECHNOLOGICAL

FORECASTING

AND

SOCIAL

CHANGE

9,309-334

309

(1976)

Mathematical Model Building


and Public Policy:
The Games Some Bureaucrats Play
JACK N. SHUMAN

ABSTRACT
This paper analyzes the relationships
between mathematical
model building and decision-making.
Special emphasis is placed on the problem of contextuality.
The primary focus of this review is on
concept
as it relates to this study. A major segment of the paper examines various
the entity
quantitative
procedures
as they related to decision-making
in the Vietnam War. Additionally,
there is a
comprehensive
analysis of issue quantification
as a cultural problem. Finally, several alternative
system
designs are presented
as well as an evaluation
of the pros and cons of using mathematical
models in
public policy.

I. Intoduction
The purpose of this study is to show that while mathematical models of social and
economic structures of the real world can be constructed, the computations derived from
these models are of extremely limited use for analyzing the current economic and social
situation, and are of even more limited use for predicting the future state of social and
economic conditions. This premise is based on an examination of the nature of mathematical modeling and the ability or lack of ability of a model to handle more than a very
limited number of variables. Further, the accuracy of measurements of the variables used
as imput to computation
is examined. And, perhaps most important of all-before
analysis of the output of the model-is an examination of the reality (and limitations) of
our theoretical knowledge of how social and economic variables interact.
The thesis of this analysis is that model building around systems with the many
variables of social and economic conditions is an exercise in unreality, particularly when
the inability to obtain meaningful measurements of many of these variables is considered.
The author suggests that other approaches to solutions in these complex areas must be
developed.
This article (borrowing from the comments of one of the reviewers) is written for
those individuals who experience the world as informational and statistical qualities, and
in terms of the variables of economics, or scientific disciplines, rather than as overwhelming sensations and events which are hard to analyze and describe without losing the
essence of experience. It is this group who experience surprises to their professional
consciousness from reading this kind of material, or even more importantly,
experience

DR. SHUMAN is a Lecturer


in Psychology
at Georgetown
University
doctorate.
His area of interest is the social psychology
of planning.
0 American

where

Elsevier Publishing

he also received

Company,

his

Inc., 1976

310

JACK N. SHUMAN

the agony of seeing their hypothesis more often than not failing to have any validity for
predicting the real world. Undoubtedly
all professions, including the humanities, have
people of limited perception and even less vision. However this paper focuses largely on
the narrowly disciplined economists, econometricians,
mathematicians,
and statisticians
and those, particularly in public policy, who have an unquestioned high respect for them.
II. Some Realities of Mathematical Modeling
For our purposes, models are representations
of processes describing in simplified
form, some aspects of the real world. Mathematical models, quoting the Organization for
Economic Cooperation and Development, consist of statistical magnitudes articulated
into an organic system by means of a number of quantifiable relationships between these
magnitudes.
Mathematical
modeling does afford us a unique way of gaining insight into the
behavior of limited numbers of variables under rigidly specified conditions. Model
analysis is based on Newtons Law of Similarity. In general, this law states that models
must be similar to the structures or situations they represent.
Mathematical modeling has been introduced into public policy largely through aerospace and defense technology, as a sort of universal panacea, a problem-solving
tool
which plays its most significant role in those areas of public policy involving mans
interactions with the natural or physical world. Fields such as medicine, aerospace, and
environmental
quality obviously depend heavily on an expanding technological base that
includes quantification
techniques. Most of our pressing problems, including race relations, income redistribution,
public service access, and metropolitan development largely
defy quantification,
simply because they involve the enormously complex and unpredictable patterns of human relationships.
A mathematical model confirms for us, in a highly refined and simplified language,
what we already know or should know. The model improves on the accuracy, precision,
predictability,
and reliability of our knowledge. Consequently,
a mathematical model
may be much more important for telling us what we do not know about a problem, than
for reconfirming our established notions.
To attain rigor and specificity when we employ mathematical models, we tend to
sacrifice variables which are not particularly amenable to any form of measurement or
quantification.
For example, the book, The Limits to Growth, has been criticized as being
so oversimplified as to be fundamentally
misleading. The authors base their model on
purely physical parameters such as technology, natural resources and population. They
largely ignore the price system, the political system, the evolution in values and customs,
and other processes of social and economic adjustment. The authors measured only what
could be measured. Although we might disagree with their conclusions, we do not yet
have any means for conducting the required causal analysis to gain understanding into the
complicated forces which mediate the process of existence. There does not exist at
present a mathematical language sufficiently complex to effectively describe or predict
or which can capture the randomness of human
social processes and relationships,
behavior.
Most mathematical modeling, even in the social sciences, is based on the calculus. The
calculus, as Peters and van Voorhis point out, offers the distinct advantage of dealing with
the relation of infinitesimal
increments of one variable to infinitesimal increments of
another. However, meaningful measurement or quantification
is always related to some
sort of physical variable or mechanistic process. An especially interesting type of mathe-

THE GAMES SOME BUREAUCRATS

PLAY

311

matics for the analysis of social problems, which offers greater flexibility than the
calculus, is the algebra of sets or classes, which is closely connected with symbolic
logic.
The term productivity
in economics offers an excellent illustration of a purely
quantitative
approach to problem analysis. Productivity
is partly defined as physical
efficiency or, the ratio of useful output to total output. Admittedly, the concept is
subjective; someone has to define useful. But because of the difficulty in measuring
accurately the inputs of capital and natural resources to the processing of wealth creation,
productivity-as
a convenience-is
measured most often in labor input terms, i.e., as
output per man hour. The difficulty with measuring productivity in output per man
hour terms is the tendency for false inferences from the measure to the conclusion that
productivity
gains are attributable
to improvements in the productivity of labor itself.
Gains in productivity
occur in roundabout and largely non-measurable
and non-quantifiable processes ranging from cultural behavior to basic science and educational policy.
The processes of evaluation, interpretation,
and judgment-in
sum, understanding-are
as at least as important, if not more important than the actual utilization of a specific
mathematical
procedure. One specific instance is the regression equation. When computed, a correlation coefficient can express the degree of interrelation or commonality
among systems of variables.
It is, of course, a necessary, but far too often not obvious precondition
that the
relationships between systems of variables be realistic and reasonable. For example, in the
summer of 1972, the Department of Health, Education and Welfare was asked to spend
$100,000 to develop what amounted to a series of regression equations that would
indicate a high coefficient of correlation between housing abandonment
in ghettos and
social services. Clearly, the requisite mathematical relationships could have been developed but they would have meant nothing. Any reasoned social analysis would show that
housing abandonment
is caused largely by owners who feel that their investments can no
longer be profitable, and is unrelated to the quantity much less the quality of social
services.
Regression analysis is an effective method for validating hypotheses. For example, one
case examines the factors behind the near-eradication
of tuberculosis in the Western
world. The National Institutes of Health would undoubtedly
measure the diminution of
tuberculosis against medical research. The correlation coefficient of medical research does
show a high degree of interrelation among diagnosis, treatment, and ultimately cure. A
more thorough study indicates that the eradication of tuberculosis correlates much more
closely with the revolutionary redistribution of income over the past forty years.
III. Entities
An entity is a modern systems concept. The concept of an entity presupposes the
coherence of a number of parts (constituent
activities) of an organized analytical unit.
These parts are characterized by a set of relationships which are thought to exhibit a
degree of self-regulation, or to be capable of being made coherent through regulation.
Entities are the consequence of statistical abstractions. When we speak of entities, we
are talking about the design of information systems (statistical or otherwise) with a set of
social relationships conceived as a social system or a behavioral entity. This is important
because the concept of real world behavioral entities is matched with empirical
representatives
(however incomplete)
of integrated statistical information
systems. In
Kuhns term, the paradigm that emerges is not so much concerned with the design of a

312

JACK N. SHUMAN

statistical or other informational product, as with the design and operation of a statistical
or informational
production process. The informational
difficulties in the entity concept
must not be minimized. This is particularly true for the collection of statistical data.
One prototype case in point is the routine and systematic collection by the National
Science Foundation
of data on the Federal Governments research and development
efforts for every three Fiscal Years. These data are submitted in response to questionnaires to the Federal agencies. At the outset, this system is in difficulty. Agencies may
and, usually, for a variety of reasons, do selectively filter whatever data and information
they choose to report. In other words, there is no independent means for evaluating the
accuracy or veracity of the submissions. Another serious difficulty with the use of this
particular statistical compilation is the matter of interpretation.
In the National Science
Foundation procedures, each agency, in effect, interprets its own activities by classifying
them. For example, in Fiscal Year 1972, the National Institutes of Health reported that
their allocation for both internal and external basic research amounted to somewhat over
$420,000,000.
Yet, in February 1972, the then Director of the National Institutes of
Health, Dr. Robert Marsten stated quite categorically and explicitly that the National
Institutes of Health conducted no basic research whatsoever. The National Institutes of
Health, by statute, are forbidden to conduct research which is not somehow missionoriented. Thus, the need to resort to some vagaries in statements.
The necessity for this dissembling does, however, possess serious problems for applying
the entity concept. For example, if a decision-maker were concerned with a concrete
biomedical research and development policy problem he would obviously want to know
how much is basic and how much is mission-oriented.
He would derive little useful data
and information from the National Science Foundation statistical compilations.
IV. The Validation of Mathematical Models
The meaning of the term validation in social systems is concerned with assessing the
capability of the system to somehow describe and predict behavior. In physical systems
the criterion of validity means that the laboratory system is an accurate representation of
a natural system. In social systems, validation refers to a reliance on statistical probability
augmented by large overlays of analysis and interpretation.
The problem of accuracy and
meaning is a far more complex matter in social systems than it is in physical systems.
One of the more interesting and important examples of the use of proper validation
procedures in mathematical modeling is in the area of water resources management. This
research represents a significant recent trend in research management by combining,
explicitly, natural science, engineering, and economic variables.
Robert A. Young and John D. Bredehoeft (Water Resources Research, June 1972)
have developed one of the better modeling procedures in water resources management.
The essential elements of the simulation model include a set of instruments or controllable variables representing
the alternative courses of action, an objective function
permitting the ranking of these alternatives, a set of factors considered uncontrollable
within the scope of the problem as it is defined, and an empirical model expressing as a
set of equations the relations between the central variables and the consequences of their
manipulation.
An analysis of this study provides several useful insights concerning the application of
mathematical modeling procedures:
1. Simulation

models do not directly provide an optimal solution to a problem. They

THE GAMES SOME BUREAUCRATS

PLAY

313

can, however, be designed to produce a numerical measure for each of a number of


alternative courses of action.
2. Where there are a large number of decision variables, a complete explanation of the
response of the system must involve a large number of simulation runs. Sampling
techniques can then be applied to select efficiently a probable optimal policy but
expressed wholly in terms of the variables and data incorporated into the models.
3. Simulation studies such as this generate results that always pertain exclusively to
the system or systems investigated.
One cannot extend these results or even their
implications to other systems without extensive reworking. As every situation we encounter is always by definition unique, so correspondingly
are any models, simulations, or
other attempts at explanation.
The first matter of concern, from the standpoint of validity, is the internal validity of
the model itself. Are there inconsistencies of such a magnitude that they might markedly
diminish the effectiveness of the model?
Next, the model refers to uncontrollable
factors. If these factors are uncontrollable,
are they equally unpredictable? We really cannot determine, from either the numerical or
the verbal expressions of the model, the probable, much less the possible interactions
between known and unknown factors. Thus, uncontrollable
factors can really be treated
in the model only in a hypothetical deductive fashion.
Professor Julius Kane of the University of British Columbia makes an interesting and
important distinction concerning this type of model. He defines it as being arithmetic,
since it is concerned with the manipulation of a few select parameters. Without elaboration, arithmetic models are explicit, staff-oriented, present-oriented
and insular.
Professor Kanes converse concept of geometric models is that they are intended to
gain insight into structural relationships, rather than precise numerical specifications, or
predictions and control. Geometric models are implicit, policy-oriented, future-oriented,
and global. The geometric model is non-linear, relational, and concerned with communications and imaging.
There is an additional crucial element in this discussion, which concerns the conceptualization model. Max Weber, the founder of social modeling, originated the concept of
the ideal type, the precursor of the present numerical model. The ideal type in Webers
methodology refers to the construction
of certain elements of society into a logically
precise conception.
Webers ideal type offered social scientists the use of logically controlled and unambiguous conceptions directly removed from historical reality. For the optimum use of
these ideal types, Weber added two predominant qualities, a feeling of personal accountability or responsibility, and a sense of proportion. These elements are combined into an
ethic of responsibility
which requires that one account for the foreseeable results of
ones actions. The ethic of responsibility is crucial to any individual involved in political
or policy analysis, particularly in the development or use of procedures which are little
more than sophisticated mental abstractions.
The foregoing discussion might be taken to imply that models by themselves can never
directly provide an optional solution to any public policy problem. The misapplication of
numerical models (as expressly stated by the designers) by decision-makers for political
purposes is becoming a far too frequent occurrence. There is undoubtedly much abuse by
political leaders of what is, after all only, a limited and specific technique. But we cannot
merely inveigh against this process, and leave the blame to unscrupulous politicians. To

314

JACK N. SHUMAN

understand the misuses or abuses of mathematical modeling in public policy, we must


also see that model designers (in many instances with honorable motives) themselves are
quite capable of leading a political leader into potentially
dangerous policy actions,
whether drastically curtailed population growth or unwise limits on economic growth.
Both rely on our normal psychological tendency to think in terms of limits and our
fascination with exponential curves. The basis of such a decision is, of course, data
and information (in sum, knowledge), partly in numerical form, partly in verbal form.
All of us, whether political leaders or private citizens, are fascinated by exponential
curves even if we never agree precisely on what they mean. For decision makers, the
elegant simplicity of these curves such as those in the Limits to Growth, offers a
deceptively straight-forward way of explaining a way out of complexity by extrapolation.
All of us, whether we are conscious of it or not, set limits to what are, in effect,
exponential curves. Only, we never place a limit to a particular curve in exactly the same
place.
A limit is in actuality based on our interpretation
and judgment of various types of
data and information.
We aggregate knowledge-behavioral,
cultural, historical, social,
economic, scientific, technological, etc.-according
to certain patterns on a curve.
It is essential that any mathematical model be valid in relation to its environment. The
model must be isomorphic.
It must be identical, similar, or possess some sort of
contiguity with other shapes or structures. A model, to possess any intrinsic meaning,
must be perceived as it relates to other analogous forms. These points are based on J.
Stafford Beers definition of systems as coexisting and mutually supporting levels of
models.
V. The Role of Mathematical Models in Social Problem Solving
Any model which begins with the supposition that social problems can somehow be
forced to analytically tractable shapes and then solved automatically consigns itself to
failure. Quoting Ida Hoos, the solution of social problems is never achieved. One does
not solve the problems of health or transportation.
Consequently, where we start or
stop is somewhat artibrary and usually a reflection of resource availability and commitment. Public policy does not require that problems must always be solved.
Model building in itself adds to our existing knowledge base by providing new insights.
It only provides us with varying configurations and perceptions of what we already know
about a particular problem and limited views of the consequences of alternative intervention strategies.
Modeling simplifies problems. It simultaneously
raises the level of predictability
and
reduces the level of uncertainty about a limited problem. All too often, this is only in our
own minds and not in reality. Models cannot always clarify. They can also lead to
confused intervention strategies.
As a case in point, there is little that we can do now, or in the future, meaningfully to
model synergistic terror. Here we are dealing with unpredictable
groups and individuals. Moreover, even while we might develop some reasonably accurate models of known
terrorist organizations and individuals, these models would be useless in terms of those we
know nothing about.
The type of models we are discussing presupposes that an individual who commits a
criminal act is rational in the sense that he has a plan or gestalt, which interrelates
him, his act, and the rest of the world. What is unfortunately
all too often developed is a
set or sets of psychological
profiles developed through psychometric
techniques of
varying credibility, with limited variables and data of dubious quality.

THE GAMES

SOME BUREAUCRATS

PLAY

315

VI. Modeling as History


Professor Matrin Shubik of Yale University, in his ascerbic but penetrating review of
Jay Forresters World Dynamics, alludes to a commonality between mathematical model
building and historical analysis. Responsible practitioners of both disciplines know that
only certain elements or variables dominate events. Social scientists know that it is
important to locate the few variables to which the system is sensitive and to measure
where practicable the few relationships that count. Eric Jantsch suggests that even
relatively small numbers of variables and few alternatives lead to problems of unmanageable size-n variables with k alternatives yield kn combinations.
Mathematical models are actually built upon bits and fragments of historical-in
the
form of data and information-knowledge.
They are constituted on the basis of past
magnitudes and relationships. Next, they are checked in terms of how well they work as
representations of what has been happening.
Mathematical models preferred by most engineers and scientists are static since they
rely on historical data by definition.
Used in a policy context, they, however, are
becoming increasingly transformed into dynamic models, and as the re.cent profession
of growth models these models only project narrow spectra of history into the future,
all-too-often as self-fulfilling expectations.
A major ingredient in the development of either historically-based conceptual models
or mathematical
models is what Alexander Christakis calls the process of defactualization. Our perceptions of problems are always grounded on a data and information base,
in other words, a knowledge bank. The danger occurs when this knowledge is perceived
through the filter of semi-analysis, analogies, and examples. We tend to think in terms of
familiar but not discontinuous
terms. From this discussion we must see that social
problems themselves are thoroughly and inextricably woven into a set of world problems
whose solutions are totally beyond our current concepts.
The English anthropologist John Burton rightly cautions us that there is a significant
distinction between an analogy and a model. A model-whether
historical, intellectual or
mathematical-is
a simplification of reality and draws attention to those features in which
the observer is interested. An analogy is a means by which some features of reality can be
better understood. Consider the analogy, The behavior of states is like the aggressive
behavior of individuals.
There may be some relation to reality in that states include
individuals and these individuals may be aggressive. But this association between states
and individuals has to be demonstrated.
VII. Responsibility: Some Further Thoughts
Webers ethic of responsibility
assumes an extremely important meaning within the
context of this discussion: an ethic of responsibility
in politics or policy analysis has
nothing in common with the Judaeo-Christian ethic. The Judaeo-Christian ethic is purely
a matter of an individuals view of the effect of his personal conduct on his relationship
with the Almighty. The ethic of responsibility,
on the other hand, is purely concerned
with the domination
of men by other men. Responsibility
in policy entails simultaneously an ethic of responsible ends and an ethic of responsible means. One must
assume, of course, that a decision-maker who formulates the latter, is sane and of decent
instincts.
Both ethics refer to the optimal use of responsibility. The ethic of responsible means is
optimal decision-making.
On the other hand, the ethic of responsible ends, involves
optimal analysis, particularly the selection of both appropriate and relevant methodologies and reasonable alternative courses of action.

316

JACK N. SHUMAN

The essential basis for both ethics are mental or situational models. These are basically
reasoned sets of assumptions about the real world and ones capability to change this
world in a desirable way, and conversely to minimize actions that might cause unwanted
situations to arise.
The decisions that mental models are concerned with are the articulations and detailed
specifications of realizable situations, e.g., desirable social conditions or states. These
situations can be goals or objectives. However, our preoccupation
with goals and
objectives over the past several years has assumed obsessional proportions in many areas
of national life.
The mental model of the policy analyst is a match or translation of the decisionmakers mental model. The analysts model must be more complex since it is largely
technological.
A hypothetical
illustration
of these ethics might revolve around strategic defense
planning. Any President of the United States, by himself and through his Secretaries of
Defense and State, constantly calculates and revises assumptions concerning the United
States interest and capability for strategic intervention in any present or potential conflict
situation. To an extent, these assumptions are mathematically
derived scenarios. These
scenarios can only be useful to a decision-maker if he endows them with his own mental
model controls. His own strategic information
picture must always be greater than
that of his staff. It must reflect domestic as well as foreign considerations, and, economic,
social, as well as military factors.
The late President Johnsons decision to intervene in South Vietnam is an interesting
illustration of the interaction of these two ethics. His decision is not arguable on moral
grounds. From an analytic standpoint, President Johnsons decision was simultaneously
responsible and irresponsible. He did articulate what he considered to be an achievable
social state, a stable democratic South Vietnam in the context of real world relationships
as he perceived them to be-namely,
the foreseeable strategic balance between the United
States, China, and the Soviet Union. Additionally, this decision was clearly a product of
the prevailing theories of the time, e.g., containment,
agrarian reform, strategic hamlet
location, limited gradual response, etc.
President Johnson failed to be responsible in two critical points. The first was most
definitely within his control. It was his obligation to ensure that his adversaries were at all
times fully appraised of the lengths to which he would go to attain his ends. By not doing
so, he allowed doubt to creep into their minds and prolong the war.
Secondly, and more tragically, was President Johnsons selection of some of the key
civilian individuals to conceptualize
the war, Robert McNamara and Alain Enthoven.
What happened, of course, is that there was not a suitable supporting ethic of responsible
means which these two individuals should have developed. As a result, the war degenerated inevitably into an ethic of responsible ends.
Alain Enthoven stands out as the eminence gris in the Vietnam entanglement perhaps
even more than Robert McNamara. Enthoven developed his version of the sophisticated
methodology known as systems analysis. By 1967, Enthoven had come to realize the
limited state-of-the-art in systems analysis as indicated in testimony to a Subcommittee of
the Senate Committee on Government Operations concerning the Planning, Programming
and Budgeting System (Oct. 18, 1967). Enthoven views systems analysis as a quantitative
approach to decision-making.
Furthermore, he claims that systems analysis is an application of the scientific method-using
the term in the broadest sense. The final ingredient to
this methodology is a broadly based inter-disciplinary
research program.

THE GAMES SOME BUREAUCRATS

PLAY

317

There are a number of inconsistencies in Enthovens thinking. First, many parameters


of national policy can be articulated quite specifically, but if they are quantified to any
extent, too much meaning is lost. One excellent case in point is morale, an individuals
willingness to give his life for his country. At best, all that one gets in this context is a
limited surrogate indicator.
Second, Enthovens claim about the scientific validity of his methodology cannot be
substantiated.
The true scientific method checks hypotheses that are either freely accepted or rejected in the light of analysis and evidence. Enthovens hypotheses are only
objectives derived through the Planning, Programming and Budgeting System. AS such,
his analytical exercise is far more Thomistic than it is scientific. One must initially accept
his objectives and then only question the selection of the best alternative for their
attainment.
Third, Enthoven does not really delve into interdisciplinary
research with any intellectualism. Interdisciplinary
research in itself is not a science, it is merely an act. The
selection of individuals with diverse backgrounds and personalities is a difficult tas.
Properly selected and guided the group can deliver a useful product. Otherwise it becomes
a Tower of Babel.
Enthovens methodology is particularly dangerous in one crucial aspect. His insistence
on quantification
obscured the difference between the functions of systematic analysis
and judgment-the
functions of the decision or policy-maker. Judgment, of course, is
based to a large extent on imponderables,
e.g., values, the likelihood of future events,
when risks should be taken, etc. Enthovens disservice was his application of a good deal
of ingenuity
that would often yield questionable
ways of measuring and making
comparisons and clarifications that were not available at the outset.
Unfortunately,
the cost for these benefits of clarification and comparison is meaning,
or perhaps a better term-context.
Enthovens insistence upon placing the decisionmaking process into a highly quantified and structured framework, particularly priority
determinations
and the concomitant
allocation of resources, disguised the real nature of
the budgetary process. As Professor Aaron Wildavsky persuasively argues, in The Politics
of the Budgetary Process, the formulation of any budget, hence any policy, is always a
political process; it is never an economic or technical process. Implementation
on the
other hand may be economic, technical, political, or any combination
or variation of
these three processes.
The highly sophisticated
planning methodologies
developed by McNamara and
Enthoven, as well as by Peck and Scherer in The Weapons Acquisition Process, and
Charles Hitch and McKean in the Economics of Defense for the Nuclear Age, attempted
to achieve one thing-the reconciliation of an abstract notion of control with its real world
counterpart (in effect making the two identical).
According to modern psychoanalysis, the gross discrepancy between perceived abstractions and the real world leads to schizophrenia. In policy analysis and planning, assuming
normal ranges of behavior, this conduct does lead to certain dangerous illusions. The
principle actors in this situation had as their ruison d Ctre, the complete civilian control
of the military establishment.
Unfortunately,
the best way to achieve this end is to keep
the military establishment out of war.
Similarly, the entire weapons systems acquisition process has little validity in a purely
economic context. The military establishment is requirement-oriented.
It requires those
resources it needs to carry out a mission or policy, when it wants them, irrespective of
cost. Clearly, what was established was the myth, not the reality of control. Effective

318

JACK N. SHUMAN

control in any organization entails an understanding


of the communications
process in
complex organizations. It is, furthermore, based on the clear recognition of dual patterns
of authority
and legitimacy, and not bureaucratized
and institutionalized
modern
systems management.
This discussion may also reflect a somewhat accurate representation
of what actually
occurred in Vietnam. Rather than one war there were at least eight different wars. First,
of course, there was President Johnsons war. Then there were the wars waged by Robert
McNamara, the Department of Defense, Office of Systems Analysis, The Joint Chiefs of
Staff war (not including the separate services), Walt Rostows War, Dean Rusks and the
State Departments War, The Central Intelligence Agencys War and the separate ventures
of the Department of Defenses intelligence services. Thus what appeared to the public as
one war, was really a plethora of wars. Analytically and mathematically, communications
between eight separate elements are impossible to network.
VIII. Vietnam as an Externality
The concept of external economics and diseconomies (externalities) treats the subject
of how the particular costs and benefits that constrain and motivate a decision-maker
may deviate from the costs and benefits of a larger organization. If this organization
should happen to be a part of the public sector, these costs and benefits will obviously
have a psycho-social connotation.
Externalities in the public sector are as much concerned with political ecology as economic considerations.
In fact, both go hand in hand.
This factor is dominant whether the issue is domestic or international, war or peace.
A social system, with its economic, political, psychological, and sociological subsystems works in such a way that the wide diffusion of decision-making
which is
permitted is necessary if complex systems are to operate at all. The vital mechanism (and
social institution)
that facilitates such specialization is the price system. The price system
is an information
system that provides users with signals to guide their behavior.
If David Halberstams arguments are at all valid, this highly specialized system broke
down in at least two ways as far as Vietnam is concerned. First, the policy-makers
prevented the entry of accurate cost information on the Vietnam war to be communicated through the price system. As a result, in the words of Thomas Schelling and
Roberta Wohlstetter, one had, by definition, a less-than-perfect communication
system,
communicating
both signals and noise. These were impossible to separate with the most
evident dangers for an interdependent
system.
Second, there is strong evidence to believe that defense analysts may have used an
externalities approach in attempting to fathom the corrupation problem in South Vietnam. It would not be difficult to visualize them conducting extensive and sophisticated
analyses of the South Vietnamese price structure in an attempt to discern if the system
was capable of communicating
corruption
signals with concomitant
strategic opportunities to which they as policy-makers could respond.
The externalities approach to Vietnam is the problem of determining those externalities which are politically or socially undesirable and should be counteracted by public
policy measures. We usually think of externalities that operate in a fashion that thwarts
the full attainment of broad social objectives and which justifies government intervention.
Extensive lists of unwanted by-products may be drawn up in modem societies-from
air
and water pollution to traffic congestion associated with the automobile. Perhaps a war
that was imperfectly thought out and naively fought can also be considered a technical
diseconomy of sorts.

THE GAMES SOME BUREAUCRATS

PLAY

319

External economies and diseconomies are a manifestation of the fact that, in complex
systems, one mans decision or behavior can often have an undesigned impact upon
others. The trick in systems design is to establish arrangements by which the mutually
interacting and dependent behavior of all decision-makers harmonizes so that the larger
system operates in an optimal way.
The kinds of problems and phenomena we have discussed are not unique to the
operation of a private enterprise economy, although they have been most extensively
treated by economists in such a context. A large multiproduct corporation, a government
agency, a military service, or a university-organizations
that may be characterized as
closed systems and may be centrally managed to a high degree-have
identical
problems. Decision-making and authority are necessarily diffused (governments consist of
departments and bureaus, armies consist of divisions and squads, etc.). The decisions of
many must nevertheless result in some coordinated and mutually consistent behavior.
Decision-makers must be constrained as well as motivated. Finally, they must be able to
obtain knowledge about their constraints, their opportunities,
and the impact of the
behavior of others. In varying degrees, discussions of externalities focus on these fundamental aspects of system organization design and management.
The problems are basically those of specifying over-all system objectives, measuring
effectiveness criteria, identifying and measuring the relevant cost concept, determining
the relative merits of alternative information
systems (with particular reference to the
cost and worth of obtaining and communicating
information),
and specifying the appropriate decision rules that should guide individual decision-makers.
IX. Opportunity Costs: Vietnam
War is, by definition, an act of, if not irrationalism,
at least, nonrationalism.
Even
though war may be senseless, it still must viewed as one aspect of public policy. If war is
governed by the phenomenon of what amounts to external diseconomies, it is also no less
governed by the phenomenon of opportunity costs.
Policy-makers and analysts concerned with the conduct of war are also concerned with
the attainment of certain strategic objectives.
These objectives are the main social benefits of war. The costs of war, like any other
public policy, are calculated according to the concept of opportunity
costs-the social
value foregone when the resources in question are moved away from alternative activities
into the specific project.
According to Professor William Baumol of Princeton University, the actual opportunity cost rate can be calculated for most public programs, usually according to the rate
of return on long-term trends. In a narrow sense, Professor Baumol is correct in arguing
that a decision to utilize a low discount rate, or to achieve the same results by grossly
understating inflationary pressures can lead to wasteful employment of these resources in
terms of low yield on investment.
However, opportunity
costs from a public policy
context have more than an economic meaning. Public policy decisions are based on excess
benefit over costs, with the benefits being expressed as value, or in this instance, strategic
judgments.
Opportunity
costs really must be judged in terms of their contextuality.
They are
relativistic. In any venture, there are costs associated with any benefit. It is a matter of
determining,
through the perceptual filter of ones own values, a proper cost-benefit
relationship in terms that may be largely non-monetary.
The Answan
Dam is an excellent case in point. It is poorly engineered; it has created

320

JACKN.SHUMAN

biomedical and ecological problems including significant crises in the schistosomiasis


fluke; and its effects may cause Egypt to have serious political difficulties with her
African and Mediterranean
neighbors. Moreover, the rapid rate of Egypts population
growth has already outstripped the irrigation and power output potential of the Dam.
All these factors could have been visualized to a large extent when the Dam was first
conceptualized.
This analysis would have predictably indicated a low rate of social return
over costs. Yet, into this equation, Gamal Abdul Nasser added the factors of his own
personal prestige and what he believed to be the enhancement
of Egypts national
prestige. Thus, in Nassers view as the ultimate decision-maker, these two value judgments
justified the transfer of resources into what was a dubious project. Clearly, investments
particularly
in the public sector, are not made exclusively accordingly to economic
rationality.
These same factors are also evident in the Vietnam war. On a purely rational basis
there was little benefit in relation to cost. Yet, if a decision-maker calculated factors such
as national honor, commitment to an alliance, credibility in the face of ones adversaries,
and the confidence of his own population, the opportunity
cost relationships begin to
assume a new meaning.
Finally, there is a case where, in the conduct of war, opportunity
costs have no
meaning whatsoever. This issue arises when a nation is fighting for its sheer physical
survival. The most obvious case in point is Israel. When a nation fights for its very
existence, costs are extraneous. The national leadership can only be concerned with the
availability or accessibility of resources. For Israel, existence and survival constitute the
ultimate social benefits.
X. Vietnam: The Social Indicator
Because war is a political process, decision-makers must possess accurate and reliable
information
on the conditions of the friendly society, which is the object of their
intervention strategy, as well as on conditions within the adversary society. In the case of
Vietnam, the information solicited was known as, in large measure, the social indicator.
Even in domestic use, the term is not yet clearly defined. It refers to some measure of
overall well-being or a good quality of life. It represents an attempt, in Christakis
words, to describe with some precision and detail the condition(s) of a society in terms of
particular activities and social groups.
Robert McNamara depended on social indicators or their variations as one of the
quantitative guides for his Vietnamese intervention strategy beginning in the early 1960s.
The use of those indicators were most noteworthy in their failure as meaningful data
collection and reporting systems. For example, as early as June 1962, former Secretary
McNamara visited South Vietnam and reported every quantitative measure we have
shows that we are winning the war. There is, of course, the problem of defining
winning. Did winning mean the total or at least the effective destruction of the Viet
Cong or did it mean the greater economic, political, and social stability of South
Vietnam based on national evolution toward some sort of participatory constitutional
system.
Winning was and always is a mental model rather than a formal mathematical
model. In any case our adversaries did not play the same game. Western rules of war,
including confrontations
with the Soviet Union, are based on chess with a check-mate
and, thus, an eventual loser and winner. The North Vietnamese and the Viet Cong war
games conversely were based on Wei Chi, a Chinese version of the Japanese game of GO.
Wei Chi constitutes the basis of the Maoist strategy. Unlike chess, which is based on an

THE GAMES SOME BUREAUCRATS

PLAY

321

end-state situation attained as quickly as possible, Wei Chi is based on a protracted


struggle, with no dramatic win-loss but rather shifts in relative advantages and disadvantages, particularly in psychological terms. The obvious lesson to be derived from this
knowledge is that intracultural
wars are much more manageable. Most certainly, the
communications
gap concerning the rules is markedly lessened. It is simply not good
public policy to wage wars where the combattants cannot understand each other.
By themselves these indicators did give a partial but dangerously incomplete picture.
These measures could have, and probably did include the ratio of South Vietnamese to
Viet Cong, and consequently North Vietnamese desertions, ratios of equipment, drop or
loss ratios, monthly changes in the Viet Cong and North Vietnamese infiltration rate,
equipment and supply destruction in over replacement by category, comparative South
Vietnamese-Viet Cong-North Vietnamese casualty ratios, etc.
These ratios, of course, all assume an analytically acceptable ratio. We also know that
these measures contained other various but critical social indicators, such as the number
of hamlets shifting under the control of either side, changes in the numbers of villagers
who assisted or cooperated with either side (loyalty), the success of the Viet Cong
psychological penetration in terms of villagers being given individual tracts of land, and
probably the conduct on popularity opinion, attitudinal, or support polls of sorts.
It is also clear that the defense strategists developed an elaborate mathematical
sampling system. This system probably included sampling rates of junks searched for
contraband with appropriate margins of error as percentages of the estimated total junk
population
in both North and South Vietnam. These inferences are probably quite
accurate in view of certain statements by Walt Rostow on the subject. Moreover, the
parameters and boundary conditions of this sort of model can be easily deduced. This
system probably also measured highly controlled samplings of hamlets and villages
supposedly pacified by region as a fraction of the total. This system never answered two
key questions. First, obviously, the quality of the data base itself. Second, the capability
of the enemy to thwart or even use the data base for their own purposes.
This criticism of McNamara and Enthoven should not be construed as denigrating
them for being judicious or even fanatical collectors of social statistics. These data were
vitally needed. However, unlike purely physical science statistical data, social statistics are
not universal. By definition, they are inextricably bound within the matrix of a particular
culture and its historical consciousness. Their tragic mistake was in neglecting the extreme
conceptual and methodological
difficulties in the cross-cultural interpretations
of social
statistics. In effect, they did not know what data they had.
These abused statistical data were incorporated into the entire policy process, both in
its analytical and decision-making
processes, with which, at least in hindsight, are
predictable results. First, as already noted, these data were taken and used out of natural
context. Second, they were used in a predictive mode, particularly for trend extrapolations. The use of any form of social statistics to discern the future is fraught with hazards,
because these data, by definition, are always historical. When a user is not even aware of
what his data are, the danger is compounded.
Physical systems, unlike social systems, are governed by certain universally accepted
laws, which regularize experimental behavior and results. Thus, if a metallurgist designs
an experiment on the superconductivity
of metals and specifies rigid boundary conditions
as well as the critical parameters, his experiment can be universally repeated with at most
only very minor changes. F = ma is immutable in Newtonian mechanics. The data
obtained in this experiment are unchallengeable.
A totally different situation prevails in social systems. For example, an experiment

322

JACK N. SHUMAN

might hypothetically
be designed to sample in real time the factors governing reading
achievement in children. If brain measurements could be taken while the children were
reading, psychologists might examine in partially quantitative
form parameters such as
auditory and visual response, perception, and motor skills. Presumably this data could be
fed into a computer for instantaneous analysis.
What useful data would really result? First of all, the experiment is not really
repeatable,
even minor changes in the boundary conditions or variables would entail
different results. Secondly, the notion of real-time in this context is totally specious.
Each participant in the experiment is a human being. He is bound by historical circumstances, owed in part to his unique genetic characteristics, which partially shapes the
cognitive and emotive aspects of his personality. The other aspects of his personality are
conditioned by his sense of values, or his information
coding matrix derived from his
external environment
in terms of culture and history. Therefore, unless those investigators were prepared to somehow account for these factors, their experiment would be
questionable.
What Secretary McNamara and Alain Enthoven did was to place these indicators in a
PPB model. Without going into further discussion, each of these indicators was included
in the PPB model as a variable. Furthermore,
estimates of cost-outlays and proposed
benefits were probably attached to these variables which were then, in turn, related to the
strategic goals and programs enunciated by President Johnson.
The fundamental difficulty with the PPB model is that it is based on an econometric
type of thinking. Therefore, as a purely economic methodology,
it is explicitly and
intrinsically grounded in maximization.
The problem in social systems is that one must
assume that the entire population
tends to maximize its perceived benefits from an
individual rather than a group basis. Maximization thus becomes a series of individual
strategies. The result is the inevitable social trap, brilliantly depicted by Garret Hardin
and John Platt. PPB, unfortunately,
offers both a superb closed methodology and the
distinct disadvantage of facilitating the accelerated achievement of situational social traps.
The entire PPB rationale offers a strong inducement
for these entrapments.
Its
Jesuitical
structure, in reality a hierarchy of goals and objectives provides an
initiate with a surprisingly broad range of options once he accepts the truth and veracity
of very general premises. This situation is actually what occurred during the course of the
Vietnam intervention strategy. An individual in a decision-making capactiy at a fairly high
level thought that because he interpreted the truth correctly, he could apply his
policies in any way he saw fit.
The PPB system, or any other quantitative
system for that matter, was and still is
incapable of directly measuring the all-important
intangibles of national life. These
systems could not measure the elan of the South Vietnamese population, their national
consciousness,
their inner-felt loyalty to their government,
their willingness for self
sacrifice in a common cause-in other words, the total national infrastructure.
The best that could be obtained were dubious surrogates. For example, how could one
measure the problem of corruption?
In brief, elaborating some of the discussion in an
earlier section of this paper, the following conceptual-mathematical
model was probably
presents cultural-linguistic
difficulties.
developed. First of all, the term corruption
Corruption to an American may not necessarily have the same connotation as it does to a
South Vietnamese. Therefore, some sort of corruption correlation coefficient had to be
devised. Next, there is the problem of data, which are suspicious. They came by
definition from the South Vietnamese. Next from what sources? If the price system of

THEGAMESSOMEBUREAUCRATSPLAY

323

any national economy does function as an information system, it might be theoretically


possible to gauge corruption pressures such as inflationary pressures in the price system
as functions of changes in the price system. That is, if one knows how much of the price
level is affected by corruption, a suitable methodology might conceivably be some sort
of innovative one-dimensional
Phillips analysis. It is easy to see how one can conceptualize a PPB approach to the problem of corruption if its dimunition is given the status
of a sub-goal.
The problem here is one of incorrectly
applying multiplexing
procedures. It is
necessary to simulate various channels-honest
and corrupt-and
sample the data travelling within each channel. We must assume that a major deterrent for the American
situation was, of course, that the Americans and the South Vietnamese may not have
been acting either in concert or as allies in this instance. Rather, they probably perceived
each other in terms of an adversary relationship.
How could we test the data being
transmitted through such channels?
Undoubtedly,
some indicators, such as the ratio of arrests to convictions for corruption, were developed. The increased ratio between equipment and supplies off-loaded and
deliveries was another. Changes in the number of corrupt officials to honest officials
may have also been used. It was also probably at least theoretically possible to simulate,
using the PPB model, the degree of black market activity as a function of price structure.
These various and sundry indicators were (or in reality always are) surrogates of many
phenomena. Giving them meaning was and always will constitute the other two phases of
the evaluation pro.cess-interpretation
and judgment. One can only wonder what might
have happened if the Spartans at Thermopylae
had had a computer processing their
situational models into PPB models.
XI. Responsibility-Domestic
Style
The need for synchronous situational models on the part of the decision-maker and
the analyst is also quite evident in several of the Federal agencies having R & D contract
and grant programs. Many of the contracts and grants given out by these agencies have
been of rather dubious quality. The reasons for these awards can be explained in several
ways, e.g., cybernetics-the
quality of information
in organizational
communications
channels, statistics-poor
awards may just be random samplings.
If random samplings in grant awards are accepted as a viable explanation, it may only
prove the existence of some sort of higher or lower limits of acceptable competence on
the part of some of these organizations. That is, even if they do give out sub-standard
contracts and grants, the ratio may be lower than in some other organizations. Accordingly, they may actually be more cost-effective in terms of a lower competence ratio.
Conversely, a random sampling approach could be used by an effective organization, to
identify a reasonable ratio of high-quality contract or grant awards. However, whether
one is concerned with establishing limits of competence or quality, by any method
whatsoever, the key factor is still judgment, or the lack thereof.
This paper, on the other hand, ventures to offer the suggestion that one should look at
how these organizations view their roles. If their intent and practice are some sort of
beneficial change or remedial improvement,
then they are searching for large-scale,
dynamic system-wide improvement-in
one word-reform.
If conversely, their intent is
more limited, one can rightly accuse them of doing nothing more than of having an
elaborate series of policies all of which are designed to assure their self-perpetuation.
A contract or grant organization may decide that it is, in fact, some sort of problem-

324

.JACKN.SHUMAN

solving organization (whatever the term means, usually Ida Hoos content ot the term
suffices). In this instance, it would endeavor to develop what one might call a First-Rate
Information System (FRIS) for short.
This system might conceivably attempt to articulate simultaneously
achievable and
desirable future social states. Certainly, the attainment of these states would have to be
dependent on technology in its broadest context. Mental models would essentially be sets
of assumptions of the scope and need of technology. Technical models of various sorts
would establish key variables, structural relationships
and intervention
strategies. In
between these two modeling states-other
conceptual models could be used to develop
simulated projections of the relevant state-of-the-art,
and profile matches with individuals and organizations
against particular technology areas. Most importantly,
this
roughly sketched system could provide scales of performance expressed linearly or
logarithmically.
Judgment is necessary both for the array and presentation of quantitative data and its
interpretation.
A computer simulation model cannot determine what is first-rate. This
determination
is the province of the analyst and equally or, even more, of the decisionmaker who alone can really determine first-rate, as an arbitrary but hopefully analytically substantiated value judgment.
The process of establishing, if one desires to have one, a second-rate or even third-rate
information
system (SRIS, TRIS) is identical with a first-rate information system. The
only variables are the quality and base of the analysis and the quality of the individuals
making the appropriate value judgments.
Unfortunately,
the latter type of information systems abound in public policy. They
may occur at times as deliberate machinations.
However, more often than not, they are
just managerial manifestations
of John Platts networks of systems-wise social traps,
which are essentially the end result of mans preoccupation with self-generating chains of
short-term rationality.
It is clear from Platts analysis that supposedly new innovations such as the National
Institute
of Education
in the Office of Education will be ineffective in optimizing
American education in the. sense that whatever the Institute does will not achieve
long-term basic structural reforms. If for no other reason NIE like most organizations
takes a macrosystems
approach without first understanding
microsystems
interactions. Rather, NIE will probably pursue mainly short-term quick-fix, routines and
standardized policies. Some of its grants and policy decisions are already based on SRIS
and TRIS systems.
Of course, NIE does suffer from two difficulties. First, it is part of the Office of
Education. It has had to assume many of the existing R & D programs, contracts, and
grants of the Office of Education as well as personnel from that office. These factors
alone could ruin any organization. However, to compound these difficulties, the National
Institute of Education is modeled on the National Institutes of Health, making its tasks
even more difficult. Whatever NIE does, it is not science.
XIII. A Behavioral Overview
The intent of this section is to give a behavioral overview to the type of individual who
both possesses and puts to practical policy-relevant uses, the narrow restricted outlook
described in other sections of this paper. It is obviously difficult to characterize this
personality in terms of a single or even various traits of character. However, for the
purposes of this paper the term policy Philistine will suffice.

THEGAMESSOMEBUREAUCRATSPLAY

325

Because policy Phil&tines exist-whether


as individuals or as large aggregates, from the
standpoint of cultural anthropology,
they can be visualized as forming a cultural subgroup, as do members of any ethnic groups. Members of any sub-group always ahere to
the following common features:
1.
2.
3.
4.

Myths
Symbols
Initiation rites
Kinship relations

Futhermore, being a separate sub-group, policy Philistines can and do clash with other
sub-groups. Obviously, it is these clashes between sub-groups which causes internal strife
and instability,
which in their most exaggerated form produce civil war-the
most
destructive type of conflict. It is quite probable that differing sub-groups can have so
much cultural and historical variance that true communication
between them is no more
possible than it was between the Americans and the Vietnamese.
Professor Richard Gambino of Queens College observes that the exaggerated, excessive, and extra-systematic
or systemic-destructive
behavior of any sub-group is a form of
tribalism. In his study, Professor Gambino was alluding to the phenomenon
of ethnic
tribalism. Our rather peculiar sub-group may be said to indulge in quantitative
tribalism.
The great pity is that few cultural anthropologists pay any attention to contemporary
urbanized society. Rather, they seem to spend most of their time studying what they
regard to be primitive societies. In their own narrow, distorted way, policy-Philistines,
who collectively are quantitative
tribalists, are at least as primitive as the most remote
tribe in New Guinea.
Whether or not these people possess abnormal character traits or personality disorders
is beyond the scope of this paper. Clinical psychologists or psychiatrists such as Freud,
Binschager and Strauss would, of course, look for signs of neurotic obsessive-compulsive
behavior. Similarly, social psychologists might look at this over-attentiveness to quantification as a form of territoriality. What this paper attempts to point out, is that a policy
Philistine does not need to exhibit pathological behavior. It is only necessary that the
results of his endeavors be pathological in terms of their social disbenefits, especially
when one thinks of these persons as being a collectivity.
Thought and action are based on three perceptions. These are the perception of
knowledge, the perception of reality, and finally the perception of the relationship that
occurs when knowledge is applied to reality. In any individual, there is a certain natural
degree of distortion among these three perceptions. This sort of distortion is perhaps
more accurately a divergence between the mental image or the ideal and the true.
Recently, perhaps in the past 20 years, a second element of distortion can also be
observed-the
computer. The present state-of-the-art
of computer technology is not
readily adaptable to any natural language, whether linguistic or symbolic. For example, a
mathematical model must be placed into the context of an artificial computer language
before it can be tested on a computer. This procedure clearly results in additional
distortion.
Fortunately,
we can (or at least we should) recognize and compensate for these
distortions. In this sense, however, they are not pathological. As an hypothesis, but one
possessed of considerable validity, it is probable that the most severe distortion is within
the mental processes of this rather odd collectivity-the
policy Philistines or quantitative

326

JACKN.SHUMAN

tribalists. Their analyses have been inaccurate and misleading, perhaps because of an
abnormal divergence between their perceptions of both knowledge and reality, and by
extension, the application of this knowledge to real world problems. It is this extreme
divergence which makes their collective behavior pathological in terms of its results.
The difficulty of meaningful policy analysis is largely cultural-linguistic.
We have not
yet devised a suitable language for performing this task. A new language, of course, can be
either verbal or numerical-mathematics
is as much a natural language as English.
The way we attempt to comprehend social systems offers an excellent illustration of
this linguistic deficiency. We derive our conceptualization
from the successful results of
the laboratory control of experimental variables. And, the discovery of the laws of
classical mechanics have given rise to the belief that physical science methods can help
discover the basic laws of social systems. In physical sciences, the criterion of system
validity is that it works, which means that the laboratory system is an accurate
representation
of a natural system. In other words, the system possesses the capability of
describing and predicting behavior, or validation in some other sense. The complex
interrelationships
of politics, history, economics, psychology, and sociology have nothing
like the empirical justifications
of the phenomenology
of the classical atoms and molecules of physics.
Let us take two concepts of systems analysis to demonstrate this conceptual difficulty.
First, is the boundary conditions
or limits of the system. In social systems these
concepts have little, substantive meaning. One can say that the domains of any social
system are characterized by complexity, diversity, entropy, and resonance (or its converse, dissonance). These dynamics condition the behavior of a social system but they do
not circumscribe
its behavior, as we are conditioned
to think about the meaning of
limits in physical systems.
The second term is the parameter. In physical systems the parameter is a constant
since these systems are bounded. In social systems on the other hand, there are many
boundary conditions. Therefore we are concerned with the variables which are amenable
to at least some quantification.
Social systems may have parameters. However, the presence of constants in a phenomenon
as chaotic, dynamic and unpredictable
as a social system is a matter of
question. Most certainly, there are variables present in social systems, and these can be
partially quantified. The process of meaningful quantification
in social systems, which are
by definition unstable, is extremely difficult. It calls for creativity, imagination, patience,
diligence, and above all, the most profound skepticism on the part of a policy analyst.
Here, the analyst acts in an intuitive fashion, as did most of the great figures in the
history of science.
Individuals accustomed to studying physical systems are used to two basic procedures.
First, determine the boundary conditions or limits of the system. Second, identify the
major parameters.
This second procedure is quite difficult as far as social systems are concerned. For
example, in education, an obvious quantifiable parameter is the school age population. A
non-quantifiable
parameter would be pupil attitudes. It is difficult at best to know which
parameters are quantifiable and which are not.
Unfortunately
these parameters do not exist in an ordered way in social systems.
Physical scientists are accustomed to visualizing neat, well-bounded systems, possessing
highly structured systems of parameters. Such is not the case in social systems.
The ferment in the Arab world offers an excellent illustration in point of isolating

THE GAMES SOME BUREAUCRATS

PLAY

327

parameters. The Arab world is presently going through an extraordinary social revolution
which has essentially two interrelated
aspects-modernization
and secularization.
The
former can be quantified to a large extent by including variables such as increases in per
capita income, greater educational attainment, improved longevity, industrialization,
etc.
Secularization on the other hand is virtually impossible to quantify. It represents a radical
transition from the value structure of a 1,000 year old scholastic and traditional society
to the value structure of a technological society. This aspect of the Arab social revolution
is the most wrenching since these two value systems are totally incompatible.
Social systems do, of course, possess parameters, some of which are quantifiable, and
some of which are not. However, in social systems these parameters are linked almost as
closely as the sections of the doublt helix of the DNA and RNA molecules. The point of
this discussion is not to draw useless analogies. Clearly, there is no comparison at all
between these two phenomena. Rather, the intent of this discussion is, in fact, sensory. If
innovative policy thinking is to be predicated on Thomas Kuhns notion of the paradigm
shift, then a sense of vivid but disciplined mental and visual imagery is critical to
devising an innovative focus for intersensitive and intercultural problems.
The mental and visual image of a doublt helix may also be important in another
respect. If social systems in any way resemble double helices, then their self-organizing
characteristics may in fact be templates. This conceptualization
might well go far in
explaining social inertia and cultural lag.
Again, we are faced with a severe semantic difficulty. We know that part of this double
helix comes under the general nomenclature
of a parameter. How then do we meaningfully describe the other parts of this double helix? They are certainly not moral
imperatives. More than likely they fall between behavioral dynamics and institutional
vestments. These factors include inherent powers, futurism, populism, and moral
urgency. To these we can also add diffusion of power, pragmatism, the relativism of
values, gradualism, institutionalism,
process, procedure, legality, and technicalities. These
vestments may and obviously do vary in nature depending on a particular social system.
However, their existence in whatever form is about the nearest thing there is to a
constant in social systems analysis.
John Warfield of the Battelle Laboratories wisely suggests that in the search for any
kind of methodology, it is valuable to seek historical perspective. One of the more useful
works in terms of its impact rather than its intellectual ability is Thomas Kuhns 7he
Structure of Scientific Revolutions. This work, as Warfield points out, shows how at any
given point in the development of science there is a set of paradigms that has temporary
acceptance. Most of science, including its application, is carried out under the prevailing
set of paradigms.
There paradigms are also twofaced. On the one hand, they furnish the knowledge and
hence the freedom to conduct scientific research and science-based problem-solving in a
professional way. On the other hand, they constrain us because most of us have to live
with the limitations of the paradigms.
Kuhn is correct as far as his analysis goes. Most of the physical scientists such as
Maxwell and Einstein effected only one revolutionary
paradigm shift in their careers.
Unfortunately
paradigms in social policy shift radically. In a creative and innovative
policy analysis several paradigm shifts take place simultaneously-namely
the continual
recontextualization
of knowledge. Using Kuhns ideas we can gain some insight into
certain of the relatively simplistic quantitative modes of policy analysis now in vogue.The
most well-known, or at least advertised, of these procedures is the growth exponential.

328

JACKN.SHUMAN

As Daniel Bell points out, any growth that is exponential must at some point level off
or we would reach a point of absurdity. In the measurement of the growth of any
phenomenon
that shows patterns of saturation, the questions revolve around the definition of that saturated state and the estimation of its arrival date.
The exponential pattern described is a sigmoid or S-shaped curve in which the rate
below and above is often quite symmetrical. Because this is so, it lends itself easily to
predictions, since one assumes that the rate above the mid-point will match that below
and then level off. In fact the beauty of this curve, as Bell characterizes it, has seduced
many statisticians into believing that it is the philosophers stone for the charting of
human behavior. The two most recent examples of this behavior pattern are Forrester and
Meadows, i%e Limits to Growth and the Zero Energy Growth (ZEG) scenario in
Exploring Energy Choices of the Ford Foundations Energy Policy Project. No curve of
any type ever fully takes into account the growth of the human spirit, the development
of technology, social complexity, evolution, mans ever-present capability to control his
own destiny.
The difficulty with any proposed saturation of social change is that such curves are
plotted for at best, a few (and usually only one) variables and presume a saturation. But
what may be true of beanstalks, or yeast, or fruit flies, or similar organisms where logistic
growths have been neatly plotted in fixed ecological environments,
may not hold for
social situations where decisions can be postponed (as in the case of births) or where
substitutions
are possible (as in the case of bus or subway transit for passengers) so that
the growths do not develop in some fixed way. As Bell argues, it is for this reason that the
use of logistic curves may be deceptive.
It was suggested earlier that meaningful paradigm shifts in social policy are the result
of a process of continual recontextualization
of ones internal knowledge base. The
purpose of the paradigm shift is to gain perspective, imagery, proportion and above all
validation. Another way of stating this purpose is that a paradigm shift, in essence,
controls a shift in our perceptual filters and hence our value structures.
A policy Philistine can belong to any intellectual
discipline-the
humanities,
the
physical sciences, or the social sciences. Some of the more interesting policy Philistines
can be found in economics and its offspring econometrics. Economists and econometricians have evolved some complex and intricate mathematical
systems for explaining
economic change.
The difficulty occurs when some of these economists and econometricians lose sight of
the limitations of their methodologies. They attempt to extend their generally accurate
short-term models into long-term models of economic growth. There are a number of
difficulties in this transition.
First, it is difficult, if not impossible, to assign quantitative
relationships to the
impact that education, science, and technology have on economic growth. Second, many
economic theories in current vogue are static in the sense that the technological and social
framework within which fluctuations
in quantities, prices, and incomes takes place is
taken for granted, or regarded as a residual factor.
Penetrating the facade of residualism, we can see that education, training, innovation, research, and development
are even by themselves only part of an infinitely
diversified, and complicated process, which is at best only partially quantifiable. To arrive
at even a partial explanation or interpretation
of economic growth, we are forced to
combine the standard methods of economic research with those of psychology, sociology,

THEGAMESSOMEBUREAUCRATSPLAY

329

anthropology,
history, philosophy,
decision, and organization
theory, and political
science.
The term residual does allow the economic-type Philistine the opportunity
to play
out his scenario. Where he can identify or isolate a quantifiable variable, he can develop a
differential equation. Physical scientists are aware that even a limited physical model is so
complicated and diverse that they can never hope to develop the full range of differential
equations to characterize their system. By definition, social systems can only be expressed by a virtual infinity of differential equations. Hence, the policy Philistine resorts
to the cul-de-sac of the exogenous or residual factor.
Economic or econometric Philistines and allied model builders are often accused of
having no sense of humanity in their calculations, This accusation is only partly true.
These Philistines can easily handle humanity. It is just individuals and people with which
they cannot deal. Their abuse of the Phillips curve is one example.
There are relationships
among prices, wages, and employment
at least as far as
demand-pull inflation is concerned.
There are also serious definitional questions inherent in the Phillips curve as well as the
availability of data. In large measure, the Phillips curve is a specification problem in the
econometric sense of the word. That is, the act of defining the variables to be considered
in the Phillips curve also has implications of how the equation in question is to be
specified regarding the appropriate sets of weights to be assigned to each variable.
The Phillips curve if used by itself in policy planning can lead to serious complications,
since it tends to give a simplified view of inflation and unemployment.
In human and
political terms, higher rates of unemployment
never constitute an acceptable tradeoff.
While the Phillips curve offers an explanation for demand-pull inflation, it does not
provide an explanation for cost-push inflation. Hence, the Phillips curve cannot account
for the present situation of high-inflation and high unemployment.
Most importantly, the
Phillips curve does not take into account the changing nature of the labor force. A
Phillips analysis would only treat this sort of factor as a shift in a curve.
As an alternative approach, it would be far more preferable to marry the Phillips
relationships
with a broad and expanded manpower development policy, which would
reach the entire work force. In the long run, there is only one strategy offering truly a
comprehensive view of coping with unemployment.
According to this strategy, the surest
way to reduce the natural rate of unemployment
is to eliminate the reasons for it,
which are primarily the workers uncertainty about alternative job opportunities and their
inability to retrain for new jobs.
There is no cheap, non-complex or simple way of increasing any countrys production
or employment. The inflation side of the Phillips curve seems to offer an answer, but that
is illusory. At best, the extra output is temporary. It is much more likely to be offset by
less than normal output and unemployment
sometime in the future.
There are most certainly quite a few economists who have recognized the need to
build models of varying levels of aggregation out of the information
processed by
individuals. These models reflect both diversity and heterogeneity.
Diversity is unfortunately suppressed in most econometric models as a price for data consistency.
Edgar Dunns Resources for the Future statistical entity approach built on cybernetic
techniques is one such innovative approach. Similarly, Kenneth Bouldings classification
of one form of knowledge as folk knowledge, is quite useful as he relates these
categories to specific social problems. John Wartield suggests (using Boulding as well as

330

JACK N. SHUMAN

Dunn and John Burton) that it is helpful to think about the international system as being
dominated by folk knowledge processed by individuals acting out multiple and simultaneous roles. To paraphrase Warfield, the chief merit of these thinkers is that they
challenge man to enlarge the dimensions of the social sciences.
Some outstanding
applications of model building have recently been made by Dr.
Alexander Christakis of the Academy for Contemporary
Problems. Using Warfields
mathematical matrices and sets, Dr. Christakis has developed a system that analyzes the
linkages of a contemporary problem set. While his structure is hierarchical in terms of its
heuristics, it can also be used in a system of non-hierarchical heuristics. Problems affect
each other and are linked to each other as complex communication
or feedback chains
which we do not always know to be hierarchical. All we really know is that they are
interdependent,
sometimes in hierarchical aggregates and sometimes not.
Warfield has pioneered in the creative application of mathematics as a language in
policy analysis and planning. His set and matrix form is based on hierarchical principles,
based on his belief that there are subordinal relationships in social systems. One does not
necessarily have to fully accede to this opinion to see that Warfields use of what he terms
a system matrix offers a means of visualizing complexity in social systems.
At first glance, the approach of Warfield and Christakis seems a deceptively simple use
of mathematics. They give a meaningful and, for a policy-maker, useful perspective on
social complexity. The simplicity is their avoidance of the calculus, which would only
give a mechanistic and deterministic, model, which they have assiduously sought to avoid.
Symbolic logic can be, and is, an equally useful mathematical procedure for establishing
complex and systemic configurations.
XIV. Econometrics: A Form of Cultural Destruction
Several years ago, the Organization
for Economic Cooperation and Development
requrested a number of leading economists to develop econometric
models for the
educational systems of some less developed countries including Greece, Italy, Spain,
Portugal, Turkey and Yugoslavia. The group was headed by Dr. Jan Tinbergen of the
Netherlands who was later to become a Nobel Laureate in economics.
The group used a Keynesian multiplier model. As its study pointed out, Keynesian
models have the advantage of the clarification they bring to some basic properties of the
mechanisms in the economic system, particularly supply and demand. In essence, the
approach was an attempt to apply the method of input-output
analysis to educational
planning.
To be fair, Tinbergen himself noted that erroneous results are obtained unless one
clearly understands
that macromodels only provide a first approximation
to reality.
Unfortunately,
this caveat is not only lost in too many instances by too many econometric&s,
but more importantly,
decision-makers are often oblivious to its implications
particularly for long-term planning.
Tinbergens model suffers from the key conceptual and methological flaw common to

1 The author is thoroughly


familiar with the work of the Organization
for Economic Cooperation
and Development
in educational
planning.
His doctoral
dissertation,
An Analysis of the Science
Policy Programs of the Organization
for Economic
Cooperation
and Development
(OECD), focused
on OECDs ability to deal with the relationships
among economics,
education,
and science and
technology.
In view of his research as well as his continued
issue-tracking
of OECD, the author is
qualified to comment on OECD in this critical fashion.

THE GAMES

SOME BUREAUCRATS

PLAY

331

mathematical models. Only those parameters which can be estimated statistically have
been introduced into the various models in this study. Even certain theoretically refined
concepts and relationships, e.g., teacher-pupil feedback relationships which bear directly
on the models, have not been included because they cannot be translated into numerical
estimates.
The problem is not that these models do not work. On the contrary, in their own quite
limited way, they work far too well, with results that are often unfortunate. For all of its
supposed clarity and sophistication, this sort of thinking is unrealistic.
Perhaps one of the best ways to illustrate this sort of muddle headedness is to translate
one of these models into a so-called word problem. This word problem obviously is
simplistic, but it does demonstrate the logic or illogic of Professor Tinbergens model.
Question: If it takes one painter eight hours to paint a room, how long will it take
thirty-two painters to do the same job. 7 Answer accepted as correct within the context
of Professor Tinbergens model: Fifteen minutes. Simple algebra. The true answer is
that thirty-two painters are going to get in each others way. Professor Tinbergens models
foster a kind of thinking which seriously holds that if technology in any form including
econometric models causes problems, then more technology will solve these problems.
Professor Tinbergen claims that these models are a large part of the basic long-term
decisions in educational planning in terms of labor-market policies. His models give the
impression of being dehumanizing. No one denies that educational planning must have a
sound economic basis. However, in their approach, Professor Tinbergen and his colleagues
are oblivious to the fact that education throughout history has an intrinsic, inner value to
an individual apart but yet linked somehow to economic gain.
The work of Professor Tinbergen and his colleagues in Greece offers ample illustration
not only of the lack of realistic applicability of these models, but equally, the supreme
benefits of having a receptive audience. Professor Tinbergens group began much of its
indepth work in Greece about the time of the coup detut and continued throughout the
rule of the military junta. Moreover Professor Tinbergens work in this phase shifted from
hmited concern with education to a broader involvement in Greek economic and social
planning and development.
This change of emphasis is crucial in terms of Professor Tinbergens approach.
Paraphrasing the Swiss economist Professor Eugen Beuler, economists such as Professor
Tinbergen are grounded in linearity.
Being extrapolative, linearity is regular, predictable, rational, and mechanistic.
The product of this form of thought is a plan
which isolates and quantifies the major aspects of reality according to the scientific
method. Later deviations of real processes are either not distinguished or not noticed
from planned ones.
Psychologically or methodologically,
Dr. Tinbergen would have little to offer. A tough
but open-minded
social planner or thinker would reject this approach outright. Conversely, a young, nationalistic but limited Greek colonel, who nevertheless regards himself
as a social planner or thinker of sorts would find Professor Tinbergens planning methodology to be highly acceptable for what he considers to be its cardinal virtue, its
simplicity, without any regard to its deceptiveness.
Professor Tinbergen and his colleagues themselves admit that their proposed development model for Greece depends only on the validity of the economic postulates and the

* This word problem

was suggested

by Dr. Arnold

Mysior,

Georgetown

University.

332

JACK N. SHUMAN

accuracy of the equations which characterize the Greek educational system. They do not
answer the question of how this system can be characterized in ways other than by
equations.
Basically, this group developed a capital/output
ratio growth model. The main assumption in the model is that there is a systematic relationship, which they do not adequately
define, between certain aspects of manpower stock and national income. They did
mention, but glossed over, the difficulty of obtaining reliable statistical data, which
dominate the validity of any model.3
Governments too should be able to face up to the Tinbergen models discrepancies,
particularly the role of qualitative factors in non-measurable
factors, if they are at all
concerned with second-order
effects. First-order effects are by definition economic.
The economy always first feels the effects of change. Second-order effects, which are
psychological
and sociological in nature, appear later on, but are usually of longer
duration. Clearly, ill-conceived economic planning can and will produce the most disruptive second-order effects.
XV. Conclusion
This paper has attempted to demonstrate the complexity of determining cost-benefit
relationships in public policy and some of the conceptual methods used to elucidate these
relationships.
Clearly, one of the most widely used conceptual methods, but rather
primitive and rigid in view of our present state of intellectual
development,
is the
mathematical model.
While the mathematical model is not as useful in policy analysis as one might wish
neither is it thoroughly useless. To put the issue in simple terms, one does not throw out
the baby with the bathwater. This rationale already exists for too many policy decisions
with unfortunate
results. We are inevitably driven to the laborious task of devising the
utility or nonutility of a particular model on an individual by individual case. What do we
expect of the model? Are our parameters and variables valid? Are our data reliable?
Energy offers an ample illustration
of the gravity of the problems involved in
numerical procedures. For example, according to the General Accounting Office, the
newest Project Independence scenario will be based on input from 47 data and information sources. The dangers in preparing a plan in this approach can be severe, particularly
from the standpoint of data and information distortion and the unpredictability
of the
data and information
interactions;
that is, even if the variables selected are the crucial
ones. We have at least the capability of developing mathematical
models of these
distortions; although the predictability problem is less amenable to these techniques.
For the future, however, Alexander Christakis concept of the policy science paradigm
does offer a potentially
useful way of better integrating mathematical modeling as a
fundamental
methodology
of public policy. Essentially, the paradigm shift as a phenomenon represents a shift in personal values, and relatedly, cultures and ideologies. If we
are shifting our perception and our perspective, we are in effect saying two related things.
First, our knowledge base not only grows, but shifts into wholly new, unfamiliar and
perhaps even frightening contexts. When we speak of the phenomenology
of a paradigm shift, we are speaking of an evolutionary
process of comparing and contrasting
mental and visual images.
3 These observations
are corroborated
by Dr. Alexander
N. Christakis.
Earlier this year Dr.
Christakis was invited to OECD to participate
in a planning session with some of Professor Tinbergens
colleagues.
Dr. Christakis found these sessions to be unproductive,
with respect to their ability to deal
with complex problems, and fully supports, from his own direct experience,
the issues raised here.

THE GAMES SOME BUREAUCRATS

PLAY

333

Second, the paradigm shift has as an obvious corollary, a concomitant


emotional
growth on the part of an individual equal if not perhaps somewhat greater than his
intellectual growth. This type of growth is probably even harder to attain in many of us.
The policy process finally is always the expression of choices. These choices are based
on intellect and consciousness,
or far more accurately, the awareness of deep, inner
feelings. A policy based solely on either factor cannot succeed. For example, in a
mathematical
model we not only decide intellectually
that a particular parameter or
variable must be incorporated into the model, as human beings we must equally feel in an
emotional sense the decision as well for it to have any meaning.
The paradigm shift is not really unique. The term itself is. What the paradigm shift
implies is the continued and even inexorable growth of the human mind and spiritfeeling, perceiving, thinking, and doing with a particular vision of reality.
The paradigm shift is not a panacea. It cannot cure the ills of humankind. It is not in
itself a problem-solving
tool. It does not explain human action. However, the paradigm
shift does allow society to develop the means to permit understanding
of how to allow
heterogeneous, but not antagonistic, cultures of systems of interpersonal relationships to
develop. The other aspect of the problem, constructive participation, is concerned with
behavior and motivation
which no social theory of development has yet to explain.
Unfortunately,
the decision-making
process and its supporting theories and the human
action process far too often bear no resemblance to each other.
Mathematical modeling will remain as highly limited in social planning, and, will
expand its effectiveness only when our own consciousness expands. Presently, mathematical modeling can describe the relationships between physical systems. Physical states
are continuous,
discrete and as a result at least fairly predictable. Conversely, social
systems are discontinuous,
and certainly not discrete in terms of being well-grounded
states.4
There is no conceptual way, therefore, whether it be the paradigm shift or the
mathematical model, that can in any way account for the phenomenon of the discontinuous function. For example, a linear regression curve can show over a period of several
years slight, but gradual and perceptual, increases in the population above the poverty
line. Suddenly, and dramatically, there is a totally new curve which represents, if not
affluence, at least a major income redistribution
among the poor. There are many
hypotheses advanced for this shift, but it still remains inexplicable. We have to accept the
quantum jump even if we cannot explain it.
No matter what we do we are inevitably forced to resume the policies and politics of
what Burke called the computing
principle: adding, subtracting, dividing, and multiplying. The systems relationships to be computed are not in themselves mathematical
variables. Rather, these relationships are concerned with the distributions of power. It is
doubtful that they will involve love relationships. Perhaps love relationships as expressed
in interpersonal
relationships being the ultimate humanism are not possible or even
desirable in social systems with all their attendant complexities. However, in our existence we should hope for the relative humanism of responsibility as defined in the scope
of this paper, as a combination of the best of both worlds.
4 Dr. Christakis
in his excellent
paper on the policy
science paradigm
appears to support
quantitative
approaches
in policy analysis, since one whole section of his paper is devoted to the
discussion
of mathematical
models. However, the discussion is pointless and the models are ludicrous,
which was Dr. Christakis
intent in the first place. Setting up a ridiculous
strawman is an effective
technique
in demonstrating
the fallacy of taking any analytical
technique
too seriously.
In this
instance, the mathematical
calculations
are even performed
without the benefit of data.

334

JACK N. SHUMAN

References
I. OFFICIAL

SOURCES

United States
J. Stafford
Beer. Managing Modern Complexity.
The Management of Information and Knowledge.
Committee
on Science and Technology-House
Panel on Science and Technology.
Washington,
D.
C.: Government
Printing Office, 1970.
Edgar S. Dunn. The National Economic Accounts.
A Case Study of the Evoluation
Forward Integrated
Statistical
Information
Systems.
United States Department of Commerce.
Survey of Current
Business, 50th Anniversary
Issue, 1971, pp. 4564.
National
Science Foundation.
Federal Funds for Research and Development and Other Scientific
Activities. 1970-1973.
Washington,
D. C.: Government
Printing Office, 1973.

International
Organization
for Economic
Cooperation
and Development.
Econometric
Applications. Paris: OECD, 1905.
Social Change and Economic Growth. Paris: OECD, 1967.
-.

Models of Education:

Some

II. ARTICLES AND REPORTS


William J. Baumol. On the Appropriate
Discount
Rate for Evaluation
of Public Projects. Program
Budgeting and Cost Benefit Analysis. Pacific Palisades, California:
Goodyear Publishing,
1970, pp.
200-211.
Alexander
Christakis
and David W. Malone. A Systematic Approach to Human Settlement Planning.
Progress Report
I of the Science-Based
Planning
Project.
Columbus,
Ohio: The Academy
for
Contemporary
Problems and Battelle Laboratories,
1973.
Alexander
N. Christakis.
A New Policy-Science
Paradigm for Emerging Population
Trends and Issues.
Presented to the Research Symposium
on Alternative
Futures and Environmental
Quality. United
States Environmental
Protection
Agency, March 7-8, 1973.
The Limits of Systems Analysis in Economic
and Social Development
Planning. Conference
on
-.
Information
Technology,
Jerusalem,
Israel, 19 71.
The Club of Rome. The Predicament
of Mankind. Unpublished
Manuscript.
Alain C. Enthoven.
The Systems Analysis Approach.
Program Budgeting and Cost Benefit Analysis.
Pacific Palisades, California:
Goodyear
Publishing Company,
1970, pp. 159-168.
Julius Kane. Intuition,
Policy and Mathematical
Speculation.
International
Symposium
on Uncertainties in Hydrologic
and Water Resources Systems. Unpublished.
Charles C. Peters and Walter Van Voorhis. Statistical Procedures and Their Mathematical Bases. New
York: McGraw-Hill,
1940.
Martin Shubik. Modeling on a Grand Scale. (Review of World Dynamics by Jay W. Forrester),
Science
174 (December,
19 7 1).
John N. Warfield. An Assault on Complexity. A Battelle Monograph.
Columbus, Ohio, 1973.
Robert
A. Young and John D. Bredehoeft.
Digital Computer
Simulation
for Solving Management
Problems
of Conjunctive
Groundwater
and Surface Water Systems. Water Resources Research, 8
533-556 (1972).
III. BOOKS
Daniel Bell. 7he Coming of Post-Zndustrial Society. A Venture in Social Forecasting. New York: Basic
Books, 1973.
John W. Burton. World Society. London: Cambridge University Press, 1972.
Jay W. Forrester.
World Dynamics. Cambridge,
Massachusetts:
Wright-Allen,
1971.
D. G. Garth and C. W. Mills. Max Weber: Essays in Sociology. New York: Galaxy Books, 1958.
David Halberstam.
The Brightest and the Best. Greenwich,
Connecticut:
Fawcett,
1973.
Ida R. Hoos. Systems Analysis in Public Policy: A Critique. Berkeley,
California:
University
of
California Press, 19 72.
Thomas S. Kuhn. The Structure of Scientific Revolutions. Chicago: University of Chicago Press, 1970.
Donella H. Meadows,
Dennis L. Meadows et al. The Limits to Growth. New York: Universe Books,
1972.
E. J. Mishan. Economics for Social Decisions: Elements of Cost Benefit Analysis. New York: Praeger,
1973.
Received 28 June I9 74; revised 7 February

I9 75

You might also like