Professional Documents
Culture Documents
PETER M. DEMARZO
DIMITRI VAYANOS
JEFFREY ZWIEBEL
I. INTRODUCTION
In this paper we propose a model of opinion formation in
which individuals are subject to persuasion bias, failing to adjust
properly for possible repetitions of information they receive. We
argue that persuasion bias provides a simple explanation for
several important phenomena that are otherwise hard to ratio-
nalize, such as propaganda, censorship, marketing, and the im-
portance of airtime. We show that persuasion bias implies two
additional phenomena. First, that of social inuence, whereby
ones inuence on group opinions depends not only on accuracy,
2003 by the President and Fellows of Harvard College and the Massachusetts Institute of
Technology.
The Quarterly Journal of Economics, August 2003
909
910 QUARTERLY JOURNAL OF ECONOMICS
not only to information coming from one source over time, but
also to information coming from multiple sources connected
through a social network. Suppose, for example, that two individ-
uals speak to one another about an issue after having both spoken
to a common third party on the issue. Then, if the two conferring
individuals do not account for the fact that their counterparts
opinion is based on some of the same (third party) information as
their own opinion, they will double-count the third partys
opinion.
Our notion of persuasion bias can be viewed as a simple,
boundedly rational heuristic for dealing with a very complicated
inference problem. Correctly adjusting for repetitions would re-
quire individuals to recount not only the source of all the infor-
mation that has played a role in forming their beliefs, but also the
source of the information that led to the beliefs of those they
listen to, of those who they listen to listen to, and so on. This
would become extremely complicated with just a few individuals
and a few rounds of updating, to say nothing of a large population
of individuals, interacting according to a social network, where
beliefs are derived from multiple sources over an extended time
period. Instead, under persuasion bias, individuals update, as
Bayesians, except that they do not accurately account for which
information they receive is new and which is repetition. This
notion corresponds with social psychology theories of political
opinion formation. In particular, Lodge [1995] develops an on-line
model of information processing, where individuals receive mes-
sages from campaigns, and integrate their interpretation of the
message into a running tally for the candidate, while forgetting
the details (e.g., source) of the message. We elaborate on the
running-tally interpretation of persuasion bias in Section II.
Persuasion bias is consistent with psychological evidence.
Several studies document that repetition of statements in-
creases the subjects belief in their validity. The interpretation
given by these studies is that repetition makes the statements
more familiar, and familiarity serves as a cue to validity (that
is, subjects are more likely to believe a statement if it rings a
bell).3 Closely related to familiarity are the notions of salience
8. For a classic case study, see Festinger, Schachter, and Back [1950] who
study the evolution of beliefs among residents of MIT dormitories. This study
argues that the nature of group inuence is related to the underlying social
network in the group, for which frequency of contact plays an important role (see
Chapter 3 on the Spatial Ecology of Group Formation).
9. We are thus ignoring issues of strategic communication. Ignoring these
issues is reasonable in many settings (e.g., sharing of political opinions or stock
market views among friends and colleagues). And as will be seen, our notion of
persuasion will yield interesting biases in the evolution of beliefs, absent any
strategic behavior. Nonetheless, in many persuasive settings (e.g., political cam-
paigns and court trials) agents clearly do have incentives to strategically misre-
port their beliefs. We discuss the interaction of strategic behavior with our notion
of persuasion in Section V.
914 QUARTERLY JOURNAL OF ECONOMICS
13. More recent treatments include Wasserman and Faust [1994] and Bonac-
ich and Lloyd [2001]. Dynamics similar to those in French and Harary are
obtained in a statistics literature starting with DeGroot [1974]. See also Press
[1978] and Berger [1981].
14. See, for example, Banerjee [1992], Bikhchandani, Hirshleifer, and Welch
[1992], and Ellison and Fudenberg [1993], where agents receive information from
the entire population, and Ellison and Fudenberg [1995], where agents receive
information from their neighbors. See also Bala and Goyal [1998] for a general
neighborhood structure, analogous to our network structure.
PERSUASION BIAS 917
15. See, for example, Case and Katz [1991], Evans, Oates, and Schwab
[1992], Borjas [1995], Glaeser, Sacerdote, and Scheinkman [1996], Katz, Kling,
and Liebman [1999], Bertrand, Luttmer, and Mullainathan [2000], Duo and
Saez [2000], Madrian and Shea [2000], and Hong and Stein [2001].
918 QUARTERLY JOURNAL OF ECONOMICS
knew the entire social network. If, instead, the agent did not
know the entire network (as is likely to be the case in large
networks), the agent would have to try to infer the sources of his
sources information from the reports he received. Even for a
simple network, this appears to be an extremely complicated
problem that does not beget an obvious solution.16
Persuasion bias can be viewed as a simple, boundedly ratio-
nal heuristic for dealing with the above complicated inference
problem, when agents cannot determine (or recall) the source of
all the information that has played a role in forming their beliefs.
Agents might instead simply maintain a running tally summa-
rizing their current beliefs, as in Lodges [1995] conception of
on-line information processing. Furthermore, upon hearing such
a summary from someone else, agents would be unable to distin-
guish which part is new information, and which part is informa-
tion already incorporated into their own beliefs. Agents might
then simply integrate the information they hear into their run-
ning tally, using an updating rule that would be optimal if the
information were new. This is precisely what agents do under
persuasion bias.
Formally, communication and updating in our model are as
follows. In the rst communication round, agent i learns the
signals of the agents in S(i). Given normality and agents xed
assessment of the precision of others information, a sufcient
statistic for these signals is their weighted average, with weights
given by the precisions. We denote this statistic by x 1i , and refer
to it as agent is beliefs after one round of updating:
O
q ijp 0ij 0
(1) 1
xi 5 x ,
j
p 1ii j
16. Instead of requiring agents to know the entire social network, agents
could alternatively communicate not only their beliefs, but also the sources of
their information, the sources of their sources, etc. Such communication, however,
would be extremely complicated: the information agents would need to recall and
communicate would increase exponentially with the number of rounds.
17. To be precise, this equation determines a sufcient statistic for agent is
information, rather than is posterior estimate of u. The posterior estimate also
depends upon is prior on u, and coincides with the sufcient statistic if the prior
is diffuse. By assuming that agents communicate raw data rather than posteriors,
we focus on how data is aggregated by the social network, absent any effect of the
priors.
920 QUARTERLY JOURNAL OF ECONOMICS
1 2
1 1
0 0
2 2
1 1 1
0
3 3 3
T5 .
1 1 1 1
4 4 4 4
1 1 1
0
3 3 3
The directed graph corresponding to the listening structure S is
depicted in Figure I. We use this listening structure in Section III
as a running example.
PERSUASION BIAS 921
FIGURE I
Listening Structure for Example 1
where l t [ (0,1]. These belief dynamics, given by (4) and (5), are
the focus of our analysis.19
t
x 5 FP G
t21
s50
T~ls! x0 .
Since each row of the matrix T sums to one, the same is true for
T(l) and hence for P ts2 1
5 0 T(l s ).
20
Therefore, the beliefs of an agent
i after any communication round are a weighted average of all
agents initial beliefs. The weight of agent j can naturally be
interpreted as js inuence on i.
t
w ;
ij FP G
t21
s50
T~ls! ,
ij
ASSUMPTION 1. S `t5 0 l t 5 `.
21. See, for example, Aldous and Fill [1999, Ch. 2] and the references therein.
PERSUASION BIAS 925
THEOREM 3. Suppose that agents are fully rational (i.e., not sub-
ject to persuasion bias), the social network is common knowl-
edge, N is strongly connected, and p 0i j 5 p 0j for all i, j. Then,
agents converge to the correct beliefs x after at most N 2
rounds of updating.
t
xtil 2 x# tl
d ;
il ,
~1/N! Sj i xtj 2 x# ti
where i xi denotes the Euclidean norm of the vector x. 24 The
relative difference of opinion d ti l will be positive or negative,
depending upon whether agent is beliefs are to the right or
left of average on issue l.
Initially, an agent might be to the right on some issues, and
FIGURE II
Dynamics of Beliefs for Example 1
to the left on others. Our main result in this section is that with
sufcient communication, and for a general class of networks, an
agents relative position on any issue will be the same. That is, in
the long run, if agent i is moderately to the right on one issue,
then agent i will be moderately to the right on all issues. This
implies that long-run differences of opinion across all issues can
be summarized by identifying agents on a simple left-right
spectrum. In this case, we say that long-run differences of opinion
are unidimensional.
To illustrate the unidimensionality result, consider the lis-
tening structure of Example 1. Figure II illustrates the dynamics
of beliefs in this example for two distinct issues represented by
the two axes. Initially (t 5 1), beliefs are arbitrary. For example,
agents 1 and 4 are similar with regard to the vertical issue, but
at opposite extremes with regard to the horizontal issue. How-
ever, after sixteen rounds of communication, beliefs appear to lie
along a single line. Agent 1 is now at one extreme on both issues,
PERSUASION BIAS 929
agents 3 and 4 are at the other extreme on both issues, and agent
2 holds moderate views.25
We will show that the convergence to unidimensional beliefs
is a quite typical result of communication under persuasion bias.
Moreover, the nature of the long-run disagreement (i.e., the line
along which beliefs differ) and the long-run positions of individual
agents are determined from the listening structure.
Recall that the dynamics of beliefs are determined by the
listening matrix T through equations (4) and (5). Since T can be
interpreted as the transition matrix of a nite, irreducible, and
aperiodic Markov chain, it has a unique eigenvalue equal to one,
and all other eigenvalues with modulus smaller than one. The
row eigenvector corresponding to the largest eigenvalue coincides
with the social inuence weights (since wT 5 w), and thus
characterizes the beliefs to which agents ultimately converge. As
we show below, the row and column eigenvectors corresponding
to the second largest eigenvalue characterize the long-run differ-
ences of opinion.
Dening the second largest eigenvalue is complicated be-
cause eigenvalues might be complex, and because they must be
ranked according to the ranking of their counterparts in T(l t ) for
t large. The ranking criterion is, in fact,
f l*~a! ; ia~l*!i 1/l*,
where a(l) [ (1 2 l) 1 la, l* is the limit of l t when t goes to `,
and iai the modulus of the complex number a.26 Based on this
ranking, we let a n be the nth largest eigenvalue, and V rn and V cn
the nth row and column eigenvectors, respectively. Of course, ties
in the ranking are possible. We assume, however, that except for
complex conjugate pairs, there are no ties. This is true for generic
matrices, and thus our result below applies to generic listening
structures.27
We can now state our result regarding the unidimensionality
of long-run differences of opinion.
macro that in each application randomly locates ten agents in geographic neigh-
borhoods (which dene the listening structure as in Section IV), and independent
of location also randomly assigns two-dimensional beliefs. The application then
animates the evolution of differences of opinion, demonstrating the convergence to
a line for any initial beliefs and neighborhood network.
932 QUARTERLY JOURNAL OF ECONOMICS
30. Other updating processes generally will not yield long-run linear differ-
ences of opinion. For example, we have already seen in Theorem 3 that if agents
know the network and are fully rational, they converge in a nite number of
rounds to the same beliefs, and therefore there are no long-run differences of
opinion.
31. This result is available upon request. For example, if agents are on a
circle and listen to their clockwise neighbors, and if l t 5 1, then the cycle has
period length 2p/arctan(Im(a 2 )/Re(a 2 )) 5 2N. Since each agent moves halfway
toward his neighbor each period, relative positions will cycle after 2N communi-
cation rounds.
PERSUASION BIAS 933
32. The symmetric precision case also corresponds to the random matching
model, sketched in subsection III.D. Indeed, in that model, the assessments of
precision are the matching probabilities, which are by denition symmetric.
PERSUASION BIAS 935
FIGURE III
Two-Dimensional Neighborhood Example of Subsection IV.B. First Number
gives Inuence Weight (percent), Second Number gives
the Long-Run Position, V ci2 .
Mu 2 1 arctan F1
3
tan F GG
u2
2
5 .
p
2
(In particular, p/(2M 1 1) , u 2 , p/ 2M.) Then, the ith
component of V c2 is proportional to v ci 5 sin[iu 2 ], while the
ith component of V r2 is proportional to v ri , dened by
v ri 5 2 sin[iu 2 ] for i 5 6M, and v ri 5 3 sin[iu 2 ] otherwise.
When M $ 2, the extreme values of v ri are for i 5 6(M 2 1).
FIGURE IV
Long-Run Positions for the Linear Neighborhood Example of Subsection IV.C
(M 5 1000)
are pulled mainly in one direction, since the beliefs of their more
extreme neighbors are close to theirs.
The result that distance is smallest at the extremes can also
be expressed in terms of the density of agents holding a given
position. We plot this density in Figure V. As shown in the gure,
the density is largest at the extremes, since agents positions are
the closest. Thus, independent of the initial density, the long-run
density exhibits a form of polarization.
According to Theorem 8, the agents whose difference in ini-
tial beliefs matters the most for long-run disagreement are M 2
1 and 2(M 2 1), i.e., the agents next closest to the extremes. This
is because these agents not only are close to the extremes, but
also are the only agents that those at the extremes are listening
to.
Finally, it is interesting to examine the speed of convergence
in this example. Recall from Theorem 5 that the speed at
which beliefs converge to consensus is of order f l (a 2 )S st25 10 l s ,
*
while the speed at which differences of opinion become unidimen-
sional is of order ( f l (a 3 )/f l (a 2 ))S st25 10 l s , where a2 and a3 are
* *
the second and third largest eigenvalues of T. For large M, these
eigenvalues are given by
(8) a 2 < 1 2 p 2/12M 2 and a3 < 1 2 p2/3M 2,
940 QUARTERLY JOURNAL OF ECONOMICS
FIGURE V
Density of Long-Run Disagreement Positions for the Linear Neighborhood
Example of Subsection IV.C.
V. A DDITIONAL APPLICATIONS
In this section we explore more informally a number of other
applications of our model. For concreteness we pick and discuss in
detail one area where persuasive activity is particularly notable:
that of political discourse. We then briey discuss applications to
a number of other areas.
One important caveat is in order here. Our model presumes
that agents report their information truthfully. This assumption
34. This follows from equations (23) and (28) in the Appendix, by setting m 5
1 and m 5 2.
PERSUASION BIAS 941
35. The interaction between strategic behavior and persuasion bias yields a
number of interesting questions for future research. One question, for example,
concerns the competition between strategic political parties when voters are
subject to persuasion bias.
942 QUARTERLY JOURNAL OF ECONOMICS
message; i.e., the intent is for listeners to hear the message many
times. Similarly, censorship often seems motivated by the notion
that frequent exposure to an opposing opinion will sway beliefs in
that direction, and therefore such exposure should be restricted.36
The phenomenon of political spin seems closely related. Po-
litical strategists often seem to be concerned with pushing a
common message repeatedly. Political spokespersons coordinate
on common lists of talking-points which they repeat in the
media, seemingly under the belief that such repetition will give
their viewpoint the most effective spin.37
More generally, airtime is considered to have an important
effect on beliefs. For example, a political debate without equal
time for both sides, or a criminal trial in which the defense was
given less time to present its case than the prosecution, would
generally be considered biased and unfair. Indeed, common con-
cerns about inequitable political fund-raising seem to be moti-
vated by concerns about the unequal airtime that the funds will
subsequently provide. Note that this appears to be true even
when the message is simple, and could seemingly be conveyed
rather quickly, and when it has already been widely heard by the
electorate. Here as well, such concerns can be easily understood
under the presumption that repetition unduly affects beliefs.
Unidimensional Opinions. Perhaps the most notable im-
plication of our model for political science is that according to
Theorem 4, long-run differences of opinion should be unidimen-
sional.38 In particular, representing an agents beliefs over L
36. A distinction should be drawn here between the censor who believes that
the censored opinions are inaccurate, and the censor who believes that they are
accurate (but personally undesirable). Without a notion of persuasion, it is hard to
understand censorship of the former type. Indeed, absent persuasion bias, argu-
ments for incorrect opinions should, on average, lessen the appeal of these opin-
ions. While, in contrast, persuasion bias is not necessary to understand the
activity of a censor trying to silence opinions he believes are in fact correct, it
seems clear that many censors do believe the views they are silencing are harmful
and incorrect.
37. Closely related are social psychology literatures on the efcacy of priming
and agenda setting through the media in political campaigns. (See, for example,
Iyengar and Kinder [1987] on priming and Cohen [1963] and Erbring, Goldenberg,
and Miller [1980] on agenda setting.) Related to our model, Huckfeldt and
Sprague [1995] nd evidence that the specic form of social networks plays an
important role in inuencing political beliefs. In particular, they nd that fre-
quent political conversation partners have an important effect on an individuals
political beliefs after controlling for natural demographic and personal
characteristics.
38. We believe the focus on long-run differences of opinion, rather than on
convergent beliefs, is more appropriate for political applications, in light of our
discussion in footnote 23 above. That is, lacking a natural metric to measure the
PERSUASION BIAS 943
47. In line with our notion of persuasion, juries frequently deliberate for an
extended period before reaching a consensus. Judges often instruct deadlocked
juries to continue deliberating, despite the absence of new information, and such
juries frequently do reach a consensus.
948 QUARTERLY JOURNAL OF ECONOMICS
VI. CONCLUSION
We propose a boundedly rational model of opinion formation
in which individuals are subject to persuasion bias. Specically,
individuals fail to adjust for possible repetitions of (or common
sources in) information they receive. Persuasion bias can be
viewed as a simple, boundedly rational heuristic, arising from the
complexity of recounting the source of all the information that
has played a role in forming ones beliefs. We argue that it
provides an explanation for several important phenomena, such
as the effectiveness of airtime, propaganda, censorship, political
spin, marketing, etc. These phenomena suggest that persuasion
bias is ubiquitous, and plays an important role in the process of
social opinion formation.
We explore the implications of persuasion bias in a model
where individuals communicate according to a social network. We
show that persuasion bias implies the phenomenon of social in-
uence, whereby an individuals inuence on group opinions de-
pends not only on accuracy, but also on how well connected the
individual is in the network. Persuasion bias implies the addi-
tional phenomenon of unidimensional opinions, because individ-
uals opinions over a multidimensional set of issues converge to a
single left-right spectrum.
We apply our model to several natural settings, including
neighborhoods with bilateral communication, political discourse,
court trials, and marketing. We argue that our model provides a
useful way for thinking about opinion formation in these settings,
and has a number of novel empirically testable predictions. For
example, our model predicts that political beliefs should be uni-
dimensional, with conformity to this pattern being greater for
issues that receive media coverage. Additionally, the importance
of different demographic factors in explaining political beliefs
should depend on the role these factors play in social interaction.
An important extension of our model is to endogenize the
listening structure. In some settings, for example, agents might
believe that those expressing opinions similar to their own are
more accurate, and might choose to listen only to such individu-
PERSUASION BIAS 949
r~G! 5 P
~i, j![G
T ji;
that is, the product of the direct inuence that ows from the root
of the tree. Let Gi be the set of spanning trees with agent i at the
root and r(G) . 0. Then, the social inuence weights can be
characterized as follows.
48. See, for example, Aldous and Fill [1999, Ch. 9] and the references therein.
952 QUARTERLY JOURNAL OF ECONOMICS
FIGURE VI
Spanning Trees for Agents 3 and 4 in Example 1
satisfy
w i #G i #S~i!
5 .
w j #G j #S~ j!
That is, to determine the relative social inuence of agent i, we
simply need to count the number of spanning trees with i at the
root and the property that each agent listens to his predecessor,
and multiply this by the number of agents that i listens to.
To illustrate the tree-counting method, consider the listening
structure of Example 1. The vector corresponding to the number
of spanning trees is (8,5,2,1). (The spanning trees for agents 3 and
4 are shown in Figure VI.) The vector corresponding to the car-
dinality of an agents listening set is (2,3,4,3). Corollary 2 implies
that the relative inuence weights are (16,15,8,3). These are
equal to the weights found in subsection III.A, after
normalization.
APPENDIX 2: PROOFS
Proof of Theorem 1. The matrix T is the transition matrix of
a nite, irreducible, and aperiodic Markov chain. It is well-known
(see, for example, Aldous and Fill [1999, Ch. 2] and the references
therein) that (i) such a Markov chain has a unique stationary
distribution, i.e., a unique probability distribution w satisfying
wT 5 w, (ii) w i . 0 for any state i, and (iii) starting from any
state i, the probability of being in state j at date t, converges to w j
as t N `. Property (iii) implies that the matrix Tt converges to a
limit T` , each row of which is equal to w.
To prove the theorem, it remains to show that the matrix
P ts2510 T(l s ) converges to T` . Dene the random variable L t to be
equal to 1 with probability l t and 0 otherwise. Assume also that
L t are independent over time. Dene the (random) matrix Z t by
P @~1 2 L !I 1 L T# 5 T
t21
1
S t2
Zt 5 s s
s50 L s .
s50
Then
P T~l !.
t21
E~Z t! 5 s
s50
t50
Lt 5 ` 5 1.
t2 1
lim E~Zt! 5 lim E~TS s5 0 L s! 5 T` .
tN ` tN `
i
Proof of Corollary 1. The corollary follows immediately from
wT 5 w. i
Proof of Theorem 2. The consensus beliefs are correct if and
only if
p 0i
wi 5 ;
S j p 0j
i.e., if and only if the vector p 0 [ (p 01 , . . . ,p 0N ) solves equation
p0 T 5 p0 . The ith component of this vector equation is
(9) O p T
j
0
j ji 5 p0i .
Since
qji p0ji qjip0ji p0jj qjip0ji qji p0i
T ji 5 1 5 0 1 5 0 Tjj 5 0 Tjj ,
pjj pjj pjj pjj pj
equation (9) is equivalent to equation (6). i
Proof of Theorem 3. Given normality, a sufcient statistic of
agent is information after communication round t is a weighted
average v ti x 0 of all agents signals, for some vector v ti of weights.
(This can easily be proved by induction.)
Suppose that after communication round t, agents have not
converged to the correct beliefs; i.e., there exists i such that v ti
differs from the vector w of weights corresponding to the correct
beliefs x. Then, we will show that the beliefs of at least one agent
must change in round t; i.e., there exists i such that v ti v t2 i
1
.
t t21
Suppose by contradiction that v i 5 v i for all i. Agent i does not
change his beliefs in round t only if v ti 2 1 x 0 is a sufcient statistic
for {v t2
j
1 0
x } j[S ( i ) . The same applies to the agents in S(i), and so
on. Therefore, by strong connectedness, v t2 i
1 0
x is a sufcient
t21 0 t9 0
statistic for {v j x } j[N, and thus for {v j x } j [N,t9 5 0 , . . . ,t 2 1 . This
implies that v t2 i
1
5 w for all i.
To show that agents converge to the correct beliefs in at most
N 2 rounds, we will show that each agent can change his beliefs in
at most N rounds. For this, we observe that if agent i changes his
beliefs in round t, then v ti must be linearly independent of
956 QUARTERLY JOURNAL OF ECONOMICS
T 5 Vc AV r,
T~l! 5 Vc A~l!V r ,
s50
s
t21
s50
c
s
r c
F t21
s50
s G r
Vc F t21
s50
s G
P A~l ! V 5 O P a ~l ! V V .
r
n51
N
F t21
s50
n s G c
n
r
n
Setting
P a ~l !,
t21
k n~t! ; n s
s50
n51
F s50
n s G
O P a ~l ! V V 5 1w 1 O k ~t!V V .
N t21
c
n
r
n
n52
N
n
c
n
r
n
~10!
xt 5 FP G
t21
s50
T~ls! x0
O k ~t!V V x .
N
5 1wx0 1 n
c
n
r 0
n
n52
O k ~t!wV V x
N
t 0 c r 0
wx 5 w1wx 1 n n n
n52
~11!
O k ~t!wV V x .
N
5 wx 0 1 n
c
n
r 0
n
n52
Multiplying equation (10) from the left by the row vector e i whose
ith component is one and all others are zero, we similarly have
x ti 5 e ix t
O k ~t!e V V x
N
(12) 5 e i1wx 1 0
n i
c
n
r 0
n
n52
O k ~t!V V x .
N
c r 0
5 wx 0 1 n in n
n52
I I O I I O
t21 t21
kn ~t! an ~ls !
(13) log 5 log ; ys .
k2 ~t! s50
a2~ls ! s50
lim
sN `
ys
ls
5 log F
fl* ~an !
fl* ~a 2!
; zn , 0. G
Therefore,
ys
lim 5 1,
sN ` l s zn
and thus the series (13) has the same behavior as the series
Olz 5z Ol,
t21 t21
s n n s
s50 s50
Ol.
t21
z3 s
s50
FOGF G
t21
S s 50 l s
t21
fl* ~a 3!
exp z3 ls 5 .
fl* ~a 2!
s50
PERSUASION BIAS 959
P c ; p 01 S Oq k
k1 p 0k, . . . ,p 0N Oq
k
kN p 0k D
satises equation P c T 5 P c . The ith component of this vector
equation is
(14) O Fp O q p GT
j
0
j
k
kj
0
k ji 5 p0i Oq p .
k
ki
0
k
Since
qjip0ji qji p0ji qji p0i qji p0i
T ji 5 5 5 5 ,
p1jj Sk qjkp0jk Sk qjkp0k Sk qkjp0k
equation (14) is equivalent to
Op q p 5p Oq p ,
j
0
j ji
0
i
0
i
k
ki
0
k
Ps ; SO k
q k1p 0k1, . . . , Oq k
kN p 0kN D
satises equation P s T 5 P s . The ith component of this vector
equation is
(15) O F O q p G T 5 Oq p .
j k
kj
0
kj ji
k
ki
0
ki
Since
qjip0ji qjip0ji qji p0ji
T ji 5 5 5 ,
p1jj Sk qjk p0jk Sk qkjp0kj
equation (15) is equivalent to
Oq p 5Oq p ,
j
ji
0
ji
k
ki
0
ki
F O q p G T 5 F p O q p G S qqpp
0
ji i
P cj T ji 5 p0j kj
0
k ji
0
j kj
0
k 0
k k k kj k
F O q p G T 5 F O q p G S qqpp
0
ji ji
P sj T ji 5 kj
0
kj ji kj
0
kj 0 5 qji p0ji 5 qijp0ij 5 Psi Tij.
k k k kj kj
WTW21 5 V21AV.
T 5 ~VW!21 AVW;
V r 5 VW,
V c 5 ~V r! 21 5 W 21V9.
(18) v M21 1 1 2 v M 5 av M.
1 2
(22) F
b 1~r 1! M ~1 2 2a! 1
1
r1 G F
1 b 2~r 2! M ~1 2 2a! 1
1
r2
5 0. G
Equations (21) and (22) form a 2 3 2 linear system in b1 and b2 .
For a to be an eigenvalue of T, this system must have a nonzero
solution, and thus its determinant has to be zero. This will imply
an equation which will determine a.
To derive this equation, we assume that a [ (21/3,1]. (Under
this assumption, we will obtain 2M 1 1 5 N distinct eigenvalues,
which is obviously the maximum number that T can have.)
Setting
(23) cos u 5 @~3a 2 1!/2# [ ~21,1#,
we have r 1 5 e i u and r 2 5 e 2 i u . The determinant of the system
of (21) and (22) is
e 22iMu@~1 2 2a! 1 e iu# 2 2 e 2iMu@~1 2 2a! 1 e 2iu# 2.
Setting this to zero, we get
(24) e 2iMu@~1 2 2a! 1 e iu# 5 6eiMu @~1 2 2a! 1 e2iu #.
962 QUARTERLY JOURNAL OF ECONOMICS
where we rst used the addition of sines rule, then equation (23),
and nally the function g(u ) [ [2p/ 2, p/ 2] dened by
tan@ g~u !# 5
1 2 cos u 1
3 sin u
5 tan
3 FG
u
2
.
u2
0 , g~u 2! , h
2
p p u2 p
Mu 2 2 , Mu 2 1 g~u 2! 2 5 G~u 2! 5 0 , Mu 2 1 2 h
2 2 2 2
p p
, u2 , .
2M 1 1 2M
v i 5 2b 1 sin~iu !.
~6a 2 2 5!sin@Mu2# $ 0.
Proof of Theorem 9. See Aldous and Fill [1999, Ch. 9] and the
references therein. A sketch of the proof is as follows. We need to
show that the vector
G;
SOG[G 1
r~G!, . . . ,
G[G
O N
D
r~G! ,
O FS O r~G!D T G 5 O r~G!
j G[G j
ji
G[G i
O FS O r~G!D T G 5 O r~G!S O T D .
~29!
ji ij
ji G[G j G[G i ji
r~G! 5 P #S~k!
ki
1
,
and thus
O r~G! 5 #G P #S~k!
G[G i
1
. i
ki
1 2
T1 0
T2
z ,
0 TM
R1 R2 z RM Q
where the mth set of rows, m 5 1, . . . , M, corresponds to the
agents in the mth isolated set, I m , and the M 1 1th set of rows
corresponds to the agents in the nonisolated sets. The matrix Tt
has the same form as T, except that Tm is replaced by Ttm , Q by
Q t , and R m by
OQR T
t21
s t212s
R m,t ; m m .
s50
~Q t11! ij 5 O Q ~Q !
k
ik
t
kj h ~Q t11! ij # SO D
k
Q ik q j,t h
q j,t11 # SO D k
Q ik q j,t h q j,t11 # q j,t,
such that all rows of Q t* sum to strictly less than 1, and thus less
than 1 2 e for some e . 0. Using equation Q t1 t* 5 Q t* Q t , and
proceeding as before, we get
q j,t1t* # ~1 2 e!q j,t.
Therefore, q j 5 0, and thus the matrix Q t converges to zero. Since
Ttm converges to T`m and Q t converges to zero, it is easy to show
that R m ,t converges to
OQR T .
`
s
R m,` ; m
`
m
s50
O @R
M
m51
0
m,` x ~I m!# i 5
M
m51
F
O O Q R T x ~I !
s50
`
s
m
` 0
m m G i
.
m51
F
O OQR 1
M `
s50
s
m G i
w~Im ! x0~Im !.
REFERENCES
Aldous, David, and Jim Fill, Reversible Markov Chains and Random Walks on
Graphs, monograph in preparation, University of California at Berkeley,
1999.
PERSUASION BIAS 967
Bala, Venkatesh, and Sanjeev Goyal, 1998, Learning from Neighbors, Review of
Economic Studies, LXV (1998), 595 621.
Banerjee, Abhijit, A Simple Model of Herd Behavior, Quarterly Journal of
Economics, CVII (1992), 797 817.
Berger, Roger, A Necessary and Sufcient Condition for Reaching a Consensus
Using DeGroots Method, Journal of the American Statistical Association,
LXXVI (1981), 415 418.
Bertrand, Marianne, Erzo Luttmer, and Sendhil Mullainathan, Network Effects
and Welfare Cultures, Quarterly Journal of Economics, CXV (2000),
10191055.
Bikhchandani, Sushil, David Hirshleifer, and Ivo Welch, A Theory of Fads,
Fashion, Custom, and Cultural Change as Information Cascades, Journal of
Political Economy, C (1992), 9921026.
Bonacich, Phillip, and Paulette Lloyd, Eigenvector-Like Measures of Centrality
for Asymmetric Relations, working paper, University of California, Los An-
geles, 2001.
Borjas, George, Ethnicity, Neighborhoods, and Human-Capital Externalities,
American Economic Review, LXXXV (1995), pp. 365390.
Caplin, Andrew, and Barry Nalebuff, Aggregation and Social Choice: A Mean
Voter Theorem, Econometrica, LIX (1991), 123.
Case, Anne, and Lawrence Katz, The Company You Keep: The Effects of Family
and Neighborhood on Disadvantaged Youths, NBER Working Paper, 1991.
Cohen, Bernard, The Press and Foreign Policy (Princeton, NJ: Princeton Univer-
sity Press, 1963).
Converse, Philip, The Nature of Beliefs Systems in Mass Publics, in D. Apter,
ed., Ideology and Discontent (New York: Free Press, 1964).
DeGroot, Morris, Reaching a Consensus, Journal of the American Statistical
Association, LXIX (1974), 118121.
DeMarzo, Peter, Dimitri Vayanos, and Jeffrey Zwiebel, Social Networks and
Financial Markets, working paper, Massachusetts Institute of Technology
and Stanford University, 2001.
Duo, Esther, and Emmanuel Saez, Participation and Investment Decisions in a
Retirement Plan: The Inuence of Colleagues Choices, Journal of Public
Economics, forthcoming.
Ellison, Glenn, and Drew Fudenberg, Rules of Thumb for Social Learning,
Journal of Political Economy, CI (1993), 612 643.
Ellison, Glenn, and Drew Fudenberg, Word-of-Mouth Communication and Social
Learning, Quarterly Journal of Economics, CX (1995), 93125.
Erbring, Lutz, Edie Goldenberg, and Arthur Miller, Front-Page News and Real-
World Cues: Another Look at Agenda-Setting by the Media, American Po-
litical Science Review, XXIV (1980), 16 49.
Evans, William, Wallace Oates, and Robert Schwab, Measuring Peer Group
Effects: A Study of Teenage Behavior, Journal of Political Economy, C
(1992), 966 991.
Festinger, Leon, Stanley Schachter, and Kurt Back, Social Pressures in Informal
Groups: A Study in Human Factors in Housing (Stanford: Stanford Univer-
sity Press, 1950).
Fiske, Susan, and Shelley Taylor, Social Cognition (Boston, MA: Addison-Wesley,
1984).
French, John, A Formal Theory of Social Power, Psychological Review, LXIII
(1956), 181194.
Geanakoplos, John, and Heracles Polemarchakis, We Cant Disagree Forever,
Journal of Economic Theory, XXVII (1982), 192200.
Gilbert, Daniel, How Mental Systems Believe, American Psychologist, XLVI
(1991), 107119.
Glaeser, Edward, Bruce Sacerdote, and Jose Scheinkman, Crime and Social
Interactions, Quarterly Journal of Economics, CXI (1996), 507548.
Harary, Frank, A Criterion for Unanimity in Frenchs Theory of Social Power, in
Cartwright and Dorwin, eds., Studies in Social Power (Ann Arbor: University
of Michigan Press, 1959), pp. 168182.
Hawkins, Scott, and Stephen Hoch, Low-Involvement Learning: Memory without
Evaluation, Journal of Consumer Research, XIX (1992), 212225.
968 QUARTERLY JOURNAL OF ECONOMICS
Hong, Harrison, and Jeremy Stein, Social Interaction and Stock-Market Partici-
pation, NBER Working Paper, 2001.
Huckedt, Robert, and John Sprague, Citizens, Politics, and Social Communica-
tion: Information and Inuence in an Election Campaign (Cambridge, UK:
Cambridge University Press, 1995).
Iyengar, Shanto, and Donald Kinder, News That Matters: Television and Ameri-
can Opinion (Chicago, IL: University of Chicago Press, 1987).
Katz, Lawrence, Jeffrey Kling, and Jeffrey Liebman, Moving to Opportunity in
Boston: Early Impacts of a Housing Mobility Program, Quarterly Journal of
Economics, CXVI (2001), 607 654.
Lodge, Milton, Toward a Procedural Model of Candidate Evaluation, in M. Lodge
and K. M. Mcgraw, eds., Political Judgement: Structure and Process (Ann
Arbor: University of Michigan Press, 1995).
Madrian, Brigitte, and Dennis Shea, The Power of Suggestion: Inertia in 401(k)
Participation and Savings Behavior, Quarterly Journal of Economics, CXVI
(2001), 11491525.
Nisbett, Richard, and Lee Ross, Human Inference: Strategies and Shortcomings of
Social Judgment (Princeton, NJ: Prentice Hall, 1980).
Parikh, Rohit, and Paul Krasucki, Communication, Consensus and Knowledge,
Journal of Economic Theory, LII (1990), 178189.
Poole, Keith, and Howard Rosenthal, Patterns of Congressional Voting, Ameri-
can Journal of Political Science, XXXV (1991), 228 278.
Poole, Keith, and Howard Rosenthal, Congress: A Political-Economic Theory of
Roll Call Voting (Oxford, UK: Oxford University Press, 1997).
Press, James, Qualitative Controlled Feedback for Forming Group Judgments
and Making Decisions, Journal of American Statistical Association, LXXIII
(1978), 526 535.
Spector, David, Rational Debate and One-Dimensional Conict, Quarterly Jour-
nal of Economics, CXV (2000), 181200.
Tversky, Amos, and Daniel Kahneman, Availability: A Heuristic for Judging
Frequency and Probability, Cognitive Psychology, V (1973), 207232.
Varian, Hal, Microeconomic Analysis (New York: Norton, 1992).
Wasserman, Stanley, and Katherine Faust, Social Network Analysis: Methods
and Applications (Cambridge, UK: Cambridge University Press, 1994).
Zallner, J., The Nature and Origins of Mass Opinion (Cambridge, UK: Cambridge
University Press, 1992).