The Small-World phenomenon An Algorithmic Perspective Jon Kleinberg f Abstract ility. More generally, we provide a strong characteri- zation of this family of network models, showing that Long a matter of folklore, the "small-world phenomenon" there is in fact a unique model within the family fo the principle that we are all linked by short chains of which decentralized algorithms are effective acquaintances- was inaugurated as an area of experi mental study in the social sciences through the pioneer ing work of Stanley Milgram in the 1960s. This work 1 Introduction was among the first to make the phenomenon quantita- tive, allowing people to speak of the "six degrees of sep- aration"between any two people in the United States The Small-World Phenomenon A social network ex Since then, a number of network models have been pro- hibits the small-world phenomenon if, roughly speak posed as frameworks in which to study the problem an ing, any two individuals in the network are likely to alytically. One of the most refined of these models was be connected through a short sequence of intermediate formulated in recent work of Watts and Strogatz; their acquaintances. This has long been the subject of anec- framework provided compelling evidence that the small- and discover that we have an acquaintance in common world phenomenon is pervasive in a range of network arising in nature and technology, and a fundamental in- It has since grown into a significant area of study in the gredient in the evolution of the World Wide Web social sciences, in large part through a series of strik- But existing models are insufficient to explain the ng experiments conducted by Stanley Milgram and his striking algorithmic component of Milgram's original co-workers in the 1960 s[13, 18, 12. Recent work has suggested that the phenomenon is pervasive in networks findings:that individuals using local information are arising in nature and technology, and a fundamental in- collectively very effective at actually constructing short paths between two points in a social network. Although gredient in the structural evolution of the World wide recently proposed network models are rich in short paths, Web [17, 19, 2 Milgram's basic small-world experiment remains one local information only, can construct short paths in these of the most compelling ways to think about the prob- etworks with non-negligible probability. We then de- lem. The goal of the experiment was to find short chains fine an infinite family of network models that naturally of acquaintances linking pairs of people in the United generalizes the Watts-Strogatz model, and show that States who did not know one another. In a typical in- for one of these models, there is a decentralized algo- stance of the experiment urce person in Nebraska rithm capable of finding short paths with high proba would be given a letter to deliver to a target person in Massachusetts. The source would initially be told ba A version of this work Technical Report 99-1776(October 1999) as Cornell Computer Science sic information about the target, including his address t Department of Computer S Cornell University, Ithac and occupation; the source would then be instructed to NY 14853. Email: kleiner@cs. cornell.edu. Supported in part send the letter to someone she knew on a first-name by a David and Lucile Packard Foundation Fellowship an Al- basis in an effort to transmit the letter to the target as red P. Sloan Research Fellowship, an ONR Young Investigator efficaciously as possible. Anyone subsequently receiving Award, and NSF Faculty Early Career Development Award CCr- the letter would be given the same instructions, and the 9701399 chain of communication would continue until the target Permission t of all or part of this work for was reached. Over many trials, the average number of out fee provided that copies intermediate steps in a successful chain was found to lie between five and six, a quantity that has since en on the first page. To copy therwise, to republish to post on servers or to redistribute to lists tered popular culture as the "six degrees ' of separation mission and/or a fee principle [7] STOC 2000 Portland Oregon US. Copyright ACM2000158113-184400/5,5500
The Small-World Phenomenon: An Algorithmic Perspective * Jon Kleinberg * Abstract Long a matter of folklore, the "small-world phenomenon" -- the principle that we are all linked by short chains of acquaintances -- was inaugurated as an area of experimental study in the social sciences through the pioneering work of Stanley Milgram in the 1960's. This work was among the first to make the phenomenon quantitative, allowing people to speak of the "six degrees of separation" between any two people in the United States. Since then, a number of network models have been proposed as frameworks in which to study the problem analytically. One of the most refined of these models was formulated in recent work of Watts and Strogatz; their framework provided compelling evidence that the smallworld phenomenon is p@rvasive in a range of networks arising in nature and technology, and a fundamental ingredient in the evolution of the World Wide Web. But existing models are insufficient to explain the striking algorithmic component of Milgram's original findings: that individuals using local information are collectively very effective at actually constructing short paths between two points in a social network. Although recently proposed network models are rich in short paths, we prove that no decentralized algorithm, operating with local information only, can construct short paths in these networks with non-negligible probability. We then define an infinite family of network models that naturally generalizes the Watts-Strogatz model, and show that for one of these models, there is a decentralized algorithm capable of finding short paths with high proba- *A version of this work appears as Cornell Computer Science Technical Report 99-1776 (October 1999). tDepartment of Computer Science, Cornell University, Ithaca NY 14853. Email: kleinber@cs.cornell.edu. Supported in part by a David and Lucile Packard Foundation Fellowship, an Alfred P. Sloan Research Fellowship, an ONR Young Investigator Award, and NSF Faculty Early Career Development Award CCR- 9701399. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed tbr profit or commercial advantage and that copies bear this notice and the lull citation on tile first page. To copy otherwise, to republish, to post on sem, ers or to redistribute to lists, requires prior specific permission and/or a fee. STOC 2000 Portland Oregon USA Copyright ACM 2000 1-58113-184-4/00/5...$5.00 bility. More generally, we provide a strong characterization of this family of network models, showing that there is in fact a unique model within the family for which decentralized algorithms are effective. 1 Introduction The Small-World Phenomenon. A social network exhibits the small-world phenomenon if, roughly speaking, any two individuals in the network are likely to be connected through a short sequence of intermediate acquaintances. This has long been the subject of anecdotal observation and folklore; often we meet a stranger and discover that we have an acquaintance in common. It has since grown into a significant area of study in the social sciences, in large part through a series of striking experiments conducted by Stanley Milgram and his co-workers in the 1960's [13, 18, 12]. Recent work has suggested that the phenomenon is pervasive in networks arising in nature and technology, and a fundamental ingredient in the structural evolution of the World Wide Web [17, 19,.2]. Milgram's basic small-world experiment remains one of the most compelling ways to think about the problem. The goal of the experiment was to find short chains of acquaintances linking pairs of people in the United States who did not know one another. In a typical instance of the experiment, a source person in Nebraska would be given a letter to deliver to a target person in Massachusetts. The source would initially be told basic information about the target, including his address and occupation; the source would then be instructed to send the letter to someone she knew on a first-name basis in an effort to transmit the letter to the target as efficacious!y as possible. Anyone subsequently receiving the letter would be given the same instructions, and the chain of communication would continue until the target was reached. Over many trials, the average number of intermediate steps in a successful chain was found to lie between five and six, a quantity that has since entered popular culture as the "six degrees 'of separation" principle [7]. 163
Modeling the Phenomenon. Naturally, the empirical been investigated in the area of probabilistic combina. validation of the phenomenon has led to a rash of an- tories. In a fundamental instance of such an approach alytical work aimed at answering the following general Bollobas and Chung [5 gave bounds on the diameter of the random graph obtained by adding a random match to the nodes of a cycle. ( Se (*)Why should there earist short chains of ac- quaintance linking together arbitrary pairs of The Present Work. Let us return to Milgram's exper- iment. We claim that it really contains two fundamen tally surprising discoveries: first, that such short chains Most of the early work on this issue, beginning with should ezist in the network of acquaintanceships; and analysis of Pool and Kochen that pre-dated Mil 's second, that people should be able to find these chains experiments [16, was based on versions of the follow knowing so little about the target individual. From an ing explanation: random networks have low diameter. analytical point of view, the first of these discoveries is (See for example the book of surveys edited by Kochen existential in nature, the second algorithmic-it reveals [11]. )That is, if every individual in the United States that individuals who know only the locations of their were to have a small number of acquaintances selected direct acquaintances can still, collectively, construct a uniformly at random from the population-and if ac- short path between two points in the network. We there- quaintanceship were symmetric-then two random in- fore propose to study the following natural companion dividuals would be linked by a short chain with high to Question(*)above probability. Even this early work recognized the lim (**)Why should arbitrary pairs of strangers itations of a uniform random model if a and b are be able to find short chains of acquaintances two individuals with a common friend, it is much more that link them together? likely that they themselves are friends. But at the same time, a network of acquaintanceships that is too"cl It is important to note that Question(**)raises issues tered"will not have the low diameter that Milgram s that lie truly beyond the scope of Question(*):one experiments indicated can imagine networks in which short chains exist, but ecently, Watts and Strogatz proposed a model no mechanism based on purely local information is able the small-world phenomenon based on a class of random to find them. The success of Milgram's experiment sug- gests a source of latent navigational"cues"embedded in networks that interpolates between these two extremes, the underlying social network, by which a message could in which the edges of the network are divided into"lo- cal"and"long-range"contacts (19. The paradigmatic implicitly be guided quickly from source to target. It is example they studied was a"re-1 natural to ask what properties a social network must in order for it to exhibit such cues, and structed roughly as follows. One starts with a set v of points spaced uniformly on a circle, and joins each its members to find short chains through it In this work, we study "decentralized"algorithms by a small constant k. These are the "local contacts "in which individuals, knowing only the locations of their direct acquaintances, attempt to transmit a message the network. One then introduces a small number of from a source to a target along a short path. Our central edges in which the endpoints are chosen uniformly at V- the "long-range contacts". Watts and Strogatz argued that such a model captures two First, we show that existing models are insufficient crucial parameters of social networks: there is a sim- to explain the success of such decentralized algo- ple underlying structure that explains the presence of rithms in finding short paths through a social net- most edges, but a few edges are produced by a ran work. In a class of networks generated according dom process that does not respect this structure. Their to the model of Watts and Strogatz, we prove that networks thus have low diameter (like uniform random there is no decentralized algorithm capable of con- networks), but also have the property that many of the structing paths of small expected length(relative neighbors of a node u are themselves neighbors(unlike to the diameter of the underlying network uniform random networks). They showed that a number We then define an infinite family of random net- of naturally arising networks exhibit this pair of prop- k models that naturally generalizes the Watts- erties(including the connections among neurons in the nematode species C elegans, and the power grid of the Strogatz model. We show that for one of these 'estern U.S.; and their approach has been applied to models, there is a decentralized algorithm capable the analysis of the hyperlink graph of the World wide of finding short paths with high probability Web as well [1] Finally, we prove the stronger statement that there etworks that are formed from a superposition of a is in fact a unique model within the family for which “ structured subgraph”anda" random subgraph”have decentralized algorithms are effective
Modeling the Phenomenon. Naturally, the empirical validation of the phenomenon has led to a rash of analytical work aimed at answering the following general question: (*) Why should there exist short chains of acquaintances linking together arbitrary pairs of strangers? Most of the early work on this issue, beginning with analysis of Pool and Kochen that pre-dated Milgram's experiments [16], was based on versions of the following explanation: random networks have low diameter. (See for example the book of surveys edited by Kochen [11].) That is, if every individual in the United States were to have a small number of acquaintances selected uniformly at random from the population -- and if acquaintanceship were symmetric -- then two random individuals would be linked by a short chain with high probability. Even this early work recognized the limitations of a uniform random model; if A and B are two individuals with a common friend, it is much more likely that they themselves are friends. But at the same time, a network of acquaintanceships that is too "clustered" willnot have the low diameter that Milgram's experiments indicated. Recently, Watts and Strogatz proposed a model for the small-world phenomenon based on a class of random networks that interpolates between these two extremes, in which the edges of the network are divided into "local" and "long-range" contacts [19]. The paradigmatic example they studied was a "re-wired ring lattice," constructed roughly as follows. One starts with a set V of n points spaced uniformly on a circle, and joins each point by an edge to each of its k nearest neighbors, for a small constant k. These are the "local contacts" in the network. One then introduces a small number of edges in which the endpoints are chosen uniformly at random from V -- the "long-range contacts". Watts and Strogatz argued that such a model captures two crucial parameters of social networks: there is a simple underlying structure that explains the presence of most edges, but a few edges are produced by a random process that does not respect this structure. Their networks thus have low diameter (like uniform random networks), but also have the property that many of the neighbors of a node u are themselves neighbors (unlike uniform random networks). They showed that a number of naturally arising networks exhibit this pair of properties (including the connections among neurons in the nematode species C. elegans, and the power grid of the Western U.S.); and their approach has been applied to the analysis of the hyperlink graph of the World Wide Web as well [1]. Networks that are formed from a superposition of a "structured subgraph" and a "random subgraph" have been investigated in the area of probabilistic combinatorics. In a fundamental instance of such an approach, Bollob£s and Chung [5] gave bounds on the diameter of the random graph obtained by adding a random matching to the nodes of a cycle. (See also [6].) The Present Work. Let us return to Milgram's experiment. We claim that it really contains two fundamentally surprising discoveries: first, that such short chains should exist in the network of acquaintanceships; and second, that people should be able to find these chains knowing so little about the target individual. From an analytical point of view, the first of these discoveries is existential in nature, the second algorithmic-- it reveals that individuals who know only the locations of their direct acquaintances can still, collectively, construct a short path between two points in the network. We therefore propose to study the following natural companion to Question (*) above: (**) Why should arbitrary pairs of strangers be able to find short chains of acquaintances that link them together? It is important to note that Question (**) raises issues that lie truly beyond the scope of Question (*): one can imagine networks in which short chains exist, but no mechanism based on purely local information is able to find them. The success of Milgram's experiment suggests a source of latent navigational "cues" embedded in the underlying social network, by which a message could implicitly be guided quickly from source to target. It is natural to ask what properties a social network must possess in order for it to exhibit such cues, and enable its members to find short chains through it. In this work, we study "decentralized" algorithms by which individuals, knowing only the locations of their direct acquaintances, attempt to transmit a message from a source to a target along a short path. Our central findings are the following. • First, we show that existing models are insufficient to explain the success of such decentralized algorithms in finding short paths through a social network. In a class of networks generated according to the model of Watts and Strogatz, we prove that there is no decentralized algorithm capable of constructing paths of small expected length (relative to the diameter of the underlying network). • We then define an infinite family of random network models that naturally generalizes the WattsStrogatz model. We show that for one of these models, there is a decentralized algorithm capable of finding short paths with high probability. • Finally, we prove the stronger statement that there is in fact a unique model within the family for which decentralized algorithms are effective. 164
The Model: Networks and Decentralized Algorithms.(ii)the locations and long-range contacts of all nodes 7e now give precise definitions for our network model have come in contact with the message and our notion of a decentralized algorithm; we the provide formal statements of the main result Crucially, u does not have knowledge of the long-range In designing our network model, we seek a simple contacts of nodes that have not touched the message framework that encapsulates the paradigm of Watts and Given this, u must choose one of its contacts v, and Strogatz -rich in local connections, with a few long- forward the message to this contact. The expected de range connections. Rather than using a ring as the ba- livery time of a decentralized algorithm- a primary sic structure, however, we begin from a two-dimensional hgure of merit in our analysis-is the expected num- grid and allow for edges to be directed. Thus, we begin ber of steps taken by the algorithm to deliver the mes- with a set of nodes(representing individuals in the so- sage over a network generated according to an inverse ial network)that are identified with the set of lattice rlm-power distribution, from a source to a target chosen points in an n x n square,{(,j):i∈{1,2,…,n},j 1, 2, .., n), and we define the lattice distance between constraining the algorithm to use only local information two nodes(i,])and(k, e)to be the number of "lattice Is crucial to our model; if one had full global knowledge steps"separating them: d((i, 1),(k, e))=k-il +le-jl of the local and long-range contacts of all nodes in the For a universal constant p> 1, the node u has a directed network, the shortest chain between two nodes could be dge to every other node within lattice distance p- computed simply by breadth-first search these are its local contacts. For universal constants q >0 The reader may worry that assumption (iii)above and r 20, we also construct directed edges from u to g gives a decentralized algorithm too much power.How ther nodes(the long-range contacts)using independen ever, it only strengthens our results: our lower bounds random trials; the ith directed edge from u has endpoint will hold even for algorithms that are given this knowl u with probability proportional to (d(u, u)-.To ob- edge, while our upper bounds make use of decentralized tain a probability distribution, we divide this quantity algorithms that only require assumptions()and(ii) by the appropriate normalizing constant 2,[d(u, v)I- Statement of Results. Our results explore the way in we will call this the inverse rtn-power distribution which the structure of the network affects the ability of This model has eographIc a decentralized algorithm to construct a short path tion: individuals live on a grid and know their neighbors When r=0- the uniform distribution over long for some number of steps in all directions; they also have range contacts- standard results from random graph some number of acquaintances distributed more broadly theory can be used to show that with high probabil- across the grid. Viewing p and q as fixed constants, ity there exist paths between every pair of nodes whose ve obtain a one-parameter family of network models lengths are bounded by a polynomial in log n, exponen by tuning the value of the exponent r. When r=0, tially smaller than the total number of nodes. However we have the uniform distribution over long-range con- there is no way for a decentralized algorithm to find tacts, the distribution used in the basic network model these chains of Watts and Strogatz - s long-range contacts are chosen independently of their position on the grid. As Theorem 1 There is a constant ao, depending on p nd g but independent of n, so that when r=0, the ea- more and more clustered in its vicinity on the grid. pected delivery time of any decentralized algorithm is at Thus, r serves as a basic structural parameter measur- ing how widely "networked"the underlying society of mum path length. odes is The algorithmic component of the model is based As the parameter r increases, a decentralized algo- on Milgram's experiment. We start with two arbitrary rithm can take more advantage of the"geographic struc- ture!"implicit in the long-range contacts; at the same nodes in the network, denoted s and t; the goal is to transmit a message from s to t in as few steps as po time, long-range contacts become less useful in movin the message a large distance. There is a value of r where sible. We study decentralized algorithms, mechanism Thereby the message is passed sequentially from a cur- this trade-off can be best exploited algorithmically; this holder to one of its(loe is r=2, the inverse-square distribution. contacts, using only local information. In particular, Theorem 2 There is a decentralized algorithm A and the message holder u in a given step has knowledge of a constant a2, independent of n, so that when r=2 and (i)the set of local contacts among all nodes(i. e. the p=g= 1, the expected delivery time of A is at most underlying grid structure) This pair of theorems refiects a fundamental (ii)the location, on the lattice, of the target t; and sequence of our model. When long-range contacts 165
The Model: Networks and Decentralized Algorithms. We now give precise definitions for our network model and our notion of a decentralized algorithm; we then provide formal statements of the main results. In designing our network model, we seek a simple framework that encapsulates the paradigm of Watts and Strogatz -- rich in local connections, with a few longrange connections. Rather than using a ring as the basic structure, however, we begin from a two-dimensional grid and allow for edges to be directed. Thus, we begin with a set of nodes (representing individuals in the social network) that are identified with the set of lattice points in an n x n square, {(i,j): i e {1,2,...,n},j e {1, 2,..., n}}, and we define the lattice distance between two nodes (i, j) and (k, g) to be the number of "lattice steps" separating them: d((i, j), (k, g)) = Ik - i I + Ig-jl. For a universal constant p > 1, the node u has a directed edge to every other node within lattice distance p -- these are its local contacts. For universal constants q > 0 and r _> 0, we also construct directed edges from u to q other nodes (the long-range contacts) using independent random trials; the i TM directed edge from u has endpoint v with probability proportional to [d(u, v)] -r. (To obtain a probability distribution, we divide this quantity by the appropriate normalizing constant }-iv [d(u, v)]-r; we will call this the inverse rth-power distribution.) This model has a simple "geographic" interpretation: individuals live on ~ grid and know their neighbors for some number of steps in all directions; they also have some number of acquaintances distributed more broadly across the grid. Viewing p and q as fixed constants, we obtain a one-parameter family of network models by tuning the value of the exponent r. When r = 0, we have the uniform distribution over long-range contacts, the distribution used in the basic network model of Watts and Strogatz -- one's long-range contacts are chosen independently of their position on the grid. As r increases, the long-range contacts of a node become more and more clustered in its vicinity on the grid. Thus, r serves as a basic structural parameter measuring how widely "networked" the underlying society of nodes is. The algorithmic component of the model is based on Milgram's experiment. We start with two arbitrary nodes in the network, denoted s and t; the goal is to transmit a message from s to t in as few steps as possible. We study decentralized algorithms, mechanisms whereby the message is passed sequentially from a current message holder to one of its (local or long-range) contacts, using only local information. In particular, the message holder u in a given step has knowledge of (i) the set of local contacts among all nodes (i.e. the underlying grid structure); (ii) the location, on the lattice, of the target t; and (iii) the locations and long-range contacts of all nodes that have come in contact with the message. Crucially, u does not have knowledge of the long-range contacts of nodes that have not touched the message. Given this, u must choose one of its contacts v, and forward the message to this contact. The expected delivery time of a decentralized algorithm -- a primary figure of merit in our analysis -- is the expected number of steps taken by the algorithm to deliver the message over a network generated according to an inverse rth-power distribution, from a source to a target chosen uniformly at random from the set of nodes. Of course, constraining the algorithm to use only local information is crucial to our model; if one had full global knowledge of the local and long-range contacts of all nodes in the network, the shortest chain between two nodes could be computed simply by breadth-first search. The reader may worry that assumption (iii) above gives a decentralized algorithm too much power. However, it only strengthens our results: our lower bounds will hold even for algorithms that are given this knowledge, while our upper bounds make use of decentralized algorithms that only require assumptions (i) and (ii). Statement of Results. Our results explore the way in which the structure of the network affects the ability of a decentralized algorithm to construct a short path. When r = 0 -- the uniform distribution over longrange contacts -- standard results from random graph theory can be used to show that with high probability there exist paths between every pair of nodes whose lengths are bounded by a polynomial in log n, exponentially smaller than the total number of nodes. However, there is no way for a decentralized algorithm to find these chains: Theorem 1 There is a constant C~o, depending on p and q but independent of n, so that when r = O, the expected delivery time of any decentralized algorithm is at least c~on 2/3. (Hence exponential in the expected minimum path length.) As the parameter r increases, a decentralized algorithm can take more advantage of the "geographic structure" implicit in the long-range contacts; at the same time, long-range contacts become less useful in moving the message a large distance. There is a value of r where this trade-off can be best exploited algorithmically; this is r = 2, the inverse-square distribution. Theorem 2 There is a decentralized algorithm .4 and a constant c~2, independent of n, so that when r = 2 and p = q = 1, the expected delivery time of .4 is at most a2(log n) 2 . This pair of theorems reflects a fundamental consequence of our model. When long-range contacts are 165
o。oO0O . O9/OO Figure 1:(A)a two-dimensional grid network with n= 6, p= l, and q=0.(B) The contacts of a node u witl and g= 2 v and w are the two long-range contacts formed independently of the geometry of the grid, short of U, and if the message is never passed from a node to chains will exist but the nodes, operating at a local level, a long-range contact in U, the number of steps needed will not be able to find them. When long-range contacts to reach t will be at least proportional to n 2/3. But the are formed by a process that is related to the geometry probability that any message holder has a long-range of the grid in a specific way, however, then short chains contact in U is roughly n-2/3, so the expected numbe will still form and nodes operating with local knowledge of steps before a long-range contact in U is found is at will be able to construct them least proportional to n2/ as well We now comment on the ideas underlying the proofs of these results; the full details are given in the sub- More generally, we can show a suron characteriz sequent sections. The decentralized algorithm A that tion theorem for this family of models: r= 2 is the only achieves the bound of Theorem 2 is the following simple value for which there is a decentralized algorithm capa rule: in each step, the current message-holder u chooses ble of producing chains whose length is a polynomial in a contact that is as close to the target t as possible, in the sense of lattice distance. Note that algorithm A Theorem 3(a)Let0Sr<2. There is a constant ar need to know anything about the set of previous mes- at least a,n(2-t/o >ne of any decentralized algorithm is sage holders. To analyze an execution of algorithm A, (b) Let r>2. There is a constant ar, depending we say that it is in phase j if the lattice distance from the on p, r, but independent of n, so that the expected current message holder to the target is between 23and delivery time of any decentralized algorithm is at least We show that in phase j, the expected fore the current message holder has a long-range contact within lattice distance 23 of t is bounded proportionally tion 3. The proof of(a)is analogous to that of The- to log n; at this point, phase j will come to an end. As orem 1. The proof of(b), on the other hand, exposes there are at most 1+log n phases, a bound proportional a "dual" obstacle for decentralized algorithms: with a to(logn)2 follows. Interestingly, the analysis matches large value of T, it takes a significant amount of time a sho before the message reaches a node with a long range chain is found in real life: "The geographic movement of contact that is far away in lattice distance. This ef- message from Nebraska to Massachusetts is strik fectively limits the "speed"at which the message can ing. There is a progressive closing in on the target area travel from s to t as each new person is added to the chain"[13 Although we have focused on the two-dimensional The impossibility result of Theorem 1 is based, fun- grid, our analysis can be applied more broadly. We can damentally, on the fact that the uniform distribution generalize our results to k-dimensional lattice networks prevents a decentralized algorithm from using any clues" for constant values of k, as well as less structured graphs provided by the geometry of the grid. Roughly, we con- with analogous scaling properties. In the k-dimensional sider the set U of all nodes within lattice distance n2/3 case, a decentralized algorithm can construct paths of of t. With high probability, the source s will lie outside length polynomial in log n if and only if r= k
A) B) 0 0 0 0 0 0 0 0 0 0 0 0 0 v Ow 0 0 0 0 0 0 0 0 0 0 0 0 Figure 1: (A) A two-dimensional grid network with n = 6, p = 1, and q = 0. (B) The contacts of a node u with p = 1 and q = 2. v and w are the two long-range contacts. formed independently of the geometry of the grid, short chains will exist but the nodes, operating at a local level, will not be able to find them. When long-range contacts are formed by a process that is related to the geometry of the grid in a specific way, however, then short chains will still form and nodes operating with local knowledge will be able to construct them. We now comment on the ideas underlying the proofs of these results; the full details are given in the subsequent sections. The decentralized algorithm .4 that achieves the bound of Theorem 2 is the following simple rule: in each step, the current message-holder u chooses a contact that is as close to the target t as possible, in the sense of lattice distance. Note that algorithm A makes use of even less information than is allowed by our general model: the current message holder does not need to know anything about the set of previous message holders. To analyze an execution of algorithm .4, we say that it is in phase j if the lattice distance from the current message holder to the target is between 2 j and 2 j+l. We show that in phase j, the expected time before the current message holder has a long-range contact within lattice distance 2 j of t is bounded proportionally to logn; at this point, phase j will come to an end. As there are at most 1 +log n phases, a bound proportional to (logn) 2 follows. Interestingly, the analysis matches our intuition, and Milgram's description, of how a short chain is found in real life: "The geographic movement of the [message] from Nebraska to Massachusetts is striking. There is a progressive closing in on the target area as each new person is added to the chain" [13]. The impossibility result of Theorem 1 is based, fundamentally, on the fact that the uniform distribution prevents a decentralized algorithm from using any "clues" provided by the geometry of the grid. Roughly, we consider the set U of all nodes within lattice distance n 2/3 of t. With high probability, the source s will lie outside of U, and if the message is never passed from a node to a long-range contact in U, the number of steps needed to reach t will be at least proportional to n 2/3. But the probability that any message holder has a long-range contact in U is roughly n -2/3, so the expected number of steps before a long-range contact in U is found is at least proportional to n 2/3 as well. More generally, we can show a strong characterization theorem for this family of models: r = 2 is the only value for which there is a decentralized algorithm capable of producing chains whose length is a polynomial in log n: Theorem 3 (a) Let 0 < r < 2. There is a constant c~r, depending on p, q, r, but independent of n, so that the expected delivery time of any decentralized algorithm is at least c~rn(2-r )/3. (b) Let r > 2. There is a constant c~r, depending on p, q, r, but independent of n, so that the expected delivery time of any decentralized algorithm is at least OLrn(r--2)/(r--1). The complete proof of this theorem is given in Section 3. The proof of (a) is analogous to that of Theorem 1. The proof of (b), on the other hand, exposes a "dual" obstacle for decentralized algorithms: with a large value of r, it takes a significant amount of time before the message reaches a node with a long-range contact that is far away in lattice distance. This effectively limits the "speed" at which the message can travel from s to t. Although we have focused on the two-dimensional grid, our analysis can be applied more broadly. We can generalize our results to k-dimensional lattice networks, for constant values of k, as well as less structured graphs with analogous scaling properties. In the k-dimensional case, a decentralized algorithm can construct paths of length polynomial in log n if and only if r = k. 166
lower bound t (given as log T) 0 clustering exponent r Figure 2: The lower bound implied by Theorem 3. The c-axis is the value of r; the y-axis is the resulting exponent The results suggest a fundamental network property, simulations of communication in an abstract social net- distinct from diameter, that helps to explain the success work in which each individual was given pre-defined ac- of small-world experiments. One could think of it as the curacy and responsiveness parameters. The distinction transmission rate"of a class of networks: the minimum between the mere existence of short paths linking points expected delivery time of any decentralized algorithm on the World Wide Web, and the ability of agents to find operating in a random network drawn from this class them, has also been raised recently in work of Albert, Thus we see that minimizing the transmission rate of Jeong, and Barabasi 12, 9 a network is not necessarily the same as minimizing its diameter. This may seem counter-intuitive at first, but 2 Upper Bound for the Inverse-Square Distribution in fact it formalizes a notion raised initially - in ad- We now present proofs of the theorems discussed in the latent structural cues that can be used to guide a mes- previous section. When we analyze a decentralized al- sage towards a target. The dependence of long-range gorithm, we can adopt the following equivalent formula- connections on the geometry of the lattice is providing tion of the model, which will make the exposition easier precisely such implicit information Although our model considers all long-range contacts as being generated initially, at random, we invoke the Other Related Work. There has been work aimed at Principle of Deferred Decisions"-a common mech- modeling the way in which individuals in Milgram's ex. assume that the long-range contacts of hms/14)-and ed in spirit to what we do here, though erated only when the message first reaches v. Since a decentralized algorithm does not learn the long-range ing very different perspectives and models. Killworth contacts of v until the message reaches v, this formula and Bernard [10, in their "reverse small-world experi ments, "asked a set of respondents to explain how they tion is equivalent for the purposes of analysis chose to send letters in a run of the small-world ex A comment on the notation: log n denotes the loga- periment, and used this information to look f rithm base 2, while In n denotes the natural logarithm mon principles at an empirical level. At an analyti base e cal level, White [20] investigated the probability that Proof of Theorem 2. Sinc 1, we have a a chain would "die out"through an individuals fail- network in which each node u is connected to its four ure to participate, and Hunter and Shotland [8]studied earest neighbors in the lattice(two or three neighb the passage of a chain through different social "cate- in the case of nodes on the boundary), and has a single ories. In the context of a "referral system"for the long-range contact v. The probability that u chooses u
lower bound T on delivery time (given as log n T) .8 .6 .4. .2 m 0 1 2 3 4 clustering exponent r Figure 2: The lower bound implied by Theorem 3. The x-axis is the value of r; the y-axis is the resulting exponent on n. The results suggest a fundamental network property, distinct from diameter, that helps to explain the success of small-world experiments. One could think of it as the "transmiSsion rate" of a class of networks: the minimum expected delivery time of any decentralized algorithm operating in a random network drawn from this class. Thus we see that minimizing the transmission rate of a network is not necessarily the same as minimizing its diameter. This may seem counter-intuitive at first, but in fact it formalizes a notion raised initially -- in addition to having short paths, a network should contain latent structural cues that can be used to guide a message towards a target. The dependence of long-range connections on the geometry of the lattice is providing precisely such implicit information. Other Related Work. There has been work aimed at modeling the way in which individuals in Milgram's experiments chose recipients for their letters. Some of this work is related in spirit to what we do here, though using very different perspectives and models. Killworth and Bernard [10], in their "reverse small-world experiments," asked a set of respondents to explain how they chose to send letters in a run of the small-world experiment, and used this information to look for common principles at an empirical level. At an analytical level, White [20] investigated the probability that a chain would "die out" through an individual's failure to participate, and Hunter and Shotland [8] studied the passage of a chain through different social "categories." In the context of a "referral system" for the World Wide Web, Kautz, Selman, and Shah [17] ran simulations of communication in an abstract social network in which each individual was given pre-defined accuracy and responsiveness parameters. The distinction between the mere existence of short paths linking points on the World Wide Web, and the ability of agents to find them, has also been raised recently in work of Albert, Jeong, and Barabasi [2, 9]. 2 Upper Bound for the Inverse-Square Distribution We now present proofs of the theorems discussed in the previous section. When we analyze a decentralized algorithm, we can adopt the following equivalent formulation of the model, which will make the exposition easier. Although our model considers all long-range contacts as being generated initially, at random, we invoke the "Principle of Deferred Decisions" -- a common mechanism for analyzing randomized algorithms [14] -- and assume that the long-range contacts of a node v are generated only when the message first reaches v. Since a decentralized algorithm does not learn the long-range contacts of v until the message reaches v, this formulation is equivalent for the purposes of analysis. A comment on the notation: logn denotes the logarithm base 2, while inn denotes the natural logarithm, base e. Proof of Theorem 2. Since p = q = 1, we have a network in which each node u is connected to its four nearest neighbors in the lattice (two or three neighbors in the case of nodes on the boundary), and has a single long-range contact v. The probability that u chooses v as its long-range contact is d(u, v)-2/~v¢~ d(u, v) -2, 167