Preface on without being able to accogmt for its remarka h originate do gam he tam d onomer (hemacee and astronomer(the of th havermwhat,it is everthele true that probabilitythoryhasbecom This book is intended as an elementary introduction to the theory of probability istics,engineering,and the s ciences (including com mathematics of probabbuthroughumrous cxamples,the many ble appcth s of this subject. outing probabilities of combinatorial analysis,which are most apthadithe axioms of probability theory and shows how they can be Chaperects of condiional prohabiliy com not oniy when someo probabilities by"conditioning"reappears in Chapter 7,where we use it to obtain Chapter 5.and jointly distributed random variables in Chapter 6.The important con cepts of the expecte troduced in ofandom variable many of t Additional properties of the expected value are considered in Chapter 7.Many ness or functions are conta mp! distribution
Preface “We see that the theory of probability is at bottom only common sense reduced to calculation; it makes us appreciate with exactitude what reasonable minds feel by a sort of instinct, often without being able to account for it. .It is remarkable that this science, which originated in the consideration of games of chance, should have become the most important object of human knowledge. .The most important questions of life are, for the most part, really only problems of probability.” So said the famous French mathematician and astronomer (the “Newton of France”) PierreSimon, Marquis de Laplace. Although many people feel that the famous marquis, who was also one of the great contributors to the development of probability, might have exaggerated somewhat, it is nevertheless true that probability theory has become a tool of fundamental importance to nearly all scientists, engineers, medical practitioners, jurists, and industrialists. In fact, the enlightened individual had learned to ask not “Is it so?” but rather “What is the probability that it is so?” This book is intended as an elementary introduction to the theory of probability for students in mathematics, statistics, engineering, and the sciences (including computer science, biology, the social sciences, and management science) who possess the prerequisite knowledge of elementary calculus. It attempts to present not only the mathematics of probability theory, but also, through numerous examples, the many diverse possible applications of this subject. Chapter 1 presents the basic principles of combinatorial analysis, which are most useful in computing probabilities. Chapter 2 handles the axioms of probability theory and shows how they can be applied to compute various probabilities of interest. Chapter 3 deals with the extremely important subjects of conditional probability and independence of events. By a series of examples, we illustrate how conditional probabilities come into play not only when some partial information is available, but also as a tool to enable us to compute probabilities more easily, even when no partial information is present. This extremely important technique of obtaining probabilities by “conditioning” reappears in Chapter 7, where we use it to obtain expectations. The concept of random variables is introduced in Chapters 4, 5, and 6. Discrete random variables are dealt with in Chapter 4, continuous random variables in Chapter 5, and jointly distributed random variables in Chapter 6. The important concepts of the expected value and the variance of a random variable are introduced in Chapters 4 and 5, and these quantities are then determined for many of the common types of random variables. Additional properties of the expected value are considered in Chapter 7. Many examples illustrating the usefulness of the result that the expected value of a sum of random variables is equal to the sum of their expected values are presented. Sections on conditional expectation, including its use in prediction, and on moment-generating functions are contained in this chapter. In addition, the final section introduces the multivariate normal distribution and presents a simple proof concerning the joint distribution of the sample mean and sample variance of a sample from a normal distribution. xi
xii Preface Chapter 8 presents the major theoretical results of probability theory.In partic- roe prove the strong law of laree number nd thean strong lawis a relatively n assumes th Levy's continuity theorem.This chapter also presents such probability inequalities as Markov's inequality,Chebyshev's he final section 01 he sponding probability of a Poisson random variable having the same expected the poi simulation. This last set of exercis for which complete solutions a r in SotionstoSeTestPoblemsamdExercises.sdesignediBheipSudemsteel comprehension and study for exams CHANGES IN THE NEW EDITION The eighth edition continues the evolution and fine tuning of the text.It include new problems, ercises, and text material chosen both fo its inh erent interest ane 公二aw this result when the sample space of the probability experiment napter num er or ran 6.3nm new section in which istooofpndet eome random variables with different means ACKNOWLEDGMENTS me to conta me with comme d sity of Lausaunne:Joscph Mitchell.SUNY.Stony Brook:lan Chambie Robert Kriner:Israel David,Ben-Gurion University;T.Lim,George Mason Univer
xii Preface Chapter 8 presents the major theoretical results of probability theory. In particular, we prove the strong law of large numbers and the central limit theorem. Our proof of the strong law is a relatively simple one which assumes that the random variables have a finite fourth moment, and our proof of the central limit theorem assumes Levy’s continuity theorem. This chapter also presents such probability inequalities as Markov’s inequality, Chebyshev’s inequality, and Chernoff bounds. The final section of Chapter 8 gives a bound on the error involved when a probability concerning a sum of independent Bernoulli random variables is approximated by the corresponding probability of a Poisson random variable having the same expected value. Chapter 9 presents some additional topics, such as Markov chains, the Poisson process, and an introduction to information and coding theory, and Chapter 10 considers simulation. As in the previous edition, three sets of exercises are given at the end of each chapter. They are designated as Problems, Theoretical Exercises, and Self-Test Problems and Exercises. This last set of exercises, for which complete solutions appear in Solutions to Self-Test Problems and Exercises, is designed to help students test their comprehension and study for exams. CHANGES IN THE NEW EDITION The eighth edition continues the evolution and fine tuning of the text. It includes new problems, exercises, and text material chosen both for its inherent interest and for its use in building student intuition about probability. Illustrative of these goals are Example 5d of Chapter 1 on knockout tournaments, and Examples 4k and 5i of Chapter 7 on multiple player gambler’s ruin problems. A key change in the current edition is that the important result that the expectation of a sum of random variables is equal to the sum of the expectations is now first presented in Chapter 4 (rather than Chapter 7 as in previous editions). A new and elementary proof of this result when the sample space of the probability experiment is finite is given in this chapter. Another change is the expansion of Section 6.3, which deals with the sum of independent random variables. Section 6.3.1 is a new section in which we derive the distribution of the sum of independent and identically distributed uniform random variables, and then use our results to show that the expected number of random numbers that needs to be added for their sum to exceed 1 is equal to e. Section 6.3.5 is a new section in which we derive the distribution of the sum of independent geometric random variables with different means. ACKNOWLEDGMENTS I am grateful for the careful work of Hossein Hamedani in checking the text for accuracy. I also appreciate the thoughtfulness of the following people that have taken the time to contact me with comments for improving the text: Amir Ardestani, Polytechnic University of Teheran; Joe Blitzstein, Harvard University; Peter Nuesch, University of Lausaunne; Joseph Mitchell, SUNY, Stony Brook; Alan Chambless, actuary; Robert Kriner; Israel David, Ben-Gurion University; T. Lim, George Mason University; Wei Chen, Rutgers; D. Monrad, University of Illinois; W. Rosenberger, George Mason University; E. Ionides, University of Michigan; J. Corvino, Lafayette College; T. Seppalainen, University of Wisconsin. Finally, I would like to thank the following reviewers for their many helpful comments. Reviewers of the eighth edition are marked with an asterisk
Acknowledgments xiii K.B.Athreva.lowa State University *Edward lonides.University of Michiga Richard Bass,University of Connecticut Anastasia Ivanova,University of North Carolina Robert Bauer,University of Illinois at Hamid Jafarkhani.University of California, Arthur Benjamin,Harvey Mudd College Chapel Hill Thomas Liggett.University of California,Los Shahar Boneh,Metropolitan State Coliege of Stony Brook an McKeague,Florida Srate Universin 4 James Clay,University of Arizona at Tucson oe Naus,Rutgers Universiry Francis Conlan.University of Snta Clara w Mexi Jay DeVo N.U.Prabhu,Co eors Un Kathryn Prewitt,Arizona State University Anant Godbole Michigan Technical osenberger,George Mason Govindar Myra Samuels Purdue University Mike Hardy,Massachusetts Institte of Technology Bernard Ha m Sherman,State University of New York David Heath.Cornell niversi SR smross@usc.edu
Acknowledgments xiii K. B. Athreya, Iowa State University Richard Bass, University of Connecticut Robert Bauer, University of Illinois at Urbana-Champaign Phillip Beckwith, Michigan Tech Arthur Benjamin, Harvey Mudd College Geoffrey Berresford, Long Island University Baidurya Bhattacharya, University of Delaware Howard Bird, St. Cloud State University Shahar Boneh, Metropolitan State College of Denver Jean Cadet, State University of New York at Stony Brook Steven Chiappari, Santa Clara University Nicolas Christou, University of California, Los Angeles James Clay, University of Arizona at Tucson Francis Conlan, University of Santa Clara *Justin Corvino, Lafayette College Jay DeVore, California Polytechnic University, San Luis Obispo Scott Emerson, University of Washington Thomas R. Fischer, Texas A & M University Anant Godbole, Michigan Technical University Zakkula Govindarajulu, University of Kentucky Richard Groeneveld, Iowa State University Mike Hardy, Massachusetts Institute of Technology Bernard Harris, University of Wisconsin Larry Harris, University of Kentucky David Heath, Cornell University Stephen Herschkorn, Rutgers University Julia L. Higle, University of Arizona Mark Huber, Duke University *Edward Ionides, University of Michigan Anastasia Ivanova, University of North Carolina Hamid Jafarkhani, University of California, Irvine Chuanshu Ji, University of North Carolina, Chapel Hill Robert Keener, University of Michigan Fred Leysieffer, Florida State University Thomas Liggett, University of California, Los Angeles Helmut Mayer, University of Georgia Bill McCormick, University of Georgia Ian McKeague, Florida State University R. Miller, Stanford University *Ditlev Monrad, University of Illinois Robb J. Muirhead, University of Michigan Joe Naus, Rutgers University Nhu Nguyen, New Mexico State University Ellen O’Brien, George Mason University N. U. Prabhu, Cornell University Kathryn Prewitt, Arizona State University Jim Propp, University of Wisconsin *William F. Rosenberger, George Mason University Myra Samuels, Purdue University I. R. Savage, Yale University Art Schwartz, University of Michigan at Ann Arbor Therese Shelton, Southwestern University Malcolm Sherman, State University of New York at Albany Murad Taqqu, Boston University Eli Upfal, Brown University Ed Wheeler, University of Tennessee Allen Webster, Bradley University S. R. smross@usc.edu
CHAPTER1 Combinatorial Analysis 1.1 INTRODUCTION PERMUTATIONS ICIPLE OF COUNTING 1.4 COMBINATIONS 1.1 INTRODUCTION Here is a typical problem of interest involving probability:A communication system then boabe to re oult that cXactly m of the n sonal?For instance.in the special case where4 and sSibnct 0110 0101 1100 rsthat the nt orking and that it is defcctive Because n the 3 arrangements and not fu the functional in a similar fashion.That is,we could count the number of configurations thsystem's being functional and then divide by the total number of a omhepreceiscuesethtt oul bev method for counting the number of ways that things can occur.In fact,many prob. counting is formally 1.2 THE BASIC PRINCIPLE OF COUNTING The basic principle of counting will be fundamental to all our work.Loosely put,it
CHAPTER 1 Combinatorial Analysis 1.1 INTRODUCTION 1.2 THE BASIC PRINCIPLE OF COUNTING 1.3 PERMUTATIONS 1.4 COMBINATIONS 1.5 MULTINOMIAL COEFFICIENTS 1.6 THE NUMBER OF INTEGER SOLUTIONS OF EQUATIONS 1.1 INTRODUCTION Here is a typical problem of interest involving probability: A communication system is to consist of n seemingly identical antennas that are to be lined up in a linear order. The resulting system will then be able to receive all incoming signals—and will be called functional—as long as no two consecutive antennas are defective. If it turns out that exactly m of the n antennas are defective, what is the probability that the resulting system will be functional? For instance, in the special case where n = 4 and m = 2, there are 6 possible system configurations, namely, 0110 0101 1010 0011 1001 1100 where 1 means that the antenna is working and 0 that it is defective. Because the resulting system will be functional in the first 3 arrangements and not functional in the remaining 3, it seems reasonable to take 3 6 = 1 2 as the desired probability. In the case of general n and m, we could compute the probability that the system is functional in a similar fashion. That is, we could count the number of configurations that result in the system’s being functional and then divide by the total number of all possible configurations. From the preceding discussion, we see that it would be useful to have an effective method for counting the number of ways that things can occur. In fact, many problems in probability theory can be solved simply by counting the number of different ways that a certain event can occur. The mathematical theory of counting is formally known as combinatorial analysis. 1.2 THE BASIC PRINCIPLE OF COUNTING The basic principle of counting will be fundamental to all our work. Loosely put, it states that if one experiment can result in any of m possible outcomes and if another experiment can result in any of n possible outcomes, then there are mn possible outcomes of the two experiments. 1
2 Chapter 1 Combinatorial Analysis The basic principle of counting Suppose tha n any one c 2 the comes of the two experiments. Proof of the Basic Prin The basic principle may be proven by enumerating all 4,1).1,2,1,n (2,1),(2,2),(2,n (m,1,m,2,0m,m) mcnsiss ofcc mpvhe EXAMPLE 2a different choices are possible? Solution.By regarding the choice of the woman as the outcome of the first experi ment and th one or r n as t choices. When there are more than two experiments to be performed,the basic principle can be generalized The generalized basic principle of counting If r experiments that are to be performed are such that the first one may result in rcachofhcndiliotcachotthe ssible ou f the third xperimentheroPossib outcmf r experiments. EXAMPLE 2b A college planning committee consists of 3 freshmen,4 sophomores,5 juniors,and 2
2 Chapter 1 Combinatorial Analysis The basic principle of counting Suppose that two experiments are to be performed. Then if experiment 1 can result in any one of m possible outcomes and if, for each outcome of experiment 1, there are n possible outcomes of experiment 2, then together there are mn possible outcomes of the two experiments. Proof of the Basic Principle: The basic principle may be proven by enumerating all the possible outcomes of the two experiments; that is, (1, 1), (1, 2), . , (1, n) (2, 1), (2, 2), . , (2, n) # # # (m, 1), (m, 2), . , (m, n) where we say that the outcome is (i, j) if experiment 1 results in its ith possible outcome and experiment 2 then results in its jth possible outcome. Hence, the set of possible outcomes consists of m rows, each containing n elements. This proves the result. EXAMPLE 2a A small community consists of 10 women, each of whom has 3 children. If one woman and one of her children are to be chosen as mother and child of the year, how many different choices are possible? Solution. By regarding the choice of the woman as the outcome of the first experiment and the subsequent choice of one of her children as the outcome of the second experiment, we see from the basic principle that there are 10 * 3 = 30 possible choices. . When there are more than two experiments to be performed, the basic principle can be generalized. The generalized basic principle of counting If r experiments that are to be performed are such that the first one may result in any of n1 possible outcomes; and if, for each of these n1 possible outcomes, there are n2 possible outcomes of the second experiment; and if, for each of the possible outcomes of the first two experiments, there are n3 possible outcomes of the third experiment; and if ., then there is a total of n1 · n2 ··· nr possible outcomes of the r experiments. EXAMPLE 2b A college planning committee consists of 3 freshmen, 4 sophomores, 5 juniors, and 2 seniors. A subcommittee of 4, consisting of 1 person from each class, is to be chosen. How many different subcommittees are possible?