x Contents 10 Sir ulation 438 10. Introdu ction 10.2 tinuous Random Variables.. ormation Method 0.2.2 103 butions Distributi 4M 10.4 1041 of Antithetic Va hla 450 10.4.2 Varia 10.4.3 Control Variates 45 453 Problems 453 Self-Test Problems and Exercises 455 Reference 455 Answers to Selected Problems 457 Solutions to Self-Test Problems and Exercises 461 Index 521
x Contents 10 Simulation 438 10.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 438 10.2 General Techniques for Simulating Continuous Random Variables . . 440 10.2.1 The Inverse Transformation Method . . . . . . . . . . . . . . . 441 10.2.2 The Rejection Method . . . . . . . . . . . . . . . . . . . . . . . 442 10.3 Simulating from Discrete Distributions . . . . . . . . . . . . . . . . . . 447 10.4 Variance Reduction Techniques . . . . . . . . . . . . . . . . . . . . . . 449 10.4.1 Use of Antithetic Variables . . . . . . . . . . . . . . . . . . . . 450 10.4.2 Variance Reduction by Conditioning . . . . . . . . . . . . . . . 451 10.4.3 Control Variates . . . . . . . . . . . . . . . . . . . . . . . . . . 452 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 453 Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 453 Self-Test Problems and Exercises . . . . . . . . . . . . . . . . . . . . . 455 Reference . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 455 Answers to Selected Problems 457 Solutions to Self-Test Problems and Exercises 461 Index 521
Preface "We see that the theory of probability is at bottom only common sense reduced to calculation;it makes us appreciate with exactitude what reasonable minds feel by a sort of instinct,often without being able to account for it....It is remarkable that this science,which originated in the consideration of games of chance,should have become the most important object of human knowledge.. I he most importan So said tons of e are.tor the most part,really only problems of probabilty. n and astronomer (the wton of France")Pierre rquis de aplace.A who was alsc hough many people eel tha ne la of the great contr hty,migh what,it is n that proba strialists.In fact. ask not "Is it so2”but rather“What is the obability that it is so? This hook is intended a introduction to the the of for students in mathematics statistics e ering and the sciences (including c puter science,biology,the social sciences,and management science)who possess the prerequisite knowledge of elementary calculus.It attempts to present not only the mathematics of probability theory,but also,through numerous examples,the many diverse possible applications of this subject. Chapter 1 presents the basic principles of combinatorial analysis,which are most useful in com nputing probabilities. Chapter 2 handles the axioms of probability theory and shows how they can be applied to compute various probabilities of interest. Chapter 3 deals with the extremely important subjects of conditional probability pendence of events.By a se ries of examples.we il ustrate how conditional ial informati as a too more casily,even bta s by "co reappears in Chapter where we use it to obt expectations crete Chapter 5.and iointly distributed random variables in Ch er 6.Thei Chapters 4 and 5,and these quantities are then determined for many of the common types of random variables. Additional properties of the expected value are considered in Chapter 7.Many examples illustrating the usefulness of the result that the expected value of a sum of random variables is equal to the sum of their expected values are presented.Sections on conditional expectation,including its use in prediction,and on moment-generating functions are contained in this chapter.In addition,the final section introduces the multivariate normal distribution and presents a simple proof concerning the joint distribution of the sample mean and sample variance of a sample from a normal distribution
Preface “We see that the theory of probability is at bottom only common sense reduced to calculation; it makes us appreciate with exactitude what reasonable minds feel by a sort of instinct, often without being able to account for it. ...It is remarkable that this science, which originated in the consideration of games of chance, should have become the most important object of human knowledge. ...The most important questions of life are, for the most part, really only problems of probability.” So said the famous French mathematician and astronomer (the “Newton of France”) PierreSimon, Marquis de Laplace. Although many people feel that the famous marquis, who was also one of the great contributors to the development of probability, might have exaggerated somewhat, it is nevertheless true that probability theory has become a tool of fundamental importance to nearly all scientists, engineers, medical practitioners, jurists, and industrialists. In fact, the enlightened individual had learned to ask not “Is it so?” but rather “What is the probability that it is so?” This book is intended as an elementary introduction to the theory of probability for students in mathematics, statistics, engineering, and the sciences (including computer science, biology, the social sciences, and management science) who possess the prerequisite knowledge of elementary calculus. It attempts to present not only the mathematics of probability theory, but also, through numerous examples, the many diverse possible applications of this subject. Chapter 1 presents the basic principles of combinatorial analysis, which are most useful in computing probabilities. Chapter 2 handles the axioms of probability theory and shows how they can be applied to compute various probabilities of interest. Chapter 3 deals with the extremely important subjects of conditional probability and independence of events. By a series of examples, we illustrate how conditional probabilities come into play not only when some partial information is available, but also as a tool to enable us to compute probabilities more easily, even when no partial information is present. This extremely important technique of obtaining probabilities by “conditioning” reappears in Chapter 7, where we use it to obtain expectations. The concept of random variables is introduced in Chapters 4, 5, and 6. Discrete random variables are dealt with in Chapter 4, continuous random variables in Chapter 5, and jointly distributed random variables in Chapter 6. The important concepts of the expected value and the variance of a random variable are introduced in Chapters 4 and 5, and these quantities are then determined for many of the common types of random variables. Additional properties of the expected value are considered in Chapter 7. Many examples illustrating the usefulness of the result that the expected value of a sum of random variables is equal to the sum of their expected values are presented. Sections on conditional expectation, including its use in prediction, and on moment-generating functions are contained in this chapter. In addition, the final section introduces the multivariate normal distribution and presents a simple proof concerning the joint distribution of the sample mean and sample variance of a sample from a normal distribution. xi
xii Preface Chapter 8 presents the major theoretical results of probability theory.In partic ular,we prove the strong law of large numbers and the central limit theorem.Our proo of the strong law is a relativelys mple one w ch assum ne random var ourth moment, i our prool ne c entral li theorem inuity theorem.T apter al o prese s such proba apter squ equ fina th a p f inde a of a poi iable ariable ing the oy the expec Chapter 9 presents some additional topics.such as markoy chains.the poisson cess,and an introduction to information and coding theory,and Chapter 10 con simulation. As in the previous edition,three sets of exercises are given at the end of each chapter.They are designated as Problems,Theoretical Exercises,and Self-Test Prob- lems and Exercises.This last set of exercises,for which complete solutions appear in Solutions to Self-Test Problems and Exercises,is designed to help students test their comprehension and study for exams. CHANGES IN THE NEW EDITION The eighth edition continues the evolution and fine tuning of the text.It includes new problems,exercises,and text material chosen both for its inherent interest and for its use in building student intuition about probability.Illustrative of these goals are Example 5d of Chapter 1 on knockout tournaments,and Examples 4k and 5i of Chapter 7 on multiple player gambler's ruin problems A key change in the current edition is that the important result that the expectation of a sum of random variables is equ l to the sum of the expectations is now firs presented in Chap (rath er than napter 7 as in prev ous ed )A new and ary prool res when the sample space of he probability experiment en in this hapter her change is expansion o Section 6.3.which deals with the su n of ind and n in w we derive th th d tha bers that ds to be added for the eed tis al to new section in which we derive the distribution of the su ind ection 6.3.5 is endent gec random variables with different means ACKNOWLEDGMENTS ani in check te th the foll rime to me with ts for i the Ardestani.Polytech c University of Teheran:Joe Blitzs in Harvard iniversity:Peter nuesch iniver. sity of Lausaunne:Joseph Mitchell.SUNY.Stony Brook:Alan Chambless.actuary: Robert Kriner:Israel David,Ben-Gurion University:T.Lim,George Mason Univer- sity:Wei Chen,Rutgers:D.Monrad.University of Illinois:W.Rosenberger,George Mason University:E.Ionides,University of Michigan;J.Corvino,Lafayette College; T.Seppalainen,University of Wisconsin. Finally,I would like to thank the following reviewers for their many helpful com- ments.Reviewers of the eighth edition are marked with an asterisk
xii Preface Chapter 8 presents the major theoretical results of probability theory. In particular, we prove the strong law of large numbers and the central limit theorem. Our proof of the strong law is a relatively simple one which assumes that the random variables have a finite fourth moment, and our proof of the central limit theorem assumes Levy’s continuity theorem. This chapter also presents such probability inequalities as Markov’s inequality, Chebyshev’s inequality, and Chernoff bounds. The final section of Chapter 8 gives a bound on the error involved when a probability concerning a sum of independent Bernoulli random variables is approximated by the corresponding probability of a Poisson random variable having the same expected value. Chapter 9 presents some additional topics, such as Markov chains, the Poisson process, and an introduction to information and coding theory, and Chapter 10 considers simulation. As in the previous edition, three sets of exercises are given at the end of each chapter. They are designated as Problems, Theoretical Exercises, and Self-Test Problems and Exercises. This last set of exercises, for which complete solutions appear in Solutions to Self-Test Problems and Exercises, is designed to help students test their comprehension and study for exams. CHANGES IN THE NEW EDITION The eighth edition continues the evolution and fine tuning of the text. It includes new problems, exercises, and text material chosen both for its inherent interest and for its use in building student intuition about probability. Illustrative of these goals are Example 5d of Chapter 1 on knockout tournaments, and Examples 4k and 5i of Chapter 7 on multiple player gambler’s ruin problems. A key change in the current edition is that the important result that the expectation of a sum of random variables is equal to the sum of the expectations is now first presented in Chapter 4 (rather than Chapter 7 as in previous editions). A new and elementary proof of this result when the sample space of the probability experiment is finite is given in this chapter. Another change is the expansion of Section 6.3, which deals with the sum of independent random variables. Section 6.3.1 is a new section in which we derive the distribution of the sum of independent and identically distributed uniform random variables, and then use our results to show that the expected number of random numbers that needs to be added for their sum to exceed 1 is equal to e. Section 6.3.5 is a new section in which we derive the distribution of the sum of independent geometric random variables with different means. ACKNOWLEDGMENTS I am grateful for the careful work of Hossein Hamedani in checking the text for accuracy. I also appreciate the thoughtfulness of the following people that have taken the time to contact me with comments for improving the text: Amir Ardestani, Polytechnic University of Teheran; Joe Blitzstein, Harvard University; Peter Nuesch, University of Lausaunne; Joseph Mitchell, SUNY, Stony Brook; Alan Chambless, actuary; Robert Kriner; Israel David, Ben-Gurion University; T. Lim, George Mason University; Wei Chen, Rutgers; D. Monrad, University of Illinois; W. Rosenberger, George Mason University; E. Ionides, University of Michigan; J. Corvino, Lafayette College; T. Seppalainen, University of Wisconsin. Finally, I would like to thank the following reviewers for their many helpful comments. Reviewers of the eighth edition are marked with an asterisk
Acknowledgments xii K B Athr eya.lowa State University Edward Ionides,University of Michigan arolina Phillip Beckwith,Michigan Tech Chuanshu Ji.University of North Carolina, Arthur Benjamin.Harvey Mudd College Chapel Hill Geoffrey Berresford.Lo ng Island University Robert Keener,University of Michigan Baidurya Bhattacharya,University of Delaware Fred Leysieffer,Florida State University Howard Bird,St.Cloud State University Thomas Liggett,University of California,Los Shahar Boneh,Metropolitan State College of Angeles Denver Helmut Mayer,University of Georgia Jean Cadet,State University of New York at Bill McCormick,University of Georgia Stony Brook Ian McKeague,Florida State University Steven Chiappari,Santa Clara University R.Miller,Stanford Universiry Nicolas Christou.University of California.Los ev Monrad, niversity of Illinois Angeles Muirhead,University of Michigan iv f Arizona at onla Tucson ta Clara Nh Colleg en,N w Mexico State University Jay DeV U.Prab University Scott Em Univ rizon University Thomas R of Washi versity Anant Godbole,Michigan Technical Gerge Maon a Samuels,Purdue University Zakkula Govindarajulu,University of Kentucky I.R.Savage.Yale University Richard Groeneveld,lowa State University Art Schwartz,University of Michigan at Ann Mike Hardy,Massachusetts Institute of Arbor Technology Therese Shelton,Southwestern University Bernard Harris,University of Wisconsin Malcolm Sherman,State University of New York Larry Harris,University of Kentucky at Albany David Heath,Cornell University Murad Taqqu,Boston University Stephen Herschkorn,Rutgers University Eli Upfal,Brown University niversity of Arizona Ed Wheeler,University of Tennessee Allen Webster,Bradley University S.R. smrossCusc.edu
Acknowledgments xiii K. B. Athreya, Iowa State University Richard Bass, University of Connecticut Robert Bauer, University of Illinois at Urbana-Champaign Phillip Beckwith, Michigan Tech Arthur Benjamin, Harvey Mudd College Geoffrey Berresford, Long Island University Baidurya Bhattacharya, University of Delaware Howard Bird, St. Cloud State University Shahar Boneh, Metropolitan State College of Denver Jean Cadet, State University of New York at Stony Brook Steven Chiappari, Santa Clara University Nicolas Christou, University of California, Los Angeles James Clay, University of Arizona at Tucson Francis Conlan, University of Santa Clara *Justin Corvino, Lafayette College Jay DeVore, California Polytechnic University, San Luis Obispo Scott Emerson, University of Washington Thomas R. Fischer, Texas A & M University Anant Godbole, Michigan Technical University Zakkula Govindarajulu, University of Kentucky Richard Groeneveld, Iowa State University Mike Hardy, Massachusetts Institute of Technology Bernard Harris, University of Wisconsin Larry Harris, University of Kentucky David Heath, Cornell University Stephen Herschkorn, Rutgers University Julia L. Higle, University of Arizona Mark Huber, Duke University *Edward Ionides, University of Michigan Anastasia Ivanova, University of North Carolina Hamid Jafarkhani, University of California, Irvine Chuanshu Ji, University of North Carolina, Chapel Hill Robert Keener, University of Michigan Fred Leysieffer, Florida State University Thomas Liggett, University of California, Los Angeles Helmut Mayer, University of Georgia Bill McCormick, University of Georgia Ian McKeague, Florida State University R. Miller, Stanford University *Ditlev Monrad, University of Illinois Robb J. Muirhead, University of Michigan Joe Naus, Rutgers University Nhu Nguyen, New Mexico State University Ellen O’Brien, George Mason University N. U. Prabhu, Cornell University Kathryn Prewitt, Arizona State University Jim Propp, University of Wisconsin *William F. Rosenberger, George Mason University Myra Samuels, Purdue University I. R. Savage, Yale University Art Schwartz, University of Michigan at Ann Arbor Therese Shelton, Southwestern University Malcolm Sherman, State University of New York at Albany Murad Taqqu, Boston University Eli Upfal, Brown University Ed Wheeler, University of Tennessee Allen Webster, Bradley University S. R. smross@usc.edu
CHAPTER 1 Combinatorial Analysis INTRODUCTION NCIPLE OF COUNTING COMBINATIONS 15 MULTINOMIAL COEFFICIENTS 1.6 THE NUMBER OF INTEGER SOLUTIONS OF EQUATIONS 1.1 INTRODUCTION Here is a typical problem of interest involving probability:A communication system s to cor will be called functio If it turn out that ex ctly m the probability tha in the special case where n=4 and m.there are6 possible system configurations.namely. 0110 1100 where 1 means that the antenna is working and 0 that it is defective.Because the resulting system will be functional in the first 3 arrangements and not functional in the remaining 3,it seems reasonable to take=as the desired probability.In the case of general n and m,we could compute the probability that the system is functional in a similar fashion.That is,we could count the number of configurations that result in the system's being functional and then divide by the total number of all possible configurations on,we see that it would be useful to have an effectiv the number of ways tha things can occur.In fact,m ems in probabil ity theory can be so d simply by counting th ways that a cert occur.The mathematical theory of counting is formally 1.2 THE BASIC PRINCIPLE OF COUNTING e that r oneeemnt anruol toomeuer experiment can result in any of n possible outcomes,then there are mn possible out- comes of the two experiments
CHAPTER 1 Combinatorial Analysis 1.1 INTRODUCTION 1.2 THE BASIC PRINCIPLE OF COUNTING 1.3 PERMUTATIONS 1.4 COMBINATIONS 1.5 MULTINOMIAL COEFFICIENTS 1.6 THE NUMBER OF INTEGER SOLUTIONS OF EQUATIONS 1.1 INTRODUCTION Here is a typical problem of interest involving probability: A communication system is to consist of n seemingly identical antennas that are to be lined up in a linear order. The resulting system will then be able to receive all incoming signals—and will be called functional—as long as no two consecutive antennas are defective. If it turns out that exactly m of the n antennas are defective, what is the probability that the resulting system will be functional? For instance, in the special case where n = 4 and m = 2, there are 6 possible system configurations, namely, 0110 0101 1010 0011 1001 1100 where 1 means that the antenna is working and 0 that it is defective. Because the resulting system will be functional in the first 3 arrangements and not functional in the remaining 3, it seems reasonable to take 3 6 = 1 2 as the desired probability. In the case of general n and m, we could compute the probability that the system is functional in a similar fashion. That is, we could count the number of configurations that result in the system’s being functional and then divide by the total number of all possible configurations. From the preceding discussion, we see that it would be useful to have an effective method for counting the number of ways that things can occur. In fact, many problems in probability theory can be solved simply by counting the number of different ways that a certain event can occur. The mathematical theory of counting is formally known as combinatorial analysis. 1.2 THE BASIC PRINCIPLE OF COUNTING The basic principle of counting will be fundamental to all our work. Loosely put, it states that if one experiment can result in any of m possible outcomes and if another experiment can result in any of n possible outcomes, then there are mn possible outcomes of the two experiments. 1