WACKERLY·MENDENHALL·SCHEAFFERMathematicalStatisticswithApplications7thEdition
CONTENTSPrefacexiliNote to the Studentxxi>WhatIs Statistics?11.1Introduction11.2Characterizing a Set of Measurements: Graphical Methods31.3Characterizing a Set of Measurements:Numerical Methods81.4HowInferencesAreMade131.5Theory and Reality 141.6Summary152Probability 202.1Introduction202.2Probabilityand Inference212.3AReviewof SetNotation232.4AProbabilistic Model foranExperiment:TheDiscrete Case262.5135Calculating the Probability of an Event: The Sample-Point Method2.6ToolsforCountingSamplePoints402.7Conditional Probabilityand theIndependenceof Events512.8TwoLawsof Probability57>Copyright 2011 Cengage Les.All RightEditors
CONTENTS Preface xiii Note to the Student xxi 1 What Is Statistics? 1 1.1 Introduction 1 1.2 Characterizing a Set of Measurements: Graphical Methods 3 1.3 Characterizing a Set of Measurements: Numerical Methods 8 1.4 How Inferences Are Made 13 1.5 Theory and Reality 14 1.6 Summary 15 2 Probability 20 2.1 Introduction 20 2.2 Probability and Inference 21 2.3 A Review of Set Notation 23 2.4 A Probabilistic Model for an Experiment: The Discrete Case 26 2.5 Calculating the Probability of an Event: The Sample-Point Method 35 2.6 Tools for Counting Sample Points 40 2.7 Conditional Probability and the Independence of Events 51 2.8 Two Laws of Probability 57 v Copyright 2011 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it
viContents2.9Calculating theProbabilityof an Event:TheEvent-CompositionMethod622.10TheLawofTotalProbabilityandBayes'Rule702.11Numerical Events and RandomVariables752.12Random Sampling772.13Summary793DiscreteRandomVariablesandTheirProbabilityDistributions863.1BasicDefinition863.2TheProbabilityDistributionforaDiscreteRandomVariable873.3TheExpected Value of a RandomVariable ora FunctionofaRandomVariable913.4TheBinomial ProbabilityDistribution1003.5The GeometricProbabilityDistribution1143.6121The Negative Binomial Probability Distribution (Optional)3.7TheHypergeometricProbabilityDistribution1253.8ThePoissonProbabilityDistribution1313.9Moments and Moment-Generating Functions1383.10Probability-GeneratingFunctions(Optional)1433.11Tchebysheff'sTheorem1463.12Summary1494 Continuous Variables and Their ProbabilityDistributions 1574.1Introduction1574.2The ProbabilityDistribution for a Continuous RandomVariable1584.3ExpectedValuesforContinuousRandomVariables1704.4TheUniformProbabilityDistribution1744.5The Normal ProbabilityDistribution1784.6The Gamma ProbabilityDistribution1854.7TheBeta ProbabilityDistribution194Copyright 2011 Ceial
vi Contents 2.9 Calculating the Probability of an Event: The Event-Composition Method 62 2.10 The Law of Total Probability and Bayes’ Rule 70 2.11 Numerical Events and Random Variables 75 2.12 Random Sampling 77 2.13 Summary 79 3 Discrete Random Variables and Their Probability Distributions 86 3.1 Basic Definition 86 3.2 The Probability Distribution for a Discrete Random Variable 87 3.3 The Expected Value of a Random Variable or a Function of a Random Variable 91 3.4 The Binomial Probability Distribution 100 3.5 The Geometric Probability Distribution 114 3.6 The Negative Binomial Probability Distribution (Optional) 121 3.7 The Hypergeometric Probability Distribution 125 3.8 The Poisson Probability Distribution 131 3.9 Moments and Moment-Generating Functions 138 3.10 Probability-Generating Functions (Optional) 143 3.11 Tchebysheff’s Theorem 146 3.12 Summary 149 4 Continuous Variables and Their Probability Distributions 157 4.1 Introduction 157 4.2 The Probability Distribution for a Continuous Random Variable 158 4.3 Expected Values for Continuous Random Variables 170 4.4 The Uniform Probability Distribution 174 4.5 The Normal Probability Distribution 178 4.6 The Gamma Probability Distribution 185 4.7 The Beta Probability Distribution 194 Copyright 2011 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it
Contentsvii4.8SomeGeneralComments2014.9OtherExpectedValues2024.10Tchebysheff'sTheorem2074.11Expectations of Discontinuous Functions and MixedProbabilityDistributions (Optional)2104.12Summary2145MultivariateProbabilityDistributions 2235.1Introduction 2235.2224Bivariate and Multivariate ProbabilityDistributions5.3Marginal and Conditional Probability Distributions2355.4IndependentRandomVariables2475.5TheExpectedValueof aFunctionof RandomVariables2555.6SpecialTheorems2585.7TheCovarianceofTwoRandomVariables2645.8TheExpected Value and Variance of Linear FunctionsofRandomVariables2705.9TheMultinomialProbabilityDistribution2795.10TheBivariateNormalDistribution(Optional)2835.11Conditional Expectations2855.12Summary2906FunctionsofRandomVariables2966.1Introduction2966.2Finding the Probability Distribution of a FunctionofRandomVariables2976.3TheMethod of Distribution Functions2986.4TheMethod ofTransformations3106.5TheMethod of Moment-GeneratingFunctions3186.6MultivariableTransformations Using Jacobians (Optional)3256.7OrderStatistics3336.8Summary341Copyright 2011 Cengage IRialBditc
Contents vii 4.8 Some General Comments 201 4.9 Other Expected Values 202 4.10 Tchebysheff’s Theorem 207 4.11 Expectations of Discontinuous Functions and Mixed Probability Distributions (Optional) 210 4.12 Summary 214 5 Multivariate Probability Distributions 223 5.1 Introduction 223 5.2 Bivariate and Multivariate Probability Distributions 224 5.3 Marginal and Conditional Probability Distributions 235 5.4 Independent Random Variables 247 5.5 The Expected Value of a Function of Random Variables 255 5.6 Special Theorems 258 5.7 The Covariance of Two Random Variables 264 5.8 The Expected Value and Variance of Linear Functions of Random Variables 270 5.9 The Multinomial Probability Distribution 279 5.10 The Bivariate Normal Distribution (Optional) 283 5.11 Conditional Expectations 285 5.12 Summary 290 6 Functions of Random Variables 296 6.1 Introduction 296 6.2 Finding the Probability Distribution of a Function of Random Variables 297 6.3 The Method of Distribution Functions 298 6.4 The Method of Transformations 310 6.5 The Method of Moment-Generating Functions 318 6.6 Multivariable Transformations Using Jacobians (Optional) 325 6.7 Order Statistics 333 6.8 Summary 341 Copyright 2011 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it
viliContents7SamplingDistributions andtheCentralLimitTheorem3467.1Introduction 3467.2SamplingDistributions Related to theNormalDistribution3537.3TheCentral LimitTheorem3707.4AProofof theCentral LimitTheorem(Optional)3777.5TheNormal ApproximationtotheBinomialDistribution3787.6Summary3858Estimation3908.1Introduction3908.2TheBias and Mean SquareErrorof Point Estimators3928.3SomeCommonUnbiasedPointEstimators3968.4Evaluating theGoodness ofa Point Estimator3998.5Confidence Intervals4068.6Large-Sample Confidence Intervals4118.7SelectingtheSampleSize4218.8Small-Sample ConfidenceIntervals for μ and μi-μ24258.9Confidence Intervals for24348.10Summary4379PropertiesofPoint Estimators and MethodsofEstimation4449.1Introduction4449.2Relative Efficiency4459.3Consistency4489.4Sufficiency4599.5TheRao-Blackwell Theorem and Minimum-VarianceUnbiasedEstimation4649.6TheMethodofMoments4729.7TheMethod of MaximumLikelihood14769.8Some Large-Sample Properties of Maximum-Likelihood483Estimators(Optional)9.9Summary485Copyright 2011 Cengage LRightEtitou
viii Contents 7 Sampling Distributions and the Central Limit Theorem 346 7.1 Introduction 346 7.2 Sampling Distributions Related to the Normal Distribution 353 7.3 The Central Limit Theorem 370 7.4 A Proof of the Central Limit Theorem (Optional) 377 7.5 The Normal Approximation to the Binomial Distribution 378 7.6 Summary 385 8 Estimation 390 8.1 Introduction 390 8.2 The Bias and Mean Square Error of Point Estimators 392 8.3 Some Common Unbiased Point Estimators 396 8.4 Evaluating the Goodness of a Point Estimator 399 8.5 Confidence Intervals 406 8.6 Large-Sample Confidence Intervals 411 8.7 Selecting the Sample Size 421 8.8 Small-Sample Confidence Intervals for μ and μ1 − μ2 425 8.9 Confidence Intervals for σ 2 434 8.10 Summary 437 9 Properties of Point Estimators and Methods of Estimation 444 9.1 Introduction 444 9.2 Relative Efficiency 445 9.3 Consistency 448 9.4 Sufficiency 459 9.5 The Rao–Blackwell Theorem and Minimum-Variance Unbiased Estimation 464 9.6 The Method of Moments 472 9.7 The Method of Maximum Likelihood 476 9.8 Some Large-Sample Properties of Maximum-Likelihood Estimators (Optional) 483 9.9 Summary 485 Copyright 2011 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s). Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it