Normal view MARC view

Applied statistical decision theory

Author: Raiffa, Howard ; Schlaifer, Robert Series: Wiley classics library Publisher: Wiley, 2000.Language: EnglishDescription: 356 p. : Graphs/Ill. ; 23 cm.ISBN: 047138349XType of document: Book
Tags: No tags from this library for this title. Log in to add tags.
Item type Current location Collection Call number Status Date due Barcode Item holds
Book Europe Campus
Main Collection
Print QA279.4 .R35 2000
(Browse shelf)
001229503
Available 001229503
Total holds: 0

Digitized

Applied Statistical Decision Theory Contents Foreword Preface and Introduction Part I: Experimentation and Decision: General Theory 1. The Problem and the Two Basic Modes of Analysis 1. Description of the Decision Problem 1: The basic data; 2: Assessment of probability measures; 3: Example; 4: The general decision problem as a game. 2. Analysis in Extensive Form 1: Backwards induction; 2: Example. 3. Analysis in Normal Form 1: Decision rules; 2: Performance, error, and utility characteristics; 3: Example; 4: Equivalence of the extensive and normal form; 5: Bayesian decision theory as a completion of classical theory; 6: Informal choice of a decision rule. 4. Combination of Formal and Informal Analysis 1: Unknown costs; cutting the decision tree; 2: Incomplete analysis of the decision tree; 3: Example. 5. Prior Weights and Consistent Behavior 2. Sufficient Statistics and Noninformative Stopping 1. Introduction 1: Simplifying assumptions; 2: Bayes' theorem; kernels 2. Sufficiency 1: Bayesian definition of sufficiency; 2: Identification of sufficient statistics; 3: Equivalence of the Bayesian and classical definitions of sufficiency; 4: Nuisance parameters and marginal sufficiency. 3. Noninformative Stopping 1: Data-generating processes and stopping processes; 2: Likelihood of a sample; 3: Noninformative stopping processes; 4: Contrast between the Bayesian and classical treatments of stopping; 5: Summary. 3. Conjugate Prior Distributions 1. Introduction; Assumptions and Definitions 1: Desiderata for a family of prior distributions; 2: Sufficient statistics of fixed dimensionality. 28 3 vi 43 Contents 2. Conjugate Prior Distributions 1: Use of the sample kernel as a prior kernel; 2: The posterior distribution when the prior distribution is natural-conjugate; 3: Extension of the domain of the parameter; 4: Extension by introduction of a new parameter; 5: Conspectus of natural-conjugate densities. 3. Choice and Interpretation of a Prior Distribution 1: Distributions fitted to historical relative frequencies; 2: Distributions fitted to subjective betting odds; 3: Comparison of the weights of prior and sample evidence; 4: "Quantity of information" and "vague" opinions; 5: Sensitivity analysis; 6: Scientific reporting. 4. Analysis in Extensive Form when the Prior Distribution and Sample Likelihood are Conjugate 1: Definitions of terminal and preposterior analysis; 2: Terminal analysis; 3: Preposterior analysis. Part II: Extensive-Form Analysis When Sampling and Terminal Utilities Are Additive 4. Additive Utility, Opportunity Loss, and the Value of Information: Introduction to Part II 1. Basic Assumptions 2. Applicability of Additive Utilities 3. Computation of Expected Utility 4. Opportunity Loss 1: Definition of opportunity loss; 2: Extensive-form analysis using opportunity loss instead of utility; 3: Opportunity loss when terminal and sampling utilities are additive; 4: Direct assessment of terminal opportunity losses; 5: Upper bounds on optimal sample size. 5. The Value of Information 1: The value of perfect information; 2: The value of sample information and the net gain of sampling; 3: Summary of relations among utilities, opportunity losses, and value of information. 5A. Linear Terminal Analysis 1. Introduction 1: The transformed state description ; 2: Terminal analysis. 2. Expected Value of Perfect Information when is Scalar 1: Two-action problems; 2: Finite-action problems; 3: Evaluation of linearloss integrals; 4: Examples. 3. Preposterior Analysis 1: The posterior mean as a random variable; 2: The expected value of sample information. 4. The Prior Distribution of the Posterior Mean for Given e 1: Mean and variance of w; 2: Limiting behavior of the distribution; 3: Limiting behavior of integrals when is scalar; 4: Exact distributions of w; 5: Approximations to the distribution of w; 6: Examples. 93 79 Contents 5. Optimal Sample Size in Two-Action Problems when the Sample Observations are Normal and Their Variance is Known 1: Definitions and notation; 2: Behavior of net gain as a function of sample size; 3: Optimal sample size; 4: Asymptotic behavior of optimal sample size; 5: Asymptotic behavior of opportunity loss; 6: Fixed element in sampling cost. 6. Optimal Sample Size in Two-Action Problems when the Sample Observations are Binomial 1: Definitions and notation; 2: Behavior of the EVSI as a function of n; 3: Behavior of the net gain of sampling; optimal sample size; 4: A normal approximation to optimal sample size. 5B. Selection of the Best of Several Processes 139 7. Introduction; Basic Assumptions 8. Analysis in Terms of Differential Utility 1: Notation: the random variables v and 2: Analysis in terms of v and v; 3: The usefulness of differential utility. 9. Distribution of S and v" when the Processes are Independent Normal and is Linear in v 1: Basic assumptions; notation; 2: Conjugate distribution of v; 3: Distribution of ; 4: Distribution of g" when all processes are to be sampled; 5: Distribution of g" when some processes are not to be sampled. 10. Value of Information and Optimal Size when There are Two Independent Normal Processes 1: EVPI; 2: EVSI; 3: Optimal allocation of a fixed experimental budget; 4: Optimal sample size when h is known and only one process is to be sampled; 5: Optimal sample size when h is known and both processes are to be sampled according to l+ or l-; 6: The general problem of optimal sample size when h is known. 11. Value of Information when There are Three Independent-Normal Processes 1: The basic integral in the nondegenerate case; 2: Transformation to a unitspherical distribution; 3: Evaluation of the EVI by numerical integration; 4: Evaluation of the EVI by bivariate Normal tables when h is known; 5: Bounds on the EVI; 6: Example; 7: EVI when the prior expected utilities are equal; 8: EVSI when only one process is to be sampled. 12. Value of Information when There are More than Three Independent-Normal Processes 1: The nondegenerate case; 2: The degenerate case; 3: Choice of the optimal experiment. 6. Problems in Which the Act and State Spaces Coincide 176 1. Introduction 1: Basic assumptions; 2: Example. 2. Certainty Equivalents and Point Estimation 1: Certainty equivalents; 2: Example; 3: General theory of certainty equivalents; 4: Approximation of Xi; 5: Subjective evaluation of Xi; 6: Rough and ready estimation; 7: Multipurpose estimation. 3. Quadratic Terminal Opportunity Loss 1: Terminal analysis; 2: Preposterior analysis; 3: Optimal sample size. Contents 4. Linear Terminal Opportunity Loss 1: Terminal analysis; 2: Preposterior analysis; 3: Optimal sample size. 5. Modified Linear and Quadratic Loss Structures Part III: Distribution Theory 7. Univariate Normalized Mass and Density Functions 211 0. Introduction 1: Normalized mass and density functions; 2: Cumulative functions; 3: Moments; 4: Expectations and variances; 5: Integrand transformations; 6: Effect of linear transformations on moments. A. Natural Univariate Mass and Density Functions 213 1. Binomial Function 2. Pascal Function 3. Beta Functions 1: Standardized beta function; 2: Beta function in alternate notation. 4. Inverted Beta Functions 1: Inverted-beta-1 function; 2: Inverted-beta-2 function; 3: F function. 5. Poisson Function 6. Gamma Functions 1: Standardized gamma function; 2: Gamma-1 function; 3: Chi-square function; 4: Gamma-2 function. 7. Inverted Gamma Functions 1: Inverted-gamma-1 function; 2: Inverted-gamma-2 function. 8. Normal Functions 1: Standardized Normal functions; 2: General Normal functions. B. Compound Univariate Mass and Density Functions 9. Student Functions 1: Standardized Student function; 2: General Student function. 10. Negative-Binomial Function 11. Beta-Binomial and Beta-Pascal Functions 1: Relations with the hypergeometric function; 2: Computation of the cumulative beta-binomial and beta-Pascal functions. 8. Multivariate Normalized Density Functions 0. Introduction 1: Matrix and vector notation; 2: Inverses of matrices; 3: Positive-definite and positive-semidefinite matrices; 4: Projections; 5: Notation for multivariate densities and integrals; 6: Moments; expectations and variances. 1. Unit-Spherical Normal Function 1: Conditional and marginal densities; 2: Tables. 2. General Normal Function 1: Conditional and marginal densities; 2: Tables; 3: Linear combinations of normal random variables. 242 232 Contents 3. Student Function 1: Conditional and marginal densities; 2: Linear combinations of Student random variables. 4. Inverted-Student Function 9. Bernoulli Process 1. Prior and Posterior Analysis 1: Definition of a Bernoulli process; 2: Likelihood of a sample; 3: Conjugate distribution of p;4: Conjugate distribution of 1/p = p 2. Sampling Distributions and Preposterior Analysis: Binomial Sampling 1: Definition of binomial sampling; 2: Conditional distribution of (tip); 3: Unconditional distribution of r; 4: Distribution of p"; 5: Distribution of p". 3. Sampling Distributions and Preposterior Analysis: Pascal Sampling 1: Definition of Pascal sampling; 2: Conditional distribution of (/p); 3: Unconditional distribution of ; 4: Distribution of p", 5: Distribution of p". 10. Poisson Process 1. Prior and Posterior Analysis 1: Definition of a Poisson process; 2: Likelihood of sample; 3: Conjugate distribution of X; 4: Conjugate distribution of 1/ = 2. Sampling Distributions and Preposterior Analysis: Gamma Sampling 1: Definition of Gamma sampling; 2: Conditional distribution of (/); 3: Unconditional distribution of t; 4: Distribution of X"; 5: Distribution of 3. Sampling Distributions and Preposterior Analysis: Poisson Sampling 1: Definition of Poisson sampling; 2: Conditional distribution of (/); 3: Unconditional distribution of f; 4: Distribution of ."; 5: Distribution of /". 11. Independent Normal Process A. Mean Known 1. Prior and Posterior Analysis 1: Definition of an independent Normal process; 2: Likelihood of a sample when is known; 3: Conjugate distribution of h; 4: Conjugate distribution of . 2. Sampling Distributions and Preposterior Analysis with Fixed v 1: Conditional distribution of (/h); 2: Unconditional distribution of ; 3: Distribution of v". B. Precision Known 3. Prior and Posterior Analysis 1: Likelihood of a sample when h is known; 2: Conjugate distribution of 4. Sampling Distributions and Preposterior Analysis with Fixed n 1: Conditional distribution of (m/); 2: Unconditional distribution of m; 3: Distribution of m and ". 294 290 290 275 261 Contents C. Neither Parameter Known 5. Prior and Posterior Analysis 1: Likelihood of a sample when neither parameter is known; 2: Likelihood of the incomplete statistics (m, n) and (v, v); 3: Distribution of (A, h); 4: Marginal distribution of h; 5: Marginal distribution of 1.i; 6: Limiting behavior of the prior distribution. 6. Sampling Distributions with Fixed n 1: Conditional joint distribution of (A, 151/4, h); 2: Unconditional joint distribution of (A, 10; 3: Unconditional distribution of A and 0. 7. Preposterior Analysis with Fixed n 1: Joint distribution of (A", a"); 2: Distribution of fit" and 6"; 3: Distribution of a"; 4: Distribution of ft". 12. Independent Multinormal Process 310 298 A. Precision Known 310 1. Prior and Posterior Analysis 1: Definition of the independent multinormal process; 2: Likelihood of a sample when fl is known; 3: Likelihood of a sample when both h and are known; 4: Conjugate distribution of 2. Sampling Distributions with Fixed n 1: Conditional distribution of (Alp); 2: Unconditional distribution of hi. 3. Preposterior Analysis with Fixed n 1: Distribution of rii" ; 2: Distribution of ft". B. Relative Precision Known 4. Prior and Posterior Analysis 1: Likelihood of a sample when only is known; 2: Likelihood of the statistics (m, n) and (v, v); 3: Conjugate distribution of (a, h); 4: Distributions of Ii; 5: Distributions of ,1. 5. Sampling Distributions with Fixed n 1: Conditional joint distribution of (th, h); 2: Unconditional joint distribution of (th, ii); 3: Unconditional distributions of m" and 0. 6. Preposterior Analysis with Fixed n 1: Joint distribution of (m", v); 2: Distributions of m" and 0"; 3: Distributions of u." and u". C. Interrelated Univariate Normal Processes 7. Introduction 8. Analysis When All Processes Are Sampled 9. Analysis when Only p < r Processes are Sampled 1: Notation; 2: Posterior analysis; 3: Conditional sampling distributions with fixed n; 4: Marginal distribution of (filth) and (pi, A); 5: Unconditional distributions of /hi and 0; 6: Distributions of thin and 0"; 7: Preposterior analysis. 326 316 Contents 13. Normal Regression Process 1. Introduction 1: Definition of the normal regression process; 2: Likelihood of a sample; 3: Analogy with the multinormal process. A. Precision Known 2. Prior and Posterior Analysis 1: Likelihood of a sample when h is known; 2: Distribution of ß. 3. Sampling Distributions with Fixed X 1: Conditional distribution of (y/ß); 2: Unconditional distribution of y; 3: Distributions of b when X is of rank r. 4. Preposterior Analysis with Fixed X of Rank r 1: Distribution of b"; 2: Distribution of ß". 336 334 B. Precision Unknown 342 5. Prior and Posterior Analysis 1: Likelihood of a sample when neither ß nor h is known; 2: Distribution of (ß, h); 3: Marginal and conditional distributions of h; 4: Marginal and con ditional distributions of ß. 6. Sampling Distributions with Fixed X 1: Unconditional distribution of y; 2: Conditional joint distribution of (b, v/ß, h); 3: Unconditional distributions of b and v. 7. Preposterior Analysis with Fixed X of Rank r 1: Joint distribution of (b", v") ; 2: Distributions of b" and v"; 3: Distributions of ß" and ß". C. Xt' X Singular 349 8. Introduction 1: Definitions and Notation. 9. Distributions of b* and v 1: Conditional distributions of b* and 2: Distributions of (ß*/h) and (ß*, h); 3: Unconditional distributions of b* and v. 10. Preposterior Analysis 1: Utilities dependent on b" alone; 2: Utilities dependent on (b", v") jointly; 3: Distribution of

There are no comments for this item.

Log in to your account to post a comment.
Koha 18.11 - INSEAD Catalogue
Home | Contact Us | What's Koha?