Tools for Statistical Inference : Methods for the Exploration of Posterior Distributions and Likelihood Functions (Springer Series in Statistics) (3 SUB)

個数:
  • ポイントキャンペーン

Tools for Statistical Inference : Methods for the Exploration of Posterior Distributions and Likelihood Functions (Springer Series in Statistics) (3 SUB)

  • ウェブストア価格 ¥25,205(本体¥22,914)
  • Springer Verlag(1996/07発売)
  • 外貨定価 US$ 129.99
  • ゴールデンウィーク ポイント2倍キャンペーン対象商品(5/6まで)
  • ポイント 458pt
  • 提携先の海外書籍取次会社に在庫がございます。通常3週間で発送いたします。
    重要ご説明事項
    1. 納期遅延や、ご入手不能となる場合が若干ございます。
    2. 複数冊ご注文の場合、分割発送となる場合がございます。
    3. 美品のご指定は承りかねます。
  • 【入荷遅延について】
    世界情勢の影響により、海外からお取り寄せとなる洋書・洋古書の入荷が、表示している標準的な納期よりも遅延する場合がございます。
    おそれいりますが、あらかじめご了承くださいますようお願い申し上げます。
  • ◆画像の表紙や帯等は実物とは異なる場合があります。
  • ◆ウェブストアでの洋書販売価格は、弊社店舗等での販売価格とは異なります。
    また、洋書販売価格は、ご注文確定時点での日本円価格となります。
    ご注文確定後に、同じ洋書の販売価格が変動しても、それは反映されません。
  • 製本 Hardcover:ハードカバー版/ページ数 207 p.
  • 言語 ENG
  • 商品コード 9780387946887
  • DDC分類 519.542

Full Description

This book provides a unified introduction to a variety of computational algorithms for Bayesian and likelihood inference. In this third edition, I have attempted to expand the treatment of many of the techniques discussed. I have added some new examples, as well as included recent results. Exercises have been added at the end of each chapter. Prerequisites for this book include an understanding of mathematical statistics at the level of Bickel and Doksum (1977), some understanding of the Bayesian approach as in Box and Tiao (1973), some exposure to statistical models as found in McCullagh and NeIder (1989), and for Section 6. 6 some experience with condi­ tional inference at the level of Cox and Snell (1989). I have chosen not to present proofs of convergence or rates of convergence for the Metropolis algorithm or the Gibbs sampler since these may require substantial background in Markov chain theory that is beyond the scope of this book. However, references to these proofs are given. There has been an explosion of papers in the area of Markov chain Monte Carlo in the past ten years. I have attempted to identify key references-though due to the volatility of the field some work may have been missed.

Contents

1. Introduction.- Exercises.- 2. Normal Approximations to Likelihoods and to Posteriors.- 2.1. Likelihood/Posterior Density.- 2.2. Specification of the Prior.- 2.3. Maximum Likelihood.- 2.4. Normal-Based Inference.- 2.5. The ?-Method (Propagation of Errors).- 2.6. Highest Posterior Density Regions.- Exercises.- 3. Nonnormal Approximations to Likelihoods and Posteriors.- 3.1. Numerical Integration.- 3.2. Posterior Moments and Marginalization Based on Laplace's Method.- 3.3. Monte Carlo Methods.- Exercises.- 4. The EM Algorithm.- 4.1. Introduction.- 4.2. Theory.- 4.3. EM in the Exponential Family.- 4.4. Standard Errors in the Context of EM.- 4.5. Monte Carlo Implementation of the E-Step.- 4.6. Acceleration of EM (Louis' Turbo EM).- 4.7. Facilitating the M-Step.- Exercises.- 5. The Data Augmentation Algorithm.- 5.1. Introduction and Motivation.- 5.2. Computing and Sampling from the Predictive Distribution.- 5.3. Calculating the Content and Boundary of the HPD Region.- 5.4. Remarks on the General Implementation of the Data Augmentation Algorithm.- 5.5. Overview of the Convergence Theory of Data Augmentation.- 5.6. Poor Man's Data Augmentation Algorithms.- 5.7. Sampling/Importance Resampling (SIR).- 5.8. General Imputation Methods.- 5.9. Further Importance Sampling Ideas.- 5.10. Sampling in the Context of Multinomial Data.- Exercises.- 6. Markov Chain Monte Carlo: The Gibbs Sampler and the Metropolis Algorithm.- 6.1. Introduction to the Gibbs Sampler.- 6.2. Examples.- 6.3. Assessing Convergence of the Chain.- 6.4. The Griddy Gibbs Sampler.- 6.5. The Metropolis Algorithm.- 6.6. Conditional Inference via the Gibbs Sampler.- Exercises.- References.