このページのリンク

<電子ブック>
E. T. Jaynes: Papers on Probability, Statistics and Statistical Physics / edited by R.D. Rosenkrantz
(Synthese Library, Studies in Epistemology, Logic, Methodology, and Philosophy of Science. ISSN:25428292 ; 158)

1st ed. 1989.
出版者 (Dordrecht : Springer Netherlands : Imprint: Springer)
出版年 1989
本文言語 英語
大きさ 458 p. 1 illus : online resource
著者標目 Rosenkrantz, R.D editor
SpringerLink (Online service)
件 名 LCSH:Probabilities
LCSH:System theory
LCSH:Science -- Philosophy  全ての件名で検索
LCSH:Mathematical physics
FREE:Probability Theory
FREE:Complex Systems
FREE:Philosophy of Science
FREE:Theoretical, Mathematical and Computational Physics
一般注記 1. Introductory Remarks -- 2. Information Theory and Statistical Mechanics, I (1957) -- 3. Information Theory and Statistical Mechanics, II (1957) -- 4. Brandeis Lectures (1963) -- 5. Gibbs vs Boltzmann Entropies (1965) -- 6. Delaware Lecture (1967) -- 7. Prior Probabilities (1968) -- 8. The Well-Posed Problem (1973) -- 9. Confidence Intervals vs Bayesian Intervals (1976) -- 10. Where Do We Stand on Maximum Entropy? (1978) -- 11. Concentration of Distributions at Entropy Maxima (1979) -- 12. Marginalization and Prior Probabilities (1980) -- 13. What is the Question? (1981) -- 14. The Minimum Entropy Production Principle (1980) -- Supplementary Bibliography
The first six chapters of this volume present the author's 'predictive' or information theoretic' approach to statistical mechanics, in which the basic probability distributions over microstates are obtained as distributions of maximum entropy (Le. , as distributions that are most non-committal with regard to missing information among all those satisfying the macroscopically given constraints). There is then no need to make additional assumptions of ergodicity or metric transitivity; the theory proceeds entirely by inference from macroscopic measurements and the underlying dynamical assumptions. Moreover, the method of maximizing the entropy is completely general and applies, in particular, to irreversible processes as well as to reversible ones. The next three chapters provide a broader framework - at once Bayesian and objective - for maximum entropy inference. The basic principles of inference, including the usual axioms of probability, are seen to rest on nothing more than requirements of consistency, above all, the requirement that in two problems where we have the same information we must assign the same probabilities. Thus, statistical mechanics is viewed as a branch of a general theory of inference, and the latter as an extension of the ordinary logic of consistency. Those who are familiar with the literature of statistics and statistical mechanics will recognize in both of these steps a genuine 'scientific revolution' - a complete reversal of earlier conceptions - and one of no small significance
HTTP:URL=https://doi.org/10.1007/978-94-009-6581-2
目次/あらすじ

所蔵情報を非表示

電子ブック オンライン 電子ブック

Springer eBooks 9789400965812
電子リソース
EB00235049

書誌詳細を非表示

データ種別 電子ブック
分 類 LCC:QA273.A1-274.9
DC23:519.2
書誌ID 4000135640
ISBN 9789400965812

 類似資料