<電子ブック>
Regression Analysis : Theory, Methods and Applications / by Ashish K. Sen, Muni S. Srivastava
(Springer Texts in Statistics. ISSN:21974136)
版 | 1st ed. 1990. |
---|---|
出版者 | (Berlin, Heidelberg : Springer Berlin Heidelberg : Imprint: Springer) |
出版年 | 1990 |
大きさ | XV, 348 p. 5 illus : online resource |
著者標目 | *Sen, Ashish K author Srivastava, Muni S author SpringerLink (Online service) |
件 名 | LCSH:Statistics LCSH:Mathematical analysis FREE:Statistical Theory and Methods FREE:Analysis |
一般注記 | 1 Introduction -- 2 Multiple Regression -- 3 Tests and Confidence Regions -- 4 Indicator Variables -- 5 The Normality Assumption -- 6 Unequal Variances -- 7 *Correlated Errors -- 8 Outliers and Influential Observations -- 9 Transformations -- 10 Multicollinearity -- 11 Variable Selection -- 12 *Biased Estimation -- A Matrices -- A.1 Addition and Multiplication -- A.2 The Transpose of a Matrix -- A.3 Null and Identity Matrices -- A.4 Vectors -- A.5 Rank of a Matrix -- A.6 Trace of a Matrix -- A.7 Partitioned Matrices -- A.8 Determinants -- A.9 Inverses -- A.10 Characteristic Roots and Vectors -- A.11 Idempotent Matrices -- A.12 The Generalized Inverse -- A.13 Quadratic Forms -- A.14 Vector Spaces -- Problems -- B Random Variables and Random Vectors -- B.1 Random Variables -- B.1.1 Independent Random Variables -- B.1.2 Correlated Random Variables -- B.1.3 Sample Statistics -- B.1.4 Linear Combinations of Random Variables -- B.2 Random Vectors -- B.3 The Multivariate Normal Distribution -- B.4 The Chi-Square Distributions -- B.5 The F and t Distributions -- B.6 Jacobian of Transformations -- B.7 Multiple Correlation -- Problems -- C Nonlinear Least Squares -- C.1 Gauss-Newton Type Algorithms -- C.1.1 The Gauss-Newton Procedure -- C.1.2 Step Halving -- C.1.3 Starting Values and Derivatives -- C.1.4 Marquardt Procedure -- C.2 Some Other Algorithms -- C.2.1 Steepest Descent Method -- C.2.2 Quasi-Newton Algorithms -- C.2.3 The Simplex Method -- C.2.4 Weighting -- C.3 Pitfalls -- C.4 Bias, Confidence Regions and Measures of Fit -- C.5 Examples -- Problems -- Tables -- References -- Author Index Any method of fitting equations to data may be called regression. Such equations are valuable for at least two purposes: making predictions and judging the strength of relationships. Because they provide a way of em pirically identifying how a variable is affected by other variables, regression methods have become essential in a wide range of fields, including the soeial seiences, engineering, medical research and business. Of the various methods of performing regression, least squares is the most widely used. In fact, linear least squares regression is by far the most widely used of any statistical technique. Although nonlinear least squares is covered in an appendix, this book is mainly ab out linear least squares applied to fit a single equation (as opposed to a system of equations). The writing of this book started in 1982. Since then, various drafts have been used at the University of Toronto for teaching a semester-Iong course to juniors, seniors and graduate students in a number of fields, including statistics, pharmacology, pharmacology, engineering, economics, forestry and the behav ioral seiences. Parts of the book have also been used in a quarter-Iong course given to Master's and Ph.D. students in public administration, urban plan ning and engineering at the University of Illinois at Chicago (UIC). This experience and the comments and critieisms from students helped forge the final version HTTP:URL=https://doi.org/10.1007/978-3-662-25092-1 |
目次/あらすじ
所蔵情報を非表示
電子ブック | 配架場所 | 資料種別 | 巻 次 | 請求記号 | 状 態 | 予約 | コメント | ISBN | 刷 年 | 利用注記 | 指定図書 | 登録番号 |
---|---|---|---|---|---|---|---|---|---|---|---|---|
電子ブック | オンライン | 電子ブック |
|
Springer eBooks | 9783662250921 |
|
電子リソース |
|
EB00207042 |
類似資料
この資料の利用統計
このページへのアクセス回数:3回
※2017年9月4日以降