Introductory Econometrics: A Modern Approach 7th Edition by Jeffrey M. Wooldridge, ISBN-13: 978-1337558860<\/strong><\/p>\n [PDF eBook eTextbook]<\/p>\n Gain an understanding of how econometrics can answer today’s questions in business, policy evaluation and forecasting with Wooldridge’s INTRODUCTORY ECONOMETRICS: A MODERN APPROACH, 7E.<\/strong> <\/em>Unlike traditional texts, this book’s practical, yet professional, approach demonstrates how econometrics has moved beyond a set of abstract tools to become genuinely useful for answering questions across a variety of disciplines. The author has organized the book’s presentation around the type of data being analyzed with a systematic approach that only introduces assumptions as they are needed. This makes the material easier to understand and, ultimately, leads to better econometric practices. Packed with relevant applications, the text incorporates more than 100 data sets in different formats. Updates introduce the latest developments in the field, including the recent advances in the so-called “causal effects” or “treatment effects,” to provide a complete understanding of the impact and importance of econometrics today.<\/p>\n Table of Contents:<\/strong><\/p>\n Brief Contents Jeffrey M. Wooldridge<\/strong><\/em> is University Distinguished Professor of Economics at Michigan State University, where he has taught since 1991. From 1986 to 1991, he was an assistant professor of economics at the Massachusetts Institute of Technology. He received his bachelor of arts, with majors in computer science and economics, from the University of California, Berkeley, in 1982, and received his doctorate in economics in 1986 from the University of California, San Diego. He has published more than 60 articles in internationally recognized journals, as well as several book chapters. He is also the author of Econometric Analysis of Cross Section and Panel Data, second edition. His awards include an Alfred P. Sloan Research Fellowship, the Plura Scripsit award from Econometric Theory, the Sir Richard Stone prize from the Journal of Applied Econometrics, and three graduate teacher-of-the-year awards from MIT. He is a fellow of the Econometric Society and of the Journal of Econometrics. He is past editor of the Journal of Business and Economic Statistics, and past econometrics coeditor of Economics Letters. He has served on the editorial boards of Econometric Theory, the Journal of Economic Literature, the Journal of Econometrics, the Review of Economics and Statistics, and the Stata Journal. He has also acted as an occasional econometrics consultant for Arthur Andersen, Charles River Associates, the Washington State Institute for Public Policy, Stratus Consulting, and Industrial Economics, Incorporated.<\/p>\n What makes us different?<\/strong><\/p>\n \u2022 Instant Download<\/p>\n \u2022 Always Competitive Pricing<\/p>\n \u2022 100% Privacy<\/p>\n \u2022 FREE Sample Available<\/p>\n \u2022 24-7 LIVE Customer Support<\/p>\n","protected":false},"excerpt":{"rendered":" Introductory Econometrics: A Modern Approach 7th Edition by Jeffrey M. Wooldridge, ISBN-13: 978-1337558860 [PDF eBook eTextbook] Publisher: \u200e Cengage Learning; 7th edition (January 4, 2019) Language: \u200e English 816 pages ISBN-10: \u200e 1337558869 ISBN-13: \u200e 978-1337558860 Gain an understanding of how econometrics can answer today’s questions in business, policy evaluation and forecasting with Wooldridge’s INTRODUCTORY ECONOMETRICS: A MODERN APPROACH, 7E. Unlike traditional texts, this book’s practical, yet professional, approach demonstrates how econometrics has moved beyond a set…<\/p>\n","protected":false},"featured_media":31505,"template":"","meta":{"_links_to":"","_links_to_target":""},"product_cat":[14],"product_tag":[18914,18915,18913,939],"class_list":{"0":"post-31504","1":"product","2":"type-product","3":"status-publish","4":"has-post-thumbnail","6":"product_cat-business","7":"product_tag-introductory-econometrics-a-modern-approach-7th-edition-by-jeffrey-m-wooldridge","8":"product_tag-isbn-10-1337558869","9":"product_tag-isbn-13-978-1337558860","10":"product_tag-jeffrey-m-wooldridge","12":"first","13":"instock","14":"sale","15":"downloadable","16":"shipping-taxable","17":"purchasable","18":"product-type-simple"},"yoast_head":"\n\n
\nContents
\nChapter 1: The Nature of Econometrics and Economic Data
\n1-1 What Is Econometrics?
\n1-2 Steps in Empirical Economic Analysis
\n1-3 The Structure of Economic Data
\n1-3a Cross-Sectional Data
\n1-3b Time Series Data
\n1-3c Pooled Cross Sections
\n1-3d Panel or Longitudinal Data
\n1-3e A Comment on Data Structures
\n1-4 Causality, Ceteris Paribus, and Counterfactual Reasoning
\nSummary
\nKey Terms
\nProblems
\nComputer Exercises
\nPart 1: Regression Analysis with \u00adCross-Sectional Data
\nChapter 2: The Simple Regression Model
\n2-1 Definition of the Simple Regression Model
\n2-2 Deriving the Ordinary Least Squares Estimates
\n2-2a A Note on Terminology
\n2-3 Properties of OLS on Any Sample of Data
\n2-3a Fitted Values and Residuals
\n2-3b Algebraic Properties of OLS Statistics
\n2-3c Goodness-of-Fit
\n2-4 Units of Measurement and Functional Form
\n2-4a The Effects of Changing Units of Measurement on OLS Statistics
\n2-4b Incorporating Nonlinearities in Simple Regression
\n2-4c The Meaning of \u201cLinear\u201d Regression
\n2-5 Expected Values and Variances of the OLS Estimators
\n2-5a Unbiasedness of OLS
\n2-5b Variances of the OLS Estimators
\n2-5c Estimating the Error Variance
\n2-6 Regression through the Origin and Regression on a Constant
\n2-7 Regression on a Binary Explanatory Variable
\n2-7a Counterfactual Outcomes, Causality, and Policy Analysis
\nSummary
\nKey Terms
\nProblems
\nComputer Exercises
\nChapter 3: Multiple Regression Analysis: Estimation
\n3-1 Motivation for Multiple Regression
\n3-1a The Model with Two Independent Variables
\n3-1b The Model with k Independent Variables
\n3-2 Mechanics and Interpretation of Ordinary Least Squares
\n3-2a Obtaining the OLS Estimates
\n3-2b Interpreting the OLS Regression Equation
\n3-2c On the Meaning of \u201cHolding Other Factors Fixed\u201d in Multiple Regression
\n3-2d Changing More Than One Independent Variable Simultaneously
\n3-2e OLS Fitted Values and Residuals
\n3-2f A \u201cPartialling Out\u201d Interpretation of Multiple Regression
\n3-2g Comparison of Simple and Multiple Regression Estimates
\n3-2h Goodness-of-Fit
\n3-2i Regression through the Origin
\n3-3 The Expected Value of the OLS Estimators
\n3-3a Including Irrelevant Variables in a Regression Model
\n3-3b Omitted Variable Bias: The Simple Case
\n3-3c Omitted Variable Bias: More General Cases
\n3-4 The Variance of the OLS Estimators
\n3-4a The Components of the OLS Variances: Multicollinearity
\n3-4b Variances in Misspecified Models
\n3-4c Estimating s2: Standard Errors of the OLS Estimators
\n3-5 Efficiency of OLS: The Gauss-Markov Theorem
\n3-6 Some Comments on the Language of Multiple Regression Analysis
\n3-7 Several Scenarios for Applying Multiple Regression
\n3-7a Prediction
\n3-7b Efficient Markets
\n3-7c Measuring the Tradeoff between Two Variables
\n3-7d Testing for Ceteris Paribus Group Differences
\n3-7e Potential Outcomes, Treatment Effects, and Policy Analysis
\nSummary
\nKey Terms
\nProblems
\nComputer Exercises
\nChapter 4: Multiple Regression Analysis: Inference
\n4-1 Sampling Distributions of the OLS Estimators
\n4-2 Testing Hypotheses about a Single Population Parameter: The t Test
\n4-2a Testing against One-Sided Alternatives
\n4-2b Two-Sided Alternatives
\n4-2c Testing Other Hypotheses about bj
\n4-2d Computing p-Values for t Tests
\n4-2e A Reminder on the Language of Classical Hypothesis Testing
\n4-2f Economic, or Practical, versus Statistical Significance
\n4-3 Confidence Intervals
\n4-4 Testing Hypotheses about a Single Linear Combination of the Parameters
\n4-5 Testing Multiple Linear Restrictions: The F Test
\n4-5a Testing Exclusion Restrictions
\n4-5b Relationship between F and t Statistics
\n4-5c The R-Squared Form of the F Statistic
\n4-5d Computing p-values for F Tests
\n4-5e The F Statistic for Overall Significance of a Regression
\n4-5f Testing General Linear Restrictions
\n4-6 Reporting Regression Results
\n4-7 Revisiting Causal Effects and Policy Analysis
\nSummary
\nKey Terms
\nProblems
\nComputer Exercises
\nChapter 5: Multiple Regression Analysis: OLS Asymptotics
\n5-1 Consistency
\n5-1a Deriving the Inconsistency in OLS
\n5-2 Asymptotic Normality and Large Sample Inference
\n5-2a Other Large Sample Tests: The Lagrange Multiplier Statistic
\n5-3 Asymptotic Efficiency of OLS
\nSummary
\nKey Terms
\nProblems
\nComputer Exercises
\nChapter 6: Multiple Regression Analysis: Further Issues
\n6-1 Effects of Data Scaling on OLS Statistics
\n6-1a Beta Coefficients
\n6-2 More on Functional Form
\n6-2a More on Using Logarithmic Functional Forms
\n6-2b Models with Quadratics
\n6-2c Models with Interaction Terms
\n6-2d Computing Average Partial Effects
\n6-3 More on Goodness-of-Fit and Selection of Regressors
\n6-3a Adjusted R-Squared
\n6-3b Using Adjusted R-Squared to Choose between Nonnested Models
\n6-3c Controlling for Too Many Factors in Regression Analysis
\n6-3d Adding Regressors to Reduce the Error Variance
\n6-4 Prediction and Residual Analysis
\n6.4 a Confidence Intervals for Predictions
\n6-4b Residual Analysis
\n6-4c Predicting y When log(y) Is the Dependent Variable
\n6-4d Predicting y When the Dependent Variable Is log(y)
\nSummary
\nKey Terms
\nProblems
\nComputer Exercises
\nChapter 7: Multiple Regression Analysis with Qualitative Information
\n7-1 Describing Qualitative Information
\n7-2 A Single Dummy Independent Variable
\n7-2a Interpreting Coefficients on Dummy Explanatory Variables When the Dependent Variable Is log(y)
\n7-3 Using Dummy Variables for Multiple Categories
\n7-3a Incorporating Ordinal Information by Using Dummy Variables
\n7-4 Interactions Involving Dummy Variables
\n7-4a Interactions among Dummy Variables
\n7-4b Allowing for Different Slopes
\n7-4c Testing for Differences in Regression Functions across Groups
\n7-5 A Binary Dependent Variable: The Linear Probability Model
\n7-6 More on Policy Analysis and Program Evaluation
\n7-6a Program Evaluation and Unrestricted Regression Adjustment
\n7-7 Interpreting Regression Results with Discrete Dependent Variables
\nSummary
\nKey Terms
\nProblems
\nComputer Exercises
\nChapter 8: Heteroskedasticity
\n8-1 Consequences of Heteroskedasticity for OLS
\n8-2 Heteroskedasticity-Robust Inference after OLS Estimation
\n8-2a Computing Heteroskedasticity-Robust LM Tests
\n8-3 Testing for Heteroskedasticity
\n8-3a The White Test for Heteroskedasticity
\n8-4 Weighted Least Squares Estimation
\n8-4a The Heteroskedasticity Is Known up to a Multiplicative Constant
\n8-4b The Heteroskedasticity Function Must Be Estimated: Feasible GLS
\n8-4c What If the Assumed Heteroskedasticity Function Is Wrong?
\n8-4d Prediction and Prediction Intervals with Heteroskedasticity
\n8-5 The Linear Probability Model Revisited
\nSummary
\nKey Terms
\nProblems
\nComputer Exercises
\nChapter 9: More on Specification and Data Issues
\n9-1 Functional Form Misspecification
\n9-1a RESET as a General Test for Functional Form Misspecification
\n9-1b Tests against Nonnested Alternatives
\n9-2 Using Proxy Variables for Unobserved Explanatory Variables
\n9-2a Using Lagged Dependent Variables as Proxy Variables
\n9-2b A Different Slant on Multiple Regression
\n9-2c Potential Outcomes and Proxy Variables
\n9-3 Models with Random Slopes
\n9-4 Properties of OLS under Measurement Error
\n9-4a Measurement Error in the Dependent Variable
\n9-4b Measurement Error in an Explanatory Variable
\n9-5 Missing Data, Nonrandom Samples, and Outlying Observations
\n9-5a Missing Data
\n9-5b Nonrandom Samples
\n9-5c Outliers and Influential Observations
\n9-6 Least Absolute Deviations Estimation
\nSummary
\nKey Terms
\nProblems
\nComputer Exercises
\nPart 2: Regression Analysis with Time Series Data
\nChapter 10: Basic Regression Analysis with Time Series Data
\n10-1 The Nature of Time Series Data
\n10-2 Examples of Time Series Regression Models
\n10-2a Static Models
\n10-2b Finite Distributed Lag Models
\n10-2c A Convention about the Time Index
\n10-3 Finite Sample Properties of OLS under Classical Assumptions
\n10-3a Unbiasedness of OLS
\n10-3b The Variances of the OLS Estimators and the Gauss-Markov Theorem
\n10-3c Inference under the Classical Linear Model Assumptions
\n10-4 Functional Form, Dummy Variables, and Index Numbers
\n10-5 Trends and Seasonality
\n10-5a Characterizing Trending Time Series
\n10-5b Using Trending Variables in Regression Analysis
\n10-5c A Detrending Interpretation of Regressions with a Time Trend
\n10-5d Computing R-Squared When the Dependent Variable Is Trending
\n10-5e Seasonality
\nSummary
\nKey Terms
\nProblems
\nComputer Exercises
\nChapter 11: Further Issues in Using OLS with Time Series Data
\n11-1 Stationary and Weakly Dependent Time Series
\n11-1a Stationary and Nonstationary Time Series
\n11-1b Weakly Dependent Time Series
\n11-2 Asymptotic Properties of OLS
\n11-3 Using Highly Persistent Time Series in Regression Analysis
\n11-3a Highly Persistent Time Series
\n11-3b Transformations on Highly Persistent Time Series
\n11-3c Deciding Whether a Time Series Is I(1)
\n11-4 Dynamically Complete Models and the Absence of Serial Correlation
\n11-5 The Homoskedasticity Assumption for Time Series Models
\nSummary
\nKey Terms
\nProblems
\nComputer Exercises
\nChapter 12: Serial Correlation and Heteroskedasticity in Time Series Regressions
\n12-1 Properties of OLS with Serially Correlated Errors
\n12-1a Unbiasedness and Consistency
\n12-1b Efficiency and Inference
\n12-1c Goodness-of-Fit
\n12-1d Serial Correlation in the Presence of Lagged Dependent Variables
\n12-2 Serial Correlation\u2013Robust Inference after OLS
\n12-3 Testing for Serial Correlation
\n12-3a A t Test for AR(1) Serial Correlation with Strictly Exogenous Regressors
\n12-3b The Durbin-Watson Test under Classical Assumptions
\n12-3c Testing for AR(1) Serial Correlation without Strictly Exogenous Regressors
\n12-3d Testing for Higher-Order Serial Correlation
\n12-4 Correcting for Serial Correlation with Strictly Exogenous Regressors
\n12-4a Obtaining the Best Linear Unbiased Estimator in the AR(1) Model
\n12-4b Feasible GLS Estimation with AR(1) Errors
\n12-4c Comparing OLS and FGLS
\n12-4d Correcting for Higher-Order Serial Correlation
\n12-4e What if the Serial Correlation Model Is Wrong?
\n12-5 Differencing and Serial Correlation
\n12-6 Heteroskedasticity in Time Series Regressions
\n12-6a Heteroskedasticity-Robust Statistics
\n12-6b Testing for Heteroskedasticity
\n12-6c Autoregressive Conditional Heteroskedasticity
\n12-6d Heteroskedasticity and Serial Correlation in Regression Models
\nSummary
\nKey Terms
\nProblems
\nComputer Exercises
\nPart 3: Advanced Topics
\nChapter 13: Pooling Cross Sections across Time: Simple Panel Data Methods
\n13-1 Pooling Independent Cross Sections across Time
\n13-1a The Chow Test for Structural Change across Time
\n13-2 Policy Analysis with Pooled Cross Sections
\n13-2a Adding an Additional Control Group
\n13-2b A General Framework for Policy Analysis with Pooled Cross Sections
\n13-3 Two-Period Panel Data Analysis
\n13-3a Organizing Panel Data
\n13-4 Policy Analysis with Two-Period Panel Data
\n13-5 Differencing with More Than Two Time Periods
\n13-5a Potential Pitfalls in First Differencing Panel Data
\nSummary
\nKey Terms
\nProblems
\nComputer Exercises
\nChapter 14: Advanced Panel Data Methods
\n14-1 Fixed Effects Estimation
\n14-1a The Dummy Variable Regression
\n14-1b Fixed Effects or First Differencing?
\n14-1c Fixed Effects with Unbalanced Panels
\n14-2 Random Effects Models
\n14-2a Random Effects or Pooled OLS?
\n14-2b Random Effects or Fixed Effects?
\n14-3 The Correlated Random Effects Approach
\n14-3a Unbalanced Panels
\n14-4 General Policy Analysis with Panel Data
\n14-4a Advanced Considerations with Policy Analysis
\n14-5 Applying Panel Data Methods to Other Data Structures
\nSummary
\nKey Terms
\nProblems
\nComputer Exercises
\nChapter 15: Instrumental Variables Estimation and Two-Stage Least Squares
\n15-1 Motivation: Omitted Variables in a Simple Regression Model
\n15-1a Statistical Inference with the IV Estimator
\n15-1b Properties of IV with a Poor Instrumental Variable
\n15-1c Computing R-Squared after IV Estimation
\n15-2 IV Estimation of the Multiple Regression Model
\n15-3 Two-Stage Least Squares
\n15-3a A Single Endogenous Explanatory Variable
\n15-3b Multicollinearity and 2SLS
\n15-3c Detecting Weak Instruments
\n15-3d Multiple Endogenous Explanatory Variables
\n15-3e Testing Multiple Hypotheses after 2SLS Estimation
\n15-4 IV Solutions to Errors-in-Variables Problems
\n15-5 Testing for Endogeneity and Testing Overidentifying Restrictions
\n15-5a Testing for Endogeneity
\n15-5b Testing Overidentification Restrictions
\n15-6 2SLS with Heteroskedasticity
\n15-7 Applying 2SLS to Time Series Equations
\n15-8 Applying 2SLS to Pooled Cross Sections and Panel Data
\nSummary
\nKey Terms
\nProblems
\nComputer Exercises
\nChapter 16: Simultaneous Equations Models
\n16-1 The Nature of Simultaneous Equations Models
\n16-2 Simultaneity Bias in OLS
\n16-3 Identifying and Estimating a Structural Equation
\n16-3a Identification in a Two-Equation System
\n16-3b Estimation by 2SLS
\n16-4 Systems with More Than Two Equations
\n16-4a Identification in Systems with Three or More Equations
\n16-4b Estimation
\n16-5 Simultaneous Equations Models with Time Series
\n16-6 Simultaneous Equations Models with Panel Data
\nSummary
\nKey Terms
\nProblems
\nComputer Exercises
\nChapter 17: Limited Dependent Variable Models and Sample Selection Corrections
\n17-1 Logit and Probit Models for Binary Response
\n17-1a Specifying Logit and Probit Models
\n17-1b Maximum Likelihood Estimation of Logit and Probit Models
\n17-1c Testing Multiple Hypotheses
\n17-1d Interpreting the Logit and Probit Estimates
\n17-2 The Tobit Model for Corner Solution Responses
\n17-2a Interpreting the Tobit Estimates
\n17-2b Specification Issues in Tobit Models
\n17-3 The Poisson Regression Model
\n17-4 Censored and Truncated Regression Models
\n17-4a Censored Regression Models
\n17-4b Truncated Regression Models
\n17-5 Sample Selection Corrections
\n17-5a When Is OLS on the Selected Sample Consistent?
\n17-5b Incidental Truncation
\nSummary
\nKey Terms
\nProblems
\nComputer Exercises
\nChapter 18: Advanced Time Series Topics
\n18-1 Infinite Distributed Lag Models
\n18-1a The Geometric (or Koyck) Distributed Lag Model
\n18-1b Rational Distributed Lag Models
\n18-2 Testing for Unit Roots
\n18-3 Spurious Regression
\n18-4 Cointegration and Error Correction Models
\n18-4a Cointegration
\n18-4b Error Correction Models
\n18-5 Forecasting
\n18-5a Types of Regression Models Used for Forecasting
\n18-5b One-Step-Ahead Forecasting
\n18-5c Comparing One-Step-Ahead Forecasts
\n18-5d Multiple-Step-Ahead Forecasts
\n18-5e Forecasting Trending, Seasonal, and Integrated Processes
\nSummary
\nKey Terms
\nProblems
\nComputer Exercises
\nChapter 19: Carrying Out an Empirical Project
\n19-1 Posing a Question
\n19-2 Literature Review
\n19-3 Data Collection
\n19-3a Deciding on the Appropriate Data Set
\n19-3b Entering and Storing Your Data
\n19-3c Inspecting, Cleaning, and Summarizing Your Data
\n19-4 Econometric Analysis
\n19-5 Writing an Empirical Paper
\n19-5a Introduction
\n19-5b Conceptual (or Theoretical) Framework
\n19-5c Econometric Models and Estimation Methods
\n19-5d The Data
\n19-5e Results
\n19.5f Conclusions
\n19-5g Style Hints
\nSummary
\nKey Terms
\nSample Empirical Projects
\nList of Journals
\nData Sources
\nMath Refresher A Basic Mathematical Tools
\nA-1 The Summation Operator and Descriptive Statistics
\nA-2 Properties of Linear Functions
\nA-3 Proportions and Percentages
\nA-4 Some Special Functions and Their Properties
\nA-4a Quadratic Functions
\nA-4b The Natural Logarithm
\nA-4c The Exponential Function
\nA-5 Differential Calculus
\nSummary
\nKey Terms
\nProblems
\nMath Refresher B Fundamentals of Probability
\nB-1 Random Variables and Their Probability Distributions
\nB-1a Discrete Random Variables
\nB-1b Continuous Random Variables
\nB-2 Joint Distributions, Conditional Distributions, and Independence
\nB-2a Joint Distributions and Independence
\nB-2b Conditional Distributions
\nB-3 Features of Probability Distributions
\nB-3a A Measure of Central Tendency: The Expected Value
\nB-3b Properties of Expected Values
\nB-3c Another Measure of Central Tendency: The Median
\nB-3d Measures of Variability: Variance and Standard Deviation
\nB-3e Variance
\nB-3f Standard Deviation
\nB-3g Standardizing a Random Variable
\nB-3h Skewness and Kurtosis
\nB-4 Features of Joint and Conditional Distributions
\nB-4a Measures of Association: Covariance and Correlation
\nB-4b Covariance
\nB-4c Correlation Coefficient
\nB-4d Variance of Sums of Random Variables
\nB-4e Conditional Expectation
\nB-4f Properties of Conditional Expectation
\nB-4g Conditional Variance
\nB-5 The Normal and Related Distributions
\nB-5a The Normal Distribution
\nB-5b The Standard Normal Distribution
\nB-5c Additional Properties of the Normal Distribution
\nB-5d The Chi-Square Distribution
\nB-5e The t Distribution
\nB-5f The F Distribution
\nSummary
\nKey Terms
\nProblems
\nMath Refresher C Fundamentals of Mathematical Statistics
\nC-1 Populations, Parameters, and Random Sampling
\nC-1a Sampling
\nC-2 Finite Sample Properties of Estimators
\nC-2a Estimators and Estimates
\nC-2b Unbiasedness
\nC-2c The Sampling Variance of Estimators
\nC-2d Efficiency
\nC-3 Asymptotic or Large Sample Properties of Estimators
\nC-3a Consistency
\nC-3b Asymptotic Normality
\nC-4 General Approaches to Parameter Estimation
\nC-4a Method of Moments
\nC-4b Maximum Likelihood
\nC-4c Least Squares
\nC-5 Interval Estimation and Confidence Intervals
\nC-5a The Nature of Interval Estimation
\nC-5b Confidence Intervals for the Mean from a Normally Distributed Population
\nC-5c A Simple Rule of Thumb for a 95% Confidence Interval
\nC-5d Asymptotic Confidence Intervals for Nonnormal Populations
\nC-6 Hypothesis Testing
\nC-6a Fundamentals of Hypothesis Testing
\nC-6b Testing Hypotheses about the Mean in a Normal Population
\nC-6c Asymptotic Tests for Nonnormal Populations
\nC-6d Computing and Using p-Values
\nC-6e The Relationship between Confidence Intervals and Hypothesis Testing
\nC-6f Practical versus Statistical Significance
\nC-7 Remarks on Notation
\nSummary
\nKey Terms
\nProblems
\nAdvanced Treatment D Summary of Matrix Algebra
\nD-1 Basic Definitions
\nD-2 Matrix Operations
\nD-2a Matrix Addition
\nD-2b Scalar Multiplication
\nD-2c Matrix Multiplication
\nD-2d Transpose
\nD-2e Partitioned Matrix Multiplication
\nD-2f Trace
\nD-2g Inverse
\nD-3 Linear Independence and Rank of a Matrix
\nD-4 Quadratic Forms and Positive Definite Matrices
\nD-5 Idempotent Matrices
\nD-6 Differentiation of Linear and Quadratic Forms
\nD-7 Moments and Distributions of Random Vectors
\nD-7a Expected Value
\nD-7b Variance-Covariance Matrix
\nD-7c Multivariate Normal Distribution
\nD-7d Chi-Square Distribution
\nD-7e t Distribution
\nD-7f F Distribution
\nSummary
\nKey Terms
\nProblems
\nAdvanced Treatment E The Linear Regression Model in Matrix Form
\nE-1 The Model and Ordinary Least Squares Estimation
\nE-1a The Frisch-Waugh Theorem
\nE-2 Finite Sample Properties of OLS
\nE-3 Statistical Inference
\nE-4 Some Asymptotic Analysis
\nE-4a Wald Statistics for Testing Multiple Hypotheses
\nSummary
\nKey Terms
\nProblems
\nAnswers to Going Further Questions
\nStatistical Tables
\nReferences
\nGlossary
\nIndex<\/p>\n