**Probability, Statistics, and Random Processes for Engineers 4th Edition by Henry Stark, ISBN-13: 978-0132311236**

[PDF eBook eTextbook]

- Publisher: Pearson; 4th edition (August 10, 2011)
- Language: English
- 704 pages
- ISBN-10: 0132311232
- ISBN-13: 978-0132311236

For courses in Probability and Random Processes.

* Probability, Statistics, and Random Processes for Engineers, 4e *is a useful text for electrical and computer engineers. This book is a comprehensive treatment of probability and random processes that, more than any other available source, combines rigor with accessibility. Beginning with the fundamentals of probability theory and requiring only college-level calculus, the book develops all the tools needed to understand more advanced topics such as random sequences, continuous-time random processes, and statistical signal processing. The book progresses at a leisurely pace, never assuming more knowledge than contained in the material already covered. Rigor is established by developing all results from the basic axioms and carefully defining and discussing such advanced notions as stochastic convergence, stochastic integrals and resolution of stochastic processes.

**Table of Contents:**

Title Page

Copyright Page

Contents

Preface

1 Introduction to Probability

1.1 Introduction: Why Study Probability?

1.2 The Different Kinds of Probability

Probability as Intuition

Probability as the Ratio of Favorable to Total Outcomes (Classical Theory)

Probability as a Measure of Frequency of Occurrence

Probability Based on an Axiomatic Theory

1.3 Misuses, Miscalculations, and Paradoxes in Probability

1.4 Sets, Fields, and Events

Examples of Sample Spaces

1.5 Axiomatic Definition of Probability

1.6 Joint, Conditional, and Total Probabilities; Independence

Compound Experiments

1.7 Bayes’ Theorem and Applications

1.8 Combinatorics

Occupancy Problems

Extensions and Applications

1.9 Bernoulli Trials—Binomial and Multinomial Probability Laws

Multinomial Probability Law

1.10 Asymptotic Behavior of the Binomial Law: The Poisson Law

1.11 Normal Approximation to the Binomial Law

Summary

Problems

References

2 Random Variables

2.1 Introduction

2.2 Definition of a Random Variable

2.3 Cumulative Distribution Function

Properties of F[Sub(X)](x)

Computation of F[Sub(X)](x)

2.4 Probability Density Function (pdf)

Four Other Common Density Functions

More Advanced Density Functions

2.5 Continuous, Discrete, and Mixed Random Variables

Some Common Discrete Random Variables

2.6 Conditional and Joint Distributions and Densities

Properties of Joint CDF F[Sub(XY)](x, y)

2.7 Failure Rates

Summary

Problems

References

Additional Reading

3 Functions of Random Variables

3.1 Introduction

Functions of a Random Variable (FRV): Several Views

3.2 Solving Problems of the Type Y = g(X)

General Formula of Determining the pdf of Y = g(X)

3.3 Solving Problems of the Type Z = g(X, Y)

3.4 Solving Problems of the Type V = g(X, Y), W = h(X, Y)

Fundamental Problem

Obtaining fVW Directly from fXY

3.5 Additional Examples

Summary

Problems

References

Additional Reading

4 Expectation and Moments

4.1 Expected Value of a Random **Variable**

On the Validity of Equation 4.1-8

4.2 Conditional Expectations

Conditional Expectation as a Random Variable

4.3 Moments of Random Variables

Joint Moments

Properties of Uncorrelated Random Variables

Jointly Gaussian Random Variables

4.4 Chebyshev and Schwarz Inequalities

Markov Inequality

The Schwarz Inequality

4.5 Moment-Generating Functions

4.6 Chernoff Bound

4.7 Characteristic Functions

Joint Characteristic Functions

The Central Limit Theorem

4.8 Additional Examples

Summary

Problems

References

Additional Reading

5 Random Vectors

5.1 Joint Distribution and Densities

5.2 Multiple Transformation of Random Variables

5.3 Ordered Random Variables

Distribution of area random variables

5.4 Expectation Vectors and Covariance Matrices

5.5 Properties of Covariance Matrices

Whitening Transformation

5.6 The Multidimensional Gaussian (Normal) Law

5.7 Characteristic Functions of Random Vectors

Properties of CF of Random Vectors

The Characteristic Function of the Gaussian (Normal) Law

Summary

Problems

References

Additional Reading

6 Statistics: Part 1 Parameter Estimation

6.1 Introduction

Independent, Identically Distributed (i.i.d.) Observations

Estimation of Probabilities

6.2 Estimators

6.3 Estimation of the Mean

Properties of the Mean-Estimator Function (MEF)

Procedure for Getting a δ-confidence Interval on the Mean of a Normal Random Variable When `

Confidence Interval for the Mean of a Normal Distribution When σ[sub(X)] Is Not Known

Procedure for Getting a δ-Confidence Interval Based on n Observations on the Mean of a Normal R

Interpretation of the Confidence Interval

6.4 Estimation of the Variance and Covariance

Confidence Interval for the Variance of a Normal Random variable

Estimating the Standard Deviation Directly

Estimating the covariance

6.5 Simultaneous Estimation of Mean and Variance

6.6 Estimation of Non-Gaussian Parameters from Large Samples

6.7 Maximum Likelihood Estimators

6.8 Ordering, more on Percentiles, Parametric Versus Nonparametric Statistics

The Median of a Population Versus Its Mean

Parametric versus Nonparametric Statistics

Confidence Interval on the Percentile

Confidence Interval for the Median When n Is Large

6.9 Estimation of Vector Means and Covariance Matrices

Estimation of μ

Estimation of the covariance K

6.10 Linear Estimation of Vector Parameters

Summary

Problems

References

Additional Reading

7 Statistics: Part 2 Hypothesis Testing

7.1 Bayesian Decision Theory

7.2 Likelihood Ratio Test

7.3 Composite Hypotheses

Generalized Likelihood Ratio Test (GLRT)

How Do We Test for the Equality of Means of Two Populations?

Testing for the Equality of Variances for Normal Populations: The F-test

Testing Whether the Variance of a Normal Population Has a Predetermined Value:

7.4 Goodness of Fit

7.5 Ordering, Percentiles, and Rank

How Ordering is Useful in Estimating Percentiles and the Median

Confidence Interval for the Median When n Is Large

Distribution-free Hypothesis Testing: Testing If Two Population are the Same Using Runs

Ranking Test for Sameness of Two Populations

Summary

Problems

References

8 Random Sequences

8.1 Basic Concepts

Infinite-length Bernoulli Trials

Continuity of Probability Measure

Statistical Specification of a Random Sequence

8.2 Basic Principles of Discrete-Time Linear Systems

8.3 Random Sequences and Linear Systems

8.4 WSS Random Sequences

Power Spectral Density

Interpretation of the psd

Synthesis of Random Sequences and Discrete-Time Simulation

Decimation

Interpolation

8.5 Markov Random Sequences

ARMA Models

Markov Chains

8.6 Vector Random Sequences and State Equations

8.7 Convergence of Random Sequences

8.8 Laws of Large Numbers

Summary

Problems

References

9 Random Processes

9.1 Basic Definitions

9.2 Some Important Random Processes

Asynchronous Binary Signaling

Poisson Counting Process

Alternative Derivation of Poisson Process

Random Telegraph Signal

Digital Modulation Using Phase-Shift Keying

Wiener Process or Brownian Motion

Markov Random Processes

Birth–Death Markov Chains

Chapman–Kolmogorov Equations

Random Process Generated from Random Sequences

9.3 Continuous-Time Linear Systems with Random Inputs

White Noise

9.4 Some Useful Classifications of Random Processes

Stationarity

9.5 Wide-Sense Stationary Processes and LSI Systems

Wide-Sense Stationary Case

Power Spectral Density

An Interpretation of the psd

More on White Noise

Stationary Processes and Difierential Equations

9.6 Periodic and Cyclostationary Processes

9.7 Vector Processes and State Equations

State Equations

Summary

Problems

References

10 Advanced Topics in Random Processes

10.1 Mean-Square (m.s.) **Calculus**

Stochastic Continuity and Derivatives [10-1]

Further Results on m.s. Convergence [10-1]

10.2 Mean-Square Stochastic Integrals

10.3 Mean-Square Stochastic Di.erential Equations

10.4 Ergodicity [10-3]

10.5 Karhunen–Loève Expansion [10-5]

10.6 Representation of Bandlimited and Periodic Processes

Bandlimited Processes

Bandpass Random Processes

WSS Periodic Processes

Fourier Series for WSS Processes

Summary

Appendix: Integral Equations

Existence Theorem

Problems

References

11 Applications to Statistical Signal Processing

11.1 Estimation of Random Variables and Vectors

More on the Conditional Mean

Orthogonality and Linear Estimation

Some Properties of the Operator Ê

11.2 Innovation Sequences and Kalman Filtering

Predicting Gaussian Random Sequences

Kalman Predictor and Filter

Error-Covariance Equations

11.3 Wiener Filters for Random Sequences

Unrealizable Case (Smoothing)

Causal Wiener Filter

11.4 Expectation-Maximization Algorithm

Log-likelihood for the Linear Transformation

Summary of the E-M algorithm

E-M Algorithm for Exponential Probability Functions

Application to Emission Tomography

Log-likelihood Function of Complete Data

E-step

M-step

11.5 Hidden Markov Models (HMM)

Speci.cation of an HMM

Application to Speech Processing

E.cient Computation of P[E|M] with a Recursive Algorithm

Viterbi Algorithm and the Most Likely State Sequence for the Observations

11.6 Spectral Estimation

The Periodogram

Bartlett’s Procedure–Averaging Periodograms

Parametric Spectral Estimate

Maximum Entropy Spectral Density

11.7 Simulated Annealing

Gibbs Sampler

Noncausal Gauss–Markov Models

Compound Markov Models

Gibbs Line Sequence

Summary

Problems

References

Appendix A: Review of Relevant Mathematics

A.1 Basic Mathematics

Sequences

Convergence

Summations

Z-Transform

A.2 Continuous **Mathematics**

Definite and Indefinite Integrals

Difierentiation of Integrals

Integration by Parts

Completing the Square

Double Integration

Functions

A.3 Residue Method for Inverse Fourier Transformation

Fact

Inverse Fourier Transform for psd of Random Sequence

A.4 Mathematical Induction

References

Appendix B: Gamma and Delta Functions

B.1 Gamma Function

B.2 Incomplete Gamma Function

B.3 Dirac Delta Function

References

Appendix C: Functional Transformations and Jacobians

C.1 Introduction

C.2 Jacobians for n = 2

C.3 Jacobian for General n

Appendix D: Measure and Probability

D.1 Introduction and Basic Ideas

Measurable Mappings and Functions

D.2 Application of Measure Theory to Probability

Distribution Measure

Appendix E: Sampled Analog Waveforms and Discrete-time Signals

Appendix F: Independence of Sample Mean and Variance for Normal Random Variables

Appendix G: Tables of Cumulative Distribution Functions: the Normal, Student t, Chi-square, and F

Index

A

B

C

D

E

F

G

H

I

J

K

L

M

N

O

P

Q

R

S

T

U

V

W

Y

Z

**What makes us different?**

• Instant Download

• Always Competitive Pricing

• 100% Privacy

• FREE Sample Available

• 24-7 LIVE Customer Support

## Reviews

There are no reviews yet.