Applied Probability Models with Optimization Applications
2.5/5
()
About this ebook
This book offers a concise introduction to some of the stochastic processes that frequently arise in applied probability. Emphasis is on optimization models and methods, particularly in the area of decision processes. After reviewing some basic notions of probability theory and stochastic processes, the author presents a useful treatment of the Poisson process, including compound and nonhomogeneous Poisson processes. Subsequent chapters deal with such topics as renewal theory and Markov chains; semi-Markov, Markov renewal, and regenerative processes; inventory theory; and Brownian motion and continuous time optimization models.
Each chapter is followed by a section of useful problems that illustrate and complement the text. There is also a short list of relevant references at the end of every chapter. Students will find this a largely self-contained text that requires little previous knowledge of the subject. It is especially suited for a one-year course in applied probability at the advanced undergraduate or beginning postgraduate level. 1970 edition.
Sheldon M. Ross
Dr. Sheldon M. Ross is a professor in the Department of Industrial and Systems Engineering at the University of Southern California. He received his PhD in statistics at Stanford University in 1968. He has published many technical articles and textbooks in the areas of statistics and applied probability. Among his texts are A First Course in Probability, Introduction to Probability Models, Stochastic Processes, and Introductory Statistics. Professor Ross is the founding and continuing editor of the journal Probability in the Engineering and Informational Sciences. He is a Fellow of the Institute of Mathematical Statistics, a Fellow of INFORMS, and a recipient of the Humboldt US Senior Scientist Award.
Read more from Sheldon M. Ross
Simulation Rating: 3 out of 5 stars3/5Introductory Statistics Rating: 0 out of 5 stars0 ratingsIntroduction to Probability Models Rating: 0 out of 5 stars0 ratingsIntroduction to Stochastic Dynamic Programming Rating: 0 out of 5 stars0 ratings
Related to Applied Probability Models with Optimization Applications
Titles in the series (100)
Calculus Refresher Rating: 3 out of 5 stars3/5A Catalog of Special Plane Curves Rating: 2 out of 5 stars2/5Laplace Transforms and Their Applications to Differential Equations Rating: 5 out of 5 stars5/5First-Order Partial Differential Equations, Vol. 2 Rating: 0 out of 5 stars0 ratingsFirst-Order Partial Differential Equations, Vol. 1 Rating: 5 out of 5 stars5/5Infinite Series Rating: 4 out of 5 stars4/5Differential Forms with Applications to the Physical Sciences Rating: 5 out of 5 stars5/5Dynamic Probabilistic Systems, Volume II: Semi-Markov and Decision Processes Rating: 0 out of 5 stars0 ratingsGauge Theory and Variational Principles Rating: 2 out of 5 stars2/5Fourier Series and Orthogonal Polynomials Rating: 0 out of 5 stars0 ratingsCalculus: An Intuitive and Physical Approach (Second Edition) Rating: 4 out of 5 stars4/5Methods of Applied Mathematics Rating: 3 out of 5 stars3/5History of the Theory of Numbers, Volume II: Diophantine Analysis Rating: 0 out of 5 stars0 ratingsMathematics for the Nonmathematician Rating: 4 out of 5 stars4/5The Calculus Primer Rating: 0 out of 5 stars0 ratingsNumerical Methods Rating: 5 out of 5 stars5/5Analytic Inequalities Rating: 5 out of 5 stars5/5Mathematics in Ancient Greece Rating: 5 out of 5 stars5/5Topology for Analysis Rating: 4 out of 5 stars4/5Advanced Calculus: Second Edition Rating: 5 out of 5 stars5/5Fifty Challenging Problems in Probability with Solutions Rating: 4 out of 5 stars4/5Elementary Matrix Algebra Rating: 3 out of 5 stars3/5Theory of Approximation Rating: 0 out of 5 stars0 ratingsOptimization Theory for Large Systems Rating: 5 out of 5 stars5/5The Foundations of Statistics Rating: 0 out of 5 stars0 ratingsGeometry: A Comprehensive Course Rating: 4 out of 5 stars4/5Applied Functional Analysis Rating: 0 out of 5 stars0 ratingsAn Introduction to Lebesgue Integration and Fourier Series Rating: 0 out of 5 stars0 ratingsTheory of Games and Statistical Decisions Rating: 4 out of 5 stars4/5Elementary Number Theory: Second Edition Rating: 4 out of 5 stars4/5
Related ebooks
An Introduction to Probability and Stochastic Processes Rating: 5 out of 5 stars5/5Introduction to Stochastic Processes Rating: 4 out of 5 stars4/5Theory of Markov Processes Rating: 0 out of 5 stars0 ratingsFundamentals of Applied Probability and Random Processes Rating: 4 out of 5 stars4/5Stochastic Calculus for Quantitative Finance Rating: 0 out of 5 stars0 ratingsAn Introduction to Probability and Mathematical Statistics Rating: 0 out of 5 stars0 ratingsMatrices and Linear Algebra Rating: 4 out of 5 stars4/5A Graduate Course in Probability Rating: 5 out of 5 stars5/5Information Theory and Statistics Rating: 0 out of 5 stars0 ratingsStudies in the Theory of Random Processes Rating: 0 out of 5 stars0 ratingsApplied Complex Variables Rating: 5 out of 5 stars5/5Invitation to Dynamical Systems Rating: 5 out of 5 stars5/5Stochastic Processes and Filtering Theory Rating: 0 out of 5 stars0 ratingsStationary and Related Stochastic Processes: Sample Function Properties and Their Applications Rating: 4 out of 5 stars4/5An Introduction to Ordinary Differential Equations Rating: 4 out of 5 stars4/5Introduction to Linear Algebra and Differential Equations Rating: 3 out of 5 stars3/5Asymptotic Expansions Rating: 3 out of 5 stars3/5Probability Theory and Mathematical Statistics for Engineers Rating: 5 out of 5 stars5/5Selfsimilar Processes Rating: 4 out of 5 stars4/5Dynamics Rating: 5 out of 5 stars5/5Probability Theory: A Concise Course Rating: 4 out of 5 stars4/5Stochastic Processes Rating: 4 out of 5 stars4/5BAYES Theorem Rating: 2 out of 5 stars2/5Dynamic Programming: Models and Applications Rating: 0 out of 5 stars0 ratingsMarkov Processes for Stochastic Modeling Rating: 5 out of 5 stars5/5Theory of Games and Statistical Decisions Rating: 4 out of 5 stars4/5Stochastic Modeling: Analysis and Simulation Rating: 0 out of 5 stars0 ratingsOptimization Theory with Applications Rating: 4 out of 5 stars4/5Numerical Methods for Scientists and Engineers Rating: 4 out of 5 stars4/5Special Functions for Scientists and Engineers Rating: 5 out of 5 stars5/5
Mathematics For You
Mental Math Secrets - How To Be a Human Calculator Rating: 5 out of 5 stars5/5Algebra - The Very Basics Rating: 5 out of 5 stars5/5Geometry For Dummies Rating: 5 out of 5 stars5/5Algebra I Workbook For Dummies Rating: 3 out of 5 stars3/5Basic Math & Pre-Algebra For Dummies Rating: 4 out of 5 stars4/5Precalculus: A Self-Teaching Guide Rating: 5 out of 5 stars5/5See Ya Later Calculator: Simple Math Tricks You Can Do in Your Head Rating: 4 out of 5 stars4/5Quantum Physics for Beginners Rating: 4 out of 5 stars4/5Basic Math & Pre-Algebra Workbook For Dummies with Online Practice Rating: 4 out of 5 stars4/5The Little Book of Mathematical Principles, Theories & Things Rating: 3 out of 5 stars3/5Must Know High School Algebra, Second Edition Rating: 0 out of 5 stars0 ratingsRelativity: The special and the general theory Rating: 5 out of 5 stars5/5Introducing Game Theory: A Graphic Guide Rating: 4 out of 5 stars4/5Game Theory: A Simple Introduction Rating: 4 out of 5 stars4/5A Mind for Numbers | Summary Rating: 4 out of 5 stars4/5Calculus Made Easy Rating: 4 out of 5 stars4/5The Golden Ratio: The Divine Beauty of Mathematics Rating: 5 out of 5 stars5/5The Everything Guide to Algebra: A Step-by-Step Guide to the Basics of Algebra - in Plain English! Rating: 4 out of 5 stars4/5Summary of The Black Swan: by Nassim Nicholas Taleb | Includes Analysis Rating: 5 out of 5 stars5/5ACT Math & Science Prep: Includes 500+ Practice Questions Rating: 3 out of 5 stars3/5Real Estate by the Numbers: A Complete Reference Guide to Deal Analysis Rating: 0 out of 5 stars0 ratingsIs God a Mathematician? Rating: 4 out of 5 stars4/5The Everything Everyday Math Book: From Tipping to Taxes, All the Real-World, Everyday Math Skills You Need Rating: 5 out of 5 stars5/5The Elements of Euclid for the Use of Schools and Colleges (Illustrated) Rating: 0 out of 5 stars0 ratingsMy Best Mathematical and Logic Puzzles Rating: 5 out of 5 stars5/5
Reviews for Applied Probability Models with Optimization Applications
3 ratings0 reviews
Book preview
Applied Probability Models with Optimization Applications - Sheldon M. Ross
Applied Probability Models
with
Optimization Applications
SHELDON M. ROSS
DOVER PUBLICATIONS, INC., New York
TO MY PARENTS
Copyright © 1970 by Sheldon M. Ross.
All rights reserved
This Dover edition, first published in 1992, is an unabridged, unaltered republication of the work first published by Holden-Day, Inc., San Francisco, 1970, in the Holden-Day Series in Management Science.
Library of Congress Cataloging-in-Publication Data
Ross, Sheldon M.
Applied probability models with optimization applications/Sheldon M. Ross.
p. cm.
An unabridged, unaltered republication of the work first published by Holden-Day, Inc., San Francisco, 1970, in the Holden-Day series in management science
—T.p. verso.
Includes bibliographical references.
ISBN 0-486-67314-6 (pbk.)
1. Probabilities. 2. Mathematical optimization. I. Title.
QA273.R8 1992
CIP
Manufactured in the United States by Courier Corporation
67314607
www.doverpublications.com
PREFACE
This text developed from a series of courses in applied probability and optimization theory given by the author at the University of California at Berkeley. A prerequisite to its reading would be some familiarity with probability theory at the level of Volume I of Feller’s well-known book.
Ideally, this book would be used as a text in a one-year course in applied probability models with optimization applications. However, because of its flexibility, it would also be suitable as a text in more specialized topics. For instance, Chapters 1 through 5 and Chapter 9 might be used in a one- or two-quarter course in applied probability or stochastic processes. Also, Chapters 6 through 9 might be used as a text for a course in sequential decision theory. It should also be mentioned that this book has been designed so as to include most of the stochastic models of operations research, and hence may be readily used as a text in this subject.
The approach I have attempted to employ in this book is to view models from a probabilistic point of view. For instance, wherever possible, proofs are based on probabilistic, rather than analytic, arguments. I feel that this approach quickly gives the student a feeling for what is essential in this subject.
I am indebted to many people for their part in making this book possible. In particular, I would like to thank M. Brown, D. Iglehart, C. Myles Lavinsky, G. J. Lieberman, S. Lippman, and I. R. Savage. Finally, I must acknowledge Gary Ross, without whose help this book could not have been possible.
Sheldon M. Ross
Berkeley, California
October 1969
CONTENTS
1. INTRODUCTION TO STOCHASTIC PROCESSES
1.1 Random Variables and Probability Theory
1.2 Conditional Expectation
1.3 Stochastic Processes
Problems
2. THE POISSON PROCESS
2.1 Introduction and Definitions
2.2 Interarrival and Waiting Time Distributions
2.3 Conditional Distribution of the Arrival Times
2.4 Compound and Nonhomogeneous Poisson Processes
2.5 Stationary Point Processes.
Problems
References
3. RENEWAL THEORY
3.1 Introduction and Preliminaries
3.2 Renewal Equation and Generalizations
3.3 Limit Theorems
3.4 Wald’s Equation
3.5 Back to Renewal Theory
3.6 Excess Life and Age Distribution
3.7 Delayed Renewal Processes
3.8 Counter Models
3.9 Renewal Reward Process
3.10 Nonterminating versus Terminating Renewal Processes
3.11 Age Dependent Branching Processes
Problems
References
4. MARKOV CHAINS
4.1 Preliminaries and Examples
4.2 Classification of States
4.3 Limit Theorems
4.4 Transitions Among Classes
4.5 Branching Processes
4.6 Transient States
Problems
References
5. SEMI-MARKOV, MARKOV RENEWAL AND REGENERATIVE PROCESSES
5.1 Introduction and Preliminaries
5.2 Classification of States
5.3 Some Simple Relationships
5.4 Regenerative Processes
5.5 A Queueing Application
5.6 Back to Markov Renewal Processes—Limiting Probabilities
5.7 Limiting Distribution of the Markov Renewal Process
5.8 Continuous Time Markov Chains
5.9 Birth and Death Processes
Problems
References
6. MARKOV DECISION PROCESSES
6.1 Introduction
6.2 Expected Discounted Cost
6.3 Some Examples
6.4 Positive Costs, No Discounting
6.5 Applications: Optimal Stopping and Sequential Analysis
6.6 Expected Average Cost Criterion-Introduction and Counterexamples
6.7 Expected Average Cost Criterion-Results
6.8 Finite State Space—Computational Approaches
Problems
References
7. SEMI-MARKOV DECISION PROCESSES
7.1 Introduction
7.2 Discounted Cost Criterion
7.3 Average Cost—Preliminaries and Equality of Criteria
7.4 Average Cost—Results
7.5 Some Examples
Problems
References
8. INVENTORY THEORY
8.1 Introduction
8.2 A Single Period Model
8.3 Multi-Period Models
8.4 A Multi-Period Stationary Optimal Policy
8.5 Inventory Issuing Policies
Problems
References
9. BROWNIAN MOTION AND CONTINUOUS TIME OPTIMIZATION MODELS
9.1 Introduction and Preliminaries
9.2 Maximum of the Wiener Process
9.3 The Wiener Process and Optimization
9.4 The Maximum Variable—A Renewal Application
9.5 Optimal Dispatching of a Poisson Process
9.6 Infinitesimal Look-Ahead Stopping Rules
Problems
References
APPENDICES
INDEX
1
INTRODUCTION TO STOCHASTIC PROCESSES
1.1. Random Variables and Probability Theory
In order to understand the theory of stochastic processes, it is necessary to have a firm grounding in the basic concepts of probability theory. As a result, we shall briefly review some of these concepts and at the same time establish some useful notation.
The distribution function F( ) of the random variable X is defined for any real number x by
A random variable X is said to be discrete if its set of possible values is countable. For discrete random variables, the probability mass function p(x) is defined by
Clearly,
A random variable is called continuous if there exists a function f(x), called the probability density function, such that
for every Borel set B, it follows that
The expectation or mean of the random variable X, denoted by EX, is defined by
provided the above integral exists.
Equation (1) also defines the expectation of any function of X, say h(X). Since h(X) is itself a random variable, it follows from (1) that
where Fh is the distribution function of h(X, i.e.,
The above equation is sometimes known as the law of the unconscious statistician [as statisticians have been accused of using the identity (2) without realizing that it is not a definition].
The variance of the random variable X is defined by
Jointly Distributed Random Variables
The joint distribution F of two random variables X and Y is defined by
are called the marginal distributions of X and Y. X and Y are said to be independent if
for all real x and y. It can be shown that X and Y are independent if and only if
for all functions g and h for which the above expectations exist.
Two jointly distributed random variables X and Y are said to be uncorrected if their covariance, defined by
is zero. It follows that independent random variables are uncorrected. However, the converse is not true. (Give an example.)
The random variables X and Y are said to be jointly continuous if there exists a function f(x, y) called the joint probability density function, such that
for every two Borel sets A and B.
The joint distribution of any collection X1, X2, ..., Xn of random variables is defined by
Furthermore, the n random variables are said to be independent if
where
Characteristic Functions and Laplace Transforms
The characteristic function of X is defined, for any real number u, by
A random variable always possesses a characteristic function (that is, the above integral always exists) and, in fact, it can be proven that there is a one-to-one correspondence between distribution functions and characteristic functions. This result is quite important as it enables us to characterize the probability law of a random variable by its characteristic function. (See Tables 1 and 2.)
A further useful result is that if X1..., Xn are independent, then the characteristic function of their sum X1 +...... + Xn is just the product of the individual characteristic functions. This result is quite useful, as it often enables us to determine the distribution of the sum of independent random variables by first calculating the characteristic function and then attempting to identify it.
TABLE I
TABLE 2
EXAMPLE. Let X and Y be independent and identically distributed normal random variables having mean μ and variance σ². Then,
where the subscript indicates the random variable associated with the characteristic function. Hence (see Table 2),
which is the characteristic function of a normal random variable having mean 2μ and variance 2σ². Therefore, by the uniqueness of the characteristic function, this is the distribution of X + Y.
We may also define the joint characteristic of the random variables X1..., Xn by
It may be proven that the joint characteristic function uniquely determines the joint distribution.
When dealing with random variables which only assume nonnegative values, it is sometimes more convenient to use Laplace transforms rather than characteristic functions. The Laplace transform of the distribution F (or, more precisely, of the random variable having distribution F) is defined by
This integral exists for a complex variable s = a + bi where a ≥ 0. As in the case of characteristic functions, the Laplace transform uniquely determines the distribution.
We may also define Laplace transforms for arbitrary functions in the following manner: The Laplace transform of the function g, is defined by
determines g up to an additive constant.
Convolutions
If X and Y are independent random variables, with X having distribution F and Y having distribution G, then the distribution of X + Y is given by F * G, where
F * G is called the convolution of F and G. If G = F, then F * F is denoted by F2. Similarly, we denote by Fn the n-fold convolution of F with itself. That is,
It is easy to show that the characteristic function (Laplace transform) of a convolution is just the product of the characteristic functions (Laplace transforms).
Limit Theorems
Some of the most important results in probability theory are in the form of limit theorems. The two most important are:
Law of Large Numbers. If X1, X2, ... are independent and identically distributed with mean μ, then with probability one,
Central Limit Theorem. If X1, X2, ... are independent and identically distributed with mean μ and variance σ², then
1.2. Conditional Expectation
If X and Y are discrete random variables, then the conditional probability mass function of Y, given X, is defined, for all x such that P{X = x} > 0, by
of Y given X, is defined for all x such that P{X = x} > 0, by
The conditional expectation of Y, given X, is defined for all x such that P{X = x} > 0, by
If X and Y have a joint probability density function fX, Y(x, y) the conditional probability density function of Y, given X, is defined for all x such that fx(x) > 0 by
and the conditional probability distribution function of Y, given X, by
The conditional expectation of Y, given X, is defined for all x such that fx(x) > 0, by
the function of X whose value at X = x . An extremely important property of conditional expectation is that for all random variables X and Y,
provided the relevant expectations exist. Hence, (3) states that the expectation of Y may be obtained by first conditioning on X ) and then taking the expectation (with respect to X) of this quantity.
EXAMPLE. Suppose that the number of accidents occurring in a factory in a month is a random variable with distribution F. Suppose also that the number of workmen injured in each accident are independent and have a common distribution G. What is the expected number of workmen injured each month?
Let N denote the number of accidents which occur, and let X1 ..., XN . Now
However,
Thus, from (4).
Example. A prisoner is placed in a cell containing three doors. The first door leads immediately to freedom. The second door leads into a tunnel which returns him to the cell after one day’s travel. The third door leads to a similar tunnel which returns him to his cell after three days. Assuming that the prisoner is at all times equally likely to choose any one of the doors, what is the expected length of time until the prisoner reaches freedom?
Let Y denote the time until the prisoner reaches freedom, and let X denote the door that he initially chooses. We first note that
, and reason as follows: If the prisoner chooses the second door, then he spends one day in the tunnel and then returns to his cell. But once he returns to his cell the problem is as before, and hence his expected time until freedom from that moment on is just EY. Therefore, from (5) we obtain
or
Functional Equations and Lack of Memory of the Exponential Distribution
The following two functional equations occur quite frequently in the theory of applied probability:
It turns out that the only (measurable) solution to these functional equations are of the respective forms
and
We shall now use (6) to prove that the exponential is the unique distribution without memory.
A random variable X is said to be without memory, or memoryless, if
If we think of X as being the lifetime of some instrument, then (8) states that the probability that the instrument lives for at least s + t hours given that it has survived t hours