Stochastic processes model random systems that change over time. They use math to predict and analyze unpredictable events in fields like physics, biology, and economics. These processes consist of random variables indexed by time, representing the system's state at different points.
Stochastic processes are classified by their state space, time index, and memory. The state space can be discrete or continuous, while time can be discrete or continuous. Some processes are memoryless, while others depend on past states. Examples include random walks, Poisson processes, and Brownian motion.
Definition of stochastic processes
- Stochastic processes model systems that evolve over time in a probabilistic manner
- Provide a mathematical framework for analyzing and predicting the behavior of random phenomena
- Enable the study of complex systems in fields such as physics, biology, economics, and engineering
Random variables over time
- Stochastic processes consist of a collection of random variables indexed by time
- Each random variable represents the state of the system at a specific time point
- The set of all possible states forms the state space of the process
- Examples: stock prices over time (continuous state space), number of customers in a queue (discrete state space)
Probabilistic models
- Stochastic processes assign probabilities to different possible outcomes or trajectories
- Probability distributions describe the likelihood of the system being in a particular state at a given time
- Joint probability distributions capture the dependencies between random variables at different time points
- Transition probabilities specify the likelihood of moving from one state to another
Dynamical systems with randomness
- Stochastic processes incorporate randomness into the evolution of a system over time
- Randomness can arise from inherent uncertainties, external noise, or unpredictable events
- The future state of the system depends on both the current state and random factors
- Stochastic differential equations (SDEs) model continuous-time processes with random perturbations
Classification by state space
- The state space of a stochastic process refers to the set of all possible values that the random variables can take
- The nature of the state space determines the mathematical tools and techniques used to analyze the process
Discrete state space
- In a discrete state space, the random variables can only take on a countable number of distinct values
- Examples: number of defective items in a production line, number of customers in a queue
- Discrete-time Markov chains are a common class of processes with discrete state spaces
Continuous state space
- In a continuous state space, the random variables can take on any value within a continuous range
- Examples: stock prices, temperature measurements, particle positions in a fluid
- Processes with continuous state spaces often involve real-valued random variables
- Brownian motion and diffusion processes are examples of continuous state space processes
Finite vs infinite state space
- State spaces can be either finite or infinite
- Finite state spaces have a fixed number of possible states, while infinite state spaces have an unlimited number of states
- Markov chains with a finite number of states are easier to analyze and have well-defined stationary distributions
- Processes with infinite state spaces, such as Poisson processes or random walks, require more advanced mathematical techniques
Classification by time index
- Stochastic processes can be classified based on the nature of the time index, which represents the temporal evolution of the system
Discrete-time processes
- In discrete-time processes, the time index takes on integer values (e.g., t = 0, 1, 2, ...)
- The state of the system is observed at fixed time intervals
- Examples: daily stock prices, monthly sales figures, annual population counts
- Discrete-time Markov chains and autoregressive models are commonly used for discrete-time processes
Continuous-time processes
- In continuous-time processes, the time index takes on real values (e.g., t โ [0, โ))
- The state of the system can change at any point in time
- Examples: particle motion, chemical reactions, financial asset prices
- Poisson processes, Brownian motion, and stochastic differential equations are used to model continuous-time processes
Classification by memory
- Stochastic processes can be classified based on their dependence on past states or history
Memoryless processes
- In memoryless processes, the future state of the system depends only on the current state, not on the past states
- The probability distribution of the next state is independent of the history of the process
- Examples: Poisson processes, exponential distributions, continuous-time Markov chains with exponential holding times
Processes with memory
- In processes with memory, the future state of the system depends on both the current state and the past states
- The probability distribution of the next state is influenced by the history of the process
- Examples: autoregressive models, moving average models, hidden Markov models
Markov vs non-Markov
- Markov processes are a special class of processes with memory, where the future state depends only on the current state, not on the entire history
- Markov property: $P(X_{t+1} = x | X_t, X_{t-1}, ..., X_0) = P(X_{t+1} = x | X_t)$
- Markov chains are a common example of Markov processes
- Non-Markov processes have a more complex dependence structure, where the future state may depend on the entire history of the process
- Examples: long-memory processes, fractional Brownian motion
Examples of stochastic processes
- Various stochastic processes have been developed to model different real-world phenomena and serve as building blocks for more complex models
Random walks
- Random walks model the trajectory of an object or particle that takes random steps in a space
- In a simple random walk, the object moves either up or down (or left or right) with equal probability at each time step
- Random walks have applications in physics (Brownian motion), finance (stock prices), and biology (animal foraging)
- Variations include biased random walks, correlated random walks, and random walks with barriers
Poisson processes
- Poisson processes model the occurrence of rare events in a fixed interval of time or space
- The number of events in disjoint intervals is independent and follows a Poisson distribution
- The inter-arrival times between events are exponentially distributed
- Applications include modeling customer arrivals, phone calls, website hits, and radioactive decay
Brownian motion
- Brownian motion, also known as the Wiener process, models the random motion of particles suspended in a fluid
- It is a continuous-time, continuous-state process with independent, normally distributed increments
- Brownian motion has applications in physics, financial modeling (stock prices), and stochastic calculus
- Geometric Brownian motion is a variant used to model asset prices in the Black-Scholes model
Markov chains
- Markov chains are discrete-time stochastic processes with the Markov property
- The state space can be either discrete or continuous
- Transition probabilities govern the movement between states
- Applications include modeling weather patterns, machine learning algorithms, and customer behavior in marketing
Stationarity of stochastic processes
- Stationarity is a property of stochastic processes that describes the statistical characteristics of the process over time
Strict vs wide-sense stationarity
- Strict stationarity: the joint probability distribution of the process is invariant under time shifts
- $P(X_{t_1}, X_{t_2}, ..., X_{t_n}) = P(X_{t_1+\tau}, X_{t_2+\tau}, ..., X_{t_n+\tau})$ for any time points $t_1, t_2, ..., t_n$ and time shift $\tau$
- Wide-sense (weak) stationarity: only the mean and covariance of the process are invariant under time shifts
- $E[X_t] = \mu$ (constant mean) and $Cov(X_t, X_{t+\tau}) = R(\tau)$ (covariance depends only on the time lag $\tau$)
- Strict stationarity implies wide-sense stationarity, but the converse is not always true
Stationary increments
- A process has stationary increments if the distribution of the increments $(X_{t+\tau} - X_t)$ depends only on the time lag $\tau$, not on the starting time $t$
- Brownian motion and Poisson processes have stationary increments
- Processes with stationary increments are not necessarily stationary themselves
Ergodicity
- Ergodicity is a stronger property than stationarity
- In an ergodic process, the time average of a single realization converges to the ensemble average as the time horizon increases
- Ergodicity allows estimating statistical properties (mean, variance) from a single long realization of the process
- Many stationary processes, such as stationary Markov chains, are also ergodic
Sample paths of stochastic processes
- A sample path, also known as a realization or trajectory, is a single instance of a stochastic process over time
Realizations and trajectories
- Each realization represents a possible outcome of the random process
- Sample paths are functions of time, denoted as $X(\omega, t)$, where $\omega$ represents a particular outcome and $t$ is the time index
- Different realizations can have different shapes and properties, depending on the underlying probability distribution
Continuity of sample paths
- Sample paths can be either continuous or discontinuous
- Continuous sample paths have no jumps or discontinuities
- Examples: Brownian motion, Ornstein-Uhlenbeck process
- Discontinuous sample paths have jumps or discontinuities
- Examples: Poisson processes, compound Poisson processes
- The continuity of sample paths affects the mathematical tools used for analysis and simulation
Differentiability of sample paths
- Some continuous sample paths may be differentiable, while others are not
- Brownian motion has continuous but almost surely nowhere differentiable sample paths
- The Ornstein-Uhlenbeck process has continuously differentiable sample paths
- Differentiability is important for stochastic calculus and the application of Itรด's lemma
Filtrations and adapted processes
- Filtrations and adapted processes are fundamental concepts in the study of stochastic processes, particularly in the context of martingales and stochastic calculus
Information accumulation over time
- A filtration ${\mathcal{F}t}{t \geq 0}$ is an increasing sequence of $\sigma$-algebras that represents the accumulation of information over time
- $\mathcal{F}_t$ contains all the information available up to time $t$
- The filtration satisfies $\mathcal{F}_s \subseteq \mathcal{F}_t$ for all $s \leq t$
- Examples: natural filtration (generated by the process itself), Brownian filtration (generated by a Brownian motion)
Adapted vs predictable processes
- A stochastic process ${X_t}_{t \geq 0}$ is adapted to a filtration ${\mathcal{F}t}{t \geq 0}$ if $X_t$ is $\mathcal{F}_t$-measurable for all $t \geq 0$
- Intuitively, the value of $X_t$ is known at time $t$ based on the information in $\mathcal{F}_t$
- A process is predictable if $X_t$ is $\mathcal{F}{t-}$-measurable for all $t \geq 0$, where $\mathcal{F}{t-}$ is the information available strictly before time $t$
- Adapted processes are more common and include Brownian motion, Poisson processes, and Itรด processes
Martingales
- A martingale is an adapted process ${M_t}_{t \geq 0}$ that satisfies the conditional expectation property:
- $E[M_t | \mathcal{F}_s] = M_s$ for all $s \leq t$
- Intuitively, the best prediction of a future value of a martingale, given the current information, is the current value itself
- Martingales are essential in stochastic calculus, financial mathematics, and the study of fair games
- Examples: Brownian motion, compensated Poisson process, stochastic integrals