Document Details

SleekQuartz

Uploaded by SleekQuartz

Stanford State University

Junmark A. Sagosoy, Rosie V. Vasquez

Tags

probability theory mathematics statistics probability

Summary

This document is a review of probability theory, covering fundamental concepts like experiments, events, sample space, and probability measure. It also explains different types of probability (classical, empirical, subjective), conditional probability, and independence. The document discusses random variables, probability distributions, expectation, variance, common probability distributions (binomial, normal, Poisson), and the law of large numbers and the central limit theorem. The document then summarizes applications in various domains like statistics, finance, machine learning, and physics.

Full Transcript

Review on Probability Theory Introduction Probability theory is a branch of mathematics that deals with the study of randomness and uncertainty. It provides a formal framework for quantifying the likelihood of different outcomes in various situations, from simple games of chance to com...

Review on Probability Theory Introduction Probability theory is a branch of mathematics that deals with the study of randomness and uncertainty. It provides a formal framework for quantifying the likelihood of different outcomes in various situations, from simple games of chance to complex real-world phenomena. Fundamental Concepts Experiment Event Probability Measure Sample Space (S) (P) Experiment A procedure that yields one of a set of possible outcomes. For example, rolling a die or flipping a coin are experiments. Sample Space (S) The set of all possible outcomes of an experiment. For instance, the sample space for a die roll is S={1,2,3,4,5,6}. Event A subset of the sample space. An event occurs if the outcome of an experiment belongs to the corresponding subset. For example, getting an even number when rolling a die can be represented as the event E={2,4,6}. Probability Measure (P) A function that assigns a number between 0 and 1 to an event, representing the likelihood of that event. The probability of an event E is denoted as P(E), where: 0≤P(E)≤1 If P(E)=0, the event is impossible, and if P(E)=1, the event is certain. Axioms of Probability The foundations of probability theory are based on three axioms introduced by Andrey Kolmogorov: Non-negativity: For any event E, P(E)≥0. Normalization: The probability of the sample space S is 1, i.e., P(S)=1. Axioms of Probability Additivity: If E1 and E2 are two mutually exclusive events (events that cannot occur simultaneously), then the probability of either event occurring is the sum of their probabilities: Types of Probability Classical Probability Empirical Probability Subjective Probability Based on equally likely Based on observations or Based on personal belief or outcomes. For example, the experiments. If an event E opinion. It is often used in probability of rolling a 3 with a occurs f times out of n \ ) situations where empirical or fair die is ​1/6. trials, its empirical probability classical probability is hard to is \( P(E) = \frac{f}{n}. determine. Conditional Probability and Independence Conditional Probability: The probability of an event A occurring given that another event B has occurred is denoted as P(A∣B) and is calculated by: Conditional Probability and Independence Independence: Two events A and B are said to be independent if the occurrence of one does not affect the probability of the occurrence of the other, i.e., P(A∩B)=P(A)⋅P(B). Random Variables Random Variable (X) Probability Distribution Random Variable (X) A variable that takes numerical values based on the outcome of a random phenomenon. There are two types: Discrete Random Continuous Random Variables: Can take a Variables: Can take finite or countable an infinite number of values within a given number of values range (e.g., the height (e.g., the number of of a randomly heads in 10 coin flips). selected person). Probability Distribution Describes how probabilities are distributed over the values of the random variable. For a discrete random variable, the probability mass function (PMF) P(X=x) gives the probability that X takes the value x. For a continuous random variable, the probability density function (PDF) f(x) describes the density of probability at x. Expectation and Variance Expected Value (Mean) Variance Expected Value (Mean): The average value of a random variable X in the long run. For a discrete random variable, it is calculated as: Variance: Measures the spread or dispersion of a random variable around its mean. For a discrete random variable: Common Probability Distributions Binomial Distribution Normal Distribution Poisson Distribution Binomial Distribution Describes the number of successes in a fixed number of independent trials, each with the same probability of success p. The probability of exactly k successes in nnn trials is given by: Normal Distribution Also known as the Gaussian distribution, it describes a continuous random variable with a symmetric, bell-shaped curve. It is characterized by its mean μ and variance σ2: Poisson Distribution Describes the number of events occurring in a fixed interval of time or space, given that these events happen with a known constant mean rate and independently of the time since the last event. The probability of k events is: The Law of Large Numbers and Central Limit Theorem Law of Large Central Limit Numbers Theorem (CLT) States that as the number of States that the sum (or average) trials of a random of a large number of independent and identically distributed random experiment increases, the variables, each with finite mean sample mean converges to and variance, will be the expected value (mean) of approximately normally the random variable. distributed, regardless of the original distribution. Applications of Probability Theory Probability theory has applications in various domains: Statistics Machine Learning and AI Finance Physics and Engineering Statistics Inference, hypothesis testing, regression analysis, and Bayesian statistics are all grounded in probability theory. Finance Pricing of financial derivatives, risk management, and portfolio optimization rely on probabilistic models. Machine Learning and AI Inference, hypothesis testing, regression analysis, and Bayesian statistics are all grounded in probability theory. Physics and Engineering Quantum mechanics, statistical mechanics, and reliability engineering are deeply rooted in probability. Thank REPORTERS: You JUNMARK A. SAGOSOY ROSIE V. VASQUEZ

Use Quizgecko on...
Browser
Browser