Epithelium

Choose a study mode

Play Quiz
Study Flashcards
Spaced Repetition
Chat to Lesson

Podcast

Play an AI-generated podcast conversation about this lesson
Download our mobile app to listen on the go
Get App

Questions and Answers

What is the primary function of cilia found on the apical surface of epithelial cells?

  • Providing structural support to the cell.
  • Aiding in absorption and secretion.
  • Moving substances across the cell surface. (correct)
  • Secreting substances into the bloodstream.

Which of the following is a specialized form of epithelium involved in the secretion of substances?

  • Glandular epithelium (correct)
  • Ciliated epithelium
  • Transitional epithelium
  • Squamous epithelium

What structural feature separates epithelium from the underlying connective tissue?

  • Adherens junctions
  • Basement lamina membrane (correct)
  • Gap junctions
  • Desmosomes

Which characteristic is used to classify epithelia?

<p>Simple (A)</p> Signup and view all the answers

What shape are squamous epithelial cells?

<p>Squamous (flat, pavement like) (A)</p> Signup and view all the answers

Which type of epithelial tissue specializes in moving particles across its surface and is found in airways and lining of the oviduct?

<p>Pseudostratified ciliated columnar (A)</p> Signup and view all the answers

Which of the following is NOT a primary function of epithelium?

<p>Integument (D)</p> Signup and view all the answers

Which is NOT a primary characteristic of epithelium?

<p>Vascular (B)</p> Signup and view all the answers

When observing epithelial cells under a microscope, the cells are arranged in a single layer and look tall and narrow, with the nucleus located close to the basal side of the cell. What type of epithelial tissue is the specimen?

<p>Columnar (A)</p> Signup and view all the answers

Which of the following is the epithelial tissue that lines the interior of blood vessels?

<p>Simple Squamous (C)</p> Signup and view all the answers

What epithelium lines the oral cavity?

<p>Non-keratinized stratified squamous epithelium. (C)</p> Signup and view all the answers

What type of epithelium covers the hard palate, the tongue, and the external surface of the gingiva?

<p>Keratinized stratified squamous epithelium (D)</p> Signup and view all the answers

What type of epithelium is found within salivary glands?

<p>Simple cuboidal or columnar in acinar regions and stratified in ductal regions (A)</p> Signup and view all the answers

What membrane is found inside the oral cavity, and where is its lubricant formed?

<p>Mucous membrane, lubricant secreted by the salivary glands (C)</p> Signup and view all the answers

What is the key structural difference between endocrine and exocrine glands?

<p>Endocrine glands release hormones directly into the bloodstream, while exocrine glands secrete substances through ducts. (D)</p> Signup and view all the answers

Which of the following organs functions as both an endocrine and exocrine gland?

<p>The pancreas, releasing hormones and producing digestive enzymes (B)</p> Signup and view all the answers

A group of organized _____ working together forms _____ . An organized group of the latter work together to form _____ .

<p>cells; tissue; organs (C)</p> Signup and view all the answers

What type of tissue covers a body surface, lines a body cavity, forms parts of most glands, and is known as what?

<p>Epithelial tissue (B)</p> Signup and view all the answers

Microvilli best fit which description?

<p>Small finger-like projections that aid in absorption and secretion, (A)</p> Signup and view all the answers

What is a common tissue that endocrine and exocrine glands are composed of?

<p>Epithelial tissue (B)</p> Signup and view all the answers

Flashcards

Cilia

Helps to move substances across the cell surface.

Microvilli

Small finger-like projections that aid in absorption and secretion.

Glandular epithelium

Involved in secretion of substances.

Endocrine glands

Do not contain ducts; release hormones directly into the bloodstream.

Signup and view all the flashcards

Exocrine glands

Contain ducts to carry secretion to the surface of an organ or body cavity.

Signup and view all the flashcards

Ciliated epithelium

Hairlike structures on the surface to help move substances. Aids in trapping and moving particles out of the airways.

Signup and view all the flashcards

Epithelium-connective tissue separation.

Basement Lamina Membrane

Signup and view all the flashcards

Columnar Epithelial Tissue

Cells are arranged in a single layer and look tall and narrow, the nucleus is located close to the basal side of the cell.

Signup and view all the flashcards

Simple Squamous

Epithelial tissue that lines the interior of blood vessels

Signup and view all the flashcards

Functions of epithelium

Protection, transportation, excretion, absorption, sensation

Signup and view all the flashcards

Characteristics of epithelium

Most cellular tissue in body, tightly packed together cells, avascular, rapid cell division, regenerative

Signup and view all the flashcards

Non keratinised stratified squamous epithelium

Lines the oral cavity.

Signup and view all the flashcards

Keratinised stratified squamous epithelium

Covers the hard palate and the tongue and the external surface of the gingiva

Signup and view all the flashcards

Epithelium found in acinar regions

Simple cuboidal or columnar in acinar regions responsible for secretion

Signup and view all the flashcards

Epithelium found in ductal regions

Stratified epithelium in ductal regions : passages that facilitate the transport of secretions FROM acinar regions

Signup and view all the flashcards

Membrane found inside the oral cavity

The mucous membrane and its lubricant is secreted by the salivary glands.

Signup and view all the flashcards

Difference between an endocrine gland and an exocrine gland

Endocrine glands release hormones directly into the bloodstream whereas exocrine glands secrete substances to epithelial surfaces.

Signup and view all the flashcards

Epithelial tissue

Tissue that covers a body surface of lines a body cavity, forms parts of most glands

Signup and view all the flashcards

Three features used to classify epithelia

Simple, Squamous, Transitional

Signup and view all the flashcards

Shapes of epithelial cells

Squamous (Flat, pavement like) Cuboidal (square shaped with roughly equal length and height) Columnar (Long rectangular)

Signup and view all the flashcards

Study Notes

Review of Estimators

  • Bayes Estimator formula: $\hat{\theta}_{B} = \mathbb{E}[\theta|X]$
  • Maximum a posteriori (MAP) estimator formula: $\hat{\theta}{MAP} = arg \max{\theta} f_{\theta|X}(\theta|X) = arg \max_{\theta} f_{X|\theta}(X|\theta) \cdot \pi(\theta)$
  • Maximum Likelihood Estimator (MLE) formula: $\hat{\theta}{MLE} = arg \max{\theta} f_{X|\theta}(X|\theta)$

Example Estimations

  • Assume $X_{1},...,X_{n}$ are independent and identically distributed following a Bernoulli distribution $Ber(p)$
  • With prior distribution: $p \sim Beta(\alpha, \beta)$
  • Posterior distribution: $p|X \sim Beta(\alpha + \sum_{i=1}^{n}X_{i}, \beta + n - \sum_{i=1}^{n}X_{i})$
  • Bayes Estimator: $\hat{p}{B} = \mathbb{E}[p|X] = \frac{\alpha + \sum{i=1}^{n}X_{i}}{\alpha + \beta + n}$
  • MAP Estimator: $\hat{p}{MAP} = arg \max{p} f_{X|p}(X|p) \cdot \pi(p) = \frac{\alpha - 1 + \sum_{i=1}^{n}X_{i}}{\alpha + \beta - 2 + n}$
  • MLE Estimator: $\hat{p}{MLE} = \frac{\sum{i=1}^{n}X_{i}}{n}$

Conjugate Prior Definition

  • Prior and posterior are in the same family
  • Beta prior is a conjugate prior for the Bernoulli likelihood.

Conjugate Prior Example

  • $\hat{p}{B} = \frac{\alpha + \sum{i=1}^{n}X_{i}}{\alpha + \beta + n} = \frac{n}{\alpha + \beta + n} \cdot \hat{p}_{MLE} + \frac{\alpha + \beta}{\alpha + \beta + n} \cdot \frac{\alpha}{\alpha + \beta}$
  • $\hat{p}{B} = w \cdot \hat{p}{MLE} + (1-w) \cdot \mathbb{E}[p]$
  • Where $w = \frac{n}{\alpha + \beta + n}$
  • As $n \to \infty$, $w \to 1$, therefore $\hat{p}{B} \to \hat{p}{MLE}$

Gaussian Estimation Example 1

  • Where $X_{1},...,X_{n}$ are independent and identically distributed following a Normal distribution $N(\mu, \sigma^{2})$
  • $\sigma^{2}$ is known.
  • Prior: $\mu \sim N(\mu_{0}, \tau^{2})$
  • Likelihood: $f_{X|\mu}(X|\mu) = \prod_{i=1}^{n} \frac{1}{\sqrt{2\pi\sigma^{2}}}e^{-\frac{(X_{i} - \mu)^{2}}{2\sigma^{2}}}$

Gaussian Posterior

  • $f_{\mu|X}(\mu|X) \propto f_{X|\mu}(X|\mu) \cdot \pi(\mu) = \prod_{i=1}^{n} \frac{1}{\sqrt{2\pi\sigma^{2}}}e^{-\frac{(X_{i} - \mu)^{2}}{2\sigma^{2}}} \cdot \frac{1}{\sqrt{2\pi\tau^{2}}}e^{-\frac{(\mu - \mu_{0})^{2}}{2\tau^{2}}}$
  • $f_{\mu|X}(\mu|X) \propto exp{ -\frac{1}{2} [\sum_{i=1}^{n}\frac{(X_{i} - \mu)^{2}}{\sigma^{2}} + \frac{(\mu - \mu_{0})^{2}}{\tau^{2}}] }$
  • $= exp{ -\frac{1}{2} [\mu^{2}(\frac{n}{\sigma^{2}} + \frac{1}{\tau^{2}}) - 2\mu(\frac{\sum_{i=1}^{n}X_{i}}{\sigma^{2}} + \frac{\mu_{0}}{\tau^{2}}) + \sum_{i=1}^{n}\frac{X_{i}^{2}}{\sigma^{2}} + \frac{\mu_{0}^{2}}{\tau^{2}}] }$
  • $= exp{ -\frac{1}{2} (\frac{n}{\sigma^{2}} + \frac{1}{\tau^{2}}) [\mu^{2} - 2\mu \frac{(\frac{\sum_{i=1}^{n}X_{i}}{\sigma^{2}} + \frac{\mu_{0}}{\tau^{2}})}{(\frac{n}{\sigma^{2}} + \frac{1}{\tau^{2}})} + (\frac{\frac{\sum_{i=1}^{n}X_{i}}{\sigma^{2}} + \frac{\mu_{0}}{\tau^{2}})}{\frac{n}{\sigma^{2}} + \frac{1}{\tau^{2}}})^{2} - (\frac{\frac{\sum_{i=1}^{n}X_{i}}{\sigma^{2}} + \frac{\mu_{0}}{\tau^{2}})}{\frac{n}{\sigma^{2}} + \frac{1}{\tau^{2}}})^{2} + \sum_{i=1}^{n}\frac{X_{i}^{2}}{\sigma^{2}} + \frac{\mu_{0}^{2}}{\tau^{2}}] }$
  • $\propto exp{ -\frac{1}{2} (\frac{n}{\sigma^{2}} + \frac{1}{\tau^{2}}) [\mu - \frac{(\frac{\sum_{i=1}^{n}X_{i}}{\sigma^{2}} + \frac{\mu_{0}}{\tau^{2}})}{(\frac{n}{\sigma^{2}} + \frac{1}{\tau^{2}})}]^{2} }$
  • Final form: $\mu|X \sim N(\mu_{n}, \sigma_{n}^{2})$
    • Where $\mu_{n} = \frac{(\frac{\sum_{i=1}^{n}X_{i}}{\sigma^{2}} + \frac{\mu_{0}}{\tau^{2}})}{(\frac{n}{\sigma^{2}} + \frac{1}{\tau^{2}})} = \frac{n\tau^{2}\bar{X} + \sigma^{2}\mu_{0}}{n\tau^{2} + \sigma^{2}}$
    • And $\sigma_{n}^{2} = (\frac{n}{\sigma^{2}} + \frac{1}{\tau^{2}})^{-1} = \frac{\sigma^{2}\tau^{2}}{n\tau^{2} + \sigma^{2}}$

Bayes Estimator of Gaussian

  • $\hat{\mu}{B} = \mathbb{E}[\mu|X] = \mu{n} = \frac{n\tau^{2}}{n\tau^{2} + \sigma^{2}}\bar{X} + \frac{\sigma^{2}}{n\tau^{2} + \sigma^{2}}\mu_{0}$
  • $= w\bar{X} + (1-w)\mu_{0}$
  • $w = \frac{n\tau^{2}}{n\tau^{2} + \sigma^{2}}$
  • $\hat{\mu}{MAP} = \hat{\mu}{B}$
  • $\hat{\mu}_{MLE} = \bar{X}$

Variance of Gaussian

  • $\sigma_{n}^{2} = \frac{\sigma^{2}\tau^{2}}{n\tau^{2} + \sigma^{2}}$
  • As $n \to \infty$, $\sigma_{n}^{2} \to 0$
  • As $\tau^{2} \to \infty$, $\sigma_{n}^{2} \to \frac{\sigma^{2}}{n}$

Gaussian Estimation Example 2

  • Where $X_{1},...,X_{n}$ are independent and identically distributed following a Normal distribution $N(\mu, \sigma^{2})$
  • $\mu$ is known.
  • Prior: $\frac{1}{\sigma^{2}} \sim Gamma(\alpha, \beta)$
  • $f_{X|\sigma^{2}}(X|\sigma^{2}) = \prod_{i=1}^{n}\frac{1}{\sqrt{2\pi\sigma^{2}}}e^{-\frac{(X_{i} - \mu)^{2}}{2\sigma^{2}}}$
  • $= (\frac{1}{2\pi\sigma^{2}})^{\frac{n}{2}} exp{ -\frac{1}{2\sigma^{2}}\sum_{i=1}^{n}(X_{i}-\mu)^{2} }$
  • $\pi(\sigma^{2}) = \frac{\beta^{\alpha}}{\Gamma(\alpha)} (\frac{1}{\sigma^{2}})^{\alpha - 1} e^{-\frac{\beta}{\sigma^{2}}}$

Gaussian Posterior

  • $f_{\sigma^{2}|X}(\sigma^{2}|X) \propto (\frac{1}{\sigma^{2}})^{\frac{n}{2}} exp{ -\frac{1}{2\sigma^{2}}\sum_{i=1}^{n}(X_{i} - \mu)^{2} } \cdot (\frac{1}{\sigma^{2}})^{\alpha - 1} e^{-\frac{\beta}{\sigma^{2}}}$
  • $= (\frac{1}{\sigma^{2}})^{\alpha + \frac{n}{2} - 1} exp{ -\frac{1}{\sigma^{2}} (\beta + \frac{1}{2}\sum_{i=1}^{n}(X_{i} - \mu)^{2}) }$
  • $\frac{1}{\sigma^{2}}|X \sim Gamma(\alpha + \frac{n}{2}, \beta + \frac{1}{2}\sum_{i=1}^{n}(X_{i} - \mu)^{2})$
  • $E[\frac{1}{\sigma^{2}}|X] = \frac{\alpha + \frac{n}{2}}{\beta + \frac{1}{2}\sum_{i=1}^{n}(X_{i} - \mu)^{2}}$

Estimators of Gaussian

  • Bayes Estimator: $\hat{\sigma^{2}}_{B} = \mathbb{E}[\sigma^{2}|X] = \mathbb{E}[(\frac{1}{\frac{1}{\sigma^{2}}})|X]$
  • MAP Estimator: $\hat{\sigma^{2}}{MAP} = arg \max{\sigma^{2}} f_{\sigma^{2}|X}(\sigma^{2}|X) = [\alpha + \frac{n}{2} - 1 - 1] / [\beta + \frac{1}{2}\sum_{i=1}^{n}(X_{i} - \mu)^{2}] = [\alpha + \frac{n}{2} - 2] / [\beta + \frac{1}{2}\sum_{i=1}^{n}(X_{i} - \mu)^{2}]$

Linear Momentum

  • The linear momentum of a particle of mass $m$ moving with a velocity $\vec{v}$ is defined as the product of the mass and velocity: $\vec{p} = m\vec{v}$
    • It is a vector quantity.
    • Its direction is the same as the direction of $\vec{v}$.
    • The dimensions of momentum are $ML/T$.
    • The SI units of momentum are kg m/s.

Newton's Second Law and Momentum

  • $\sum \vec{F} = \frac{d\vec{p}}{dt} = m\vec{a}$
  • "The time rate of change of the momentum of a particle is equal to the resultant force acting on the particle."

Impulse and Momentum

  • $\int_{t_i}^{t_f} \sum \vec{F} dt = \int_{t_i}^{t_f} \frac{d\vec{p}}{dt} dt = \int_{\vec{p}_i}^{\vec{p}_f} d\vec{p} = \vec{p}_f - \vec{p}_i = \Delta \vec{p}$
  • Impulse $\vec{I}$ of the resultant force $\sum \vec{F}$ acting on the particle: $\vec{I} = \int_{t_i}^{t_f} \sum \vec{F} dt = \Delta \vec{p}$
  • "The impulse of the resultant force acting on a particle during a time interval equals the change in momentum of the particle during that interval."

Impulse Approximation

  • $\vec{I} = \int_{t_i}^{t_f} \sum \vec{F} dt \approx \int_{t_i}^{t_f} \vec{F} dt$
  • $\Delta \vec{p} = \vec{p}_f - \vec{p}_i = \vec{I}$

Average Force

  • $\vec{F}{avg} = \frac{1}{\Delta t} \int{t_i}^{t_f} \sum \vec{F} dt = \frac{\vec{I}}{\Delta t}$
  • $\vec{I} = \vec{F}_{avg} \Delta t$

Two-Particle System: Momentum Conservation

  • $\vec{F}{12} = - \vec{F}{21}$
  • $\sum \vec{F} = \vec{F}{12} + \vec{F}{21} = 0$
  • $\sum \vec{F} = \frac{d}{dt} (\vec{p}_1 + \vec{p}_2) = 0$
  • $\vec{p}{1} + \vec{p}{2} = \text{constant}$
  • $\vec{p}{1i} + \vec{p}{2i} = \vec{p}{1f} + \vec{p}{2f}$
  • "The total momentum of an isolated system is conserved."
  • System is isolated if the net external force on the system is zero.

Conservation of Linear Momentum Equation

  • $\vec{P} = \vec{p}_1 + \vec{p}_2 +... + \vec{p}_n = \text{constant}$
  • $\vec{P}_i = \vec{P}_f$
  • $\sum_{i=1}^{n} m_i \vec{v}_{i} = \text{constant}$

Collisions of Isolated System

  • Momentum is conserved in all collisions
  • Kinetic energy is generally not conserved in a collision. Choose your system carefully

Types of Collisions

  • An elastic collision is one in which both momentum and kinetic energy are conserved.
  • An inelastic collision is one in which momentum is conserved but kinetic energy is not.
    • If the objects stick together after the collision, it is a perfectly inelastic collision.

Elastic Collisions Equations

  • $m_1 v_{1i} + m_2 v_{2i} = m_1 v_{1f} + m_2 v_{2f}$
  • $\frac{1}{2} m_1 v_{1i}^2 + \frac{1}{2} m_2 v_{2i}^2 = \frac{1}{2} m_1 v_{1f}^2 + \frac{1}{2} m_2 v_{2f}^2$

Special Cases

  1. Equal masses: the particles exchange velocities; $v_{1f} = v_{2i}$ and $v_{2f} = v_{1i}$
  2. Particle 2 initially at rest:
    • $v_{1f} = (\frac{m_1 - m_2}{m_1 + m_2}) v_{1i}$
    • $v_{2f} = (\frac{2m_1}{m_1 + m_2}) v_{1i}$
      • a) If $m_1 >> m_2$, $v_{1f} \approx v_{1i}$ and $v_{2f} \approx 2 v_{1i}$
      • b) If $m_1

Algorithmic Game Theory Vocabulary

  • $n$ players
  • Each wants to route traffic from $s_i$ to $t_i$
  • Strategy: choose a path from $s_i$ to $t_i$
  • Social Cost: makespan, average latency, etc.

Atomic vs. Non-Atomic Games

  • Non-Atomic:
    • Many tiny players.
    • Each controls a negligible amount of traffic.
    • A single player cannot noticeably affect the performance of the game.
  • Atomic:
    • Each player controls a non-negligible amount of traffic.
    • A single player can noticeably affect the performance of the game.

Cost of Anarchy (CoA)

  • Measures the extent to which non-cooperative behavior degrades the performance of a system
  • Definition: $\textrm{CoA} = \frac{\textrm{Social welfare of worst-case Nash equilibrium}}{\textrm{Optimal social welfare}}$
  • $\textrm{Price of Anarchy} = \frac{1}{\textrm{CoA}}$

Cost of Stability (CoS)

  • Measures the extent to which cooperation can improve the performance of a system
  • Definition: $\textrm{CoS} = \frac{\textrm{Social welfare of best-case Nash equilibrium}}{\textrm{Optimal social welfare}}$

Braess's Paradox

  • Adding a link to a network can hurt all players
  • A Nash equilibrium may be worse than optimal

Pigou's Example

  • Traffic rate of 1
  • Two parallel links
  • Cost on link 1: $x$
  • Cost on link 2: 1

Pigou's Example Analysis

  • In Nash Equilibrium:*
  • All traffic uses link 1
  • Social cost = 1
  • Optimal Solution:*
  • Send $\frac{1}{2}$ traffic on each link
  • Social cost: $\frac{1}{2} + \frac{1}{2} = 1$

Therefore, $PoA = \frac{1}{3/4} = \frac{4}{3}$ and $CoS = \frac{3/4}{1} = \frac{3}{4}$

Non-atomic network

  • Source node $s$
  • Destination node $t$
  • Path 1: Edge $e_1$ with cost function $l_1(x) = 1$
  • Path 2: Edge $e_2$ with cost function $l_2(x) = x$

Algorithmic Trading

  • Also known as "Algo Trading" or "Black-Box Trading"
  • A set of rules (an algorithm) that when followed generates buy and sell orders automatically, submitting them to the market.
  • Goal: Generate a profit at a speed and frequency that is impossible for a human trader.

Definition of Algorithm

  • A process or set of rules to be followed in calculations or other Problem-solving operations, especially by a computer.

Algorithmic Trading Example

  • IF*
  • The 50-day moving average of stock XYZ is above the 200-day moving average…
  • THEN*
  • Buy 100 shares of XYZ

Algorithmic Trading Advantages

  • Trades are executed at the best possible price.
  • Order placement is instant and accurate.
  • Trades can be timed perfectly and instantly to release earnings announcements.
  • Reduced transaction costs.
  • Simultaneous automated checks on multiple market conditions.
  • Reduced risk of manual errors when placing trades.
  • Can be backtested to see if profitable.

Algorithmic Trading Disadvantages

  • Technology and software costs.
  • Requires monitoring.
  • Risk of system errors.
  • Requires programming and trading knowledge.

Common Algorithmic Trading Strategies

  • Trend Following Strategies
    • Moving averages
    • Channel Breakouts
    • Oscillators - MACD, RSI
  • Arbitrage
    • Exploiting tiny differences in price in identical or similar assets.
    • A risk-free profit.
  • Index Fund Rebalancing
    • Capitalize on predictable trades that take place when a fund rebalances to its benchmark.
  • Mathematical Model-Based Strategies
    • Use mathematical models, such as mean reversion, to generate trading signals and execute trades.
  • Volume-Weighted Average Price (VWAP)
    • Breaks up a large order and releases pieces of the order to the market at specific time intervals.
    • Goal: Execute the order close to the VWAP.
  • Time-Weighted Average Price (TWAP)
    • Similar to VWAP, but the order is released based on time intervals only, not volume.

High-Frequency Trading (HFT) Characteristics

  • Extremely High Speeds.
  • High Turnover Rates.

Chemical Kinetics: Rate of Reaction

For the reaction $aA + bB \rightarrow cC + dD$

$Rate = -\frac{1}{a}\frac{d[A]}{dt} = -\frac{1}{b}\frac{d[B]}{dt} = \frac{1}{c}\frac{d[C]}{dt} = \frac{1}{d}\frac{d[D]}{dt}$

Factors Affecting Reaction Rates

  1. Concentration: Increase in concentration increases the reaction rate.
  2. Temperature: Increase in temperature increases the reaction rate.
  3. Catalysis:
    • Positive Catalysis: Increases the rate of reaction.
    • Negative Catalysis: Decreases the rate of reaction.
  4. Surface Area: Increase in surface area increases the reaction rate (only for heterogeneous reactions).
  5. Radiation: Increase in radiation increases the reaction rate.

Chemical Kinetics: Rate Law

For the reaction $aA + bB \rightarrow cC + dD$, the rate law is $Rate = k[A]^x[B]^y$

  • k = Rate constant or specific rate constant
  • x = Order with respect to A
  • y = Order with respect to B
  • x + y = Overall order of reaction

Order of Reaction

  • Sum of the powers of the concentration of the reactants in the rate law expression.
  • Can be 0, fractional or integer.
  • Experimentally determined.

Molecularity

  • Number of reacting species taking part in an elementary step.
  • Always a positive integer
  • Theoretically determined.

Chemical Kinetics: Integrated Rate Law

Zero Order Reaction

$R \rightarrow P$ $Rate = -\frac{d[R]}{dt} = k[R]^0$ $[R] = [R]0 - kt$ $t{1/2} = \frac{[R]_0}{2k}$

Zero Order Reaction Graphs
  • [R] vs t
    • Slope = -k
    • Intercept = $[R]_0$
  • $t_{1/2}$ vs $[R]_0$
    • Slope = $\frac{1}{2k}$
    • Passes through origin

First Order Reaction

$R \rightarrow P$ $Rate = -\frac{d[R]}{dt} = k[R]^1$ $[R] = [R]_0e^{-kt}$ $k = \frac{2.303}{t}log\frac{[R]0}{[R]}$ $t{1/2} = \frac{0.693}{k}$

First Order Reaction Graphs
  • ln[R] vs t
    • Slope = -k
    • Intercept = $ln[R]_0$
  • $t_{1/2}$ vs $[R]_0$
    • Slope = 0
    • Straight line

Pseudo First Order Reaction

Reactions which are not truly of the first order but under certain conditions become reactions of the first order.

Example: Acid hydrolysis of ethyl acetate $CH_3COOC_2H_5 + H_2O \xrightarrow{H^+} CH_3COOH + C_2H_5OH$ $Rate = k[CH_3COOC_2H_5][H_2O]$ If $[H_2O]$ is taken large, $Rate = k'[CH_3COOC_2H_5]$, where $k' = k[H_2O]$

Temperature Dependence of Reaction Rates

Arrhenius Equation

$k = Ae^{-E_a/RT}$

  • k = Rate constant
  • A = Arrhenius factor or frequency factor
  • $E_a$ = Activation energy
  • R = Gas constant
  • T = Temperature

$ln k = ln A - \frac{E_a}{RT}$

Arrhenius Equation Graph
  • lnk vs 1/T
    • Slope = $-\frac{E_a}{R}$
    • Intercept = ln A

$ln\frac{k_2}{k_1} = \frac{E_a}{R}[\frac{1}{T_1} - \frac{1}{T_2}]$

Collision Theory

  • Molecules are assumed to be hard spheres.
  • Reaction occurs when molecules collide with each other. $k = PZ_{AB}e^{-E_a/RT}$
  • $Z_{AB}$ = Collision frequency
  • P = Probability factor or steric factor
  • $e^{-E_a/RT}$ = Fraction of molecules with energy equal to or greater than $E_a$

Activated Complex Theory or Transition State Theory

  1. Reactant molecules form an activated complex.
  2. Activated complex is in equilibrium with reactant molecules.
  3. Activated complex decomposes to form products.

Sampling Definitions

  • Population (notée $\mathcal{P}$): Ensemble of people/objects of interest
  • Sample (notée $\mathcal{E}$): Subset of the population
  • Individual/Statistical Unit: Element of the population/sample
  • Variable/Character: Information to be studied
  • Sampling Frame: List of all individuals in the population

Sample Advantages

  • Study fewer people
  • Save time
  • Less man-power
  • Possible if study is destructive

Sample Disadvantages

  • Sample must represent population
  • Sample parameter estimations contain margin of error

Sample Probabilistic Methods

  • Each individual has known probability of being selected
  • Methods not good: non-probabilistic techniques are subjective and impossible to extrapolate

Simple Random Sample (SRS)

  • Definition:
    • Every individual in the population has equal probability of being selected.
    • Extractions are independent.
  • With Replacement: Individual returned to population after selection
  • Without Replacement: Individual not returned, most common

How To Complete SRS

  1. Number all units in sampling frame from 1 to $N$
  2. Randomly sample $n$ numbers using a random number generator

Stratified Random Sample

  • Definition:
    • Divide population into homogenous subgroups, take SRS from each
  • Objectives:
    • Improve estimation precision
    • Obtain estimations for each stratum

How To Complete Stratified Sample

  1. Divide population to $K$ strata ($P_1, P_2,..., P_K$)
    • Strata disjoint, union covers population
  2. Sample size in each level should be $n_1, n_2,..., n_K$
    • $\sum_{i=1}^{K} n_i = n$
  3. SRS for each stratum

Different Stratum Allocation Methods

  • Propportional Allocation: Sample size proportional to population:
    • $n_i = n * \frac{N_i}{N}$
  • Uniform Allocation: Select same number of people in stratum:
    • $n_i = \frac{n}{K}$
  • Optimal Allocation: Account for variability. Minimize estimation variation:
    • $n_i = n * \frac{N_i \sigma_i}{\sum_{i=1}^{K} N_i \sigma_i}$

Systematic Sample

  • Select individual at random from $k$ individuals in the frame, and the 1 every $k$ people afterwords
  • How to complete Sample
  1. Pas de sondage: $k = \frac{N}{n}$
  2. Sélectionner un nombre aléatoire $r$ entre 1 et $k$.
  3. Sélectionner les individus $r, r+k, r+2k,..., r+(n-1)k$.

Systematic Sample Advantages

  • Simple to implement
  • More precise than SRS if population ordered

Systematic Sample Disadvantages

  • Very imprecise if population periodical
  • If N not multiple of n, actual sample size different that expected

Grab Sample

  • Definition:
    • Divide population groups, select certain number at random
    • Include all people in grab group
  • How to complete Sample:
  1. Diviser la population en $C$ grappes.
  2. Sélectionner un échantillon de $c$ grappes aléatoirement.
  3. Inclure tous les individus des $c$ grappes sélectionnées dans l'échantillon.

Grab Sample Advantages

  • Cheaper that SRS id groups congregated geographically
  • Do not need sampling frame if have group list

Grab Sample Disadvantages

  • less precise that SRS if groups heterogenous
  • Risk of by us if grabs not representatives

Estimation: Point Estimate

  • Estimate single value of populace parameter from sample
  • Estimator: Function for estimator
  • Properties of good estimator
  1. Unbias: Expectation equal real value
  2. Convert: estimate closer to real when increase size
  3. Efficient: variance as low as possible

Interval Estimation

  • Estimate population parameter by certain probably containing real value
  • Interval estimate:
    • Interval calculated sample, which is $(1 - \alpha)$ to contain real parameter
    • $1 - \alpha$ "level de confiance"

Confidence Interval Creation Process

  1. Choisir un niveau de confiance $1 - \alpha$
  2. Calculer l'estimateur du paramètre à partir de l'échantillon.
  3. Déterminer la distribution de l'estimateur.
  4. Calculer la marge d'erreur.
  5. Construire l'intervalle de confiance: Estimateur $\pm$ Marge d'erreur

Margin of Error

  • Depends of de confiance, de la taille de l'échantillon et de la variabilité de la variable étudiée.
  • Factors affecting width on internal
  1. confidence level
  2. Sample size
  3. Variable studies value

Common Estimation Methods

  • Moyenne
    • Estimateur: moyenne empirique $\bar{X} = \frac{1}{n} \sum_{i=1}^{n} X_i$
  • Intervals de confiance;
  1. SI variation suit loi normale: Variation sue connue: $\bar{X} \pm z_{1-\frac{\alpha}{2}} \frac{\sigma}{\sqrt{n}}$
  2. Sin variation suitloi normale variance inconnu: $\bar{X} \pm t_{n-1, 1-\frac{\alpha}{2}} \frac{s}{\sqrt{n}}$
  3. SI taille est elevée variation est loi: $\bar{X} \pm z_{1-\frac{\alpha}{2}} \frac{s}{\sqrt{n}}$

-Proportion Estimateur: fréquence empirique $\hat{p} = \frac{X}{n}$ intervalles: $\hat{p} \pm z_{1-\frac{\alpha}{2}} \sqrt{\frac{\hat{p}(1-\hat{p})}{n}}$

Studying That Suits You

Use AI to generate personalized quizzes and flashcards to suit your learning preferences.

Quiz Team

More Like This

Use Quizgecko on...
Browser
Browser