Mathematical Phrases, Symbols, and Formulas

English Phrases Written Mathematically

When the English says: Interpret this as:
X is at least 4. X ≥ 4
The minimum of X is 4. X ≥ 4
X is no less than 4. X ≥ 4
X is greater than or equal to 4. X ≥ 4
X is at most 4. X ≤ 4
The maximum of X is 4. X ≤ 4
X is no more than 4. X ≤ 4
X is less than or equal to 4. X ≤ 4
X does not exceed 4. X ≤ 4
X is greater than 4. X > 4
X is more than 4. X > 4
X exceeds 4. X > 4
X is less than 4. X < 4
There are fewer X than 4. X < 4
X is 4. X = 4
X is equal to 4. X = 4
X is the same as 4. X = 4
X is not 4. X ≠ 4
X is not equal to 4. X ≠ 4
X is not the same as 4. X ≠ 4
X is different than 4. X ≠ 4

Symbols and Their Meanings

Symbols and their Meanings
Chapter (1st used) Symbol Spoken Meaning
Sampling and Data \sqrt{\begin{array}{c}\text{  }\\ \text{      }\end{array}} The square root of same
Sampling and Data \pi Pi 3.14159… (a specific number)
Descriptive Statistics Q1 Quartile one the first quartile
Descriptive Statistics Q2 Quartile two the second quartile
Descriptive Statistics Q3 Quartile three the third quartile
Descriptive Statistics IQR interquartile range Q3Q1 = IQR
Descriptive Statistics \stackrel{-}{x} x-bar sample mean
Descriptive Statistics \mu mu population mean
Descriptive Statistics s s sample standard deviation
Descriptive Statistics {s}^{2} s squared sample variance
Descriptive Statistics \sigma sigma population standard deviation
Descriptive Statistics {\sigma }^{2} sigma squared population variance
Descriptive Statistics \Sigma capital sigma sum
Probability Topics \left\{\right\} brackets set notation
Probability Topics S S sample space
Probability Topics A Event A event A
Probability Topics P\left(A\right) probability of A probability of A occurring
Probability Topics P\left(\mathit{\text{A}}|\mathit{\text{B}}\right) probability of A given B prob. of A occurring given B has occurred
Probability Topics P\left(A\cup B\right) prob. of A or B prob. of A or B or both occurring
Probability Topics P\left(A\cap B\right) prob. of A and B prob. of both A and B occurring (same time)
Probability Topics A A-prime, complement of A complement of A, not A
Probability Topics P(A‘) prob. of complement of A same
Probability Topics G1 green on first pick same
Probability Topics P(G1) prob. of green on first pick same
Discrete Random Variables PDF prob. density function same
Discrete Random Variables X X the random variable X
Discrete Random Variables X ~ the distribution of X same
Discrete Random Variables \ge greater than or equal to same
Discrete Random Variables \le less than or equal to same
Discrete Random Variables = equal to same
Discrete Random Variables not equal to same
Continuous Random Variables f(x) f of x function of x
Continuous Random Variables pdf prob. density function same
Continuous Random Variables U uniform distribution same
Continuous Random Variables Exp exponential distribution same
Continuous Random Variables f(x) = f of x equals same
Continuous Random Variables m m decay rate (for exp. dist.)
The Normal Distribution N normal distribution same
The Normal Distribution z z-score same
The Normal Distribution Z standard normal dist. same
The Central Limit Theorem \stackrel{-}{X} X-bar the random variable X-bar
The Central Limit Theorem {\mu }_{\stackrel{-}{x}} mean of X-bars the average of X-bars
The Central Limit Theorem {\sigma }_{\stackrel{-}{x}} standard deviation of X-bars same
Confidence Intervals CL confidence level same
Confidence Intervals CI confidence interval same
Confidence Intervals EBM error bound for a mean same
Confidence Intervals EBP error bound for a proportion same
Confidence Intervals t Student’s t-distribution same
Confidence Intervals df degrees of freedom same
Confidence Intervals {t}_{\frac{\alpha }{2}} student t with α/2 area in right tail same
Confidence Intervals p\prime p-prime sample proportion of success
Confidence Intervals q\prime q-prime sample proportion of failure
Hypothesis Testing {H}_{0} H-naught, H-sub 0 null hypothesis
Hypothesis Testing {H}_{a} H-a, H-sub a alternate hypothesis
Hypothesis Testing {H}_{1} H-1, H-sub 1 alternate hypothesis
Hypothesis Testing \alpha alpha probability of Type I error
Hypothesis Testing \beta beta probability of Type II error
Hypothesis Testing \stackrel{-}{X1}-\overline{X2} X1-bar minus X2-bar difference in sample means
Hypothesis Testing {\mu }_{1}-{\mu }_{2} mu-1 minus mu-2 difference in population means
Hypothesis Testing {{P}^{\prime }}_{1}-{{P}^{\prime }}_{2} P1-prime minus P2-prime difference in sample proportions
Hypothesis Testing {p}_{1}-{p}_{2} p1 minus p2 difference in population proportions
Chi-Square Distribution {Χ}^{2} Ky-square Chi-square
Chi-Square Distribution O Observed Observed frequency
Chi-Square Distribution E Expected Expected frequency
Linear Regression and Correlation y = a + bx y equals a plus b-x equation of a straight line
Linear Regression and Correlation \stackrel{^}{y} y-hat estimated value of y
Linear Regression and Correlation r sample correlation coefficient same
Linear Regression and Correlation \epsilon error term for a regression line same
Linear Regression and Correlation SSE Sum of Squared Errors same
F-Distribution and ANOVA F F-ratio F-ratio

Formulas

Symbols you must know
Population Sample
N Size n
\mu Mean \stackrel{_}{x}
{\sigma }^{2} Variance {s}^{2}
\sigma Standard deviation s
p Proportion p\prime
Single data set formulae
Population Sample
\mu =E\left(x\right)=\frac{1}{N}\sum _{i=1}^{N}\left({x}_{i}\right) Arithmetic mean \stackrel{-}{x}=\frac{1}{n}\sum _{i=1}^{n}\left({x}_{i}\right)
Geometric mean \stackrel{~}{x}={\left(\prod _{i=1}^{n}{X}_{i}\right)}^{\frac{1}{n}}
{Q}_{3}=\frac{3\left(n+1\right)}{4}, {Q}_{1}=\frac{\left(n+1\right)}{4} Inter-quartile range
IQR={Q}_{3}-{Q}_{1}
{Q}_{3}=\frac{3\left(n+1\right)}{4}, {Q}_{1}=\frac{\left(n+1\right)}{4}
{\sigma }^{2}=\frac{1}{N}\sum _{i=1}^{N}{\left({x}_{i}-\mu \right)}^{2} Variance {s}^{2}=\frac{1}{n}\sum _{i=1}^{n}{\left({x}_{i}-\stackrel{_}{x}\right)}^{2}
Single data set formulae
Population Sample
\mu =E\left(x\right)=\frac{1}{N}\sum _{i=1}^{N}\left({m}_{i}·{f}_{i}\right) Arithmetic mean \stackrel{-}{x}=\frac{1}{n}\sum _{i=1}^{n}\left({m}_{i}·{f}_{i}\right)
Geometric mean \stackrel{~}{x}={\left(\prod _{i=1}^{n}{X}_{i}\right)}^{\frac{1}{n}}
{\sigma }^{2}=\frac{1}{N}\sum _{i=1}^{N}{\left({m}_{i}-\mu \right)}^{2}·{f}_{i} Variance {s}^{2}=\frac{1}{n}\sum _{i=1}^{n}{\left({m}_{i}-\stackrel{_}{x}\right)}^{2}·{f}_{i}
CV=\frac{\sigma }{\mu }·100 Coefficient of variation CV=\frac{s}{\stackrel{_}{x}}·100
Basic probability rules
P\left(A\cap B\right)=P\left(A|B\right)·P\left(B\right) Multiplication rule
P\left(A\cup B\right)=P\left(A\right)+P\left(B\right)-P\left(A\cap B\right) Addition rule
P\left(A\cap B\right)=P\left(A\right)·P\left(B\right) or P\left(A|B\right)=P\left(A\right) Independence test
Hypergeometric distribution formulae
nCx=\left(\genfrac{}{}{0}{}{n}{x}\right)=\frac{n!}{x!\left(n-x\right)!} Combinatorial equation
P\left(x\right)=\frac{\left(\genfrac{}{}{0}{}{A}{x}\right)\left(\genfrac{}{}{0}{}{N-A}{n-x}\right)}{\left(\genfrac{}{}{0}{}{N}{n}\right)} Probability equation
E\left(X\right)=\mu =np Mean
{\sigma }^{2}=\left(\frac{N-n}{N-1}\right)np\left(q\right) Variance
Binomial distribution formulae
P\left(x\right)=\frac{n!}{x!\left(n-x\right)!}{p}^{x}{\left(q\right)}^{n-x} Probability density function
E\left(X\right)=\mu =np Arithmetic mean
{\sigma }^{2}=np\left(q\right) Variance
Geometric distribution formulae
P\left(X=x\right)={\left(1-p\right)}^{x-1}\left(p\right) Probability when x is the first success. Probability when x is the number of failures before first success P\left(X=x\right)={\left(1-p\right)}^{x}\left(p\right)
\mu =\frac{1}{p} Mean Mean \mu =\frac{1-p}{p}
{\sigma }^{2}=\frac{\left(1-p\right)}{{p}^{2}} Variance Variance {\sigma }^{2}=\frac{\left(1-p\right)}{{p}^{2}}
Poisson distribution formulae
P\left(x\right)=\frac{{e}^{-\mu }{\mu }^{x}}{x!} Probability equation
E\left(X\right)=\mu Mean
{\sigma }^{2}=\mu Variance
Uniform distribution formulae
f\left(x\right)=\frac{1}{b-a} for a\le x\le b PDF
E\left(X\right)=\mu =\frac{a+b}{2} Mean
{\sigma }^{2}=\frac{{\left(b-a\right)}^{2}}{12} Variance
Exponential distribution formulae
P\left(X\le x\right)=1-{e}^{-mx} Cumulative probability
E\left(X\right)=\mu =\frac{1}{m} or m=\frac{1}{\mu } Mean and decay factor
{\sigma }^{2}=\frac{1}{{m}^{2}}={\mu }^{2} Variance
The following page of formulae requires the use of the “Z“, “t“, “{\chi }^{2}” or “F” tables.
Z=\frac{x-\mu }{\sigma } Z-transformation for normal distribution
Z=\frac{x-np\prime }{\sqrt{np\prime \left(q\prime \right)}} Normal approximation to the binomial
Probability (ignores subscripts)
Hypothesis testing
Confidence intervals
[bracketed symbols equal margin of error]
(subscripts denote locations on respective distribution tables)
{Z}_{c}=\frac{\stackrel{-}{x}-{\mu }_{0}}{\frac{\sigma }{\sqrt{n}}} Interval for the population mean when sigma is known
\stackrel{-}{x}±\left[{Z}_{\left(\alpha /2\right)}\frac{\sigma }{\sqrt{n}}\right]
{Z}_{c}=\frac{\stackrel{-}{x}-{\mu }_{0}}{\frac{s}{\sqrt{n}}} Interval for the population mean when sigma is unknown but n>30
\stackrel{-}{x}±\left[{Z}_{\left(\alpha /2\right)}\frac{s}{\sqrt{n}}\right]
{t}_{c}=\frac{\stackrel{-}{x}-{\mu }_{0}}{\frac{s}{\sqrt{n}}} Interval for the population mean when sigma is unknown but n<30
\stackrel{-}{x}±\left[{t}_{\left(n-1\right),\left(\alpha /2\right)}\frac{s}{\sqrt{n}}\right]
{Z}_{c}=\frac{p\prime -{p}_{0}}{\sqrt{\frac{{p}_{0}{q}_{0}}{n}}} Interval for the population proportion
p\prime ±\left[{Z}_{\left(\alpha /2\right)}\sqrt{\frac{p\prime q\prime }{n}}\right]
{t}_{c}=\frac{\stackrel{-}{d}-{\delta }_{0}}{{s}_{d}} Interval for difference between two means with matched pairs
\stackrel{-}{d}±\left[{t}_{\left(n-1\right),\left(\alpha /2\right)}\frac{{s}_{d}}{\sqrt{n}}\right]where {s}_{d} is the deviation of the differences
{Z}_{c}=\frac{\left(\stackrel{-}{{x}_{1}}-\stackrel{-}{{x}_{2}}\right)-{\delta }_{0}}{\sqrt{\frac{{\sigma }_{1}^{2}}{{n}_{1}}+\frac{{\sigma }_{2}^{2}}{{n}_{2}}}} Interval for difference between two means when sigmas are known
\left(\stackrel{-}{{x}_{1}}-\stackrel{-}{{x}_{2}}\right)±\left[{Z}_{\left(\alpha /2\right)}\sqrt{\frac{{\sigma }_{1}^{2}}{{n}_{1}}+\frac{{\sigma }_{2}^{2}}{{n}_{2}}}\right]
{t}_{c}=\frac{\left({\overline{x}}_{1}-{\overline{x}}_{2}\right)-{\delta }_{0}}{\sqrt{\left(\frac{{\left({s}_{1}\right)}^{2}}{{n}_{1}}+\frac{{\left({s}_{2}\right)}^{2}}{{n}_{2}}\right)}} Interval for difference between two means with equal variances when sigmas are unknown
\left({\overline{x}}_{1}-{\overline{x}}_{2}\right)±\left[{t}_{df,\left(\alpha /2\right)}\sqrt{\left(\frac{{\left({s}_{1}\right)}^{2}}{{n}_{1}}+\frac{{\left({s}_{2}\right)}^{2}}{{n}_{2}}\right)}\right]wheredf=\frac{{\left(\frac{{\left({s}_{1}\right)}^{2}}{{n}_{1}}+\frac{{\left({s}_{2}\right)}^{2}}{{n}_{2}}\right)}^{2}}{\left(\frac{1}{{n}_{1}-1}\right)\left(\frac{{\left({s}_{1}\right)}^{2}}{{n}_{1}}\right)+\left(\frac{1}{{n}_{2}-1}\right)\left(\frac{{\left({s}_{2}\right)}^{2}}{{n}_{2}}\right)}
{Z}_{c}=\frac{\left({p\prime }_{1}-{p\prime }_{2}\right)-{\delta }_{0}}{\sqrt{\frac{{p\prime }_{1}\left({q\prime }_{1}\right)}{{n}_{1}}+\frac{{p\prime }_{2}\left({q\prime }_{2}\right)}{{n}_{2}}}} Interval for difference between two population proportions
\left({p\prime }_{1}-{p\prime }_{2}\right)±\left[{Z}_{\left(\alpha /2\right)}\sqrt{\frac{{p\prime }_{1}\left({q\prime }_{1}\right)}{{n}_{1}}+\frac{{p\prime }_{2}\left({q\prime }_{2}\right)}{{n}_{2}}}\right]
{\chi }_{c}^{2}=\frac{\left(n-1\right){s}^{2}}{{\sigma }_{0}^{2}} Tests for GOF, Independence, and Homogeneity
{\chi }_{c}^{2}=\Sigma \frac{{\left(O-E\right)}^{2}}{E}where O = observed values and E = expected values
{F}_{c}=\frac{{s}_{1}^{2}}{{s}_{2}^{2}} Where {s}_{1}^{2} is the sample variance which is the larger of the two sample variances
The next 3 formulae are for determining sample size with confidence intervals.
(note: E represents the margin of error)
n=\frac{{Z}_{\left(\frac{a}{2}\right)}^{2}{\sigma }^{2}}{{E}^{2}}
Use when sigma is known
E=\overline{x}-\mu
n=\frac{{Z}_{\left(\frac{a}{2}\right)}^{2}\left(0.25\right)}{{E}^{2}}
Use when p\prime is unknown
E=p\prime -p
n=\frac{{Z}_{\left(\frac{a}{2}\right)}^{2}\left[p\prime \left(q\prime \right)\right]}{{E}^{2}}
Use when p\prime is uknown
E=p\prime -p
Simple linear regression formulae for y=a+b\left(x\right)
r=\frac{\Sigma \left[\left(x-\overline{x}\right)\left(y-\overline{y}\right)\right]}{\sqrt{\Sigma {\left(x-\overline{x}\right)}^{2}*\Sigma {\left(y-\overline{y}\right)}^{2}}}=\frac{{S}_{xy}}{{S}_{x}{S}_{y}}=\sqrt{\frac{SSR}{SST}} Correlation coefficient
b=\frac{\Sigma \left[\left(x-\overline{x}\right)\left(y-\overline{y}\right)\right]}{\Sigma {\left(x-\overline{x}\right)}^{2}}=\frac{{S}_{xy}}{{SS}_{x}}={r}_{y,x}\left(\frac{{s}_{y}}{{s}_{x}}\right) Coefficient b (slope)
a=\overline{y}-b\left(\overline{x}\right) y-intercept
{s}_{e}^{2}=\frac{\Sigma {\left({y}_{i}-{\stackrel{^}{y}}_{i}\right)}^{2}}{n-k}=\frac{\underset{i=1}{\overset{n}{\Sigma }}{e}_{i}^{2}}{n-k} Estimate of the error variance
{S}_{b}=\frac{{s}_{e}^{2}}{\sqrt{{\left({x}_{i}-\overline{x}\right)}^{2}}}=\frac{{s}_{e}^{2}}{\left(n-1\right){s}_{x}^{2}} Standard error for coefficient b
{t}_{c}=\frac{b-{\beta }_{0}}{{s}_{b}} Hypothesis test for coefficient β
b±\left[{t}_{n-2,\alpha /2}{S}_{b}\right] Interval for coefficient β
\stackrel{^}{y}±\left[{t}_{\alpha /2}*{s}_{e}\left(\sqrt{\frac{1}{n}+\frac{{\left({x}_{p}-\overline{x}\right)}^{2}}{{s}_{x}}}\right)\right] Interval for expected value of y
\stackrel{^}{y}±\left[{t}_{\alpha /2}*{s}_{e}\left(\sqrt{1+\frac{1}{n}+\frac{{\left({x}_{p}-\overline{x}\right)}^{2}}{{s}_{x}}}\right)\right] Prediction interval for an individual y
ANOVA formulae
SSR=\underset{i=1}{\overset{n}{\Sigma }}{\left({\stackrel{^}{y}}_{i}-\overline{y}\right)}^{2} Sum of squares regression
SSE=\underset{i=1}{\overset{n}{\Sigma }}{\left({\stackrel{^}{y}}_{i}-{\overline{y}}_{i}\right)}^{2} Sum of squares error
SST=\underset{i=1}{\overset{n}{\Sigma }}{\left({y}_{i}-\overline{y}\right)}^{2} Sum of squares total
{R}^{2}=\frac{SSR}{SST} Coefficient of determination
The following is the breakdown of a one-way ANOVA table for linear regression.
Source of variation Sum of squares Degrees of freedom Mean squares F-ratio
Regression SSR 1 or k-1 MSR=\frac{SSR}{d{f}_{R}} F=\frac{MSR}{MSE}
Error SSE n-k MSE=\frac{SSE}{d{f}_{E}}
Total SST n-1

License

Icon for the Creative Commons Attribution 4.0 International License

Introductory Business Statistics by OSCRiceUniversity is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.

Share This Book