estimators
DownloadTélécharger
Actions
Vote :
ScreenshotAperçu

Informations
Catégorie :Category: nCreator TI-Nspire
Auteur Author: usertest4804
Type : Classeur 3.0.1
Page(s) : 1
Taille Size: 2.68 Ko KB
Mis en ligne Uploaded: 18/04/2025 - 02:14:06
Mis à jour Updated: 18/04/2025 - 02:16:16
Uploadeur Uploader: usertest4804 (Profil)
Téléchargements Downloads: 1
Visibilité Visibility: Archive publique
Shortlink : http://ti-pla.net/a4587925
Type : Classeur 3.0.1
Page(s) : 1
Taille Size: 2.68 Ko KB
Mis en ligne Uploaded: 18/04/2025 - 02:14:06
Mis à jour Updated: 18/04/2025 - 02:16:16
Uploadeur Uploader: usertest4804 (Profil)
Téléchargements Downloads: 1
Visibilité Visibility: Archive publique
Shortlink : http://ti-pla.net/a4587925
Description
Fichier Nspire généré sur TI-Planet.org.
Compatible OS 3.0 et ultérieurs.
<<
General Concept Point estimation uses sample data to produce a single number (called a statistic) to estimate an unknown population parameter. Examples include: sample mean (X_bar), sample proportion (p_hat), and sample variance (S^2) 7.3.1 Unbiased Estimators An estimator theta_hat is unbiased for parameter theta if: Expected value of theta_hat = theta That is: E(theta_hat) = theta Example: Let X_1, X_2, ..., X_n be a sample with mean mu. Then X_bar = (1/n) * sum(X_i) is an unbiased estimator of mu because E(X_bar) = mu Counter-example: Sample variance with divisor n: sigma_hat^2 = (1/n) * sum((X_i - X_bar)^2) is a biased estimator of sigma^2 Unbiased version uses divisor (n - 1) 7.3.2 Variance of a Point Estimator Var(theta_hat) = E[(theta_hat - E(theta_hat))^2] This measures how much theta_hat fluctuates between samples Example: Var(X_bar) = sigma^2 / n Smaller variance means a more stable estimator 7.3.3 Standard Error (SE) SE is the standard deviation of an estimator: SE(theta_hat) = sqrt(Var(theta_hat)) Example: SE(X_bar) = sigma / sqrt(n) If sigma is unknown, use sample std dev s: SE(X_bar) = s / sqrt(n) 7.3.4 Bootstrap Standard Error Bootstrap estimates SE by resampling the sample itself Steps: Resample data with replacement to create many new samples Compute theta_hat for each sample SE_bootstrap = std dev of the theta_hat values from all samples Example: Original sample: {4, 5, 7, 10} Take 1000 bootstrap samples compute 1000 means SE = std dev of those 1000 sample means 7.3.5 Mean Squared Error (MSE) MSE(theta_hat) = E[(theta_hat - theta)^2] Also written as: MSE = Var(theta_hat) + (Bias(theta_hat))^2 Example 1 (unbiased): theta_hat1 has Var = 2 MSE = 2 Example 2 (biased): theta_hat2 has bias = 1 and Var = 1 MSE = 1 + 1^2 = 2 7.4.1 Method of Moments (MoM) Steps: Write population moment like E[X] in terms of parameter(s) Set equal to sample moment (like X_bar) Solve for parameter Example: Let X ~ Exp(lambda) E[X] = 1 / lambda Sample mean is X_bar Set X_bar = 1 / lambda MoM estimate: lambda_hat = 1 / X_bar 7.4.2 Method of Maximum Likelihood (MLE) Steps: Write likelihood: L(theta) = product of f(x_i; theta) Take log: logL(theta) Differentiate and set equal to 0 Solve for theta Example: X_1,...,X_n ~ Exp(lambda), with f(x) = lambda * e^(-lambda * x) Likelihood: L(lambda) = lambda^n * e^(-lambda * sum(X_i)) Log-likelihood: logL(lambda) = n * log(lambda) - lambda * sum(X_i) Differentiate: d/dlambda = n / lambda - sum(X_i) = 0 Solve: lambda_hat = n / sum(X_i) = 1 / X_bar Made with nCreator - tiplanet.org
>>
Compatible OS 3.0 et ultérieurs.
<<
General Concept Point estimation uses sample data to produce a single number (called a statistic) to estimate an unknown population parameter. Examples include: sample mean (X_bar), sample proportion (p_hat), and sample variance (S^2) 7.3.1 Unbiased Estimators An estimator theta_hat is unbiased for parameter theta if: Expected value of theta_hat = theta That is: E(theta_hat) = theta Example: Let X_1, X_2, ..., X_n be a sample with mean mu. Then X_bar = (1/n) * sum(X_i) is an unbiased estimator of mu because E(X_bar) = mu Counter-example: Sample variance with divisor n: sigma_hat^2 = (1/n) * sum((X_i - X_bar)^2) is a biased estimator of sigma^2 Unbiased version uses divisor (n - 1) 7.3.2 Variance of a Point Estimator Var(theta_hat) = E[(theta_hat - E(theta_hat))^2] This measures how much theta_hat fluctuates between samples Example: Var(X_bar) = sigma^2 / n Smaller variance means a more stable estimator 7.3.3 Standard Error (SE) SE is the standard deviation of an estimator: SE(theta_hat) = sqrt(Var(theta_hat)) Example: SE(X_bar) = sigma / sqrt(n) If sigma is unknown, use sample std dev s: SE(X_bar) = s / sqrt(n) 7.3.4 Bootstrap Standard Error Bootstrap estimates SE by resampling the sample itself Steps: Resample data with replacement to create many new samples Compute theta_hat for each sample SE_bootstrap = std dev of the theta_hat values from all samples Example: Original sample: {4, 5, 7, 10} Take 1000 bootstrap samples compute 1000 means SE = std dev of those 1000 sample means 7.3.5 Mean Squared Error (MSE) MSE(theta_hat) = E[(theta_hat - theta)^2] Also written as: MSE = Var(theta_hat) + (Bias(theta_hat))^2 Example 1 (unbiased): theta_hat1 has Var = 2 MSE = 2 Example 2 (biased): theta_hat2 has bias = 1 and Var = 1 MSE = 1 + 1^2 = 2 7.4.1 Method of Moments (MoM) Steps: Write population moment like E[X] in terms of parameter(s) Set equal to sample moment (like X_bar) Solve for parameter Example: Let X ~ Exp(lambda) E[X] = 1 / lambda Sample mean is X_bar Set X_bar = 1 / lambda MoM estimate: lambda_hat = 1 / X_bar 7.4.2 Method of Maximum Likelihood (MLE) Steps: Write likelihood: L(theta) = product of f(x_i; theta) Take log: logL(theta) Differentiate and set equal to 0 Solve for theta Example: X_1,...,X_n ~ Exp(lambda), with f(x) = lambda * e^(-lambda * x) Likelihood: L(lambda) = lambda^n * e^(-lambda * sum(X_i)) Log-likelihood: logL(lambda) = n * log(lambda) - lambda * sum(X_i) Differentiate: d/dlambda = n / lambda - sum(X_i) = 0 Solve: lambda_hat = n / sum(X_i) = 1 / X_bar Made with nCreator - tiplanet.org
>>