MULTIVARIATE
DownloadTélécharger
Actions
Vote :
ScreenshotAperçu

Informations
Catégorie :Category: nCreator TI-Nspire
Auteur Author: DrDrunkinho
Type : Classeur 3.0.1
Page(s) : 1
Taille Size: 2.38 Ko KB
Mis en ligne Uploaded: 18/12/2023 - 15:27:06
Uploadeur Uploader: DrDrunkinho (Profil)
Téléchargements Downloads: 4
Visibilité Visibility: Archive publique
Shortlink : http://ti-pla.net/a3792319
Type : Classeur 3.0.1
Page(s) : 1
Taille Size: 2.38 Ko KB
Mis en ligne Uploaded: 18/12/2023 - 15:27:06
Uploadeur Uploader: DrDrunkinho (Profil)
Téléchargements Downloads: 4
Visibilité Visibility: Archive publique
Shortlink : http://ti-pla.net/a3792319
Description
Fichier Nspire généré sur TI-Planet.org.
Compatible OS 3.0 et ultérieurs.
<<
Linear regressio n: the variables are linearly related Strict exogeneity : Errors cannot be explained by the explanatory variable ( This assumption holds if the errors are independent with respect to the explanatory var). Homoskedasticity : The variance of the error term is constant Distribution : Error term is normally distributed A dummy variable is a variable which takes the value 0 or 1 => Useful for any binary questions EX : Variable Woman: is equal to one if individual is a female and zero otherwise. A categorical variable is a variable which takes a finite number d possible values -> decomposed into d-1 dummy variables . An interaction variable is a variable that is the product of two (or more) variables . EX : Variable Educ x Expe -> product of two variables that are education and experience => the new variable captures the effect of experienced and educated individuals. Model assumptions in multiple linear regression : They can be verified in the same way as in simple linear regression => BUT the explanatory variables should not be strongly correlated . Multicollinearity is the presence of one explanatory variable that are (almost) perfectly correlated with one (or several) other explanatory variable(s). Several clues that indicate problems with multicollinearity : An independent variable known to be an important predictor ends up being insignificant A parameter that should have a positive sign turns out to be negative, or vice versa. When an independent variable is added or removed, there is a drastic change in the values of the remaining coefficients. Multicollinearity is always a problem when we have more predictors than observations (N < K)! Made with nCreator - tiplanet.org
>>
Compatible OS 3.0 et ultérieurs.
<<
Linear regressio n: the variables are linearly related Strict exogeneity : Errors cannot be explained by the explanatory variable ( This assumption holds if the errors are independent with respect to the explanatory var). Homoskedasticity : The variance of the error term is constant Distribution : Error term is normally distributed A dummy variable is a variable which takes the value 0 or 1 => Useful for any binary questions EX : Variable Woman: is equal to one if individual is a female and zero otherwise. A categorical variable is a variable which takes a finite number d possible values -> decomposed into d-1 dummy variables . An interaction variable is a variable that is the product of two (or more) variables . EX : Variable Educ x Expe -> product of two variables that are education and experience => the new variable captures the effect of experienced and educated individuals. Model assumptions in multiple linear regression : They can be verified in the same way as in simple linear regression => BUT the explanatory variables should not be strongly correlated . Multicollinearity is the presence of one explanatory variable that are (almost) perfectly correlated with one (or several) other explanatory variable(s). Several clues that indicate problems with multicollinearity : An independent variable known to be an important predictor ends up being insignificant A parameter that should have a positive sign turns out to be negative, or vice versa. When an independent variable is added or removed, there is a drastic change in the values of the remaining coefficients. Multicollinearity is always a problem when we have more predictors than observations (N < K)! Made with nCreator - tiplanet.org
>>