Volume 13 | Issue 4
Volume 13 | Issue 4
Volume 13 | Issue 4
Volume 13 | Issue 4
Volume 13 | Issue 4
Despite the abundance of creative tools available in Applied Mathematics, the fundamental tool that remains paramount to mathematicians is the linear model. This is due to its straightforward and seemingly restrictive attributes, including linearity, constant variance, normality, and independence. Linear models and their associated methodologies stand out as remarkable, adaptable, and potent. Given that the majority of advanced statistical techniques stem from generalizations of linear models, a solid grasp of these models is essential for delving into advanced statistical methods. The core focus of this research article revolves around specific incarnations of the Simple Linear Regression Model, the Multiple Linear Regression Model, and the Least Squares Estimation (LSE) of their parameters, along with the inherent properties of LSE. Furthermore, an inventive proof of the Gauss-Markov theorem has been advanced through the application of Matrix Calculus principles. Additionally, the notion of Best Linear Unbiased Estimation (BLUE) has been elucidated.