Correlation and Regression The Body Mass Index (BMI), defined as the ratio between body mass (kg) and the square of height (m 2), has already been used as an indicator of obesity in adults and to assess the degree of chronic energy deficiency in adults. These authors defined chronic energy deficiency and, combining BMI and […]
To start, you canCorrelation and Regression
The Body Mass Index (BMI), defined as the ratio between body mass (kg) and the square of height (m 2), has already been used as an indicator of obesity in adults and to assess the degree of chronic energy deficiency in adults. These authors defined chronic energy deficiency and, combining BMI and physical activity level (expressed in multiples of the daily basal metabolic rate), established cut-off points for three degrees of this deficiency. Subsequently, a simplification was suggested in assessing the degree of chronic energy deficiency in adults based only on BMI. Since then, several studies have been carried out using these cut-off points, and several methodological discussions have taken place on the subject. Therefore, this article aims to study these mentioned aspects and establish the relationship between BMI and other variables.
Independent Variables
The variables used in regression analysis when measuring BMI are glucose, physical activity, and household income. The increase in glucose within the body results in more fat deposition, thus increasing lipid biosynthesis, resulting in increased BMI. The physical activity variable determines how well a person is acting towards maintaining the correct BMI. The household income determines the quality of life and the probability of eating junk food. Other independent variables that can be used when measuring BMI are sex and age. These variables provide descriptive statistics about the respondents.
Regression Statistics
The term collinearity refers to the existence of a perfect linear relationship between some of the explanatory variables. Multicollinearity refers to the existence of more than one linear relationship involving some or all of the explanatory variables. The least-squares method assumes no multicollinearity between the explanatory variables since a perfect linear relationship between them would imply that their regression coefficients would be undetermined and would have an infinite standard error (Kasuya, 2019). In practice, however, perfect multicollinearity is rare, and the problem becomes one of degree rather than existence. The higher the degree of multicollinearity, the higher the standard errors of the regression coefficients and the lower the precision in their estimation.
This assumption of homoscedasticity is fundamental for the construction of confidence intervals and for testing the hypotheses, and without it, it cannot be guaranteed that the least-squares method produces the best unbiased linear estimators. In fact, in the presence of heteroscedasticity, the least-squares estimators of the model parameters remain linear and unbiased, but the variance estimators of the parameters are biased (Holmes et al., 2017). The graphical analysis of residuals is an important element in identifying a relationship between the residuals of the regression and each explanatory variable and, if it exists, its form (Holmes et al., 2017). In this application, however, the number of points does not allow identifying whether there is any relationship between the residuals and the explanatory variables, using the Park statistical tests, the Spearman correlation, and the Breusch-Pagan tests.
Residual analysis in a test assumes that the variance of random errors is a function of explanatory variables. Using the squares of the observed residuals as approximations of the and applying logarithms to the expression in the regression model and concluded that if any variable is significant, the existence of a relationship between the observed residuals and the corresponding variable is accepted, rejecting, therefore, the hypothesis of homoscedasticity. If there is no significant variable, there is no indication for rejecting the homoscedasticity hypothesis.
References
Kasuya, E. (2019). On the use of r and r squared in correlation and regression (Vol. 34, No. 1, pp. 235-236). Hoboken, USA: John Wiley & Sons, Inc.
Holmes, A., Illowsky, B., & Dean, S. (2017). Introductory Business Statistics. OpenStax. https://openstax.org/details/books/introductory-business-statistics (Links to an external site.).
Select your paper details and see how much our professional writing services will cost.
Our custom human-written papers from top essay writers are always free from plagiarism.
Your data and payment info stay secured every time you get our help from an essay writer.
Your money is safe with us. If your plans change, you can get it sent back to your card.
We offer more than just hand-crafted papers customized for you. Here are more of our greatest perks.