The Hierarchical Linear Models

Print   

02 Nov 2017

Disclaimer:
This essay has been written and submitted by students and is not an example of our work. Please click this link to view samples of our professional work witten by our professional essay writers. Any opinions, findings, conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of EssayCompany.

A model that has both deterministic as well as probabilistic components is called a regression model [12]. In deterministic model, with the help of one variable, value of other variable can be predicted and represented by y=f(x) which means value of y depends upon x, this is the reason why this model is known as deterministic model. The prediction generated by this model is a hypothetical, "what–if" statement and does not necessarily occur in past, future or even in the present but in real scenario, the chances of y being fully dependent upon x are very slim hence we use probabilistic model.

Probabilistic model [12] or probability model are used to predict the value of a variable on the basis of previous information and represented by Y~p(y) where Y is randomly generated from probability distribution p(y). As per the value of y this model makes "what-if" prediction. However the model does not say precisely what the value of y will be and also the prediction generated by the model is need not compulsorily occur in the past, future or even in the present. When large number of values for y occurs the probability model allows us to predict aggregate outcomes. Probability model does not exactly tell what the value of Y will be, hence for increasing the prediction accuracy, we combine the feature of both the models (deterministic and probability) that builds up regression model.

Like deterministic model [12], Regression model [13] also predicts the value of one variable based on other variable, represented by Y ~ p(y|x), where Y is generated at random from the probability distribution for known x. The regression model has proven to be a powerful tool that makes prediction about past, present or future events with the help of information about past or present events. For constructing a regression model, value of x and y is taken from the sample of object and comparing with other model regression model takes less time and/or resources for retrieving the information for computing the prediction.

4.2 Types of Regression

Regression is used to predict the value of variable, which helps in recommendation. There are two basic types of regression which we discussed here are simple regression and multiple regression.

Regression

Simple

Multiple

Linear

Non Linear

Non Linear

Linear

4.2.1 Simple Regression

Simple linear regression is used to represent the relationship between a scalar dependent variable  and one explanatory variables denoted. In case when there is only one explanatory variable is present called Simple Regression. For predicting the outcome simple regression uses one independent variable. The model in which data are modelled using linear predictor functions, and unknown model parameters are estimated from the data, called linear models.

4.2.2 Multiple Regression

With the help of one or more explanatory variables multiple regression determined the value of scalar dependent variable. For more than one explanatory variable, it is called multiple regression. Multiple regression uses two or more independent variables to predict the outcome.

The general form of Simple and Multiple type of Regression is:

Simple Linear Regression: Y = a + bX

Multiple Linear Regression: Y = a + b1X1 + b2X2 + b3X3 + ... + btXt

Where

Y= predicted variable or the variable whose value we want to find out. regression are used to predict the dependent variable, that always start with a set of known y values and use these values to build a regression model. The known value y also referred to as observed values.

X= Explanatory variable or the variable that we are using to predict Y. In regression dependent variable is the function of Explanatory variable.

a=the regression intercept. It represents the exact value for dependent variable, if all the dependent variables are zero.

b= the slope b is the regression coefficient which are computed by the regression tool. For each explanatory variable there is regression coefficient that represents the strength and type of relationship of explanatory variable has to the dependent variable.

4.2.1.1 Linear Regression:   An approach to modeling the relationship between a scalar dependent variable and one or more explanatory variables is called a Linear Regression. With the help of standard estimation techniques linear regression models make a number of assumptions about the predictor variables, the response variables and their relationship, where assumption may be erogeneity, Linearity, Constant variance, independence etc. Relaxation is provided to these assumptions by number of extension and in some cases these assumption can be eliminated entirely. Some method provides the relaxation in multiple assumptions at once or by combining different extension it can be achieved. Because of these extensions, estimating procedure become more complex, time consuming and required more data in order to get a accurate model. Some of the linear regression model is discussed below.

1. General Linear Models

The general linear model considers the situation when the response variable Y is not a scalar but a vector. Conditional linearity of E(y|x) = Bx is still assumed, with a matrix B replacing the vector b of the classical linear regression model.

2. Generalized Linear Models

Generalized linear models are a framework for modelling a response variable y that is bounded or discrete. Generalised linear model is the extensions of fixed effects linear models and used where standard assumptions are violated. Poisson regression for count data, Logistic regression and probit regression for binary data, Multinomial logistic regression and multinomial probit regression for categorical data, ordered probit regression for ordinal data, are some of the example of Generalised linear models.

3. Hierarchical Linear Models

Hierarchical linear models (or multilevel regression) organizes the data into a hierarchy of regressions, for example where A is regressed on B, and B is regressed on C. It is often used where the data have a natural hierarchical structure such as in educational statistics, where students are nested in classrooms, classrooms are nested in schools, and schools are nested in some administrative grouping such as a school district. The response variable might be a measure of student achievement such as a test score, and different covariates would be collected at the classroom, school, and school district levels.

4. Heteroscedastic Models

Various models have been created that allow for heteroscedasticity, i.e. the errors for different response variables may have different variances. For example, weighted least squares is a method for estimating linear regression models when the response variables may have different error variances, possibly with correlated errors.

4.2.1.2 Non Linear Regression

Non linear [15] is one in which at least one of the parameter is occur nonlinearly, such model plays a very important role in understanding the complex interrelationship among variables. In non linear model at least one derivative with respect to parameter should involve that parameter. Simple Linear regression related two variable X and Y by using straight line (y=mx+b), while non linear regression are used to generate a curve (line) where value of Y is random. Goal of non linear regression is reduce the sum of squares and this model uses logarithmic functions, trigonometric functions and exponential functions, among other fitting methods. Non Linear Regression seek to graphically track a particular response from a set of variables by which it is similar to linear model. Because of function is created through a series of approximation that may stem from trial-and-error, the non linear model is complex. One of the major advantages in using nonlinear regression is the broad range of functions that can be fit. Example of non linear Regression model is Logistic model, Malthus model, and Gompertz model etc. In generalised form non linear model is represented as:

Y (t) =exp (at+bt2) or

Y (t) = at + exp (-bt)

Some of the non line model is discussed below.

1. Malthus Model

This model is also called simple exponential growth model. This model is essentially an exponential growth based on constant rate of compound interest.

In this model rate of growth of population size is given by

After integration we get

N(t)=No exp(rt)

Where N(t) denote the population size at time t and r is intrinsic growth rate. No denote the population size at t=0.

2. Logistic Model

This type of model is symmetric and represented by differential equation

= rN(1-N/K)

After integration

N(t)=

3. Gompertz Model

This type of model have a sigmoid type behaviour and quite useful in biological work. This type of model is not symmetric unlike logistic model. This model is represented by differential equation

e(K/N)

After integration

N(t)=K exp[loge(No/K)exp(-rt)]

4.3 Application of Linear Regression

Regression is used to predict the value of one variable based on other variable. Linear regression is widely used in biological, behavioural and social science where possible relationship between variable is described. In business it is used as trend line which is used to show the changes in data with time. Trend line shows the increment and decrement in data over the time. Linear regression is used to calculate the position and slope of data point in trend line. Linear regression is one of the simple technique which does not require any control group, experimental design, or sophisticated analysis techniques. In economics linear regression is predominant empirical tool which are used to predict the consumption spending, fixed investment spending, inventory investment, purchases of a country's exports, spending on imports, the demand to hold liquid assets, labor demand, and labor supply. In Finance for analyzing and quantifying he semantic risk of an investment, linear regression is used.

4.4 Limitation of Linear Regression

Linear regression implements a statistical model that shows optimal result when relationship between the independent variables and the dependent variables are almost linear.

One of the limitations of linear regression is predicting numeric output only.

Linear regression is often inappropriately used to model non linear relationship.

Lack of information about what has been learned, can create a problem.

4.5 Advantage and Disadvantage of Linear Regression over Non Linear Regression

Major advantage of using linear regression is that it computed the parameter analytically while parameter in non linear model is finding out in iterative manner. The biggest advantage of nonlinear regression over much other technique is the broad range of function that can be fit. Linear model do not describe process that asymptote very well because for all linear function value can’t increase or decrease at a declining rate as the explanatory variables go to extremes.



rev

Our Service Portfolio

jb

Want To Place An Order Quickly?

Then shoot us a message on Whatsapp, WeChat or Gmail. We are available 24/7 to assist you.

whatsapp

Do not panic, you are at the right place

jb

Visit Our essay writting help page to get all the details and guidence on availing our assiatance service.

Get 20% Discount, Now
£19 £14/ Per Page
14 days delivery time

Our writting assistance service is undoubtedly one of the most affordable writting assistance services and we have highly qualified professionls to help you with your work. So what are you waiting for, click below to order now.

Get An Instant Quote

ORDER TODAY!

Our experts are ready to assist you, call us to get a free quote or order now to get succeed in your academics writing.

Get a Free Quote Order Now