Simple linear regression model — revisted using the maximum likelihood estimator

This is a revisit of the previous post. See https://functor.network/user/1751/entry/653.

Goal

Write a mathematical model that describes the relationship between two variables and .

Setup

Given observations , , consider a model of the form

where are i.i.d. with . The aim is to find estimates for , , and .

Likelihood function

The likelihood function denoted is

and the negative log likelihood function is

or equivalently

where are the residuals.

Solving for

Computing and setting it to zero yields

Per the work in the previous post, this results in the following estimate for .

Solving for

Computing and setting it to zero yields

Per the work in the previous post, this results in the following estimate for .

Solving for

Computing and setting it to zero yields

This gives the following estimate for .

The interpretation of equation (3) is that the sum of the squared residuals is an estimate of the variance of the error.

The critical point is a local minimum of and therefore a local maximum of

A computation shows that the Jacobian of at is

Using the critical point conditions, the Jacobian is

This matrix is positive definite because the (3,3) entry is positive and the minor is a positive definite matrix, as shown in the previous post. Therefore the critical point is a local minimum of and a local maximum of .

Summing up

Comments

  • The model is the same as the model in the previous post.

Reference

Simple linear regression model, https://functor.network/user/1751/entry/653

No comment found.

Add a comment

You must log in to post a comment.