Mean Absolute Error — Another widely used Regression loss

Step by step implement with the gradients

neuralthreads
3 min readDec 2, 2021

In this post, we will talk about Mean Absolute Error and the gradients. This is also used in regression losses.

You can download the Jupyter Notebook from here.

Back to the previous post

Back to the first post

4.2 What is Mean Absolute error and how to compute its gradients?

Suppose we have true values,

and predicted values,

Then Mean Absolute Error is calculated as follow:

We can easily calculate Mean Absolute Error in Python like this.

import numpy as np                           # importing NumPy
np.random.seed(42)
def mae(y_true, y_pred): # MAE
return np.mean(abs(y_true - y_pred))
Defining MAE

Now, we know that

So, like MSE, we have a Jacobian for MAE.

We can easily find each term in this Jacobian.

Note — Here, 3 represents ‘N’, i.e., the entries in y_true and y_pred

We can easily define the MAE Jacobian in Python like this.

def mae_grad(y_true, y_pred):                # MAE Jacobian

N = y_true.shape[0]

return -((y_true - y_pred) / (abs(y_true - y_pred) +
10**-100))/N

Note — 10**-100 is for stability.

MAE Jacobian
10**-100 for stability

Let us have a look at an example.

y_true = np.array([[1.5], [0.2], [3.9], [6.2], [5.2]])
y_true
y_pred = np.array([[1.2], [0.5], [3.2], [4.2], [3.2]])
y_pred
y_true and y_pred
mae(y_true, y_pred)mae_grad(y_true, y_pred)
MAE and the gradients

I hope you now understand how to implement Mean Absolute Error.

Watch the video on youtube and subscribe to the channel for videos and posts like this.
Every slide is 3 seconds long and without sound. You may pause the video whenever you like.
You may put on some music too if you like.

The video is basically everything in the post only in slides.

Many thanks for your support and feedback.

If you like this course, then you can support me at

It would mean a lot to me.

Continue to the next post — 4.3 Categorical cross-entropy loss and its derivative.

--

--