What function defines accuracy in Keras when the loss is mean squared error (MSE)?

There are at least two separate issues with your question.

The first one should be clear by now from the comments by Dr. Snoopy and the other answer: accuracy is meaningless in a regression problem, such as yours; see also the comment by patyork in this Keras thread. For good or bad, the fact is that Keras will not “protect” you or any other user from putting not-meaningful requests in your code, i.e. you will not get any error, or even a warning, that you are attempting something that does not make sense, such as requesting the accuracy in a regression setting.

Having clarified that, the other issue is:

Since Keras does indeed return an “accuracy”, even in a regression setting, what exactly is it and how is it calculated?

To shed some light here, let’s revert to a public dataset (since you do not provide any details about your data), namely the Boston house price dataset (saved locally as housing.csv), and run a simple experiment as follows:

import numpy as np
import pandas
import keras

from keras.models import Sequential
from keras.layers import Dense

# load dataset
dataframe = pandas.read_csv("housing.csv", delim_whitespace=True, header=None)
dataset = dataframe.values
# split into input (X) and output (Y) variables
X = dataset[:,0:13]
Y = dataset[:,13]

model = Sequential()
model.add(Dense(13, input_dim=13, kernel_initializer="normal", activation='relu'))
model.add(Dense(1, kernel_initializer="normal"))
# Compile model asking for accuracy, too:
model.compile(loss="mean_squared_error", optimizer="adam", metrics=['accuracy'])

model.fit(X, Y,
     batch_size=5,
     epochs=100,
     verbose=1)

As in your case, the model fitting history (not shown here) shows a decreasing loss, and an accuracy roughly increasing. Let’s evaluate now the model performance in the same training set, using the appropriate Keras built-in function:

score = model.evaluate(X, Y, verbose=0)
score
# [16.863721372581754, 0.013833992168483997]

The exact contents of the score array depend on what exactly we have requested during model compilation; in our case here, the first element is the loss (MSE), and the second one is the “accuracy”.

At this point, let us have a look at the definition of Keras binary_accuracy in the metrics.py file:

def binary_accuracy(y_true, y_pred):
    return K.mean(K.equal(y_true, K.round(y_pred)), axis=-1)

So, after Keras has generated the predictions y_pred, it first rounds them, and then checks to see how many of them are equal to the true labels y_true, before getting the mean.

Let’s replicate this operation using plain Python & Numpy code in our case, where the true labels are Y:

y_pred = model.predict(X)
l = len(Y)
acc = sum([np.round(y_pred[i])==Y[i] for i in range(l)])/l
acc
# array([0.01383399])

Well, bingo! This is actually the same value returned by score[1] above…

To make a long story short: since you (erroneously) request metrics=['accuracy'] in your model compilation, Keras will do its best to satisfy you, and will return some “accuracy” indeed, calculated as shown above, despite this being completely meaningless in your setting.


There are quite a few settings where Keras, under the hood, performs rather meaningless operations without giving any hint or warning to the user; two of them I have happened to encounter are:

Leave a Comment