Scikit-learn Gaussian Process Regressor returns 2d vector instead of 2d Covariance matrix

35 Views Asked by At

I'm fitting data (2d input, 2d output) to a Gaussian Process from Sklearn but when trying to get the covariance matrix I'm getting a 2d vector, not a matrix.

For some examples, it works fine (returns a 2d matrix), but I do not understand what's wrong with my case.

Should I interpret that the two values returned are meant to be the diagonal values of a diagonal matrix?

This is a simple example to reproduce the issue:

import numpy as np
from sklearn.gaussian_process import GaussianProcessRegressor
from sklearn.gaussian_process.kernels import RationalQuadratic

# Simple dataset
X = np.array([[0,1], [1,0]])
y = np.array([np.sin(12 * X[:, 0]).ravel(),
              np.cos(12 * X[:, 1]).ravel()])

kernel = RationalQuadratic()
gpr = GaussianProcessRegressor(kernel=kernel, random_state=0)
gpr.fit(X, y)

new_x = np.array([[0.1, 0.5]])

y_pred, y_cov = gpr.predict(new_x, return_cov=True)

print("y_pred:", y_pred)
print("y_cov:", y_cov)

Which prints:

y_pred: [[ 1.71640581e-05 -2.47700145e-03]]

y_cov: [[[0.99997834 0.99997834]]]

And if instead of cov I return std (not sure what that means for a 2d target):

y_pred, y_std = gpr.predict(np.array([[0.1, 0.5]]), return_std=True)
print("y_pred:", y_pred)
print("y_std:", y_std)

y_pred: [[ 1.71640581e-05 -2.47700145e-03]]

y_std: [[0.99998917 0.99998917]]

Note that y_cov has an extra dimension than y_std

y_cov.shape

(1, 1, 2)

y_std.shape

(1, 2)

0

There are 0 best solutions below