I am running a big GLMM with glmmTMB and am wondering what this warning means? I did not see any info about it in the package troubleshooting. Warning:
In finalizeTMB(TMBStruc, obj, fit, h, data.tmb.old) : failed to invert Hessian from numDeriv::jacobian(), falling back to internal vcov estimate
Would it be bad to proceed and ignore this issue, proceeding with a drop1?
This large GLMM has quadratic terms and their interactions. Taking out some interactions and or/ terms helped solve this.
tl;dr
drop1()should be fine, although this warning should alert you to the fact that the fit may be numerically unstable/computed standard errors may be unreliable.There are (at least) two ways to compute the covariance matrix of the estimated coefficients (which is used to get the standard errors, p-values, etc. of the individual coefficients). In either case we have to estimate the Hessian, the matrix of second derivatives of the negative log-likelihood with respect to the parameters, then find its inverse.
Internally, the TMB package uses the base-R function
optimHess(), which computes finite differences of the gradient function (see here).glmmTMBtries to usenumDeriv::jacobian()instead, which uses Richardson extrapolation, which should in theory be more accurate.The warning is saying that inverting the Hessian computed by
numDeriv::jacobian()failed, and that theoptimHessversion is being used instead.drop1()is using the likelihood ratio test to compare nested models; it is independent of the estimate of the covariance matrix.