The models I'm running won't converge / show an error. Do you have any advice about working with non-normal focal data? Most of the data are zero inflated or highly skewed.
The model I want to run is this (the response is just an example):
model <- glmer(Duration_behavior_1 ~
status * days_since_event +
focal_age +
focal_rank +
pink +
year +
hour +
(1|FocalID) +
(1|partner.ID),
family = 'poisson',
offset = Focal_duration,
data = data)
But I get this error:
Error in (function (fr, X, reTrms, family, nAGQ = 1L, verbose = 0L, maxit = 100L, :
(maxstephalfit) PIRLS step-halvings failed to reduce deviance in pwrssUpdate
I think the problem is the offset. If I remove it, I get this warning instead:
Warning messages:
1: In checkConv(attr(opt, "derivs"), opt$par, ctrl = control$checkConv, :
Model failed to converge with max|grad| = 0.00252314 (tol = 0.002, component 1)
2: In checkConv(attr(opt, "derivs"), opt$par, ctrl = control$checkConv, :
Model is nearly unidentifiable: very large eigenvalue
- Rescale variables?;Model is nearly unidentifiable: large eigenvalue ratio
- Rescale variables?
But, of course, the offset is necessary to control for the duration of the observation...
Do you think it would be recommended to change the distribution to a negative binomial, instead of using a Poisson?
You almost certainly want to use
log(Focal_duration)as your offset, rather thanFocal_duration.Suppose
etais the linear predictor not including the offset, e.g.beta0 + beta1*x1 + beta2*x2 + ...(also including the components based on the random effect). Then the predicted value iseta + offset, and the expected mean isexp(eta)*exp(offset). If the values of the offset (Focal_duration) are reasonably large (say, 60 seconds?), then the values ofexp(offset)are huge (e.g.exp(60)~ 1.1e26), which [besides being not what you want] messes up numerical computations.In fact, the GLMM FAQ says something very similar: