This function generate samples, and functions of those, from an approximated posterior of a fitted model (an inla-object)

inla.posterior.sample(n = 1L, result, selection = list(), 
                           intern = FALSE,
                           use.improved.mean = TRUE, skew.corr = TRUE, 
                           add.names = TRUE, seed = 0L, num.threads = NULL, 
                           verbose=FALSE)
     inla.posterior.sample.eval(fun, samples, return.matrix = TRUE, ...)

Arguments

n

Number of samples.

result

The inla-object, ie the output from an inla-call. The inla-object must be created with control.compute=list(config=TRUE).

selection

Select what part of the sample to return. By default, the whole sample is returned. selection is a named list with the name of the components of the sample, and what indices of them to return. Names include APredictor, Predictor, (Intercept), and otherwise names in the formula. The values of the list, is interpreted as indices. If they are negative, they are interpreted as 'not', a zero is interpreted as 'all', and positive indices are interpreted as 'only'. The names of elements of each samples refer to the indices in the full sample.

intern

Logical. If TRUE then produce samples in the internal scale for the hyperparmater, if FALSE then produce samples in the user-scale. (For example log-precision (intern) and precision (user-scale))

use.improved.mean

Logical. If TRUE then use the marginal mean values when constructing samples. If FALSE then use the mean in the Gaussian approximations.

skew.corr

Logical. If TRUE then correct samples for skewness, if FALSE, do not correct samples for skewness (ie use the Gaussian).

add.names

Logical. If TRUE then add name for each elements of each sample. If FALSE, only add name for the first sample. (This save space.)

seed

Control the RNG of inla.qsample, see ?inla.qsample for further information. If seed=0L then GMRFLib will set the seed intelligently/at 'random'. If seed < 0L then the saved state of the RNG will be reused if possible, otherwise, GMRFLib will set the seed intelligently/at 'random'. If seed > 0L then this value is used as the seed for the RNG. If you want reproducible results, you ALSO need to control the seed for the RNG in R by controlling the variable .Random.seed or using the function set.seed, the example for how this can be done.

num.threads

The number of threads that can be used. num.threads>1L requires seed = 0L. Default value is controlled by inla.getOption("num.threads")

verbose

Logical. Run in verbose mode or not.

fun

The function to evaluate for each sample. Upon entry, the variable names defined in the model are defined as the value of the sample. The list of names are defined in result$misc$configs$contents where result is an inla-object. This includes predefined names for for the linear predictor (Predictor and APredictor), and the intercept ((Intercept) or Intercept). The hyperparameters are defined as theta, no matter if they are in the internal scale or not. The function fun can also return a vector.

samples

samples is the output from inla.posterior.sample()

return.matrix

Logical. If TRUE, then return the samples of fun as matrix, otherwise, as a list.

...

Additional arguments to fun

Details

The hyperparameters are sampled from the configurations used to do the numerical integration, hence if you want a higher resolution, you need to to change the int.stratey variable and friends. The latent field is sampled from the Gaussian approximation conditioned on the hyperparameters, but with a correction for the mean (default), and optional (and by default) corrected for the estimated skewness.

The log.density report is only correct when there is no constraints. With constraints, it correct the Gaussian part of the sample for the constraints.

After the sample is (optional) skewness corrected, the log.density is is not exact for correcting for constraints, but the error is very small in most cases.

Value

inla.posterior.sample returns a list of the samples, where each sample is a list with names hyperpar and latent, and with their marginal densities in logdens$hyperpar and logdens$latent and the joint density is in logdens$joint. inla.posterior.sample.eval return a list or a matrix of fun applied to each sample.

Author

Havard Rue hrue@r-inla.org and Cristian Chiuchiolo cristian.chiuchiolo@kaust.edu.sa

Examples

  r = inla(y ~ 1 ,data = data.frame(y=rnorm(1)), control.compute = list(config=TRUE))
#> Warning: error in running command
#> Error in inla.inlaprogram.has.crashed(): The inla-program exited with an error. Unless you interupted it yourself, please rerun with verbose=TRUE and check the output carefully.
#>   If this does not help, please contact the developers at <help@r-inla.org>.
  samples = inla.posterior.sample(2,r)
#> Error: object 'r' not found

  ## reproducible results:
  inla.seed = as.integer(runif(1)*.Machine$integer.max)
  set.seed(12345)
  x = inla.posterior.sample(100, r, seed = inla.seed)
#> Error: object 'r' not found
  set.seed(12345)
  xx = inla.posterior.sample(100, r, seed = inla.seed)
#> Error: object 'r' not found
  all.equal(x, xx)
#> Error in h(simpleError(msg, call)): error in evaluating the argument 'target' in selecting a method for function 'all.equal': object 'x' not found

 set.seed(1234)
 n = 25
 xx = rnorm(n)
 yy = rev(xx)
 z = runif(n)
 y = rnorm(n)
 r = inla(y ~ 1 + z + f(xx) + f(yy, copy="xx"),
         data = data.frame(y, z, xx, yy), 
         control.compute = list(config=TRUE),
         family = "gaussian")
#> Warning: error in running command
#> Error in inla.inlaprogram.has.crashed(): The inla-program exited with an error. Unless you interupted it yourself, please rerun with verbose=TRUE and check the output carefully.
#>   If this does not help, please contact the developers at <help@r-inla.org>.
 r.samples = inla.posterior.sample(100, r)
#> Error: object 'r' not found
 
 fun = function(...) {
     mean(xx) - mean(yy)
 }
 f1 = inla.posterior.sample.eval(fun, r.samples)
#> Error: object 'r.samples' not found
 
 fun = function(...) {
     c(exp(Intercept), exp(Intercept + z))
 }
 f2 = inla.posterior.sample.eval(fun, r.samples)
#> Error: object 'r.samples' not found
 
 fun = function(...) {
     return (theta[1]/(theta[1] + theta[2]))
 }
 f3 = inla.posterior.sample.eval(fun, r.samples)
#> Error: object 'r.samples' not found

 ## Predicting nz new observations, and
 ## comparing the estimated one with the true one
 set.seed(1234)
 n = 100
 alpha = beta = s = 1
 z = rnorm(n)
 y = alpha + beta * z + rnorm(n, sd = s)
 r = inla(y ~ 1 + z, 
         data = data.frame(y, z), 
         control.compute = list(config=TRUE),
         family = "gaussian")
#> Warning: error in running command
#> Error in inla.inlaprogram.has.crashed(): The inla-program exited with an error. Unless you interupted it yourself, please rerun with verbose=TRUE and check the output carefully.
#>   If this does not help, please contact the developers at <help@r-inla.org>.
 r.samples = inla.posterior.sample(10^3, r)
#> Error: object 'r' not found
 nz = 3
 znew = rnorm(nz)
 fun = function(zz = NA) {
     ## theta[1] is the precision
     return (Intercept + z * zz +
             rnorm(length(zz), sd = sqrt(1/theta[1])))
 }
 par(mfrow=c(1, nz))
 f1 = inla.posterior.sample.eval(fun, r.samples, zz = znew)
#> Error: object 'r.samples' not found
 for(i in 1:nz) {
     hist(f1[i, ], n = 100, prob = TRUE)
     m = alpha + beta * znew[i]
     xx = seq(m-4*s, m+4*s, by = s/100)
     lines(xx, dnorm(xx, mean=m, sd = s), lwd=2)
 }
#> Error: object 'f1' not found