pymc.model.core.Model#

class pymc.model.core.Model(*args, **kwargs)[source]#

Encapsulates the variables and likelihood factors of a model.

Model class can be used for creating class based models. To create a class based model you should inherit from Model and override the __init__ method with arbitrary definitions (do not forget to call base class pymc.Model.__init__() first).

Parameters:
namestr

name that will be used as prefix for names of all random variables defined within model

coordsdict

Xarray-like coordinate keys and values. These coordinates can be used to specify the shape of random variables and to label (but not specify) the shape of Determinsitic, Potential and Data objects. Other than specifying the shape of random variables, coordinates have no effect on the model. They can’t be used for label-based broadcasting or indexing. You must use numpy-like operations for those behaviors.

check_boundsbool

Ensure that input parameters to distributions are in a valid range. If your model is built in a way where you know your parameters can only take on valid values you can set this to False for increased speed. This should not be used if your model contains discrete variables.

modelPyMC model, optional

A parent model that this model belongs to. If not specified and the current model is created inside another model’s context, the parent model will be set to that model. If None the model will not have a parent.

Examples

Use context manager to define model and respective variables

import pymc as pm

with pm.Model() as model:
    x = pm.Normal("x")

Use object API to define model and respective variables

import pymc as pm

model = pm.Model()
x = pm.Normal("x", model=model)

Use coords for defining the shape of random variables and labeling other model variables

import pymc as pm
import numpy as np

coords = {
    "feature",
    ["A", "B", "C"],
    "trial",
    [1, 2, 3, 4, 5],
}

with pm.Model(coords=coords) as model:
    # Variable will have default dim label `intercept__dim_0`
    intercept = pm.Normal("intercept", shape=(3,))
    # Variable will have shape (3,) and dim label `feature`
    beta = pm.Normal("beta", dims=("feature",))

    # Dims below are only used for labeling, they have no effect on shape
    # Variable will have default dim label `idx__dim_0`
    idx = pm.Data("idx", np.array([0, 1, 1, 2, 2]))
    x = pm.Data("x", np.random.normal(size=(5, 3)), dims=("trial", "feature"))
    # single dim can be passed as string
    mu = pm.Deterministic("mu", intercept[idx] + beta @ x, dims="trial")

    # Dims controls the shape of the variable
    # If not specified, it would be inferred from the shape of the observations
    y = pm.Normal("y", mu=mu, observed=[-1, 0, 0, 1, 1], dims=("trial",))

Define nested models, and provide name for variable name prefixing

import pymc as pm

with pm.Model(name="root") as root:
    x = pm.Normal("x")  # Variable wil be named "root::x"

    with pm.Model(name="first") as first:
        # Variable will belong to root and first
        y = pm.Normal("y", mu=x)  # Variable wil be named "root::first::y"

    # Can pass parent model explicitly
    with pm.Model(name="second", model=root) as second:
        # Variable will belong to root and second
        z = pm.Normal("z", mu=y)  # Variable wil be named "root::second::z"

    # Set None for standalone model
    with pm.Model(name="third", model=None) as third:
        # Variable will belong to third only
        w = pm.Normal("w")  # Variable wil be named "third::w"

Set check_bounds to False for models with only continuous variables and default transformers PyMC will remove the bounds check from the model logp which can speed up sampling

import pymc as pm

with pm.Model(check_bounds=False) as model:
    sigma = pm.HalfNormal("sigma")
    x = pm.Normal("x", sigma=sigma)  # No bounds check will be performed on `sigma`

Methods

Model.__init__([name, coords, check_bounds, ...])

Model.add_coord(name[, values, length])

Register a dimension coordinate with the model.

Model.add_coords(coords, *[, lengths])

Vectorized version of Model.add_coord.

Model.add_named_variable(var[, dims])

Add a random graph variable to the named variables of the model.

Model.check_start_vals(start, **kwargs)

Check that the logp is defined and finite at the starting point.

Model.compile_d2logp([vars, jacobian, ...])

Compiled log probability density hessian function.

Model.compile_dlogp([vars, jacobian])

Compiled log probability density gradient function.

Model.compile_fn(outs, *[, inputs, mode, ...])

Compiles a PyTensor function.

Model.compile_logp([vars, jacobian, sum])

Compiled log probability density function.

Model.copy()

Clone the model.

Model.create_value_var(rv_var, *, ...[, ...])

Create a TensorVariable that will be used as the random variable's "value" in log-likelihood graphs.

Model.d2logp([vars, jacobian, negate_output])

Hessian of the models log-probability w.r.t.

Model.debug([point, fn, verbose])

Debug model function at point.

Model.dlogp([vars, jacobian])

Gradient of the models log-probability w.r.t.

Model.eval_rv_shapes()

Evaluate shapes of untransformed AND transformed free variables.

Model.get_context([error_if_none, ...])

Model.initial_point([random_seed])

Compute the initial point of the model.

Model.logp([vars, jacobian, sum])

Elemwise log-probability of the model.

Model.logp_dlogp_function([grad_vars, ...])

Compile a PyTensor function that computes logp and gradient.

Model.make_obs_var(rv_var, data, dims, ...)

Create a TensorVariable for an observed random variable.

Model.name_for(name)

Check if name has prefix and adds if needed.

Model.name_of(name)

Check if name has prefix and deletes if needed.

Model.point_logps([point, round_vals])

Compute the log probability of point for all random variables in the model.

Model.profile(outs, *[, n, point, profile])

Compile and profile a PyTensor function which returns outs and takes values of model vars as a dict as an argument.

Model.register_data_var(data[, dims])

Register a data variable with the model.

Model.register_rv(rv_var, name, *[, ...])

Register an (un)observed random variable with the model.

Model.replace_rvs_by_values(graphs, **kwargs)

Clone and replace random variables in graphs with their value variables.

Model.set_data(name, values[, coords])

Change the values of a data variable in the model.

Model.set_dim(name, new_length[, coord_values])

Update a mutable dimension.

Model.set_initval(rv_var, initval)

Set an initial value (strategy) for a random variable.

Model.shape_from_dims(dims)

Model.to_graphviz(*[, var_names, ...])

Produce a graphviz Digraph from a PyMC model.

Attributes

basic_RVs

List of random variables the model is defined in terms of.

continuous_value_vars

All the continuous value variables in the model.

coords

Coordinate values for model dimensions.

datalogp

PyTensor scalar of log-probability of the observed variables and potential terms.

dim_lengths

The symbolic lengths of dimensions in the model.

discrete_value_vars

All the discrete value variables in the model.

isroot

observedlogp

PyTensor scalar of log-probability of the observed variables.

parent

potentiallogp

PyTensor scalar of log-probability of the Potential terms.

prefix

root

unobserved_RVs

List of all random variables, including deterministic ones.

unobserved_value_vars

List of all random variables (including untransformed projections), as well as deterministics used as inputs and outputs of the model's log-likelihood graph.

value_vars

List of unobserved random variables used as inputs to the model's log-likelihood (which excludes deterministics).

varlogp

PyTensor scalar of log-probability of the unobserved random variables (excluding deterministic).

varlogp_nojac

PyTensor scalar of log-probability of the unobserved random variables (excluding deterministic) without jacobian term.