API#
Updates functions#
Update functions are the heart of probabilistic networks as they shape the propagation of beliefs in the neural hierarchy. The library implements the standard variational updates for value and volatility coupling, as described in Weber et al. (2023).
The updates module contains the update functions used during the belief propagation. Update functions are available through three sub-modules, organized according to their functional roles. We usually dissociate the first updates, triggered top-down (from the leaves to the roots of the networks), that are prediction steps and recover the current state of inference. The second updates are the prediction error, signalling the divergence between the prediction and the new observation (for input nodes), or state (for state nodes). Interleaved with these steps are posterior update steps, where a node receives prediction errors from the child nodes and estimates new statistics.
Posterior updates#
Update the sufficient statistics of a state node after receiving prediction errors from children nodes. The prediction errors from all the children below the node should be computed before calling the posterior update step.
Categorical nodes#
|
Update the categorical input node given an array of binary observations. |
Continuous nodes#
|
Update the posterior of a continuous node using the standard HGF update. |
Update the posterior of a continuous node using the eHGF update. |
Exponential family#
Update the hyperparameters of an ef state node using HGF-implied learning rates. |
Prediction steps#
Compute the expectation for future observation given the influence of parent nodes. The prediction step are executed for all nodes, top-down, before any observation.
Binary nodes#
|
Get the new expected mean and precision of a binary state node. |
Continuous nodes#
|
Compute the expected mean of a continuous state node. |
|
Compute the expected precision of a continuous state node. |
|
Update the expected mean and expected precision of a continuous node. |
Dirichlet processes#
|
Prediction of a Dirichlet process node. |
Prediction error steps#
Compute the value and volatility prediction errors of a given node. The prediction error can only be computed after the posterior update (or observation) of a given node.
Binary state nodes#
Compute the value prediction errors and predicted precision of a binary node. |
|
Update the posterior of a binary node given finite precision of the input. |
Categorical state nodes#
Prediction error from a categorical state node. |
Continuous state nodes#
Compute the value prediction error of a state node. |
|
Compute the volatility prediction error of a state node. |
|
|
Store prediction errors in an input node. |
Dirichlet state nodes#
|
Prediction error and update the child networks of a Dirichlet process node. |
|
Update an existing cluster. |
|
Create a new cluster. |
|
Find the best cluster candidate given previous clusters and an input value. |
|
Sample likely new belief distributions given pre-existing clusters. |
|
Likelihood of a parametrized candidate under the new observation. |
Exponential family#
Update the parameters of an exponential family distribution. |
|
Pass the expected sufficient statistics to the implied continuous nodes. |
Distribution#
The Hierarchical Gaussian Filter as a PyMC distribution. This distribution can be embedded in models using PyMC>=5.0.0.
Compute the log-probability of a decision model under belief trajectories. |
|
Compute log-probabilities of a batch of Hierarchical Gaussian Filters. |
|
Gradient Op for the HGF distribution. |
|
The HGF distribution PyMC >= 5.0 compatible. |
|
The HGF distribution returning pointwise log probability. |
Model#
The main class is used to create a standard Hierarchical Gaussian Filter for binary or continuous inputs, with two or three levels. This class wraps the previous JAX modules and creates a standard node structure for these models.
The two-level and three-level Hierarchical Gaussian Filters (HGF). |
|
A predictive coding neural network. |
|
Add continuous state node(s) to a network. |
|
Add binary state node(s) to a network. |
|
Add exponential family state node(s) to a network. |
|
Add categorical state node(s) to a network. |
|
Add a Dirichlet Process node to a network. |
|
Transform coupling parameter into tuple of indexes and strenghts. |
|
Update the default node parameters using keywords args and dictonary. |
|
Insert a set of parametrised node in a network. |
Plots#
Plotting functionalities to visualize parameters trajectories and correlations after observing new data. We are currently fully supporting Graphviz. NetworkX is also available for some functions.
Graphviz#
|
Plot the trajectories of the nodes' sufficient statistics and surprise. |
|
Plot the heatmap correlation of the sufficient statistics trajectories. |
|
Visualization of node network using GraphViz. |
|
Plot the trajectory of expected sufficient statistics of a set of nodes. |
Networkx#
|
Visualization of node network using NetworkX and pydot layout. |
Response#
A collection of response functions. A response function is simply a callable taking at least the HGF instance as input after observation and returning surprise.
|
Gaussian surprise at the first level of a probabilistic network. |
|
Sum of the Gaussian surprise across the probabilistic network. |
|
Time series of binary surprises for all binary state nodes. |
|
Surprise under the binary sofmax model. |
|
Surprise from a binary sofmax parametrized by the inverse temperature. |
Utils#
Utilities for manipulating neural networks.
|
Update the network's parameters after observing new data point(s). |
|
Return the branch of a network from a given set of root nodes. |
|
Generate a binary network implied by categorical state(-transition) nodes. |
|
Generate an update sequence from the network's structure. |
|
Export the nodes trajectories and surprise as a Pandas data frame. |
|
Add a value or volatility coupling link between a set of nodes. |
|
List all possible default inputs nodes. |
|
Add a new continuous-state parent node to the attributes and edges of a network. |
|
Remove a given node from the network. |
Math#
Math functions and probability densities.
The multivariate normal as an exponential family distribution. |
|
|
The univariate normal as an exponential family distribution. |
|
Density of the Gaussian-predictive distribution. |
|
Gaussian density as defined by mean and precision. |
|
Logistic sigmoid function. |
|
Surprise at a binary outcome. |
|
Surprise at an outcome under a Gaussian prediction. |
|
Compute the Kullback-Leibler divergence between two Dirichlet distributions. |
|
Compute the binary surprise with finite precision. |