pyhgf.updates.posterior.exponential.posterior_update_exponential_family_dynamic#
- pyhgf.updates.posterior.exponential.posterior_update_exponential_family_dynamic(attributes: Dict, edges: Tuple[AdjacencyLists, ...], node_idx: int, **args) Dict[int | str, Dict] [source]#
Update the hyperparameters of an ef state node using HGF-implied learning rates.
This posterior update step is usually moved at the end of the update sequence as we have to wait that all parent nodes tracking the expected sufficient statistics have been updated, and therefore being able to infer the implied learning rate to update the \(nu\) vector. The new impled \(nu\) is given by a ratio:
\[\nu \leftarrow \frac{\delta}{\Delta}\]Where \(delta\) is the prediction error (the new sufficient statistics compared to the expected sufficient statistic), and \(Delta\) is the differential of expectation (what was expected before compared to what is expected after). This ratio quantifies how much the model is learning from new observations.
- Parameters:
- attributes
The attributes of the probabilistic nodes.
- edges
The edges of the probabilistic nodes as a tuple of
pyhgf.typing.Indexes
. The tuple has the same length as the node number. For each node, the index lists the value and volatility parents and children.- node_idx
Pointer to the value parent node that will be updated.
- Returns:
- attributes
The updated attributes of the probabilistic nodes.
References
[1]Mathys, C., & Weber, L. (2020). Hierarchical Gaussian Filtering of Sufficient Statistic Time Series for Active Inference. In Active Inference (pp. 52–58). Springer International Publishing. https://doi.org/10.1007/978-3-030-64919-7_7