Exact probability inference from Graphical Models

Vegeedog
Machine Learning🏄‍♀️
2 min readMay 30, 2022

--

Intro: Graphical models are composed of nodes & edges. Nodes correspond to random variables, edges correspond to the relationship between random random variables.

To perform “exact” probability inference (instead of approximation) from graphical models, the theory behind is the concept of message passing. Hence, here we only consider acyclic graphs.

For model that are more complex, the package “pgmpy” would be useful for programming.

Directed graph (i.e., Bayesian Network): Network structure qualitatively describes the dependencies of random variables. Note the following is only suitable for acyclic directed graphs.

For instance,

Step1: build model

Step2: Define probabilities. In pgmpy, conditional probability distributions (CPDs) can be defined using the TabularCPD class.

(reference: https://pgmpy.org/_modules/pgmpy/factors/discrete/CPD.html)

Step 3: After defining the model parameters, we can now add them to the model using “add_cpds” method. The “check_model” method can also be used to verify if the CPDs are correctly defined for the model structure.

Now, we could start to perform probability inference. Below are three queries, for instance.

1. If John calls, predict the probability of burglary.

2. If a burglary happened, predict the probability of John calling.

3. If John and Mary called and there was no earthquake, predict the probability of burglary.

Undirected graph (i.e., Markov Random Field, MRF) & Factor Graphs

Directed graph could be converted to undirected graph through moralization, that is, marry all the co-parents. By adding additional links, it may create cycles. Hence, we would further convert into Factor Graphs, which are always tree-shaped.

For instance,

Step1: build model

Step2: Define probabilities, using numpy array.

Step3: Define the factor nodes in the graph and add the factor potentials.

(reference: https://pgmpy.org/_modules/pgmpy/factors/discrete/DiscreteFactor.html#DiscreteFactor)

Step4: Add to model.

Now, we could start to perform probability inference. Below are two queries, for instance.

1. Predict the probability of road being slick.

2. If Jerry had an accident, predict the probability of road being slick.

Note that, for factor graphs, it is necessary to perform normalization.

--

--