Ahmadi, B.B.AhmadiKersting, KristianKristianKerstingHadiji, FabianFabianHadiji2022-03-112022-03-112010https://publica.fraunhofer.de/handle/publica/370677Lifted belief propagation (LBP) can be extremely fast at computing approximate marginal probability distributions over single variables and neighboring ones in the underlying graphical model. It does, however, not prescribe a way to compute joint distributions over pairs, triples or k-tuples of distant random variables. In this paper, we present an algorithm, called conditioned LBP, for approximating these distributions. Essentially, we select variables one at a time for conditioning, running lifted belief propagation after each selection. This naive solution, however, recomputes the lifted network in each step from scratch, therefore often canceling the benefits of lifted inference. We show how to avoid this by efficiently computing the lifted network for each conditioning directly from the one already known for the single node marginals. This contribution advances the theoretical understanding of lifted inference but also allows one to efficiently solve many important AI tasks such as finding the MAP assignment, sequential forward sampling, parameter estimation, active learning, sensitivity analysis, to name only few. Our experimental results validate that significant efficiency gains are possible and illustrate the potential for second-order parameter estimation of Markov logic networks.en005Lifted conditioning for pairwise marginals and beyondconference paper