Neurosymbolic AI: A field integrating neural networks (learning) with symbolic logic (reasoning) to combine data efficiency with interpretability
RBM: Restricted Boltzmann Machine—a stochastic neural network that learns probability distributions over its inputs using an energy function
LBM: Logical Boltzmann Machine—the proposed system that encodes logical formulae into an RBM
SAT: Boolean Satisfiability—the problem of determining if there exists an interpretation that satisfies a given Boolean formula
MaxSAT: Maximum Satisfiability—the problem of determining the maximum number of clauses of a given Boolean formula that can be made true
LLM: Large Language Model—deep learning models like GPT-4 trained on vast text data
CoT: Chain of Thought—a prompting technique where models generate intermediate reasoning steps
Gibbs sampling: A Markov Chain Monte Carlo (MCMC) algorithm used to generate a sequence of observations approximated from a specified multivariate probability distribution
Hallucination: When an AI model generates incorrect or non-existent information confidently