LTN: Logic Tensor Networksโa neurosymbolic framework that grounds FOL predicates into neural networks and treats logical satisfiability as a loss function
Non-monotonic Reasoning (NMR): A logic system where adding new axioms can invalidate previous theorems (e.g., learning about penguins invalidates 'all birds fly')
Catastrophic Forgetting: A phenomenon in neural networks where learning new information causes the abrupt loss of previously learned information
Rehearsal: A Continual Learning technique where a subset of previous data (or rules) is mixed in with new data to prevent forgetting
Curriculum Learning: Organizing training data/rules into a meaningful sequence rather than shuffling them randomly
Real Logic: A differentiable logic used in LTN that maps truth values to the interval [0,1], allowing gradient-based optimization of satisfiability
Grounding: Mapping symbolic logical constants to vector embeddings and predicates to neural networks