System 1 vs System 2: A dual-process theory of cognition: System 1 is fast/intuitive/unconscious, while System 2 is slow/deliberate/logical.
Chain-of-Thought (CoT): A prompting technique that encourages models to generate intermediate reasoning steps before producing a final answer.
Foundation Models: Large-scale models (e.g., GPT-4, Llama 2) trained on broad data that can be adapted to downstream tasks via fine-tuning or prompting.
In-Context Learning (ICL): The ability of a model to learn from a few examples provided in the prompt without updating its weights.
Mixture of Experts (MoE): A neural network architecture where different sub-models (experts) specialize in different parts of the input space, activated sparsely.
Zero-shot-CoT: A method where a model performs reasoning chains simply by being prompted with 'Let's think step by step', without needing example demonstrations.
Abductive Reasoning: Inferring the most plausible explanation or hypothesis for a set of observations (inference to the best explanation).
Multimodal Reasoning: Reasoning that integrates and processes information from multiple modalities simultaneously, such as text, images, and audio.