LRM: Large Reasoning Model—LLMs optimized for complex reasoning, often producing long CoT traces (e.g., OpenAI o1, DeepSeek-R1)
SEP: Shortest Effective Path—the shortest sequence of logically self-consistent nodes in a CoT graph sufficient to reach the correct answer
Redundancy Ratio (RR): The proportion of nodes in a CoT that are not part of the Shortest Effective Path
Average Degree: A topological metric measuring graph density; values >1.0 indicate branching, looping, or backtracking beyond a linear path
Atomic Node: A single, indivisible functional step in a reasoning chain (e.g., one calculation or one verification check)
Logical Epicenter: A node with high in-degree/out-degree, acting as a hub for repeated branching or looping (indicating a stuck point)
PCB: Physics, Chemistry, Biology—a domain grouping for scientific reasoning tasks
DFS: Depth-First Search—an algorithm for traversing tree or graph data structures
CoT: Chain-of-Thought—intermediate reasoning steps generated by an LLM before the final answer