PEFT: Parameter-Efficient Fine-Tuning—methods to adapt large models by training only a small subset of parameters
LoRA: Low-Rank Adaptation—a popular PEFT method that injects trainable rank-decomposition matrices into layers
SVD: Singular Value Decomposition—a method to factorize a matrix into unitary matrices (U, V) and a diagonal matrix of singular values (S)
Spectral Adapter_A: Additive variant of the proposed method: adds trainable parameters to the top singular vectors
Spectral Adapter_R: Rotational variant of the proposed method: rotates the top singular vectors using orthogonal matrices
rank capacity: The range of matrix ranks that a fine-tuned weight can theoretically achieve given the adapter's parameterization
Cayley parameterization: A technique to enforce orthogonality constraints on matrices during optimization without expensive computational steps
ESD: Empirical Spectral Distribution—the distribution of eigenvalues/singular values of a matrix
OFT: Orthogonal Fine-Tuning—a PEFT method that multiplies weights by orthogonal matrices to preserve neuron energy