Bayes-HDC: Probabilistic Vector Symbolic Architectures
JAX library introducing Probabilistic Vector Symbolic Architectures (PVSA) — an algebra of uncertainty for hyperdimensional computing. Every hypervector is a posterior distribution; every VSA primitive propagates moments in closed form. First such framework in the HDC literature.
What PVSA unlocks
- Moment-propagating algebra. Closed-form moments for every core operation (
bind_gaussian,bundle_gaussian,bind_dirichlet,bundle_dirichlet,kl_*), with Monte Carlo fallback for the rest. - Calibrated predictive distributions. Post-hoc temperature scaling (Guo et al. 2017) fit via L-BFGS in log-space — reduces ECE by 5–25× on real datasets.
- Coverage-guaranteed prediction sets. Split-conformal with APS scores (Romano et al. 2020); true-label coverage ≥ 1 − α on exchangeable data.
Results vs. TorchHD
Head-to-head on five standard HDC benchmarks (iris, wine, breast-cancer, digits, MNIST), Bayes-HDC wins every dataset — mean +3.94 accuracy points, MNIST +8.9. No public HDC library currently offers calibration or conformal coverage.
Foundation
Beneath PVSA: a complete deterministic VSA layer — eight classical models (BSC, MAP, HRR, FHRR, BSBC, CGR, MCR, VTB), five encoders, five classifiers, three associative-memory modules, four symbolic data structures — each implemented from primary sources. No components are ported from other HDC libraries.
All operations run on CPU, GPU, and TPU via JAX/XLA. Every type is a pytree, so jit, vmap, grad, and pmap compose with the whole library.
JMLR MLOSS submission in preparation.