Conference on Parsimony and Learning (CPAL)
March 2025, Stanford
Subject Areas
Theory & Foundations
- Theories for sparse coding, structured sparsity, subspace learning, low-dimensional manifolds, and general low-dimensional structures.
- Dictionary learning and representation learning for low-dimensional structures and their connections to deep learning theory.
- Equivariance and invariance modeling.
- Theoretical neuroscience and cognitive science foundation for parsimony, and biologically inspired computational mechanisms.
Optimization & Algorithms
- Optimization, robustness, and generalization methods for learning compact and structured representations.
- Interpretable and efficient deep architectures (e.g., based on unrolled optimization).
- Data-efficient and computation-efficient training and inference.
- Adaptive and robust learning and inference algorithms.
- Distributed, networked, or federated learning at scale.
- Other nonlinear dimension-reduction and representation-learning methods.
Data, Systems & Applications
- Domain-specific datasets, benchmarks, and evaluation metrics.
- Parsimonious and structured representation learning from data.
- Inverse problems that benefit from parsimonious priors.
- Hardware and system co-design for parsimonious learning algorithms.
- Parsimonious learning in intelligent systems that integrate perception-action cycles.
- Applications in science, engineering, medicine, and social sciences.
The above is intended as a high-level overview of CPAL’s focus and by no means exclusive. If you doubt that your paper fits the venue, feel free to contact the program chairs via email at pcs@cpal.cc.