Select Architecture
CRSM
Continuous Reasoning State Model
Non-transformer architecture optimized for deep causal analysis and logical precision.
Non-transformer architecture optimized for deep causal analysis and logical precision.
Aetheris
Hybrid Mamba-MoE
Sparse mixture-of-experts model designed for fluid creativity and abstract association.
Sparse mixture-of-experts model designed for fluid creativity and abstract association.
Scroll for Telemetry
The Dual Singularity
Two distinct paths to general intelligence. One built on rigid state-space logic, the other on fluid sparse expertise.
CRSM / KERNEL_01
ONLINE
ArchitectureRecurrent State Space
Reasoning DepthDeep
(Recursive)
Context WindowInfinite (Rolling)
System Load0% (STANDBY)
Aetheris / CLUSTER_A
ONLINE
ArchitectureHybrid Mamba-MoE
Total Params294M
Active Params167M
(Sparse)
Temperature0.85
Architecture Matrix
| Vector | CRSM | Aetheris |
|---|---|---|
| Primary Driver | Internal Latent State | Sparse Expert Routing |
| Inference Style | Deliberative & Linear | Associative & Parallel |
| Best Use Case | Code, Logic, Math | Storytelling, Roleplay |
| Memory | Compressed State | Selective Attention |
// POMILON INTELLIGENCE LABS © 2025 // EXPERIMENTAL BUILD