Project: Synapse-X
Exploration of non-linear memory retention in transformer models. We addressed the "Catastrophic Forgetting" paradox in continuous learning streams.
View Full Documentation
1.0 Abstract
Standard LLMs require full retraining to assimilate new knowledge clusters. Synapse-X introduces a "bi-cameral" architecture where a short-term volatility layer interacts with a long-term frozen core, mimicking the hippocampus-cortex relationship in mammals.
2.0 Methodology
We implemented a dynamic weight-decay algorithm (DWD) that identifies "redundant neurons" during sleep cycles (low traffic periods). These neurons are then repurposed for new data tokens without overwriting critical path logic.