
Reactive AI
AI & ML interests
AGI, Reactive Awareness Models, Memory Systems, Reactive Neural Networks
Recent Activity
Reactive AI
We are working on our own idea of Reactive Neural Networks (RxNN) - special kind of memory-augmented neural networks, that keeps state/memory between interactions/sequences instead of between tokens/elements in sequence and provides reactive communication patterns.
Our primary architecture - Reactor - is planned as the first awareness AGI model, that's modelling awareness as an Infinite Chain-of-Thoughts, connected to Short-Term and Long-Term Memory (Attention-based Memory System) and Receptors/Effectors systems for real-time reactive processing. It will be able to constantly and autonomously learn from interactions in Continouos Live Learning process.
While the Reactor is the main goal, it's extremely hard to achieve, as it's definitely the most advanced neural network ensemble ever.
That's why we designed simplified architectures, for incremental transformation from language/reasoning models to awareness model:
- Reactive Transformer is introducing Attention-based Memory System and adding Short-Term Memory to Transformer language models
- Preactor is adding Long-Term Memory and ability to learn from interactions
We are currently working on Reactive Transformer Proof-of-Concept - RxT-Alpha, that will be published soon
More info soon
RxNN Platform
We are working on complete Reactive Neural Networks development framework - RxNN github
Additional Research
- Sparse Query Attention - the most cost-effective GQA variant, reducing training time/cost by ~3-10% with similar performance. Research in progress
Collections
2
models
13

ReactiveAI/RxT-Alpha-Micro-Decoder-Plus

ReactiveAI/RxT-Alpha-Micro-Decoder

ReactiveAI/RxT-Alpha-Micro-MLM

ReactiveAI/RxT-Alpha-Micro-Encoder

ReactiveAI/GQA-Ref-Micro

ReactiveAI/xSQAT-mm

ReactiveAI/SQAT-mm

ReactiveAI/MQA-Ref-Micro

ReactiveAI/sSQAT-mm
