Our Chief Scientist Correy Kowall’s journey began over 30 years ago with early collaborations with AI pioneers such as Jürgen Schmidhuber at IDSIA. Schmidhuber’s work on neural networks and long short-term memory (LSTM) networks laid the groundwork for modern AI models, emphasizing the importance of memory and context in AI systems.
Fundamental problems with recent developments in AI drove us to explore new architectural adjustments. After decades of research we identified unexplored areas in attention mechanisms related to hallucinations in transformer-based Large Language Models.
ATOMIC is the result of deep research and collaboration with our Team of ML engineers and data scientists. It leverages the best insights of LSTMs to make transformer models more efficient, accurate, and insightful.
As the world enters the Age of AI, we are ready to deploy the ATOMIC technology to help your business:
Lower training costs
Reduce hallucinations and improve understanding of context
Improve any open source large language model you choose
Prepare for the future of Agentic & Edge AI