Lumina AI, a leader in CPU-optimized machine learning solutions, announces the release of PrismRCL 2.6.0, the latest upgrade to its flagship software designed to push the boundaries of performance and efficiency in machine learning. This release introduces a highly anticipated feature: the LLM (Large Language Model) training parameter, further solidifying RCL’s capability to build foundation models at unmatched speed and cost-efficiency.
The new LLM parameter equips users to seamlessly train language models on complex datasets, demonstrating Lumina AI’s commitment to advancing text-based AI innovations. By enabling streamlined text data handling, the LLM parameter positions Random Contrast Learning (RCL) as an essential tool for the next generation of language models, outperforming conventional transformer-based architectures in speed, energy efficiency, and scalability.
“By incorporating the new LLM parameter, we’re providing a foundation for training language models that is faster and more efficient without relying on expensive hardware accelerators,” said Allan Martin, CEO of Lumina AI.
“The beauty of PrismRCL 2.6.0 lies in its simplicity. By adding the LLM parameter, users can signal their intent to build LLMs, and the system takes care of the rest. It’s rewarding to see how well this version performs against transformer networks—it’s proof that innovation doesn’t need to be complicated to be powerful,” said Dr. Morten Middelfart, Chief Data Scientist of Lumina AI.
Recent experimental results underline RCL’s unmatched performance, demonstrating up to 98.3x faster training speeds compared to transformer-based models, even on standard CPUs. The development of the LLM feature reflects Lumina AI‘s strategy of reducing the costs and environmental impact associated with traditional neural network training.
SOURCE: PRNewsWire