Aurora genAI: Intel’s Trillion-Parameter Leap into the Future of AI and Scientific Research
A groundbreaking announcement has emerged from Intel – the launch of Aurora genAI, a staggering generative AI model equipped with a mind-boggling 1 trillion parameters. Ready to lock horns with established players like OpenAI’s ChatGPT and Meta Models, Aurora genAI brings a new level of competition to the AI landscape.
But Aurora’s potency isn’t just about the astronomical number of parameters. It’s about its designed mission to revolutionize scientific research, backed by the immense computational prowess of Intel’s supercomputers. Powered by over 60,000 Intel Max GPUs, a swift IO system, and a robust, all-solid-state mass storage system, Aurora is primed to undertake large-scale AI model training on an unprecedented scale.
Leading this ambitious endeavor is an international consortium led by Argon, partnering with Intel, Hewlett Packard, enterprise, DE labs, several US and international universities, nonprofits, and international partners. Their shared vision? Training and evaluating a trillion-parameter model on a broad corpus that includes general text, code, and scientific text.
Yet Aurora’s appetite for knowledge doesn’t stop there. The model will be trained on trillions of tokens of structured scientific data, encompassing diverse disciplines such as biology, medicine, chemistry, materials, climate physics, and x-ray science. It’s not a solitary mission, either. Hackathons focused on these models have already commenced, and more are slated for the coming months. To all eager contributors, Intel extends an open invitation to join this transformative journey.
With Aurora genAI’s launch, we stand on the cusp of a new era in AI and scientific research. Keep your eyes peeled for more updates as Aurora genAI shapes the future of AI.