Techno Blender
Digitally Yours.
Browsing Tag


AI pioneer Cerebras is having ‘a monster year’ in hybrid AI computing

The business "has changed completely" for Cerebras, Feldman told ZDNET. "Rather than buying one or two machines, and putting a job on one machine for a week, customers would rather have it on 16 machines for a few hours" as a cloud service model.The result for Cerebras, is, "For hardware sales, you can do fewer, bigger deals, and you're gonna spend a lot of time and effort in the management of your own cloud."On Monday, at a supercomputing conference in Denver called SC23, Feldman and team unveiled the latest

AI startup Cerebras built a gargantuan AI computer for Abu Dhabi’s G42 with 27 million AI ‘cores’

Condor Galaxy is the result, said Feldman, of months of collaboration between Cerebras and G42, and is the first major announcement of their strategic partnership.The initial contract is worth more than a hundred million dollars to Cerebras, Feldman told ZDNET in an interview. That is going to expand ultimately by multiple times, to hundreds of millions of dollars in revenue, as Cerebras builds out Condor Galaxy in multiple stages. Also: Ahead of AI, this other technology wave is sweeping in fastCondor Galaxy is named

AI pioneer Cerebras opens up generative AI where OpenAI goes dark

Cerebras Systems Cerebras SystemsSuch large language models are notoriously compute-intensive. The Cerebras work released Tuesday was developed on a cluster of sixteen of its CS-2 computers, computers the size of dormitory refrigerators that are tuned specially for AI-style programs. The cluster, previously disclosed by the company, is known as its Andromeda supercomputer, which can dramatically cut the work to train LLMs on thousands of Nvidia's GPU chips.Also: ChatGPT's success could prompt a damaging swing

AI computing startup Cerebras releases open source ChatGPT-like models

Artificial intelligence chip startup Cerebras Systems on Tuesday said it released open source ChatGPT-like models for the research and business community to use for free in an effort to foster more collaboration. Silicon Valley-based Cerebras released seven models all trained on its AI supercomputer called Andromeda, including smaller 111 million parameter language models to a larger 13 billion parameter model. "There is a big movement to close what has been open sourced in's not surprising as there's now huge…

AI challenger Cerebras unveils ‘pay-per-model’ large-model AI cloud service with Cirrascale, Jasper

Cerebras' price schedule in collaboration with Cirrascale promises to be half the average cost of cloud services or specialized clusters to train large models.  Cerebras Systems/CirrascaleThe partnership for Studio follows a partnership between Cerebras and Cirrascale announced a year ago to offer CS2 machines in the cloud on a weekly basis.  The service will automatically scale the size of clusters depending on the scale of the language model, said Feldman. The company emphasizes that training performance improves

AI challenger Cerebras assembles modular supercomputer ‘Andromeda’ to speed up large language models

Andromeda was assembled in building-block fashion by combining 16 of Cerebras's CS-2 AI computers interconnected by a switch called Swarm-X, communicating with a central coordinating computer that updates neural weights called Memory-X. Cerebras SystemsThe Andromeda cluster is installed as a cloud-available machine by Santa Clara, California-based Colovore, which competes in the market for hosting services with the likes of Equinix. The secret of the modular design is that the CS-2 machines can be orchestrated as a

AI startup Cerebras celebrated for chip triumph where others tried and failed

"Being able to optimize compute in this manner," he said of Cerebras's chip, "will hopefully teach us that there are more potential positive uses of these technologies than the unfortunate things that have occurred as a result of some business models that have surfaced that are, effectively, programming the population without them really being aware of it."The WSE-2 chip, and its predecessor, introduced three years ago, mark an epochal achievement in the history of fabricating transistors, the building block of

Cerebras says its "wafer-scale" chip sets a record for the largest AI model trained on a single device with up to 20B parameters…

Francisco Pires / Tom's Hardware: Cerebras says its “wafer-scale” chip sets a record for the largest AI model trained on a single device with up to 20B parameters — Democratizing large AI Models without HPC scaling requirements. — Cerebras, the company behind the world's largest accelerator chip in existence … Francisco Pires / Tom's Hardware: Cerebras says its “wafer-scale” chip sets a record for the largest AI model trained on a single device…