Amazon Leaps Headlong Into the AI Rat Race


If you thought big daddy Amazon would stay out of the AI rat race, then you’d be wrong. The online retail giant is initially staying away from any user-side generative AI to start and is instead offering a business-centric model through its Amazon Web Services.

The Wall Street Journal first reported that Amazon had finally decided to join its big tech brothers Microsoft, Google, and Meta in the AI shoving match. According to the report, Amazon isn’t really offering its own AI, but is sitting back and offering a “neutral platform” for businesses to incorporate separate AI models. AWS is going to offer access to Anthropic’s Claude chatbot, Stability AI’s image generation services, and AI21 Labs’ large language model that powers programs like Wordtune Spices. There’s also Amazon Titan, the company’s own language model, but according to the report Amazon isn’t designing its own ChatGPT-like interface.

All this means the online retail giant isn’t putting any multi-billion dollar investments into a separate company like Microsoft has with OpenAI or sinking billions of dollars into generative artificial intelligence like Google and Meta have. The only direct competition is Amazon’s new CodeWhisperer, a generative AI model used to generate code. Microsoft’s similar GitHub CoPilot has already been sued by developers who say Microsoft blatantly ignored their code license.

The report says that Amazon will allow companies to have the AI trained only on customers’ data, rather than the broader information and webpages used to train other models. Of course, this ignores that all these AI language and diffusion models were already trained on hundreds of terabytes of data scraped from the internet. Sure, there are ways to restrict the AI from snatching up and training the model itself on all this company data, but it’s incredibly hard to parse all that data that was used in AI training.

In addition, Amazon said it was going to sell companies on AI-optimized chips. Of course, Amazon could one day soon release an AI model for regular users. AWS CEO Adam Selipsky told the Journal “it truly is day one in generative AI.”

As a web-hosting and cloud service, AWS presides over large swathes of the internet—so much so that an AWS outage can effectively brick most commonly-used websites. AWS is such a massive part of the Amazon ecosystem that it makes up more than 50% of the tech giant’s annual revenue. Amazon also uses specialized versions of AWS for government-use, including a special version called “Secret Region” used by the CIA.

Let’s not forget the problems with widespread AI adoption, mainly the energy and resource requirements. Recent reports show that training OpenAI’s older GPT-3 model consumed hundreds of thousands of liters of water for cooling the data centers. Maintaining these AI models requires a hell of a lot of electricity. Each use of ChatGPT has been estimated to use more than a hundred times more electricity than a regular Google search, something companies will need to contend with if they too make AI even more prolific than it already is.

Want to know more about AI, chatbots, and the future of machine learning? Check out our full coverage of artificial intelligence, or browse our guides to The Best Free AI Art Generators, The Best ChatGPT Alternatives, and Everything We Know About OpenAI’s ChatGPT.


If you thought big daddy Amazon would stay out of the AI rat race, then you’d be wrong. The online retail giant is initially staying away from any user-side generative AI to start and is instead offering a business-centric model through its Amazon Web Services.

The Wall Street Journal first reported that Amazon had finally decided to join its big tech brothers Microsoft, Google, and Meta in the AI shoving match. According to the report, Amazon isn’t really offering its own AI, but is sitting back and offering a “neutral platform” for businesses to incorporate separate AI models. AWS is going to offer access to Anthropic’s Claude chatbot, Stability AI’s image generation services, and AI21 Labs’ large language model that powers programs like Wordtune Spices. There’s also Amazon Titan, the company’s own language model, but according to the report Amazon isn’t designing its own ChatGPT-like interface.

All this means the online retail giant isn’t putting any multi-billion dollar investments into a separate company like Microsoft has with OpenAI or sinking billions of dollars into generative artificial intelligence like Google and Meta have. The only direct competition is Amazon’s new CodeWhisperer, a generative AI model used to generate code. Microsoft’s similar GitHub CoPilot has already been sued by developers who say Microsoft blatantly ignored their code license.

The report says that Amazon will allow companies to have the AI trained only on customers’ data, rather than the broader information and webpages used to train other models. Of course, this ignores that all these AI language and diffusion models were already trained on hundreds of terabytes of data scraped from the internet. Sure, there are ways to restrict the AI from snatching up and training the model itself on all this company data, but it’s incredibly hard to parse all that data that was used in AI training.

In addition, Amazon said it was going to sell companies on AI-optimized chips. Of course, Amazon could one day soon release an AI model for regular users. AWS CEO Adam Selipsky told the Journal “it truly is day one in generative AI.”

As a web-hosting and cloud service, AWS presides over large swathes of the internet—so much so that an AWS outage can effectively brick most commonly-used websites. AWS is such a massive part of the Amazon ecosystem that it makes up more than 50% of the tech giant’s annual revenue. Amazon also uses specialized versions of AWS for government-use, including a special version called “Secret Region” used by the CIA.

Let’s not forget the problems with widespread AI adoption, mainly the energy and resource requirements. Recent reports show that training OpenAI’s older GPT-3 model consumed hundreds of thousands of liters of water for cooling the data centers. Maintaining these AI models requires a hell of a lot of electricity. Each use of ChatGPT has been estimated to use more than a hundred times more electricity than a regular Google search, something companies will need to contend with if they too make AI even more prolific than it already is.

Want to know more about AI, chatbots, and the future of machine learning? Check out our full coverage of artificial intelligence, or browse our guides to The Best Free AI Art Generators, The Best ChatGPT Alternatives, and Everything We Know About OpenAI’s ChatGPT.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – admin@technoblender.com. The content will be deleted within 24 hours.
Adam Selipskyamazonamazon web servicesArticlesartificial intelligenceArtificial intelligence in healthcareChatbotChatGPTCloud infrastructureComputational neuroscienceComputingDigital technologyGitHubgithub copilotGizmodoHeadlongInstant messagingInternetLeapsMicrosoftOpenAIRaceratTechnoblenderTechnologytitanTop Storieswall street journal
Comments (0)
Add Comment