Techno Blender
Digitally Yours.

How Google’s PaLM 2 AI model is different from its predecessor

0 34


Google recently launched PaLM 2, the company’s new large language model (LLM), at Google I/O. The company said that the new model is smaller than prior LLMs, however, a report says that the LLM uses almost five times as much training data as its predecessor, PaLM launched in 2022. This allows the model to perform more advanced coding, math and creative writing tasks.

Citing internal documentation, a report by CNBC said that the PaLM 2 model is trained on 3.6 trillion tokens as compared to PaLM’s 780 billion tokens.

Tokens are strings of words and they are important building blocks for training LLMs. They teach the model to predict the next word that will appear in a sequence.

Read Also

Google IO 2023 When is the event how to register things to expect and more

Google’s PaLM 2 features
Google announced that the PaLM 2 language model has improved multilingual, reasoning and coding capabilities. Google said that model is trained on 100 languages and performs a broad range of tasks.

The training on so many languages has significantly improved its ability to understand, generate and translate nuanced text — including idioms, poems and riddles — across a wide variety of languages, a hard problem to solve.

Google claimed that PaLM 2 also passes advanced language proficiency exams at the “mastery” level. PaLM 2 is also trained on a data set that includes scientific papers and web pages that contain mathematical expressions.

Read Also

Explained What is PaLM 2 Googles newest AI language model

PaLM 2 technique
At Google I/O conference, Google said that PaLM 2 uses a “new technique” called “compute-optimal scaling,” which makes the LLM “more efficient with overall better performance, including faster inference, fewer parameters to serve, and a lower serving cost.”

PaLM 2 is available in four sizes, from smallest to largest: Gecko, Otter, Bison and Unicorn.

The LLM powers over 25 new products and features. The AI model will be available in Workspace apps, Med-PaLM for medical uses and Sec-PaLM for security.

FacebookTwitterLinkedin



end of article


How Google's PaLM 2 AI model is different from its predecessor

Google recently launched PaLM 2, the company’s new large language model (LLM), at Google I/O. The company said that the new model is smaller than prior LLMs, however, a report says that the LLM uses almost five times as much training data as its predecessor, PaLM launched in 2022. This allows the model to perform more advanced coding, math and creative writing tasks.

Citing internal documentation, a report by CNBC said that the PaLM 2 model is trained on 3.6 trillion tokens as compared to PaLM’s 780 billion tokens.

Tokens are strings of words and they are important building blocks for training LLMs. They teach the model to predict the next word that will appear in a sequence.

Read Also

Google IO 2023 When is the event how to register things to expect and more

Google’s PaLM 2 features
Google announced that the PaLM 2 language model has improved multilingual, reasoning and coding capabilities. Google said that model is trained on 100 languages and performs a broad range of tasks.

The training on so many languages has significantly improved its ability to understand, generate and translate nuanced text — including idioms, poems and riddles — across a wide variety of languages, a hard problem to solve.

Google claimed that PaLM 2 also passes advanced language proficiency exams at the “mastery” level. PaLM 2 is also trained on a data set that includes scientific papers and web pages that contain mathematical expressions.

Read Also

Explained What is PaLM 2 Googles newest AI language model

PaLM 2 technique
At Google I/O conference, Google said that PaLM 2 uses a “new technique” called “compute-optimal scaling,” which makes the LLM “more efficient with overall better performance, including faster inference, fewer parameters to serve, and a lower serving cost.”

PaLM 2 is available in four sizes, from smallest to largest: Gecko, Otter, Bison and Unicorn.

The LLM powers over 25 new products and features. The AI model will be available in Workspace apps, Med-PaLM for medical uses and Sec-PaLM for security.

FacebookTwitterLinkedin



end of article

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment