We’re watching closely to see what’s coming out of Mountain View as Google kicks off their 2023 I/O event. It goes without saying that Google, despite a long history of AI innovation behind the scenes, is playing catch up in the public’s eye.
PaLM 2, a Google advanced language model, was one of the main announcements to come from this morning’s event. Many of the company's latest AI features are built on this behemoth, which has an astounding 540 billion parameters. Although Google was very mum about the specifics of the training of this model, we do know that it was created utilizing Google's most recent JAX and TPU v4 infrastructure.
What distinguishes PaLM 2? To begin with, Google claims that it is superior in the areas of logic, mathematics, and common sense reasoning. The model, unlike many language models, does not rely on third-party plugins to solve math problems because it was trained on a wide variety of math and science materials. According to Google, PaLM 2 can reason through issues and solve math problems with ease, even offering visuals when necessary.
PaLM 2 appears to excel in the area of coding as well. It was trained in 20 different programming languages, from Python and JavaScript to Fortran and
Moreover, PaLM 2 supposedly stands out with its multilingual capabilities. Having been trained on a corpus of over 100 languages, it promises to deliver more nuanced phrasing than previous models.
Google has also developed variations of PaLM 2 designed for specific applications, including Med-PaLM 2 for medical knowledge and Sec-PaLM for security use cases.
Interestingly, a compact version of PaLM 2 is also in the works. It’s designed to run on smartphones and potentially cater to more privacy-centric and edge computing use cases.
Not to be left out, Google's AI chatbot Bard also got a major update. It will now integrate with more services like Google Maps, Gmail, and even third-party apps such as Adobe Firefly. With PaLM 2 powering Bard, we can expect to see more sophisticated and context-aware interactions.
How does PaLM 2 measure up against GPT-4? Both are certainly impressive models. GPT-3 had 175 billion parameters (we’re not exactly sure with GPT-4). Like PaLM 2, GPT-4 is designed to understand and generate human-like text. However, the specific capabilities of PaLM 2 compared to GPT-4 still remain to be seen.
As we wait to see these advancements from Google in action, one thing is clear: there is a lot of competition in the red hot AI space, and it's exciting.
Read more about Google's PaLM 2 announcement here: Introducing PaLM 2.
And watch highlights from the keynote here: