Google made significant announcements in the field of artificial intelligence (AI) at its Google I/O event, with the launch of PaLM 2, a large language model, being the highlight of the day. However, the company had more AI news to share during the event.
Google is introducing a range of updates and enhancements to its open-source machine learning (ML) technology, focusing on the growing TensorFlow ecosystem. TensorFlow, led by Google, offers ML tools that empower developers to build and train models effectively.
One of the notable updates is the introduction of DTensor technology, aimed at enhancing ML training through parallelism techniques. This technology improves model training and scaling efficiency, delivering more optimized results.
Google is also offering a preview release of the TF Quantization API, which helps optimize models for resource efficiency, ultimately reducing development costs.
Within the TensorFlow ecosystem, the Keras API suite plays a crucial role, providing deep learning capabilities in Python on top of the core TensorFlow framework. Google is introducing two new tools within Keras: KerasCV for computer vision (CV) applications and KerasNLP for natural language processing (NLP) tasks.
Alex Spinelli, Google’s Vice President of Product Management for Machine Learning, emphasized the company’s commitment to driving new capabilities, efficiency, and performance through open-source strategies. While Google continues to integrate exceptional AI and ML into its products, the company also aims to uplift the broader developer community by creating opportunities and advancements in the open-source space.
Google’s announcements at Google I/O showcase its dedication to innovation and collaboration in the AI and ML domains, providing developers with the tools and technologies necessary to build powerful and efficient models.
TensorFlow remains the ‘workhouse’ of machine learning at Google
In an era where large language models (LLMs) are all the rage, Spinelli emphasized that it’s now even more critical than ever to have the right ML training tools.
“TensorFlow is still today the workhorse of machine learning,” he said. “It is still … the fundamental underlying infrastructure [in Google] that powers a lot of our own machine learning developments.”
To that end, the DTensor updates will provide more “horsepower” as the requirements of ML training continue to grow. DTensor introduces more parallelization capabilities to help optimize training workflows.
Spinelli said that ML overall is just getting more hungry for data and compute resources. As such, finding ways to improve performance in order to process more data to serve the needs of increasingly larger models is extremely important. The new Keras updates will provide even more power, with modular components that actually let developers build their own computer vision and natural language processing capabilities.
Still more power will come to TensorFlow thanks to the new JAX2TF technology. JAX is a research framework for AI, widely used at Google as a computational library, to build technologies such as the Bard AI chatbot. With JAX2TF, models written in JAX will now be more easily usable with the TensorFlow ecosystem.
“One of the things that we’re really excited about is how these things are going to make their way into products — and watch that developer community flourish,” he said.
PyTorch vs TensorFlow
While TensorFlow is the workhorse of Google’s ML efforts, it’s not the only open-source ML training library.
In recent years the open-source PyTorch framework, originally created by Facebook (now Meta), has become increasingly popular. In 2022, Meta contributed PyTorch to the Linux Foundation, creating the new PyTorch Foundation, a multi-stakeholder effort with an open governance model.
Spinelli said that what Google is trying to do is support developer choice when it comes to ML tooling. He also noted that TensorFlow isn’t just an ML framework, it’s a whole ecosystem of tools for ML that can help support training and development for a broad range of use cases and deployment scenarios.
“This is the same set of technologies, essentially, that Google uses to build machine learning,” Spinelli said. “I think we have a really competitive offering if you really want to build large-scale high-performance systems and you want to know that these are going to work on all the infrastructures of the future.”
One thing Google apparently will not be doing is following Meta’s lead and creating an independent TensorFlor Foundation organization.
“We feel pretty comfortable with the way it’s developed today and the way it’s managed,” Spinelli said. “We feel pretty comfortable about some of these great updates that we’re releasing now.”