The future of artificial intelligence is distributed

The future of artificial intelligence is distributed

Artificial intelligence (AI) and machine learning (ML) are disrupting every industry. Their influence and integration will only continue to grow.

“Ultimately, the future of AI lies in distributed computing” – said Ion Stoica, co-founder president of Anyscale, to VentureBeat. Distributed computing allows components of software systems to be shared among multiple computers and run as a single system, improving efficiency and performance. But while there is a need for distributed computing, writing distributed applications is difficult, according to Stoica, even more difficult than before. In particular, distributed computing related to artificial intelligence and ML presents many challenges. The difficulty of implementing distributed systems varies greatly, and engineers must test for all aspects of network and device failures, as well as failures and their various permutations. This gave rise to companies like Anyscale, which offers a set of tools that developers can use to build, deploy, and manage distributed applications. The company was founded by the creators of Ray, an open source distributed AI framework that simplifies scaling AI workloads to the cloud. Ray allows users to turn sequentially running Python code into a distributed application with minimal code modification. As Stoica noted, the computational requirements for training state-of-the-art models continue to increase by orders of magnitude, depending on the dataset. For example, the Google Pathways Language Model (PaLM) – the only model that generalizes with high efficiency across different domains and tasks – contains 530 billion parameters. And some of the largest carriers run with more than 1 trillion parameters. There is a huge gap between the needs of ML applications and the capabilities of a single processor or server. Stoica also pointed out that when Apache Spark was developed and released in 2014, all machines were considered homogeneous. However, this assumption no longer holds, as today’s landscape includes a variety of hardware accelerators. There are several stages in building an ML application, such as labeling, data specific training, tuning, reinforcement training. “Each phase requires scaling, and each typically requires a different distributed system,” said the Anyscale executive. Stoica described artificial intelligence as being essentially where big data was 10 years ago. According to him, it will take time for them to mature, because it is not only about the development of the tools, but also about the training of experts. It’s been roughly five or six years since colleges and universities began offering degrees in data science. More artificial intelligence courses are now being offered, and more and more applied artificial intelligence courses will appear, he predicted. According to him, we are only in the first round of this match.Hardware, software, tests, interesting and colorful news from the world of IT by clicking here!

Leave a Comment

Your email address will not be published.