TensorFlow, dangerous! The outcast is Google itself

Welcome to the WeChat subscription number of “Sina Technology”: techsina

Text / Xiao Xiao

Source: Qubit (ID: QbitAI)

TensorFlow, which has gained nearly 166,000 Stars and witnessed the rise of deep learning, is in jeopardy.

And this time, the impact is not from the old rival PyTorch, but its own rookie JAX.

In the latest wave of heated discussions in the AI ​​circle, even Jeremy Howard, founder of fast.ai, said:

It has long been known that JAX is gradually replacing TensorFlow. Now it’s happening (at least inside undefined).

LeCun even believes that the fierce competition between deep learning frameworks has entered a new stage.

LeCun said that Google’s TensorFlow was indeed more popular than Torch. However, after the emergence of undefined PyTorch, its popularity has now surpassed that of TensorFlow.

Now, including undefined Brain, DeepMind and many external projects, have begun to use JAX.

A typical example is the recently popular DALL·E Mini. In order to make full use of the TPU, the author uses JAX for programming. Someone exclaimed after using it:

This is much faster than PyTorch.

According to “Business Insider”, it is expected that within the next few years, JAX will cover all of Google’s products using machine learning technology.

In this way, vigorously promoting JAX internally is more like a “self-rescue” initiated by Google on the framework.

Where does JAX come from?

Regarding JAX, Google actually came prepared.

As early as 2018, it was built by a small team of three people from Google Brain.

The research results are published in a paper titled Compiling machine learning programs via high-level tracing:

Jax is a Python library for high-performance numerical computing, and deep learning is just one of those capabilities.

Its popularity has been on the rise since its inception.

The biggest feature is that it is fast.

Feel it with an example.

For example, to find the sum of the first three powers of a matrix, implemented in NumPy, the calculation takes about 478 milliseconds.

With JAX it only takes 5.54 milliseconds, which is 86 times faster than NumPy.

Why so fast? There are many reasons, including:

1. NumPy accelerator. Needless to say the importance of NumPy, using Python for scientific computing and machine learning, no one can do without it, but it has not natively supported hardware acceleration such as GPU.

JAX’s calculation function API is all based on NumPy, which makes it easy to run models on GPUs and TPUs. This has caught a lot of people.

2. XLA. XLA (Accelerated Linear Algebra) is accelerated linear algebra, an optimizing compiler. JAX is built on top of XLA, which greatly increases the upper limit of JAX computation speed.

3. JIT. Researchers can use XLA to convert their functions into just-in-time compiled (JIT) versions, which are equivalent to orders of magnitude faster computations by adding a simple function modifier to the computation function.

In addition, JAX is fully compatible with Autograd, supports automatic differentiation, is converted through functions such as grad, hessian, jacfwd and jacrev, supports reverse mode and forward mode differentiation, and the two can be composed in any order.

Of course, JAX also has some disadvantages.

for example:

1. Although JAX is known as an accelerator, it is not fully optimized for every operation in CPU computing.

2. JAX is still too new and has not formed a complete basic ecology like TensorFlow. So it hasn’t been rolled out by Google in the form of a molded product.

3. The time and cost required for debugging are uncertain, and the “side effects” are not completely clear.

4. It does not support Windows system and can only run in the above virtual environment.

5. There is no data loader, you have to borrow TensorFlow or PyTorch.

Nonetheless, JAX, which is simple, flexible, and easy to use, was the first to catch on in DeepMind. Some deep learning libraries such as Haiku and RLax born in 2020 are developed based on it.

This year, Adam Paszke, one of the original authors of PyTorch, also joined the JAX team full-time.

At present, the open source project of JAX has 18.4k stars on GitHub, which is much higher than that of TensorFlow.

It is worth noting that during this period, there have been many voices that it is likely to replace TensorFlow.

On the one hand, it is because of the strength of JAX, and on the other hand, it is mainly related to many reasons of TensorFlow itself.

Why is Google switching to JAX?

TensorFlow, which was born in 2015, was once all the rage. After its launch, it quickly surpassed a number of “trends” such as Torch, Theano, and Caffe to become the most popular machine learning framework.

In 2017, however, the revamped PyTorch “made a comeback”.

This is a machine learning library built by Meta based on Torch. Because it is easy to use and easy to understand, it is quickly favored by many researchers, and even has a tendency to surpass TensorFlow.

In contrast, TensorFlow has become more and more bloated with frequent updates and interface iterations, gradually losing the trust of developers.

(From the perspective of the proportion of questions on Stack Overflow, PyTorch is rising year by year, but TensorFlow has been stagnant)

In the competition, the shortcomings of TensorFlow are gradually exposed. The problems of unstable API, complex implementation, and high learning cost have not been solved with the update, but the structure has become more complex.

In contrast, TensorFlow has not continued to exert its advantages such as “running efficiency” that can be beaten.

In academia, PyTorch usage is gradually overtaking TensorFlow.

Especially in the top conferences such as ACL and ICLR, the algorithm framework implemented by PyTorch has accounted for more than 80% in recent years. In contrast, the utilization rate of TensorFlow is still declining.

That’s why Google couldn’t sit still and tried to use JAX to regain the “dominance” of the machine learning framework.

While JAX is not nominally a “general purpose framework built for deep learning,” Google’s resources have been leaning toward JAX since its inception.

On the one hand, Google Brain and DeepMind are gradually building more libraries on top of JAX.

Including Google Brain’s Trax, Flax, Jax-md, and DeepMind’s neural network library Haiku and reinforcement learning library RLax, etc., are built based on JAX.

According to Google’s official statement:

In the development of the JAX ecosystem, consideration will also be given to ensuring that it is consistent (as much as possible) with the design of existing TensorFlow libraries such as Sonnet and TRFL.

On the other hand, more projects have begun to be implemented based on JAX, and the DALL·E mini project that has exploded recently is one of them.

Because it can better utilize the advantages of Google TPU, JAX is much better than PyTorch in running performance, and more industrial projects built on TensorFlow are also switching to JAX.

Some netizens even ridiculed the reason why JAX is so popular: it may be that users of TensorFlow can’t stand this framework.

So, does JAX hope to replace TensorFlow and become a new force to compete with PyTorch?

Which framework do you prefer?

Overall, many people still stand firm on PyTorch.

They don’t seem to like the speed at which Google comes out with a new framework every year.

“JAX, while appealing, isn’t quite “revolutionary” enough to push people away from PyTorch to use it.”

But optimistic about JAX is not a minority.

Some people say that PyTorch is perfect, but JAX is also closing the gap.

Some people even called JAX frantically, saying that it is 10 times more powerful than PyTorch, and said: If Meta does not continue to work harder, Google will win. (manual dog head)

However, there are always people who don’t care about who wins and who loses, and their vision is very long-term:

There is no best, only better. The most important thing is that more players and good ideas all join in, so that open source and truly excellent innovation are equated.

The text and pictures in this article are from Sina.com

loading.gif

This article is reprinted from https://www.techug.com/post/tensorflow-danger-google-itself-is-the-one-who-abandoned-it/
This site is for inclusion only, and the copyright belongs to the original author.

Leave a Comment