You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Mar 3, 2026. It is now read-only.
Thank you very much for this project, it's really great. I would like to know if the improved Torch XLA has been tested for the maximum number of TPUs it can support when scaling on TPU. Because previously we found that on the original Torch XLA, some unknown errors occurred when the number of TPUs exceeded a certain amount. Also, has there been any testing of the training speed difference between Torch XLA and JAX under the same conditions on TPU?