Facebook's AI Runs 5x Faster on GPUs and Now Beats Google! What Could Be Next for the Tech Giant?

Facebook's AI Runs 5x Faster on GPUs and Now Beats Google: What on Earth Could Facebook Be Capable of Doing Now?
Facebook's AI Runs 5x Faster on GPUs and Now Beats Google: What on Earth Could Facebook Be Capable of Doing Now? Photo : Screenshot From Pxhere Official Website

Recently, a team from Facebook AI Research or FAIR has just developed a novel low-dimension design space which they call the "RegNet" that supposedly outperforms the traditional available models from Google and even runs five times faster on GPUs!

The RegNet is able to produce a simple, fast, and also versatile network and in previous experiments, it was even able to outperform Google's SOTA EfficientNet Models according to researchers in a paper titled the Designing Network Design Spaces which was published on pre-print repository ArXiv.

Facebook vs. Google

The researchers were previously aiming for "interpretability and to discover general design principles," explaining that it would describe networks that are really simple, work well, and are also capable of being generalized across the whole settings.

The Facebook AI team behind this has conducted controlled comparisons of their network to EfficientNet with no training-time for further enhancements and under the very same training setup. Back in 2019, Google's previous EfficientNet was using a combination of the NAS and also the model scaling rules and also represented the current SOTA.

Read Also: Will Skype Be Number One After Zoom's Recent Privacy Issues?

With the comparable training settings and Flops, RegNet models actually outperformed the EfficientNet models while being up to 5 times faster on GPUs!

Instead of designing as well as developing different networks, the team keenly focused on designing actual network design spaces which compromise of huge and possibly infinite populations of model architectures. Analyzing the whole RegNet design space also provided the researchers with other unexpected insights into the network design.

What the team found

The team had noticed that, for, example, the depth of the best models is stable across the compute regimes with an optimal depth of 20 blocks or about 60 layers. According to the paper, "while it is common to see modern mobile networks employ inverted bottlenecks, researchers noticed that using inverted bottlenecks degrade performances" which points out that the best models do not actually use either bottlenecks or inverted bottlenecks.

The Facebook AI Research team has just recently developed a tool that tricks the whole facial recognition system into wrongly identifying a person in a certain video. The whole de-identification system, which works well in live videos, actually uses machine learning in order to change certain key features of a subject in a video.

Read Also: Ever Noticed The Facebook Android App Switched To The Bottom? Want To Know Why?

FAIR is known as an advancing state-of-the-art in artificial intelligence through the use of fundamental and applied research in open collaboration with the community. This social networking giant even created the Facebook AI Research group way back in 2014 to advance the entire state of AI technology through open research for the benefit of all.

Ever since then, FAIR has grown even into international research organizations with labs all over Menlo Park, Montreal, Paris, Seattle, Tel Aviv, New York, London, and Pittsburgh. Facebook seems to be getting even better with its AI technology and with the help of FAIR, they have already beaten Google!

© 2024 iTech Post All rights reserved. Do not reproduce without permission.

Company from iTechPost

More from iTechPost