Over the past two weeks, Google has quietly changed the terms of service for its Colab users, adding a stipulation that Colab services can no longer be used to form deepfakes.
The first web archived version of the Internet Archive that features the deepfake ban was captured last Tuesday, May 24. The last captured version of the Colab FAQ which does not not mention that the ban was on May 14.
Of the two popular deepfake-creating distributions, DeepFaceLab (DFL) and FaceSwap, both of which are forks of the controversial, anonymous code posted on Reddit in 2017, only the more notorious DFL appears to have been directly targeted by the ban. According to deepfake developer “chervonij” from DFL Discord, running the software in Google Colab now generates a warning:
‘You may be running unauthorized code, which may restrict your ability to use Colab in the future. Please note the prohibited actions specified in our FAQ.’
However, interestingly, the user is currently allowed to continue executing the code.
According to a Discord user for rival distro FaceSwap, the code for this project apparently doesn’t trigger the warning yet, suggesting that the code for DeepFaceLab (also the power architecture for real-time deepfake streaming implementation DeepFaceLive ), by far the most dominant method of deepfakes, was specifically targeted by Colab.
FaceSwap co-lead developer Matt Tora commented*:
“I find it highly unlikely that Google is doing this for any particular ethical reasons, more than Colab’s raison d’être is so that students/data scientists/researchers can run computationally expensive GPU code in an easy and accessible way, without overhead. However, I suspect that a sizeable number of users are leveraging this resource to create deepfake, large-scale models, which are both computationally expensive and take a sizeable amount of training time to produce results.
“You could say that Colab leans more on the educational and research side of AI. Running scripts that require little user input or understanding tends to go against that. At Faceswap, we try to focus on educating the user about AI and the mechanics involved, while lowering the barrier to entry. We strongly encourage the ethical use of the software and believe that making these kinds of tools available to a wider audience help educate people about what is achievable in today’s world, rather than keeping it hidden away for a select few.
“Unfortunately, we cannot control how our tools are ultimately used, or where they are executed. It saddens me that an avenue has been closed for people to experiment with our code, however, in terms of protecting this particular resource to ensure its availability to the actual target audience, I find it understandable.
Google Colab is a dedicated implementation of Jupyter notebook environments, which enables the remote training of machine learning projects on much more powerful GPUs than many users can afford.
Since deepfake training is a VRAM-intensive pursuit, and since the advent of GPU starvation, many deepfakers in recent years have shunned home training in favor of remote training in Colab, where it is possible, depending on chance and level, to form a model deepfake on powerful boards such as the Tesla T4 (16GB VRAM, currently around $2000), V100 (32GB VRAM, around $4000) and the mighty A100 (80GB VRAM, MSRP $32,097.00), among others.
Banning Colab training seems likely to reduce the pool of deepfakers capable of training higher-resolution models, where the input and output images are larger, more suitable for high-resolution results, and able to extract and reproduce greater facial details.
Some of the most committed deepfake enthusiasts and enthusiasts, according to Discord and forum posts, have invested heavily in local hardware over the past two years, despite high GPU prices.
However, given the high costs involved, sub-communities have emerged to deal with the challenges of forming deepfakes on Colabs, with random GPU allocation being the most common complaint since Colab restricted GPU usage. premium to free users.
* In private messages on Discord
First published May 28, 2022. Revised 7:28 AM EST, corrected typo.
#Google #banned #formation #Deepfakes #Colab