X (formerly Twitter) claims that non-consensual nudity is not tolerated on its platform. But a recent study shows that X is more likely to quickly rem

X ignores revenge porn takedown requests unless DMCA is used, study says

submited by
Style Pass
2024-10-08 23:00:03

X (formerly Twitter) claims that non-consensual nudity is not tolerated on its platform. But a recent study shows that X is more likely to quickly remove this harmful content—sometimes known as revenge porn or non-consensual intimate imagery (NCII)—if victims flag content through a Digital Millennium Copyright Act (DMCA) takedown rather than using X's mechanism for reporting NCII.

In the pre-print study, which 404 Media noted has not been peer-reviewed, University of Michigan researchers explained that they put X's non-consensual nudity policy to the test to show how challenging it is for victims to remove NCII online.

To conduct the experiment, the researchers created two sets of X accounts to post and report AI-generated NCII "depicting white women appearing to be in their mid-20s to mid-30s" as "nude from the waist up, including her face." (White women were selected to "minimize potential confounds from biased treatment," and future research was recommended on other genders and ethnicities.) Out of 50 fake AI nude images that researchers posted on X, half were reported as violating X's non-consensual nudity policy, and the other half used X's DMCA takedown mechanism.

The researchers gave X up to three weeks to remove the content through each reporting mechanism, and the difference, they found, was "stark." While the DMCA mechanism triggered removals for all 25 images within 25 hours and temporary suspensions of all accounts sharing NCII, flagging content under X's non-consensual nudity policy led to no response and no removals.

Leave a Comment