In what sounds like a harrowing digital nightmare, an incident occurred where unknown individuals stolen nude pictures and videos from a young woman’s private cloud storage, subsequently sharing them across various internet porn sites. The woman’s stolen identity was also used, making the content easily discoverable via a Google search of her name.
The victim sought assistance from HateAid, a German nonprofit that addresses online hate and digital violence. With their help, she was able to identify and report over 2,000 URLs found through Google’s image search, which included the stolen data as well as deepfake content generated by artificial intelligence (AI).
This disturbing incident raises critical questions regarding online data protection, particularly for women and those perceived as female, who may become the targets of such cyberattacks. It also highlights the challenges faced by victims in having such content removed from the internet, as it often persists despite efforts to have it erased.
The affected woman, who remains anonymous and referred to as Laura, accidentally discovered the breach when searching for herself online. Feeling violated, as if she had been subjected to a form of digital equivalent to rape, Laura’s life was drastically altered, leading her to relocate and change her job. She now struggles with post-traumatic stress disorder.
After unsuccessful out-of-court attempts to remove the content from Google search results, Laura has taken legal action against Google in Ireland, where the company’s European headquarters are located. HateAid supports her legal pursuit, covering all costs and hoping for a landmark ruling that clarifies the legal obligation for search engines to permanently remove such content from search results, even after reuploading.
Laura’s case also shines a light on the broader societal issue of image-based sexual violence and the profit motives of companies like Google, which can inadvertently facilitate such harm by making explicit content widely accessible.
Furthermore, the case revisits the legal precedent set by the European Court of Justice’s “right to be forgotten” ruling, which could influence the current case’s outcome. Data protection experts suggest that while there is a solid case for requiring global search engine providers to ensure such content is removed, more technical and legal challenges must be addressed.
HateAid, alongside Laura’s legal action, runs a campaign titled “Our nudes are #NotYourBusiness” to raise awareness of image-based sexual violence and the need for better protections against deepfakes and similar abuses.
Laura’s case underscores the risk of such cyberattacks extending beyond celebrities to ordinary individuals. The case aims not only to improve data protection and privacy laws but to establish the criminal liability of creating deepfakes without the depicted person’s consent.
The incident serves as a stark reminder of the vulnerabilities in digital privacy and the psychosocial impact of online harassment, advocating for a comprehensive response to internet-related harms.
Source: https://www.dw.com/en/german-woman-sues-google-over-nude-pictures-and-sex-videos/a-73869490?maca=en-rss-en-all-1573-rdf