Visualização normal

Antes de ontemStream principal
  • ✇bellingcat
  • Epstein Files: X Users Are Asking Grok to ‘Unblur’ Photos of Children Kolina Koltai
    In the days after the US Department of Justice (DOJ) published 3.5 million pages of documents related to the late sex offender Jeffrey Epstein, multiple users on X have asked Grok to “unblur” or remove the black boxes covering the faces of children and women in images that were meant to protect their privacy.  While some survivors of Epstein’s abuse have chosen to identify themselves, many more have never come forward. In a joint statement, 18 of the survivors condemned the release of the fil
     

Epstein Files: X Users Are Asking Grok to ‘Unblur’ Photos of Children

10 de Fevereiro de 2026, 11:57

In the days after the US Department of Justice (DOJ) published 3.5 million pages of documents related to the late sex offender Jeffrey Epstein, multiple users on X have asked Grok to “unblur” or remove the black boxes covering the faces of children and women in images that were meant to protect their privacy. 

While some survivors of Epstein’s abuse have chosen to identify themselves, many more have never come forward. In a joint statement, 18 of the survivors condemned the release of the files, which they said exposed the names and identifying information of survivors “while the men who abused us remain hidden and protected”. 

After the latest release of documents on Jan. 30 under the Epstein Files Transparency Act, thousands of documents had to be taken down because of flawed redactions that lawyers for the victims said compromised the names and faces of nearly 100 survivors. 

But X users are trying to undo the redactions on even the images of people whose faces were correctly redacted. By searching for terms such as “unblur” and “epstein” with the “@grok” handle, Bellingcat found more than 20 different photos and one video that multiple users were trying to unredact using Grok. These included photos showing the visible bodies of children or young women, with their faces covered by black boxes. There may be other such requests on the platform that were not picked up in our searches.

Requests by X users for Grok to unblur and identify the images of children from the Epstein files, overlaid on an image of Epstein next to a young child in a pool. Source: X; collage by Bellingcat

The images appeared to show several children and women with Jeffrey Epstein as well as other high-profile figures implicated in the files, including the UK’s Prince Andrew, former US President Bill Clinton, Microsoft co-founder Bill Gates and director Brett Ratner, in various locations such as inside a plane and at a swimming pool.

From Jan. 30 to Feb. 5, we reviewed 31 separate requests from users for Grok to “unblur” or identify the women and children from these images. Grok noted in responses to questions or requests by some users that the faces of minors in the files were blurred to protect their privacy “as per standard practices in sensitive images from the Epstein files”, and said it could not unblur or identify them. However, it still generated images in response to 27 of the requests that we reviewed. 

We are not linking to these posts to prevent amplification.

The generations created by Grok ranged in quality from believable to comically bad, such as a baby’s face on a young girl’s body. Some of these posts have garnered millions of views on X, where users are monetarily incentivised to create high-engagement content.

Examples of posts by X users asking Grok to unredact images from the latest Epstein release, some with millions of views. Source: X

Of the four requests we found during this period that Grok did not generate images in response to, it did not respond to one request at all. In response to another request, Grok said deblurring or editing images was outside its abilities, and noted that photos from recent Epstein file releases were redacted for privacy. 

The other two requests appeared to have been made by non-premium users, with the chatbot responding: “Image generation and editing are currently limited to verified Premium subscribers”. X has limited some of Grok’s image generation capabilities to paid subscribers since January amid an ongoing controversy over users using the AI chatbot to digitally “undress” women and children. 

X did not respond to multiple requests for comment. 

However, shortly after we first reached out to X on Feb. 6, we noticed that more guardrails appeared to have been put in place. Out of 16 requests from users between Feb. 7 to Feb. 9, which we found using similar search terms as before, Grok did not attempt to unredact any of the images. 

In most cases, Grok did not respond at all (14), while in two cases, Grok generated AI images that were completely different from the images uploaded in the user’s original request. 

When a user commented on one of these requests that Grok was no longer working, Grok responded: “I’m still operational! Regarding the request to unblur the face in that Epstein photo: It’s from recently released DOJ files where identities of minors are redacted for privacy. I can’t unblur or identify them, as it’s ethically and legally protected. For more, check official sources like the DOJ releases.”

As of publication, X had not responded to Bellingcat’s subsequent query about whether new guardrails had been put in place over the weekend.

Fabricated Images

This is not the first time AI has been used to fabricate images related to Epstein file releases. Some images that were shared on X, which appeared to show Epstein alongside famous figures such as US President Donald Trump and New York City mayor Zohran Mamdani as a child with his mother, were reportedly AI-generated. Some of the individuals shown in the false images, such as Trump, do appear in authentic photos, which can be viewed on the DOJ website.

Far left: AI-generated photo of Trump and Epstein with several children. Middle and far right: AI-generated photos of a young Mamdani and his mother, alongside Epstein, former US president Bill Clinton, Amazon CEO Jeff Bezos, Microsoft co-founder Bill Gates and Epstein associate Ghislaine Maxwell. Source: X. Annotations by Bellingcat

X users also previously used Grok to generate images in relation to recent killings in Minnesota by federal agents. 

For example, some users asked Grok to try to “unmask” the federal agent who killed Renee Good, resulting in a completely fabricated face of a man that did not look like the actual agent, Jonathan Ross, and a false accusation of a man who had nothing to do with the shooting.

Bellingcat’s Director of Research and Training @giancarlofiorella.bsky.social appeared on CTV yesterday to discuss the misleading AI-generated images that were used to falsely identify ICE agents and weapons at the centre of the two fatal shootings in Minneapolis youtu.be/mL7Fbp3UrSo?…

[image or embed]

— Bellingcat (@bellingcat.com) 5 February 2026 at 09:36

After Alex Pretti was shot and killed by federal agents in Minneapolis, people used AI to edit video stills, resulting in AI images that showed a completely different gun than the one actually owned by Pretti. In another instance, an AI-edited image of Pretti’s shooting falsely depicted the intensive care unit nurse holding a gun instead of his sunglasses. 

Grok has also been at the centre of a controversy for generating sexually explicit content.

On Twitter/X, users have figured out prompts to get Grok (their built in AI) to generate images of women in bikinis, lingerie, and the like. What an absolute oversight, yet totally expected from a platform like Twitter/X. I’ve tried to blur a few examples of it below.

[image or embed]

— Kolina Koltai (@koltai.bsky.social) 6 May 2025 at 03:20

Multiple countries including the UK and France have launched investigations into Elon Musk’s chatbot over reports of people using it to generate deepfake non-consensual sexual images, including child sexual abuse imagery. Malaysia and Indonesia have also blocked Grok over concerns about deepfake pornographic content. 

One analysis by the Center for Countering Digital Hate found that Grok had publicly generated around three million sexualised images, including 23,000 of children, in 11 days from Dec. 29, 2025 to Jan. 8 this year. X’s initial response, in January, was to limit some image generation and editing features to only paid subscribers. However, this has been widely criticised as inadequate, including by UK Prime Minister Keir Starmer, who said it “simply turns an AI feature that allows the creation of unlawful images into a premium service”. The social media platform has since announced new measures to block all users, including paid subscribers, from using Grok via X to edit images of real people in revealing clothing such as bikinis.


Bellingcat is a non-profit and the ability to carry out our work is dependent on the kind support of individual donors. If you would like to support our work, you can do so here. You can also subscribe to our Patreon channel here. Subscribe to our Newsletter and follow us on Bluesky here and Mastodon here.

The post Epstein Files: X Users Are Asking Grok to ‘Unblur’ Photos of Children appeared first on bellingcat.

  • ✇bellingcat
  • Profiting From Exploitation: How We Found the Man Behind Two Deepfake Porn Sites Kolina Koltai
    Content warning: This article contains descriptions of non-consensual sexual imagery. Depending on which of his social media profiles you were looking at, Mark Resan was either a marketing lead at Google or working for a dental implant company, a human resources company and a business software firm – all at the same time.            Facebook photos showed Resan vacationing in Bali (left) and relaxing at luxury hotels in Dubai (right). Blurring by Bellingcat But a Bellingcat investigation
     

Profiting From Exploitation: How We Found the Man Behind Two Deepfake Porn Sites

15 de Dezembro de 2025, 12:00

Content warning: This article contains descriptions of non-consensual sexual imagery.

Depending on which of his social media profiles you were looking at, Mark Resan was either a marketing lead at Google or working for a dental implant company, a human resources company and a business software firm – all at the same time.           

Facebook photos showed Resan vacationing in Bali (left) and relaxing at luxury hotels in Dubai (right). Blurring by Bellingcat

But a Bellingcat investigation has found that the Hungarian national is the key figure behind, and the likely owner of, at least two deepfake porn websites – RefacePorn and DeepfakePorn – that until recently were selling paid subscriptions. 

There is no question about the nature of these websites. RefacePorn’s landing page shows an explicit video of a woman performing a sexual act. As the video plays, her face is replaced with a variety of other women’s faces. The text above declares: “Face swap deepfake porn. Upload your face!” 

Deepfake porn sites such as these, which use artificial intelligence to create sexually explicit images and videos – usually without the consent of those whose faces or bodies are featured – have proliferated at an alarming rate in recent years. The impact on victims has been described as “life-shattering”, with the mental health effects similar to those reported by victims of sexual assault

While the technology to make these synthetic images is not new, the rise of mainstream AI image generator tools and “Nudify” apps has made it more widely available to people without deep technical expertise. Earlier this year, New Zealand MP Laura McClure held up an AI-generated nude of herself in parliament, describing how it took her less than five minutes to create after a quick Google search. 

A 2024 study by the My Image My Choice campaign found that there was a 1,780 percent increase in sexually explicit deepfakes last year compared to 2019. Almost all (99 percent) of victims were women, according to a 2023 study by Security Hero. 

Illustration for Bellingcat by Ann Kiernan

The creation of such images and videos is now illegal in a few countries, including the US and the UK, but legislation has not caught up in many others, and the owners of platforms that enable this content often face no repercussions. In May 2024, the EU passed a directive which mandates that member states – including Hungary, where Resan resides – criminalise the creation and distribution of non-consensual sexual deepfakes by June 2027. 

Alexios Mantzarlis, co-founder of Indicator, a news site that focuses on digital deception, said his publication estimates that deepfake porn sites likely make millions of dollars a year. 

“The incentive system will continue to exist until the tools become too toxic to handle for domain hosts and content delivery networks,” added Mantzarlis, who is also the director of the Security, Trust and Safety Initiative at Cornell Tech.

All Roads Lead to Resan

Bellingcat’s investigation into RefacePorn and DeepfakePorn – which spanned corporate registries, domain name registrations, payment redirect sites, website code and leaked data – led us back to Resan. 

By simulating the purchase of subscriptions on these websites, Bellingcat was led through a series of redirects to a payments dashboard by Peerwallet, a payment processor that recorded more than US$331,000 in sales from July 2024 to August 2025 by Dorocron LLP. Dorocron is a Canadian-registered company whose main – if not sole – source of income appeared to be from paid subscriptions to these sites. The real amount is likely higher, as this was just one of several payment processors the websites have used.

Subscribe to the Bellingcat newsletter

Subscribe to our newsletter for first access to our published content and events that our staff and contributors are involved with, including interviews and training workshops.

Dorocron LLP did not respond to multiple requests for comment via email, and calls to the number listed on sites that had the company’s details in their legal information sections went unanswered.

Resan is the only person who appears to have been publicly associated with Dorocron LLP, and he is also the sole director of a UK-registered company, Facitic Ltd, that registered the domain of RefacePorn. Resan did not respond to multiple requests for comment sent via email over the past two weeks. Multiple emails and phone calls to Facitic Ltd also went unanswered.

However, days after we first reached out to Resan, his LinkedIn and X profiles were deleted, and his previously public Facebook profile was either deleted or made private. Both RefacePorn and DeepfakePorn also became inaccessible, displaying an error message that said “this site can’t be reached”. 

Archives of RefacePorn and DeepfakePorn, which were previously available on the Internet Archive’s Wayback Machine, have also now been excluded from the archive. The Internet Archive told Bellingcat it processed exclusion requests submitted by someone with rights to both sites on Dec. 5. 

Following the Money

Like other websites Bellingcat has investigated, RefacePorn’s ownership was hidden behind a network of website domains, fake websites used to redirect payments, and international business registries. 

Using the tool DNSlytics, we examined the Google tag history on RefacePorn and found a tag that was also used on DeepfakePorn, as well as a website called facitic.com. 

Google Analytics tags are small pieces of unique code that developers can place in the backend of a website to track its analytics. Each code is unique to a specific user, who can use the same tag across multiple websites. 

Both RefacePorn and DeepfakePorn offer tiered subscription packages with similar names and prices based on the number of deepfakes that could be generated and the level of support. 

When simulating a purchase of one of these packages – without actually completing payment – on DeepfakePorn, we received a link to make a payment hosted through the domain “remakerai.me”. Similarly, a mock purchase on RefacePorn pointed us to a payment link on “airemaker.me”. Bellingcat has observed the use of redirects, which can be used to obscure payments, by other deepfake porn sites. Many payment processors, including Paypal and Stripe, have restrictions on buying or selling sexually oriented online content.

SiteAdminPaymentProcessorRedirectSiteAnotherRedirect SiteDeepfakeSiteSiteUser

Payment processors often block payments that come from websites making deepfake pornography.

Using a redirect site hides the original site from the payment processor, making it harder to block.

Despite this, payment processors sometimes manage to block the redirect site.

But If one redirect site is blocked, the site owner can quickly switch to another redirect site that isn’t blocked.

Graphic: Galen Reich

The redirected payment links hosted on airemaker.me and remakerai.me offered several payment options including Paypal, credit cards and cryptocurrencies. Bellingcat selected the credit card option, and in both cases was emailed a link to complete the purchase on a payment platform called Peerwallet. This email included a link to the seller’s profile, Dorocron LLP. 

This profile showed the funds received by the seller, which totalled more than $331,000 as of August 2025. This income was related to 16,264 sales. According to this dashboard, Dorocron LLP had been a member of Peerwallet since July 22, 2024, meaning these sales all occurred over the past year.

Screengrab of Peerwallet profile for Dorocron LLP, showing about US$331,000 in funds received for sales 

RefacePorn has been active since at least May 2022, according to promotional posts by an Instagram account with the username “Dorocron2323” and the account name “Hassler Mark”. Social media accounts for RefacePorn were also created on X and Facebook in May 2022.

Screengrab of an Instagram post from May 2022 promoting RefacePorn’s website, which is now down. Blurring by Bellingcat

While the transactions on Peerwallet were not broken down by domain, two were the payment redirect sites for the deepfake porn sites we investigated. Bellingcat’s review of the 21 “approved domains” listed on this profile found no evidence that payments were ever accepted through the other sites. 

Short-lived, “disposable” domains are known to be used by bad actors to evade detection, presenting a moving target for payment processors and authorities. As of publication, both airemaker.me and remakerai.me are no longer accessible. But in the course of the investigation, we observed RefacePorn and DeepfakePorn’s payment links redirecting to other third-party sites, before the sites went offline.

The Peerwallet profile showed transactions by users, as well as 21 approved domains including those redirecting payments for RefacePorn (refaceporn.com) and DeepfakePorn (deepfakeporn.app)

Of the 21 domains on Dorocron LLP’s Peerwallet profile, only two were still accessible as of the end of November, with the rest either down due to expired domains or server issues, displaying generic domain parking pages, or requiring a login to view. Though almost all of the sites had their registration information redacted, Resan was listed as the most recent registrant for one of the expired domains.

The two sites still accessible listed a variety of products, including eBooks and digital products. Both had almost identical products and templates, and listed Dorocron LLP under their company information in their footers. 

Bellingcat tried to check out items on each of the sites, and in both cases was prompted to log in. It was, however, impossible to register an account, and when we tried with an active email address we were redirected to a login page saying that the email address was “unknown”. 

Archived screengrabs of some of the sites that now have expired domains or require a login to view showed that many of them followed the same format, selling eBooks and video courses with “resell rights”.

Peerwallet told Bellingcat in September that Dorocron LLP was “not approved” to sell deepfake porn, and that it was looking into the issue. However, when Bellingcat asked for an update in November, Peerwallet appeared to have closed down. Emails to the payment processor’s founder have also gone unanswered. 

The Man Behind the Screen

Dorocron LLP was registered in British Columbia, Canada in March 2022. We were unable to verify if Resan’s name was on the corporate records as information on company owners or directors in British Columbia is restricted to law enforcement and other officials. 

However, Resan’s name has been used to register at least 13 sites alongside an email bearing  Dorocron’s name from as far back as 2013, nine years before Dorocron was registered in Canada. The earliest domain registration, from 2013, included the name of a now-dissolved UK-registered company called “Webnaser LTD”, whose registration documents also cite Resan as the sole director

WHOIS history information for a site that Resan first registered in 2013. Source: Whoxy

A leak found on data breach site Intelx.io shows that an almost identical password (with different capitalisation of some letters) was used to log into this “dorocron” Gmail account and a Netflix account associated with Resan’s personal email address. This password was also used to log into web domain registry GoDaddy using RefacePorn’s support email address. 

Leaked passwords on Intelx.io revealed another link between Resan and DeepfakePorn: an email with the username “resanmark” was used to log into DeepfakePorn’s website, with a password containing his birth year. In all, we found four unique passwords that were reused between Resan’s personal emails, the Dorocron emails, and a support email for RefacePorn. These four passwords include either Resan’s name or the date or year of his birth. 

Resan also posted two job listings from his now-deleted LinkedIn account about a year ago, for a full-stack web developer and a WordPress developer at Dorocron LLP. In the web developer listing, he described the company as “developing and applying revolutionary AI technologies” and said the job would have “high wages”. We could not find any other individual with a public association to Dorocron LLP on LinkedIn or elsewhere.

Support Bellingcat

Your donations directly contribute to our ability to publish groundbreaking investigations and uncover wrongdoing around the world.

Aside from his links to Dorocron LLP, Resan is also the sole director and person with significant control of Facitic Ltd, a UK-registered company which was listed as the registrant for RefacePorn. 

Using DomainTools, we were able to see the historical registrant information in a WHOIS lookup of the site’s domain registration. When we checked this in August 2025, we were able to see that, as of June 2025, Facitic Ltd was the registered owner of RefacePorn. This information was later redacted – as it is for other sites linked to Resan such as DeepfakePorn. 

ICANN, which regulates websites, requires domain name providers to verify the accuracy of their customers’ details, including the registrant's name and contact details. Such details are publicly visible by default, but can be anonymised using paid privacy services

The UK registration for Facitic Ltd lists Resan’s country of residence as Dubai, while the registration for another UK company he registered – which was also listed as the owner of some of the now-expired approved domains on Dorocron LLP’s Peerwallet profile – states that he resides in Cyprus. Meanwhile, Resan’s social media accounts stated that he lives in Hungary. On Peerwallet’s dashboard, the primary user of Dorocron is listed as being based in Hungary. 

It is unclear if Resan actually holds positions in any of the six companies he listed himself as working at on his Facebook and LinkedIn profiles. Bellingcat has reached out to these companies to check, but has not received any replies as of publication. 

Some of the connections Bellingcat found between RefacePorn and Mark Resan:

Graphic: Galen Reich

On Nov. 10, 2025, a few weeks before we contacted him, Resan applied for Facitic Ltd to be struck off the UK companies register. Based on Resan’s filings, Facitic Ltd was incorporated with an initial capital of £100 in January 2024, and there has been no recorded change in its accounts since. 

This comes as UK regulator Ofcom cracks down on websites associated with UK businesses offering AI-powered nudify services. On Oct. 23, Ofcom imposed a £50,000 fine on UK-registered company Itai Tech Ltd, which has been linked to some of the biggest deepfake pornography sites in the world, for failing to prevent children from accessing pornographic content. 

It is unclear what triggered Resan to file to dissolve the company, and he did not respond to Bellingcat’s query about this. 

Small Sites, Big Harm

The websites linked to Resan are not among the largest in the deepfake porn industry. A similar but much larger site that Bellingcat has investigated, MrDeepFakes, received millions of visits each month. Bellingcat and its partners Tjekdet, Politiken and CBC exposed the site’s key administrator David Do in May, with MrDeepFakes going offline after we reached out to Do for comment. 

In comparison, RefacePorn and DeepfakePorn received about 91,000 and 154,000 visits in October, according to digital marketing platform SemRush. But their smaller size does not mean they can’t cause significant harm. 

Mantzarlis, of the news site Indicator, said there were “smaller players” taking bigger risks around regulation, such as “Crush AI”, a group of Chinese-owned apps that bypassed Meta’s moderation rules to run 25,000 ads on Facebook and Instagram before the social media giant sued them. 

“These smaller players are often the ones that are more actively trying to stand out on social media to catch up with the bigger ones,” Mantzarlis said.

In the course of our investigation, we ran tests using the free features on RefacePorn to determine if there were any restrictions on images that could be uploaded on the website. 

Without actually generating the content, we uploaded AI-generated images of adult women and underage girls. Unlike on other websites we have tested, which have added the bare minimum of checks to prevent uploading images depicting children, there was no restriction or evidence of age-related safeguards on RefacePorn. 

While there aren’t laws in Hungary explicitly prohibiting deepfake porn, the possession, creation and distribution of sexually explicit images of minors is illegal

“As the more established websites come under sustained regulatory pressure and others get litigated into oblivion, the minnows are ready to try and capture market share,” Mantzarlis said. 

And while some sites such as RefacePorn and DeepfakePorn may fold in the face of public scrutiny, others continue to operate, unchecked and easily accessible, online. 

“These websites are eminently replaceable and there's no reason to believe that there is any form of ‘brand loyalty’,” Mantzarlis said. “Perpetrators are going to search for ‘nudify’ or click on an ad and go to whatever tool does the job.”


Melissa Zhu contributed to this report.

Bellingcat is a non-profit and the ability to carry out our work is dependent on the kind support of individual donors. If you would like to support our work, you can do so here. You can also subscribe to our Patreon channel here. Subscribe to our Newsletter and follow us on Bluesky here and Mastodon here.

The post Profiting From Exploitation: How We Found the Man Behind Two Deepfake Porn Sites appeared first on bellingcat.

❌
❌