Anime two women

GitHub Purges Adult Game Developers, Offers No Explanation

Something strange started rippling through a small, niche corner of the internet not long ago. Developers who build mods and plugins for hentai games and even interactive sex toys began waking up to missing repositories, locked accounts, and dead links. GitHub, the place many of them had treated like home base, had quietly pulled the rug out. No warning. No explanation. Just… gone.

From conversations within the community, the rough headcount quickly took shape: somewhere between 80 and 90 repositories, representing the work of roughly 40 to 50 people, vanished in a short window. Many of the takedowns seemed to cluster around late November and early December. A large number of the affected accounts belonged to modders working on games from Illusion, a now-defunct Japanese studio known for titles that mixed gameplay with varying degrees of erotic content. One banned account alone reportedly hosted contributions from more than 30 people across 40-plus repositories, according to members of the modding scene.

What made the situation feel especially surreal was the silence. Most suspended developers say they were never told which rule they’d broken—if any. Their accounts simply stopped working. Several insisted they’d been careful to stay within GitHub’s acceptable use guidelines, avoiding anything overtly explicit. The code was functional, technical, sometimes cheeky in naming, but never pornographic. At least, not in the way most people would define it.

“Amongst my repositories there were no explicitly sexual names or images anywhere in the code or the readme, the most suggestive naming would be on the level of referencing the dick as ‘the men thing’ or referencing the sex as ‘huffing puffing,’” one developer, Danil Zverev, told me. He makes plugins for an Illusion game called Koikatsu. Zverev said he’s been using Github for this purpose since 2024, but on November 18, his Github page was “completely deleted,” he said. “No notifications anywhere, simply a 404 error when accessing the page and inability to log in on the web or in the mobile app. Also it does not allow me to register a new account with the same name or email.”

The timing raised eyebrows. GitHub had updated its acceptable use policies in October 2025, adding language that forbids “sexually themed or suggestive content that serves little or no purpose other than to solicit an erotic or shocking response, particularly where that content is amplified by its placement in profiles or other social contexts.” The policy explicitly bars pornographic material and “graphic depictions of sexual acts including photographs, video, animation, drawings, computer-generated images, or text-based content.”

At the same time, the policy leaves room for interpretation. “We recognize that not all nudity or content related to sexuality is obscene. We may allow visual and/or textual depictions in artistic, educational, historical or journalistic contexts, or as it relates to victim advocacy,” GitHub’s terms of use state. “In some cases a disclaimer can help communicate the context of the project. However, please understand that we may choose to limit the content by giving users the option to opt in before viewing.”

The Anti-Porn Crusade That Censored Steam and Itch.io Started 30 Years Ago

Keywords and tags have never been a useful metric for distilling nuance. Pushing for regulations based on them is repeating a 30-year history of porn panic online.

SAMANTHA COLE

Zverev didn’t bother appealing. He said writing to support felt pointless and chose instead to move on to another platform. Others tried to fight it—and found themselves stuck in limbo.

A developer who goes by VerDevin, known for Blender modding guides, utility tools, and plugins for the game Custom Order Maid 3D2, said users began reporting trouble accessing his repositories in late October. Oddly, he could still see his account when logged in, but not when browsing while logged out.

“Turned out, as you already know, that my account was ‘signaled’ and I had to purposefully go to the report section of Github to learn about it. I never received any notifications, by mail or otherwise,” VerDevin told me. “At that point I sent a ticket asking politely for clarifications and the proceedings for reinstatement.”

The response from GitHub Trust & Safety was vague and procedural: “If you agree to abide by our Terms of Service going forward, please reply to this email and provide us more information on how you hope to use GitHub in the future. At that time we will continue our review of your request for reinstatement.”

VerDevin replied the next day, agreeing to comply and offering to remove whatever GitHub considered inappropriate—despite still not knowing what that was. “I did not take actual steps toward it as at that point I still didn’t know what was reproach of me,” they said.

A full month passed before GitHub followed up. “Your account was actioned due to violation of the following prohibition found in our Acceptable Use Policies: Specifically, the content or activity that was reported included multiple sexually explicit content in repositories, which we found to be in violation of our Acceptable Use Policies,” GitHub wrote.

“At that point I took down several repositories that might qualify as an attempt to show good faith (like a plugin named COM3D2.Interlewd),” they said. GitHub restored the account on December 17—weeks later, and just one day after additional questions were raised about the ban—but never clarified which content had triggered the action in the first place.

Requests for explanation went unanswered. Even when specific banned accounts were flagged to GitHub’s press team, the response was inconsistent. Some accounts were reinstated. Others weren’t. No clear reasoning was ever shared.

The whole episode highlights a problem that feels painfully familiar to anyone who’s worked on the edges of platform rules: adult content policies that are vague, inconsistently enforced, and devastating when applied without warning. These repositories weren’t fringe curiosities—they were tools used by potentially hundreds of thousands of people. The English-speaking Koikatsu modding Discord alone has more than 350,000 members. Another developer, Sauceke, whose account was suspended without explanation in mid-November, said users of his open-source adult toy mods are now running into broken links or missing files.

“Perhaps most frustratingly, all of the tickets, pull requests, past release builds and changelogs are gone, because those things are not part of Git (the version control system),” Sauceke told me. “So even if someone had the foresight to make mirrors before the ban (as I did), those mirrors would only keep up with the code changes, not these ‘extra’ things that are pretty much vital to our work.”

GitHub eventually reinstated Sauceke’s account on a Tuesday—seven weeks after the original suspension—following renewed questions about the bans. Support sent a brief note: “Thank you for the information you have provided. Sorry for the time taken to get back to you. We really do appreciate your patience. Sometimes our abuse detecting systems highlight accounts that need to be manually reviewed. We’ve cleared the restrictions from your account, so you have full access to GitHub again.”

Even so, the damage lingers. In Sauceke’s account and others, including the IllusionMods repository, release files remain hidden. “This makes the releases both inaccessible to users and impossible to migrate to other sites without some tedious work,” Sauceke said.

Accounts may come back. Repositories might be restored. But for many developers, the trust is already gone—and that’s the kind of thing that doesn’t reinstall quite so easily.

GitHub isn’t just another code host—it’s the town square for open-source developers. For adult creators especially, who are used to being quietly shoved to the margins everywhere else, visibility there actually matters. It’s how people find each other, trade ideas, and build something that feels bigger than a solo side project. “It’s the best place to build a community, to find like-minded people who dig your stuff and want to collaborate,” Sauceke said. But if this wave of bans stretches beyond hentai game and toy modders, they warned, it could trigger a slow exodus. Some developers aren’t waiting around to find out, already packing up their repositories and moving them to GitGoon, a platform built specifically for adult developers, or Codeberg, a nonprofit, Berlin-based alternative that runs on a similar model.

Read More »
Beautiful woman on bed

Nevada’s Legal Sex Workers Claim They’re Being Muted on X

It didn’t happen slowly. It wasn’t subtle. Within the past month, legal Nevada sex workers have been hit with a sudden, sweeping wave of account suspensions on X, the platform once known as Twitter. Not for doing anything illegal. Not for soliciting crimes. These are licensed workers, operating in the only state where brothel-based sex work is legal — and yet their voices are vanishing from a platform that once wrapped itself in the language of free speech.

That promise came straight from Elon Musk himself when he set his sights on buying Twitter. At the time, he framed the platform as something almost sacred, saying:

“Free speech is the bedrock of a functioning democracy, and Twitter is the digital town square.”

He followed that with another line meant to reassure skeptics:

“By ‘free speech’ I simply mean that which matches the law.”

By that definition, Nevada sex work clearly qualifies.

Prostitution is legal in Nevada when it takes place inside licensed brothels, as outlined in Nevada Revised Statutes 201.354. Counties are empowered to license and regulate these brothels under Nevada Revised Statutes 244.345, while workers comply with additional health and safety standards set by the state. At present, six counties operate legal brothels. This isn’t a loophole or a gray area — it’s a fully regulated, lawful industry.

At first glance, it might look like X is simply enforcing broad rules around adult content. But the reality cuts deeper. When legal Nevada sex workers lose their accounts, they’re erased from public conversation — conversation that increasingly lives and breathes on platforms like X. What’s left behind is a so-called “digital town square” where only certain voices are allowed to stay standing.

Nevada sex workers understand exactly what’s at stake when they’re shut out. Not long ago, anti-brothel groups attempted to dismantle the legal system through ballot initiatives. When voters heard directly from sex workers, those efforts failed — decisively. In the 2018 Lyon County referendum, for example, nearly 80 percent of voters rejected a proposed brothel ban.

That wasn’t an accident. When sex workers are able to speak publicly, explain how the licensed system actually functions, and share their lived experiences, people listen. Voters learn about the safeguards, the structure, and why legal brothels exist in the first place — not from headlines or fear campaigns, but from the people inside the system.

Silencing those voices on X means the public hears less from those with firsthand knowledge. Anti-sex-work narratives remain visible, amplified, and largely unchallenged. The workers most affected by stigma and policy decisions fade into the background.

This isn’t just about clumsy algorithms sweeping up adult content. It’s about who gets to participate in conversations that can shape laws, livelihoods, and lives. Platforms don’t just host debate — they quietly curate it by deciding who stays and who disappears.

When licensed Nevada sex workers are removed from social media, the public square stops reflecting reality. The debate tilts. The story becomes one-sided. And the people whose livelihoods are on the line — most of them women — lose the chance to speak for themselves.

Maybe that’s the most unsettling part. If this can happen to a group operating legally, transparently, and within the law, it raises an uncomfortable question: who’s next when an algorithm decides a voice is inconvenient?

Read More »
Age verification

Age Verification Push Sends Millions of Britons to Unregulated Porn Sites, Charity Says

Something strange has been happening since age verification checks quietly became part of everyday internet life in the UK. You’d think stricter rules would close doors. Instead, a lot of people seem to be wandering down darker hallways.

New research from the Lucy Faithfull Foundation suggests that nearly 45 per cent of UK porn users have visited websites without age verification checks since the rules came into force last summer under the Online Safety Act.

The poll, which surveyed more than 3,700 people across Britain, found that 39 per cent of those who visited these sites ended up watching content that made them feel uncomfortable. Even more telling, 40 per cent said what they saw was enough to put them off returning altogether.

The charity, which focuses on preventing online child sexual abuse, has warned that these unregulated spaces can quietly increase the risk of people stumbling into harmful — and illegal — material.

Vicky Young, who leads the Stop It Now UK and Ireland anonymous helpline, said these sites can become a dangerous stepping stone toward indecent images of children.

“We work with people who have looked at indecent images of children to try and address that behaviour, to help support them to change their behaviour,” she said.

“One of the things that people say to us frequently is that they started looking at legal adult pornography and that their behaviour escalated. In part they were spending maybe longer online, but also the sort of content that they were looking at became more extreme and often started getting younger, and that’s when they then crossed into illegal behaviour, so looking at indecent images of children.”

That pattern — the slow creep from curiosity to something far more serious — is what worries the charity most.

“Because of that pathway and that coming out constantly in conversations we have with people, it concerns us that if people are accessing sites where there is this riskier content, that actually they are putting themselves at a higher chance of accessing indecent images,” she said.

“Sometimes that might not be intentional in the beginning, but what people tell us is that actually, if they come across those images as part of their other pornography, that then sparks curiosity. There’s something that kind of adds to the excitement around the risk, and they don’t necessarily stop at one image. They actually then start looking for more images.”

The survey also revealed a quieter anxiety bubbling under the surface. Nearly 30 per cent of respondents said they were worried about how much pornography they consume. That concern was highest among young men aged 18 to 24, with more than half admitting it’s something that troubles them — a group the Foundation describes as particularly vulnerable.

At the same time, the rules appear to be forcing a moment of self-reflection for many. Almost 47 per cent said they’ve reduced how much pornography they watch since age checks were introduced, while 55 per cent said the changes made them stop and think about their habits.

Recent data backs that up. Enforcement of highly effective age assurance led to an immediate drop in traffic to major porn sites from late July. But as one door closed, another cracked open: VPN use surged as people looked for ways around the new barriers.

The UK’s most visited adult site recorded a drop of around 1.5 million viewers year on year, falling from 11.3 million in August 2024 to 9.8 million this August.

Meanwhile, VPN usage more than doubled after the rules came in, jumping from roughly 650,000 daily users to a peak of over 1.4 million in mid-August 2025. Although that number has since dipped, it still hovered around 900,000 by November.

In other words, fewer people are walking through the front door — but plenty are still trying the side entrance.

Dr Alexandra Bailey, head of psychology at the Foundation and an associate professor at the University of Roehampton, said the intention behind age verification is sound, but the consequences are more complicated.

“Age verification is vital to protect children, and we fully support it,” she said. “But we also need to recognise that some adults are choosing riskier sites to avoid age checks. These sites can expose people to harmful material, including illegal content depicting child sexual abuse. Even if you’re not looking for it, you could encounter it — and that can have serious life-changing consequences.”

She added that the rules have created a pause many people probably needed, but not everyone is responding in a safe way.

“Age verification is also prompting adults to reflect on their online behaviour, which can be a good thing for people worried about their porn use. But we need to address the risks for those who are turning to sites that avoid the new regulations.

“Every day, our advisors speak to people whose pornography use has spiralled into something much more harmful. We know embarrassment can stop people from reaching out, but confidential help is available. If you’re worried about your own behaviour or someone else’s, contact Stop It Now before it’s too late.”

Sometimes the most dangerous part isn’t the rule itself — it’s what people do when they decide to dodge it.

Read More »
Age verification image

When HBO’s Industry Meets the Age Verification Reckoning

WHEN THEY DECIDED to take on age verification in their latest season, Industry cocreators Konrad Kay and Mickey Down didn’t expect to wander straight into a political minefield. It probably felt, at first, like one more sharp storyline—edgy, timely, a little dangerous in the way good TV often is. But sometimes a writers’ room accidentally opens a door to something bigger. And once it’s open, there’s no quietly closing it again.

“It was in the ether of British politics, but it wasn’t front and center when we started writing the scripts or shooting it, and then it really flared up as a kind of front-page-of-BBC topic of conversation,” Kay says.

Season 4 of HBO’s sexy, darkly funny financial drama—premiering Sunday—pushes Industry even further beyond the blood-slick trading floors that first defined it. This time, the story spills into tech, porn, age verification, and the uncomfortable politics sitting between them. Early in the season, tensions rise inside Tender, a fintech firm fresh off its IPO, as executives debate whether to keep processing payments for Siren, an adult platform in the OnlyFans mold. Siren—and other porn and gambling businesses—account for a sizable slice of Tender’s revenue. But looming threats of new age-verification laws and a rising tide of anti-porn rhetoric from the UK’s Labour Party have some leaders wondering if reputational cleanup might be more profitable than cashing controversial checks. It’s boardroom fear dressed up as moral clarity, the kind that tends to surface right before regulators do.

In the real world, the UK’s Online Safety Act—requiring age verification to access porn and other restricted content—didn’t take effect until July 2025, long after Kay and Down had mapped out this season’s arc. Still, the parallels are hard to ignore. Platforms like Pornhub saw UK traffic plunge by nearly 80 percent after the rules kicked in, and similar pressures are mounting in the U.S., where roughly half of all states now enforce some form of age-verification law. Even Capitol Hill is circling the issue: in December alone, lawmakers considered 19 bills aimed at protecting minors online. Critics, meanwhile, argue that several of those proposals stray into unconstitutional territory. It’s messy, unresolved, and very much still unfolding.

“It’s kind of shown how fragile free speech absolutism is,” says Down, pointing to the “wildly different” reactions the issue has provoked—from puritan instincts cropping up in liberal circles to a more blunt, censor-first “shut everything down” posture on the conservative side. And that tension, hanging in the air, feels like the real cliffhanger. Not who wins the argument—but what gets lost while everyone’s busy shouting.

Read More »
Utah House building

Utah Senator Floats Porn Tax to Pay for Age-Verification Enforcement

SALT LAKE CITY—There are some ideas that arrive quietly and others that walk in like they own the place. This one does the latter. At the opening of Utah’s new legislative session, a Republican lawmaker dropped a bill that would tax online adult content, funneling the money toward age-verification enforcement and teen mental health programs.

Sen. Calvin R. Musselman, who represents the small town of West Haven, is the driving force behind Senate Bill (SB) 73. The proposal introduces what it calls a “material harmful to minors tax,” set at seven percent of the “gross receipts” from sales of content classified under that label.

SB 73 has been formally introduced but hasn’t yet landed in a committee. Even so, the odds of it clearing the legislature are widely considered high.

The bill defines “gross receipts” as “the total amount of consideration received for a transaction […] without deduction for the cost of materials, labor, service, or other expenses.” In other words, it’s the top line, not the leftovers.

And the reach is… expansive. The tax would apply to “the gross receipts of all sales, distributions, memberships, subscriptions, performances, and content, amounting to material harmful to minors that is: (a) produced in this state; (b) sold in this state; (c) filmed in this state; (d) generated in this state; or (e) otherwise based in this state.” That’s a wide net, and it’s not subtle about it.

Because of that scope, the tax wouldn’t just hit one corner of the industry. Producers, creators, platforms—anyone touching qualifying content—would likely feel it. And it wouldn’t exist in a vacuum. The levy would stack on top of existing obligations, including Utah’s digital sales tax and other state fees.

Revenue from the tax would flow into a newly created government account, earmarked for teen mental health treatment through the state Department of Health and Human Services. It’s worth noting that Utah is among the states that formally frame pornography consumption as a public health crisis, a position tied to the still-contested concept of “pornography addiction.”

The bill doesn’t stop at taxation. It also introduces a $500 annual recurring fee, paid into accounts overseen by the Division of Consumer Protection. This so-called “notification fee” would apply to companies producing content deemed “harmful to minors” and is tied directly to age-verification compliance.

Those funds would be used by the Division to monitor compliance in a system modeled after the United Kingdom’s Ofcom framework. Companies would need to notify annually. Miss that step, and the penalty jumps to $1,000 per day until the paperwork—and compliance—are in order.

Utah, of course, has already been down this road. It was one of the first states to pass a statewide age-verification law structured as a “bounty law,” allowing private individuals to sue on the state’s behalf over noncompliance. That approach famously led Aylo, the owner of Pornhub, to block Utah IP addresses, just as it has done in other states with similar laws.

Utah wouldn’t be alone in adding a porn-specific tax to the mix. Alabama already has one on the books, imposing a ten percent levy on top of existing digital goods and sales taxes.

And the idea is still spreading. In Pennsylvania, a bipartisan pair of state senators recently announced plans to propose a measure that would tax online pornography subscriptions in that state’s digital marketplace.

Read More »
Adults only button

2025: The Year Tighter Regulation Came to Town for the Online Adult Industry by Morley Safeword

When I got my start in the online sector of the adult entertainment business, back in the mid-nineties, there was no video streaming. Individual photos often took web users several minutes to download. And you hardly heard a peep from anyone suggesting that the fledgling industry needed to be reined in.

To be fair, many people were only vaguely aware of what was available on the internet at the time, let alone worried about what their kids might be looking for on there – and frankly, the web was so slow, using it exceeded the patience of a lot of kids, anyway.

Oh, how things have changed.

What evolved fastest, of course, was the technology underpinning the internet. As high-speed connectivity became the norm rather than the exception and video streaming capabilities increased year over year, online porn went from something enjoyed by a small subset of early adopters to a massive, multibillion dollar industry. Along with those changes in technology came ever louder calls for the online adult industry to be more tightly regulated – or regulated at all, in the still-early-internet days of the mid-nineties.

In the United States, Congress began cooking up proposals to prevent minors from accessing online porn. While these proposals enjoyed broad bipartisan support (within the legislature, at least), what they didn’t get was much support from the courts.

Early attempts to impose things like age verification requirements were slapped down by the courts, most notably in cases like Reno v. ACLU, decided in 1997. In Reno, the Supreme Court held that certain provisions of the Communications Decency Act of 1996 (“CDA”) violated the First Amendment. Specifically, the court found that the CDA’s “indecent transmission” and “patently offensive display” provisions trod upon the freedom of speech protected by the First Amendment.

What changed in 2025, as the Supreme Court again considered an age verification proposal, this time a state law passed in Texas (“HB 1181”), was in part the continued forward march of technology. But more crucially, what changed was the court’s disposition as to which “standard of review” ought to be applied.

In previous cases involving online age verification proposals, the court has applied “strict scrutiny,” a high bar that requires the government to show its actions (and laws) are “narrowly tailored” to further a “compelling government interest” and are the “least restrictive means” to further that interest.

In the case Free Speech Coalition v. Paxton, which the Supreme Court decided in June, the district court had applied strict scrutiny and found that HB 1181 failed to satisfy the standard. When the case reached the Supreme Court, however, the majority decided the court had erred in applying strict scrutiny and that the correct standard to apply was “intermediate scrutiny,” which sets the bar much lower for the government.

Writing for the majority, Justice Clarence Thomas asserted that HB 1181 has “only an incidental effect on protected speech.”

“The First Amendment leaves undisturbed States’ traditional power to prevent minors from accessing speech that is obscene from their perspective,” Thomas wrote. “That power includes the power to require proof of age before an individual can access such speech. It follows that no person – adult or child – has a First Amendment right to access such speech without first submitting proof of age.”

Since the law “simply requires adults to verify their age before they can access speech that is obscene to children,” Thomas found that HB 1181 “is therefore subject only to intermediate scrutiny, which it readily survives.”

The three justices who dissented from the majority’s position didn’t see things quite the same way, naturally. In her dissent, Justice Elena Kagan criticized the majority’s holding as “confused” and highlighted the ways in which it departed from the court’s previous rulings in similar cases.

“Cases raising that question have reached this Court on no fewer than four prior occasions – and we have given the same answer, consistent with general free speech principles, each and every time,” Kagan observed. “Under those principles, we apply strict scrutiny, a highly rigorous but not fatal form of constitutional review, to laws regulating protected speech based on its content. And laws like H. B. 1181 fit that description: They impede adults from viewing a class of speech protected for them (even though not for children) and defined by its content. So, when we have confronted those laws before, we have always asked the strict scrutiny question: Is the law the least restrictive means of achieving a compelling state interest? There is no reason to change course.”

Whether there was reason to change course or not, surely now the course has been changed. Make no mistake, laws like HB 1181 are here to stay – and they will be followed by other measures designed to restrict access to sexually-explicit materials online, as well as regulation which goes much further and sweeps in an even broader range of controversial content.

The old cliché about the “canary in the coal mine” has often been applied to pornography in the context of free speech discussions. Even those who don’t like or approve of porn have often warned that crackdowns on sexually explicit expression can presage attempts at regulating other forms of speech.

If indeed those of us who work in the adult industry are part of a sentinel species, the warning to our peers in the broader world of entertainment and self-expression could not be more clear, as we look out to 2026 and beyond: Here in the world of online porn canaries, we’re choking on this new regulatory push – and most likely, some of you other birds are going to be feeling short of breath too, soon enough.

Read More »

Commentary: Age Verification Trounced Free Speech in 2025

Here’s a commentary on age verification from Michael McGrady of AVN.

Read More »
Section 230

Congressional Push to Amend – or Simply End – Section 230 Safe Harbor Continues by Morley Safeword

For years now, legislators at both the state and federal level have been calling for reform to the “safe harbor” provisions of Section 230 of the Communications Decency Act of 1996, a provision which has long protected providers (and to an extent, users) of “interactive computer services” from liability stemming from the actions of third parties.

There are several proposals floating around the U.S. House of Representatives and the U.S. Senate currently, some of which are far broader than others in terms of their impact on Section 230 safe harbor and the entities that rely on it. The most extreme of these proposals is one that would simply eliminate Section 230 altogether after December 31, 2026.

The need for reforms to Section 230, according to the legislators pushing for such, is rooted in the belief that changes and advances in communications technology have outpaced the law – and have turned Section 230 into too large a shield, in effect, for the technology companies it protects.

“Changes in technology have created new opportunities for criminals to harass, exploit, intimidate and harm American children,” said Senator Chuck Grassley (R-Iowa) in a statement about the Section 230 reform bills he sponsors or supports. “These horrific crimes – often committed by violent online groups who take advantage of our nation’s outdated laws – have gone unchecked for far too long.”

Senator Dick Durbin (D-Ill.) has joined Grassley in his effort to amend Section 230—and echoed Grassley’s sentiments in the same statement.

“Because of modern technology, child predators from anywhere in the world can target American kids online,” Grassley said. “As technology has evolved, so have online child exploiters. Today, offenders are engaging in sadistic online exploitation and coercing kids to take their own lives. Big Tech continues to fail our most vulnerable because they refuse to incorporate safety-by-design measures into their platforms or make meaningful efforts to detect the increasingly violent and depraved sexual exploitation of children on their services.”

The most extreme Section 230 reform idea being bandied about in Congress right now is the “Sunset To Reform Section 230 Act,” a very short bill that would simply append the following text to the law: “(g) Sunset.—This section shall have no force or effect after December 31, 2026.” The effect of this Act, should it pass, seems to be a complete repeal of Section 230, as opposed to a reform of the law.

While it’s perfectly understandable for people to want to do more to protect children who use the internet and other communications technologies and platforms, eliminating Section 230 would have far-reaching implications, some of which I get the feeling Congress has not fully considered.

Publication of user-generated content (UGC) is not limited to the likes of adult tube sites or major social media platforms. It’s one thing to approach Section 230 reform ‘surgically,’ by limiting the scope of its protections, or requiring more of the largest and best-funded platforms in terms of policing the content uploaded by their users, but to repeal Section 230 entirely would create a flood of lawsuits, potentially directed at any site or platform that enables users to publish content.

It’s not hard for one to imagine the chaos that could ensue, even for legislators themselves. If a representative or senator has a website of their own that allows readers and users to post comments, does the legislator in question want to face liability for anything untoward or illicit those users might post? This is the sort of hypothetical I’m not sure the likes of Grassley and Durbin have fully taken on board.

Reasonable people can disagree on whether the scope of Section 230 immunity, particularly as it has been interpreted by the courts, is too broad. But when it comes to reforming the safe harbor, outright elimination of Section 230 would create far more problems than it would solve.

Read More »
France flag

French Regulator Gets Its Way as Targeted Adult Sites Add Age Verification

PARIS — There’s a particular hush that falls right before the hammer drops. Five high-traffic adult websites based outside of France have now put age-verification systems in place under the country’s Security and Regulation of the Digital Space (SREN) law, after receiving pointed warnings from media regulator Arcom. It’s one of those moments where the room goes quiet and everyone waits to see who blinks first.

Back in August, Arcom sent enforcement notices to xHamster, Xvideos, XNXX, xHamsterLive, and TNAFlix, giving them a tight three-week window to comply or brace for delisting and blocking proceedings. Not exactly a friendly nudge—more like a stopwatch set on the table.

According to the regulator, all five sites now have age-verification solutions in place, and for the moment, that’s been enough to halt further action. No public victory laps, no dramatic announcements—just a sense that compliance, at least for now, has won the day.

That hasn’t stopped the arguments, though. From the start, there’s been real tension over whether France even has the authority to regulate companies based in other EU member states, and how that authority would work in practice. Arcom asked media regulators in Cyprus and the Czech Republic to help enforce its rules against the warned sites, but those agencies declined, saying they simply don’t have the legal tools to enforce French age-verification law within their own borders.

Then came a shift in September. In a case involving WebGroup Czech Republic, which operates XVideos.com, and NKL Associates, which operates XNXX.com, an advocate general of the European Union’s Court of Justice advised that France may, in fact, require pornographic websites based in other EU states to implement age verification in line with French law. It wasn’t a ruling—more like a legal compass—but it pointed in a very clear direction.

The opinion isn’t binding, but if the EU Court of Justice follows it, the ripple effects could be enormous. It would set precedent for other member states wrestling with the same jurisdiction questions, especially as similar litigation plays out in Germany over whether national laws or the EU’s Digital Services Act ultimately take precedence. This is the slow, grinding part of policymaking—courts, counsels, and contradictions, all trying to decide who gets the final word.

And this likely isn’t the end of it. Arcom has made clear that its next move will be to widen enforcement to include smaller adult sites. The message feels unmistakable now: this isn’t a one-off crackdown—it’s a line being drawn, and the rest of the industry is standing just behind it, watching to see how hard it holds.

Read More »
James Uthmeier

Florida AG Drops Age-Verification Case Against Segpay

TALLAHASSEE, Fla. — Sometimes legal battles don’t end with a bang, but with a quiet agreement and a collective exhale. On Monday, the Florida attorney general’s office agreed to drop its claims against payment processor Segpay in a lawsuit tied to alleged noncompliance with the state’s age-verification law.

Back in September, Florida Attorney General James Uthmeier brought lawsuits against both Segpay and Aylo in the 12th Judicial Circuit Court of Florida. The accusations centered on alleged violations of HB3, the state’s age-verification statute — a law with real teeth, carrying potential fines of up to $50,000 for each individual violation. The stakes were never abstract; they were painfully concrete.

Then, on Monday, the temperature shifted. The Office of the Attorney General and Segpay jointly filed a stipulation of voluntary dismissal, effectively closing that chapter of the case. No dramatic courtroom showdown. Just a line drawn under it.

Attorney Corey Silverstein, who represented Segpay alongside fellow industry attorney Lawrence Walters, said he and his clients were relieved by how the matter ultimately played out. Anyone who’s spent time in regulatory trench warfare knows that resolution — especially a fair one — can feel like a small miracle.

“We are very appreciative that the Florida AG’s office worked with us to get a clear understanding of the real facts involved here,” Silverstein said.

The lawsuit against Aylo, however, is still moving forward, a reminder that while one door has quietly closed, others remain very much open.

Read More »