Unsatisfactory service provide by SeaArt.

Dear SeaArt team,

I read your warning message about pornography, violence, and community guideline violations being reviewed and blocked, with no refund of energy/compute credits, and I have to say—it feels more like a shield than actual policy enforcement.

Let me explain why this approach is losing trust from a lot of paying users:

Massive false positives are the norm, not the exception

Swimsuits, tight clothing, artistic nudes, pin-up style, even tasteful lingerie or suggestive-but-clothed poses are constantly getting flagged as NSFW.

Countless users (including on Reddit, Trustpilot, Discord) are asking the same question: “How is this pornography?”

When the filter is so trigger-happy that safe-for-work content gets blurred 70–80% of the time, telling people “just appeal it” is not a solution—it’s an admission that the system is broken.

You changed the rules mid-game without warning

For a long time (especially 2024–early/mid 2025), SeaArt was known as one of the more permissive platforms for adult-oriented generation. Many reviews literally called it “NSFW without limits.”

Then suddenly the filter strength was cranked way up—some LoRAs (including innocent swimsuit / fitness ones) started getting flagged, old images retroactively blurred, etc.

From a customer perspective, this feels like bait-and-switch: we paid for one level of creative freedom, and now we’re on a completely different (much more censored) service.

“No refund of credits” for false flags is straight-up unfair

Even when the AI misinterprets a completely innocent prompt and produces something borderline, or when the filter wrongly catches it after generation → credits are gone forever.

That’s the platform saying: “Even if we made the mistake, you still pay the price.”

At minimum, confirmed false positives should result in credit restoration for that generation. Anything less is bad faith toward paying users.

“Protecting the community” doesn’t justify blanket artistic suppression

Nobody is arguing against blocking actual illegal content, CSAM, non-consensual deepfakes, gore-for-gore’s-sake, etc.—that should be instant and permanent.

But right now the net is catching fashion photography, fantasy art, cosplay, classical nude studies, romance novel covers, and even mildly suggestive character designs.

This isn’t community protection anymore; it looks a lot like overzealous payment-processor / app-store appeasement dressed up as moral policy.

Bottom line:

Saying “follow the rules” is fine in theory.

But when rule interpretation is arbitrary, enforcement is inconsistent and overly broad, and the financial penalty always falls on the user even for platform errors, it stops being reasonable moderation and starts feeling like a cash grab.

If the real reason is “payment processors and app stores are forcing our hand,” just say that openly.

Or if you’ve decided to pivot away from adult content entirely, announce it clearly so people can decide whether to stay or leave.

Dragging it out with “appeal it if you think it’s wrong” while quietly eating credits is only going to frustrate and drive away the exact creative users who made SeaArt popular in the first place.

Please consider these improvements:

Sharply increase filter precision so SFW content isn’t collateral damage

Restore credits for confirmed false-positive generations

Be transparent about what actually triggers permanent blocks vs. temporary blur

That would actually help build a better user experience and community—way more than another copy-paste warning ever could.

Thanks for reading.

Please authenticate to join the conversation.

Upvoters
Status

In Progress

Board
💡

Feature Request

Date

15 days ago

Author

kevin chung

Subscribe to post

Get notified by email when there are changes.