> Obviously, the public gets no benefit when you censor some guy from doing his business in private.
> But you're also in danger of providing sexualised child characters for people to jerk off to.
I would say that anything is fine in private, even that. Better than them looking for the real thing, right? I guess some people are afraid that such content will actually turn people into pedo's, but I don't think that's true any more than watching gay porn will turn you gay.
CSAM is bad, no nuance. Which is why I welcome an alternative that could collapse the entire CSAM market and therefore save thousands of lives.
But, this is probably not the forum for this topic anyway. I understand that there are many complications to this, including the risk for normalization.
There's a reasonable philosophy where toolmakers don't involve themselves in fine, disputed, and frequently changing cultural differences.
It's one thing to build guardrails into tools where the impact is immediate and catastrophic, like a safety guard on saw, so that people don't permanently lose a finger to distraction or casual negligence.
It's a different thing -- to be reasonably argued both for and against -- to build guardrails against concerns that are abstract, indirect, cultural, political, etc
And you can tell that your example is more cultural than catastrophic because not everybody frets over what people do by themselves and in their own mind.
For public consumption, yeah I think it's reasonable to censor the most egregious stuff, but for private use there's really no point. If people want something, they will always find a way to get it. 100% of the time. There is absolutely no amount of censorship or even laws that can prevent it.
Censoring anything meant for private use seems like a pretty pointless exercise. It will either get un-censored, or users will find what they want elsewhere.
E: You also need to ask yourself: assuming that you can't simply suppress other people's desires for things you find objectionable (because you can't), would you rather they use fictional CP with fictional people, or the real deal with real children? Because the real thing very much exists and is quite available despite the laws and social stigma. If you make some vain attempt to prevent anyone from ever creating fictional CP, then what do you think those people will get next?
Do you think you'll switch to running local models at some point e.g. LambdaLabs/Runpod running Llama to reduce costs? Just thinking about as you scale and wanting to reduce cost per token. (I'm not in the GPU hosting biz - just curious because I think we're all thinking about this)
The thing I like best about character.ai is their TTS. Does anyone have an idea what engine they use? I don't suppose they invented their own system from scratch.
Why compare this to Dippy AI of all things? Looks like a platform/app targeting a different type of AI character consumer. Also I see nothing about Dippy being open source.
It was a bit surprising to click this link on my VPN. Wish it had a NSFW tag. Uncensored is a bit of a crapshoot when it comes to GenAI, half of the time is just means "no instruction tuning."
The link should probably just be the GitHub repo. It's an interesting project and discussion topic (although maybe a little flamewar inviting), but it doesn't seem necessary to link to the product itself in this case.
I built this because users where angry at c.ai for shutting down the old site and implementing more filters and censors.
So I rebuilt the old ui and used models on OpenRouter to power everything
Have nearly all the features, edit, regenerate, public/private characters, personas.