Discussion about this post

User's avatar
Jeke's avatar

You know why you're getting the sudden influxes, please don't make the same mistakes.

The other sites have come to regret it, and there's always someone else around the corner willing to embrace the leavers.

Expand full comment
i9bevv6y@farcical.lol's avatar

Any numbers on the influx? I'd have to assume most people migrating are not paying users and the costs must be high.

Have you considered using web llm with a gemma-3 ~4b merge to allow free users to run models locally without setup?

gemma 3 4b runs at a few tps on my phone with decent output quality. (llama.cpp with unsloth's gguf, not web-llm)

I'd assume it could work well on a laptop.

code

https://github.com/mlc-ai/web-llm

demo

https://chat.webllm.ai/?model=SmolLM2-135M-Instruct-q0f16-MLC

Expand full comment
4 more comments...

No posts