Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

LiteLLM isn't a good choice for a proxy in any case. It introduces a lot of lag amd latency and the features are often half baked. To me, it looks like a vibecoded application without a product owner. And the code itself isn't very organized either. I evaluated it for a project a few months ago and will never use it for anything production. Theres a few much better alternatives out there.
 help



Could you name some of these better alternatives?

If your requirements are just to load balance between selhosted AI servers: nginx. If you want a more thorough system with configurability, logging, etc.: Bifrost from MaximAI



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: