Comment on What's up, selfhosters? It's selfhosting Sunday again!

chirospasm@lemmy.ml ⁨4⁩ ⁨days⁩ ago

Hello! I recently deployed GPUStack, self-hosted GPU resource manager.

It helps you deploy AI models across clusters of GPUs, regardless of networks of device. Got a Mac? It can toss a model on there and route it into an interface. Got a VM? Same. GPUStack is great at scaling what you have.

I use it to route pre-run LLMs inti Open WebUI, another self-hosted interface for AI interactions, via the OpenAI API that both GPUStack and Open WebUI support!

source
Sort:hotnewtop