Mostly I’m curious what people’s setups are. Are you using docker or a VM? Which tools are you using to stream and play your roms or steam games?
Looking for suggestions for myself as well… I’m on unraid and looking to support multiple users.
Submitted 6 months ago by couch1potato@lemmy.dbzer0.com to selfhosted@lemmy.world
Mostly I’m curious what people’s setups are. Are you using docker or a VM? Which tools are you using to stream and play your roms or steam games?
Looking for suggestions for myself as well… I’m on unraid and looking to support multiple users.
My box sits in my closet, so can’t really help much with docker or vm. But I use sunshine server with moonlight client. Keep in mind you can’t fight latency that comes from distance between server and client. I can use 4/5G for turn based or active pause games but wouldn’t try anything real time. On cable my ping is under ms, enough to play shooters as badly as I do these days.
I use AMD for CPU and GPU, and wouldn’t try nvidia if using Linux as sever.
I did use to run a VM in xenserver/xcp-ng and passthrough gpu with a mock hdmi screen plug. A windows 10 vm, ran very well bar pretty crap CPU but I did get around 30fps in 1080p tarkov, sometimes more with amd upscalling. Back then I was using parsec, but found sunshine and moonlight works better for me.
What kind of machine do you use as a client? And how does performance compare to playing on the server directly?
I use a 2016 Asus Zenbook with integrated intel gpu.
The performance is comparable. The only thing that’s different is latency, obviously, although it’s fairly negligible on LAN, and encoding/decoding sometimes createa artifacts and smudges, but it’s better at higher bandwidth.
I used to run a proxmox server with windows in a VM that had a GPU via gpu passthrough. Then connected to it via parsec. On my laptop connected to the local network it was pretty good.
noobface@lemmy.world 6 months ago
I’ve spent a decade working on and off on this in professional and personal settings.
If budget is no object it’s only kind of a pain in the ass with Nvidia’s vGPU solutions for data centers. Even with $10 grand spent there’s hypervisor compatibility issues, license servers, compatibility challenges with drivers for games/consumer OS’s on hypervisors, and other inane garbage.
Consumer wise it’s technically it’s the easiest it’s ever been with SRIOV support for hardware accelerating VMs on Intel 13 & 14 gen procs with iGPUs, however iGPU performance is kinda dogshit, drivers are currently kinda wonky, and multiple display heads being passed through to VMs is kind fucking weird for hypervisors.
On the docker side of things, since containers aren’t technically full kernels YMMV based on what you’re trying to accomplish. Technically nvidia container toolkit does support CUDA & display heads for containers: hub.docker.com/r/nvidia/vulkan/tags. I haven’t gotten it working yet, but this is the basis for my next set of experiments.