Yes, just grab any recent LLM like Mistral-7B and ask it to translate for you. A local client is here github.com/LostRuins/koboldcpp but you might need a good GPU to get quick answers.
Alternatively use lite.koboldai.net to use someone else’s computer.
django@discuss.tchncs.de 8 months ago
You can self-host libretranslate: libretranslate.com
acockworkorange@mander.xyz 8 months ago
Looks like the engine behind it is opennmt.net
Which can use a Tensor Flow backend, which can potentially be accelerated by a rather cheap Coral TPU. Neat!