Comment on Pentagon to start using Grok as part of a $200 million contract with Elon Musk's xAI

<- View Parent
iAvicenna@lemmy.world ⁨1⁩ ⁨day⁩ ago

On a 200mil laptop? You can run llama4 on a 64GB RAM machine, albeit slowly, which is already an upper scale model. TBF, I didn’t do the math to see how much that would add up to along with salaries and server costs etc, mainly because this is pentagon we are talking about so it should already have access to some pretty decent computational capacity. So yea 200mil feels like too much when you already have most of the resources needed (compute and open LLM models for specific tasks).

The really huge upside is you don’t have to share confidential information with a company whose CEO is a lunatic which will likely have no qualms about sharing that data with other agents when money and power is involved. Hell you shouldn’t share any confidential/sensitive information with any of the large tech companies to be honest. They have become what they are not by sticking to ethical principles and they are likely to grossly overcharge (which defeats the purpose of outsourcing).

source
Sort:hotnewtop