I frequently feel that urge to rebuild from ground (specifications) up, to remove the “old bad code” from the context window and get back to the “pure” specification as the source of truth. That only works up to a certain level of complexity. When it works it can be a very fast way to “fix” a batch of issues, but when the problem/solution is big enough the new implementation will have new issues that may take longer to identify as compared with just grinding through the existing issues. Devil whose face you know kind of choice.
Comment on I Went All-In on AI. The MIT Study Is Right.
minorkeys@lemmy.world 4 days ago
It looks like a rigid design philosophy that must completely rebuild for any change. If the speed of production becomes fast enough, and the cost low enough, iterating the entire program for every change would become feasible and cost effective.
MangoCats@feddit.it 4 days ago
entropicdrift@lemmy.sdf.org 4 days ago
… as long as the giant corpos paying through the nose for the data centers continue to vastly underprice their products in order to make us all dependent on them.
Just wait till everyone’s using it and the prices will skyrocket.