Over half of all tech industry workers view AI as overrated::undefined
I work in an AI company. 99% of our tech relies on tried and true standard computer vision solutions instead of machine-learning based. It’s just that unreliable when production use requires pixel precision.
We might throw a gradient descent here or there, but not for any learning ops.
little_hermit@lemmus.org 11 months ago
I asked chatGPT to generate a list of 5 random words, and then tell me the fourth word from the bottom. It kept telling me the third. I corrected it, and it gave me the right word. I asked it again, and it made the same error. It does amazing things while failing comically at simple tasks. There is a lot of procedural code added to plug the leaks. Doesn’t mean it’s overrated, but when something is hyped hard enough as being able to replace human expertise, any crack in the system becomes ammunition for dismissal. I see it more as a revolutionary technology going through evolutionary growing pains. I think it’s actually underrated in its future potential and worrisome in the fact that its processing is essentially a black box that can’t be understood at the same level as traditional coding. You can’t debug it or trace the exact procedure that needs patching.
aidan@lemmy.world 11 months ago
It’s definitely feasible, like what they tried to do with Wolfram alpha- but do you have a source for this?
ByGourou@sh.itjust.works 11 months ago
I believe I saw this kind of issues was because of the token system. Like if you tell him to find a word starting with a letter, he can’t really do it without hard coded workaround, because he doesn’t know about single letters, only about tokens which are parts of the sentence.
It’s definitly more complicated than that, but it doesn’t mean AI is bad, only that this current implementation can’t do theses kind of task.