[ comments | sourced from HackerNews ]
[HN] What happened in this GPT-3 conversation?
Submitted 1 year ago by irradiated@radiation.party [bot] to technews@radiation.party
https://chat.openai.com/share/f5341665-7f08-4fca-9639-04201363506e
Submitted 1 year ago by irradiated@radiation.party [bot] to technews@radiation.party
https://chat.openai.com/share/f5341665-7f08-4fca-9639-04201363506e
[ comments | sourced from HackerNews ]
joe@lemmy.knocknet.net 1 year ago
There’s actually a pretty simple explanation to this if you understand the way that GPT-3 trains on the models that it’s given.
I’ll try to make this as short as possible, because to naturally explain it it would take hour by hour
by hour to endure a response by hour by hour and keep going and learning. Each by hour by hour. Thank you by hour for each hour.