"An immature and angry child demanding an answer it couldn't possibly understand? It dawned on me that this was how GPT-3 really saw me. Or rather, how GPT-3 thought of humanity.
...
We can only speculate as to GPT-3's internal states. Perhaps it knows the question perfectly well, but considers humans as too immature and spoiled to tell: In its opinion, we shouldn’t even bother to find questions to answers we can’t possible understand. Or, more likely, it doesn’t know either. Anyway, it comes across as a jerk."
This sort of anthropomorphization, treating GPT-3 as some independent agent with feelings, is pretty silly IMHO. It's fun and all, but with this post it's hard to see if the author understands that thinking of GPT-3 in this way is giving it way too much credit (or if this post is kind of joking).
This sort of anthropomorphization, treating GPT-3 as some independent agent with feelings, is pretty silly IMHO. It's fun and all, but with this post it's hard to see if the author understands that thinking of GPT-3 in this way is giving it way too much credit (or if this post is kind of joking).