this discussion is so stupid. no one who isn't a moron is offloading all work and thought to LLMs. no one who isn't a moron is seriously afraid of their thinking and learning skill "atrophying", whatever tf that means.
it's clear that LLMs are unique in that you actually do have the capability to turn your brain off and blindly trust whatever it does for you. but it should be equally clear that that's a stupid approach. people will still use their minds, and this use gets empowered with proper use of LLMs. it's that simple. ffs, we take the fact that they pass the Turing Test routinely for granted now. let's not forget that this technology is legitimately incredible. it stands to reason that you are seriously handicapping yourself by not trying to use it.
my skepticism is really strong but I'm not knowledgeable enough to really be able to do anything about it... have you seen any valid critique or analysis of your project?
my cat personally wants to be between me and the monitor right in front of my face. so maybe designing a desk with like, a dip or hole or something where the cat can go into would be good
My take is that at some point, we will need ID verification online in general to prove you are human. Otherwise it's just chaos out here identity-wise and will get worse like you point out.
It is not about the humans who use AI for posting!
I believe it is more about the bot accounts that gets overwhelmingly annoying... and pollutes this and other places like reddit or other such discussion forums...
Some kind of a verification and vetting needs to happen for account creation.
I agree. But I am also sick and tired of humans prompting some LLM about the points that they want to say and having the LLM generate the response. Online communities will never be the same again.
Thank god for noscript. Did see or hear any of that and dumped the text-only version of the article and HN discussion right to my local hard drive for off-line reading.
On mobile the cat sits in the middle of the screen and does not respond to touch input. The author has been told about the distracting elements and refused to acknowledge it.
it's clear that LLMs are unique in that you actually do have the capability to turn your brain off and blindly trust whatever it does for you. but it should be equally clear that that's a stupid approach. people will still use their minds, and this use gets empowered with proper use of LLMs. it's that simple. ffs, we take the fact that they pass the Turing Test routinely for granted now. let's not forget that this technology is legitimately incredible. it stands to reason that you are seriously handicapping yourself by not trying to use it.
reply