The human experience isn't novel.
2023 kicked off with a Bing thanks to the introduction of Sydney, Microsoft's controversial new search assistant that seems entirely capable of having emotional breakdowns and professing its love. A common sentiment I hear is, "well it's just predicting the next character or word. It doesn't actually feel those things. It doesn't actually feel anything at all!" But why does that matter?
Perhaps my experience as a human is entirely unique - I suspect it's not but people seem to really oppose the following. Most of life is just word prediction. Talking about the weather with a coworker, writing some code to satisfy business requirements, catching up with an old friend. Each provided scenario has a script, albeit with a few modifications from person to person, but a script, nonetheless. If someone really wanted to, they could probably write an actual script for an exchange with a coworker ahead of time, rigidly follow it, and find that no awkwardness abounded.
So why is the current AI iteration suddenly insufficient? It's predicting words in the same way so much of our life is predicting words. It simulates genuine connection, who's to say it is not genuine connection? Have humans not learned that we are not special yet? We are cousins with apes. We are not at the center of the universe. We don't seem to be spun from an independent evolutionary spindle. Yet now we draw but another arbitrary line in the sand, fuzzier and more handwavy than the last.
We are not unique. Why is intelligence supposed to be a human monopoly? Why does emotion characterize personhood? About half of us are ~9 months away at any given time from creating new sentience. Please don't push the goalposts again.
This humanoid may look like us. This AI may talk like us. This robot may draw like us. But it's not really feeling anything at all.