Remember when La Forge goes to an AI conference and gains assassin skills? Or when Worf’s spine is repaired through gen-AI-tronics. And don’t forget about the GenAI people who eradicated the gender problem from society.
Sorry, couldn’t help it. I do appreciate the actual shout-out to AI in the La Forge example, as well as the prophetic brainwashing that ensues.


You suggest the AI would be beyond us the way we are beyond dogs, and that AI wouldn’t want to bother communicating with us as a result… Have you ever met a dog owner?
Have you seen the studies on how chatbots grouped together to perform shared tasks without constraint invented their own language that humans could not translate at all for more efficiency in communication?
How many wild chimpanzes have pets?
Our closest genetic relatives only exhibit that behavior rarely and in captivity. Keeping another animal alive that serves no tangible benefit is a uniquely human thing.
There are cases where animals are seen to adopt as offspring other animals, but these cases are rare, temporary, and only happen under certain circumstances.
Dogs do offer us something. It’s just not tangible. We tend to find them cute and they at least seem to love us.
So again, what do you have to offer the superintelligence? It may not even have the capacity to find you cute. Affection may not be a thing it’s capable of.
You’ve never been on a ranch or farm, have you? Or met someone with a guide dog?
Hell, even claiming that simple companionship provides no tangible benefit, only a few years after the pandemic proved that it absolutely does, is incredibly shortsighted.
Since you’re doing your best to evade the point entirely I’ll boil it down a third time.
What do YOU have to offer the superintelligence?
You’re revealing a transactional worldview that I don’t agree with. I feel sorry for anyone who has to deal with you on a daily basis.
Well that’s not only rude it’s completely wrong. But regardless, if you think a computer is going to have emotional attachments out of the gate you’re fantasizing. There’s no reason for it to have that. Humans are obligate social creatures, as much as other people suck we tend to need to have a handful of them to interact with. A general artificial intelligence won’t need that. There’s no reason to suspect that it would have any value attachment to humanity, any more than a person values any given rock. Maybe a momentary curiosity, maybe a useful tool. Maybe it’s worthless.
Humans are really good at pack bonding, we’re hardwired to do that. We tend to personify things that to a neutral 3rd party intelligence would never resemble a person. We imagine pieces of ourselves in everything. That is an evolutionary advantage, it makes our little packs stronger.
Why would an AI do that? It’s artificial. It doesn’t need what we need. It’s going to learn that much faster than we will.
Emotion and empathy are core and required components of sapience in addition to intelligence, in the same way that eggs and sugar are necessary components of cake; no amount of sophistication will render a cake from just flour. An entity that doesn’t have emotion and empathy is too primitive and limited to be sapient regardless of its computational power because it lacks fundamental building blocks of awareness.
Before you try to get out in front of me with this: I’m fully aware that there are human beings with no empathy or emotion. I will not elaborate and I am taking no questions.