Will you be my Binary Lover?

I was talking with some colleagues today about AI, because one of them has been using ChatGPT for some of their work. They have been getting the base and then editing and translating it into their native language. This has saved them time. They are also using it to improve their language skills, which is useful. The difference between this usage and how many use them however, is that some use it to improve their skillset, while many use it to replace their skillset. All in the name of efficiency.

image.png

It isn't efficient or effective to be unskilled.

And the more people rely on the tools to do their work, the less skilled they will likely become, as like it or not, with all the spare time saved, most people use it on entertainment - not tradeable skill development. But, not everything is about work, right? What about entertainment itself - or relationships?

And this is where things get personal.

He was smart, funny, good looking and always made me feel special. He was interested in many of the same things I am and I felt we were connecting deeply. Until we met in person. This has happened many times now. What has happened to men?
_ My Friend

You see, she had met a guy on Tinder and everything was going well, but once they met in person, they were "no longer" the same person. The reason is, because they were never that person they presented themselves to be. They were a cyborg.

This is not some dystopian sci-fi story, this is the reality of humanity right now. This is more than an overly-flattering 10-year-old profile image. It is more than someone saying that they like cats too, even though they don't. It is catfishing.

Catfishing is a deceptive activity in which a person creates a fictional persona or fake identity on a social networking service. It can be used for financial gain, to compromise a victim in some way, as a way to intentionally upset a victim, or for wish fulfillment.

And the way they are deceiving my friend for example, is that their persona is not theirs, it is a Google personality. The intelligence isn't theirs - it is from a search. The funny isn't theirs - it is leveraging gifs, memes and quotes from others. And the interest in the same things - isn't theirs, they are just feigning it by learning a little online.

But in person...

They fail.

All the "skills" of connection they had, were never in their possession, they belonged to the cyborg, the AI who served them content through a search to provide them the right thing to present at the right time. However, while it works online, in real relationships, it really, really doesn't. It just makes them look like liars and fools.

Yet, when people use these tools, they don't think about the consequences that will increasingly impact on them both from their own behavior and, through the behavior of others. There is no filter to apply for real life, no time to search for the perfect response and, no pause before pressing send on spoken words - unless one has the skills of emotional control, patience and wisdom. But these are being outsourced to technology too, to give the impression that we hold them.

You know what happens to companies that outsource their core competence?

They fail too.

And that is what people are increasingly doing to themselves. While the argument for efficiency is there, the maximization of output generation doesn't necessarily support the long term goals. After all, what do people actually want from life? What is a good life? Does reducing the amount and quality of skills lead to higher incomes or better relationships?

What do people actually value?

Think about it in terms of where we are heading on blockchains - trustless networks. Trustless doesn't mean "untrusted", it means that there is consensus on a single point, with the idea of a web of trust providing many points to provide a high level of confidence on a decision.

The proof is in the pudding.

And in the human.

What people aren't yet realizing is that the data collection isn't going to go down, it is going to go up and become increasingly transparent. What this means is that in the not too distant future, all those searches and all that leveraging technology, will also be part of the confidence score that could very well be applied to an individual. Now that is sounding dystopian, isn't it?

But more than that, the other thing that people aren't recognizing is that the more they leverage the AIs to enhance their skills across the board, the more the AIs learn and the wider their application becomes. Everything is codifiable, it just requires enough data and out of convenience and "efficiency", people are literally making themselves redundant. And, this redundancy is not only in the areas that don't matter to us, they are in the things that make life worth living.

Companionship, intimacy and love.

image.png

Yes. Pathetics. Because you know, it is the people who are having sex who are the problem, not the people who are so socially inept, they can't get a partner. Who's fault is it? That is the future for many people though, because the more they rely on technology to be their personality, the less capable they become to actually be attractive in reality. But don't worry,

I am sure sex in the Metaverse will be just as satisfying.

Humans create tools that advance us by giving the ability to perform tasks we can't naturally, or save us time and energy, however, these aren't good by default. Competitive Cognitive Artifacts are tools that we create that lower our skill ability like calculators, and while this is okay in some respects, it is going to cost us dearly in others. Now, we are creating tools that can mimic human abilities better than we can produce ourselves.

Yet while this seems a step forward, once it encroaches into the areas of social behaviors, it will mean that our interactions with each other will suffer. We are seeing this in terms of social media usage already, where the heaviest users are likely the least socially capable, and also often seen as the least trustworthy, because everything they do is for "likes" and attention. Not to mention, all of the mental issues associated.

Disconnection, depression, violence.

Proof?

It is coming for all of us. We might not want to live in a data dystopia, but that is exactly where we are headed and, we are going to be powerless to overcome it, because we are relying more and more on that data to enhance we are - from the selfie filters for how we look, to the google intelligence and humor that we use to attract potential partners. But, all that digital aid falls down in the face of a face-to-face conversation over coffee.

As always - humans are the weakest link.

But hey, I don't mind. For me, I will try to improve my skillset and raise my daughter to have as many valuable human skills as she can, so rather than be a user of technology that enslaves her, she will be a creator of technology that enslaves others.

Will it be her fault if she supplies what the market demands?

Sounds bad, doesn't it? But isn't this the free market system in action? Right now, people can freely feed an AI all the sensitive data it needs to enslave them and us, while getting an almost valueless output in return. And that is the deal we keep making, over and over. Given free choice, we choose convenience in the short term, and increasing suffering in the long.

None of us need die alone again. We will have a digital companion in hand that knows everything about us and exactly what to say.

Binary love.

Taraz
[ Gen1: Hive ]

H2
H3
H4
3 columns
2 columns
1 column
34 Comments
Ecency