Press the Any Key

One of my colleagues and I were playing with ChatGPT yesterday, to see what are the capabilities for it from a work perspective. For example, we asked it to tell us about the company we work for. We asked it to give us reasons why what we do for customers is important. We asked it to write a sales pitch. Overall, it did a great job and would make creating a base easier for basics, but...

It is definitely plagiarism.

While it was able to write in natural language well, it was also taking content almost directly from available marketing material and content that both of us know quite well. In fact, I have worked on some of the content and there were slices of my sentences in there also. So, what it is doing is spinning the content already available, and it isn't asking if that is okay to do - especially for commercial usage.

P2120439 (1).jpg

Spinning and it isn't just frowned upon on Hive, it is discouraged heavily through downvotes, as is of course plagiarism, so why would AI created content that automatically plagiarizes and spins content be treated differently from if a person does the same?

Some people seem to think that AI content isn't an issue and should be rewarded, but how would you feel if for example, my content was generated through AI and I was getting rewarded for it as I do? Or @taskmaster4450's, or any other well-rewarded content creator? What if you knew that I was using AI as a base for my work and instead of taking hours to create a single piece, it was taking me ten minutes?

Happy with that efficiency?

Of course, while I don't know about others, I am definitely not doing this because not only is AI generated content plagiarism, it is also not mine. It would be interesting to see what it would create if it only had access to my work, as I have almost 6000 posts written hudreds of topics on HIVE, that equate to something like 30x 300-page novels, plus comments. What would an AI create?

Would it still be mine?

I don't think so.

And, the reason is that even if it is writing a new article using my voice, it isn't using my experience garnered between then and now. Since we are always changing, it won't actually represent my opinions in the moment, even if it is close.

People don't have an intuition for this, especially those who are unable to produce as good quality content as the AIs. No one is going to steal the work or spin the content of low-quality, low-experience, content creators, are they? Yet, most people don't want their own work stolen, especially for gains that they get no part of. We all have experience with those who take our ideas and present them as their own, and I am yet to meet someone who likes it happening. Just because the content comes from sources off Hive, it doesn't make it okay.

And with an AI, how do you know?

The AI isn't citing sources, it is just trawling, collecting and combining. In the case of testing with my colleague, the only way we knew where it came from, was because of our very direct relationship with its creation, and the people who originally wrote it.

It raises moral questions about those who use it for reward, doesn't it? Even using it as a base, there is so much room for content theft and of course, the easier it is, the more it happens. The first article used it as a base, the second spun it, the third just cuts and pastes in entirety.

Are you still a content creator?

Your content has value.
Your voise has value.

Where is the line start of what is yours, and what belongs to the unseen and unreferenced voices of those who originally created the output?

And if it isn't your voice, people shouldn't be providing value to you. It isn't just citing the usage of AI either, because that doesn't reflect the actual sources of the information provided.

And for a lot if the people using AI to create their content, it would appear that they themselves have little to no skill or experience in these areas, so how clever can the usage of the tool be and, what can they add in value through comments?

Google answers, no doubt.

What I like about the usage of AI, is it is raising questions of what is valuable, something that has to happen going forward. Ont top of this, it is also raising the ownership equation and in an age where content source is rewarded, this is going to provide much needed usecases for blockchain, through content verification and proof of human.

These are key to blockchain adoption and require careful consideration, but there is already a massive need for being able to trust content and verify source, as well as legitimacy. So far, the algorithms are providing response based on relevancy, but that doesn't mean what is provided is trustworthy and instead, tends to feed a process of confirmation bias for the consumer.

Everyone is getting fed what they want to eat, believing that because they agree with it, it must be correct.

People seem to think that I am negative on AI, but I am not, it is going to play a vital role in the future. However, whether it is going to be a tool that improves us, or one that enslaves us through our own laziness and complacency, is the question. In a world of decentralised, transparent and trustworthy data, AI can be an amazing help. But in an opaque box, fed disinformation from a centralized source - do you really want it to become your god?

Praying is useless.

Taraz
[ Gen1: Hive ]

H2
H3
H4
3 columns
2 columns
1 column
51 Comments
Ecency