This content was deleted by the author. You can see it from Blockchain History logs.

Prompt Engineering: a skill which makes Artificial Intelligence a tool which makes you more productive.

Artificial Intelligence and Machine Learning

  • I recently learned of this new field in Artificail Intelligence called Prompt Engineering, and it seems like an important innovation, possibly a historic event occurrence which we may use as a divider between what the world was like before AI and after AI. So ythis post is me sharing what I have learned and a archive of some sources to learn more for Free.
  • Additionally I was very interested in learning more about this subject because I read that Prompt Engineering is the difference between losing your job due to AI and doing a better job because of AI. Learning Prompt Engineering may change your outlook from seeing AI as something which will replace you, to something which will allow you to do your job better. The main point is AI should be a tool used by humans to do a better job. Ai should not be thought of as a tool to replace you. Tools don't replace users, they help users perform better.


source

Define Prompt Engineering

Prompt engineering is the process of structuring text that can be interpreted and understood by a generative AI model. A prompt is natural language text describing the task that an AI should perform.
source

Prompt engineering is the process where you guide generative artificial intelligence (generative AI) solutions to generate desired outputs. Even though generative AI attempts to mimic humans, it requires detailed instructions to create high-quality and relevant output.
source

Define Prompt

A prompt is a natural language text that requests the generative AI to perform a specific task.
source

So a prompt is the way you explain to ChatGPT or similar AI BOT in English, how to do a task.
source: me

Why are Prompts important?

Prompts or the way you formulate questions and tasks you give ChatBots like ChatGPT is very important. Because not every type of input generates helpful output. Generative AI systems require context and detailed information to produce accurate and relevant responses. When you systematically design prompts, you get more meaningful and usable creations. In prompt engineering, you continuously refine prompts until you get the desired outcomes from the AI system.
source

Why are many companies employing Prompt Engineers?

  • In my opinion, it is because Employees who are already trained to do a job, who are then trained to interact effectively with AI Bots like ChatGPT are incrediubly more productive then other employees.
  • There are two important concepts here I wish to focus on.
  • First the AI ChatBot's primary purpose is not to replace the Employee with a great deal of expertise. The AI ChatBot is a tool, which makes the EMployee with the subject matter expertise much more productive and thus makes the company much more profitable.
  • Second the AI ChatBOt is a complex computer intelligence tool who has all the knowledge of the Internet at it's disposal, but only by having a subject matter expert ask it the right questions, can the AI ChatBot really be more then a big library of difficult to use knowledge.

Video Introductions to Prompt Engineering

  • First

Generative AI in a Nutshell - how to survive and thrive in the age of AI

  • This one is a very good explaination with words, pictures and drawings and explains Prompt Engineering in a non technical way.

source

  • Second

Prompt Engineering Tutorial – Master ChatGPT and LLM Responses

source

  • Third

Official ChatGPT Prompt Engineering Guide From OpenAI

source

Google Top Search Results for Prompt Engineering

Bonus

  • If your interested in a brief overview of Artificial Intelligence I recommned this video, it's ten minutes and summarizes a four hour Google Video.

Google’s AI Course for Beginners (in 10 minutes)

#

source

  • Last Words

  • A prompt is a natural language text that requests the generative AI to perform a specific task.
  • Prompt engineering is the process of structuring text that can be interpreted and understood by a generative AI model. A prompt is natural language text describing the task that an AI should perform.
  • Learning Prompt Engineering may change your outlook from seeing AI as something which will replace you, to something which will allow you to do your job better. The main point is AI should be a tool used by humans to do a better job. Ai should not be thought of as a tool to replace you.
  • I hope this brief introduction should motivate you to find out more.
  • @shortsegments

@shortsegments

CBBA6A9CAD1446DE820AC3D107CE6976.jpeg

About me.

I am a writer who has focused on cryptocurrency, the blockchain, non-fungible digital tokens or NFTs, and decentralized finance for over six years. But I also like to write about life, my garden, my island home and other stuff.

Hive Blockchain

  • The Hive blockchain has zero transaction fees, 3 second transaction time, and proof of stake consensus.
  • Hive also has a Community DAO with a 25 million dollar DAO treasury paying developers and builders to build applications for the community.
  • Maybe your project should be here.

Hive Blog is a Social Media Platform

Hive is a blockchain based social media platform where you own your content and you earn a cryptocurrency token when your readers like your content.
There are other applications on the Hive Blockchain, including other social media platforms like Inleo.io and also Pay for Play games like Splinterlands and Pay for Move exercise or activity applications like Actifit.

Learn more about Hive at this Link

Join for free at this Link

.

Additional Detailed references for further Reading

Other Top Links for Prompt Engineering

Prompt Engineering Guide
Prompt engineering is a relatively new discipline for developing and optimizing prompts to efficiently use language models (LMs) for a wide variety of applications and research topics. Prompt engineering skills help to better understand the capabilities and limitations of large language models (LLMs).
Researchers use prompt engineering to improve the capacity of LLMs on a wide range of common and complex tasks such as question answering and arithmetic reasoning. Developers use prompt engineering to design robust and effective prompting techniques that interface with LLMs and other tools.
Prompt engineering is not just about designing and developing prompts. It encompasses a wide range of skills and techniques that are useful for interacting and developing with LLMs. It's an important skill to interface, build with, and understand capabilities of LLMs. You can use prompt engineering to improve safety of LLMs and build new capabilities like augmenting LLMs with domain knowledge and external tools.
Motivated by the high interest in developing with LLMs, we have created this new prompt engineering guide that contains all the latest papers, advanced prompting techniques, learning guides, model-specific prompting guides, lectures, references, new LLM capabilities, and tools related to prompt engineering.
Prompt Engineering Guide

Prompt engineering
This guide shares strategies and tactics for getting better results from large language models (sometimes referred to as GPT models) like GPT-4. The methods described here can sometimes be deployed in combination for greater effect. We encourage experimentation to find the methods that work best for you.
Some of the examples demonstrated here currently work only with our most capable model, gpt-4. In general, if you find that a model fails at a task and a more capable model is available, it's often worth trying again with the more capable model.
Prompt engineering - OpenAI API

Six strategies for getting better results

Write clear instructions
These models can’t read your mind. If outputs are too long, ask for brief replies. If outputs are too simple, ask for expert-level writing. If you dislike the format, demonstrate the format you’d like to see. The less the model has to guess at what you want, the more likely you’ll get it.
Prompt engineering - OpenAI API

Include details in your query to get more relevant answers
Ask the model to adopt a persona
Use delimiters to clearly indicate distinct parts of the input
Specify the steps required to complete a task
Provide examples
Specify the desired length of the output
Provide reference text
Language models can confidently invent fake answers, especially when asked about esoteric topics or for citations and URLs. In the same way that a sheet of notes can help a student do better on a test, providing reference text to these models can help in answering with fewer fabrications.
Prompt engineering - OpenAI API

Instruct the model to answer using a reference text
Instruct the model to answer with citations from a reference text
Split complex tasks into simpler subtasks
Just as it is good practice in software engineering to decompose a complex system into a set of modular components, the same is true of tasks submitted to a language model. Complex tasks tend to have higher error rates than simpler tasks. Furthermore, complex tasks can often be re-defined as a workflow of simpler tasks in which the outputs of earlier tasks are used to construct the inputs to later tasks.
Prompt engineering - OpenAI API

Use intent classification to identify the most relevant instructions for a user query
For dialogue applications that require very long conversations, summarize or filter previous dialogue
Summarize long documents piecewise and construct a full summary recursively
Give the model time to "think"
If asked to multiply 17 by 28, you might not know it instantly, but can still work it out with time. Similarly, models make more reasoning errors when trying to answer right away, rather than taking time to work out an answer. Asking for a "chain of thought" before an answer can help the model reason its way toward correct answers more reliably.
Prompt engineering - OpenAI API

Instruct the model to work out its own solution before rushing to a conclusion
Use inner monologue or a sequence of queries to hide the model's reasoning process
Ask the model if it missed anything on previous passes
Use external tools
Compensate for the weaknesses of the model by feeding it the outputs of other tools. For example, a text retrieval system (sometimes called RAG or retrieval augmented generation) can tell the model about relevant documents. A code execution engine like OpenAI's Code Interpreter can help the model do math and run code. If a task can be done more reliably or efficiently by a tool rather than by a language model, offload it to get the best of both.
Prompt engineering - OpenAI API

Use embeddings-based search to implement efficient knowledge retrieval
Use code execution to perform more accurate calculations or call external APIs
Give the model access to specific functions
Test changes systematically
Improving performance is easier if you can measure it. In some cases a modification to a prompt will achieve better performance on a few isolated examples but lead to worse overall performance on a more representative set of examples. Therefore to be sure that a change is net positive to performance it may be necessary to define a comprehensive test suite (also known an as an "eval").
Prompt engineering - OpenAI API

Other References from this article Prompt engineering - OpenAI API
For more inspiration, visit the OpenAI Cookbook, which contains example code and also links to third-party resources such as:
Prompting libraries & tools
Prompting guides
Video courses
Papers on advanced prompting to improve reasoning

Since ChatGPT dropped in the fall of 2022, everyone and their donkey has tried their hand at prompt engineering—finding a clever way to phrase your query to a large language model (LLM) or AI art or video generator to get the best results or sidestep protections. The Internet is replete with prompt-engineering guides, cheat sheets, and advice threads to help you get the most out of an LLM.
In the commercial sector, companies are now wrangling LLMs to build product copilots, automate tedious work, create personal assistants, and more, says Austin Henley, a former Microsoft employee who conducted a series of interviews with people developing LLM-powered copilots. “Every business is trying to use it for virtually every use case that they can imagine,” Henley says.
AI Prompt Engineering Is Dead Long live AI prompt engineering

Prompt engineering is the practice of designing inputs for generative AI tools that will produce optimal outputs. Just as better ingredients can make for a better dinner, better inputs into a generative AI model can make for better results. These inputs are called prompts, and the practice of writing them is called prompt engineering. It’s clear that the more specific output has a greater chance of achieving the result you’re after. By creating a more detailed, specific request to the AI chatbot, you’ve just engineered a prompt.
What is prompt engineering?

Organizations are already beginning to make changes to their hiring practices that reflect their generative AI ambitions, according to McKinsey’s latest survey on AI. That includes hiring prompt engineers. The survey indicates two major shifts. First, organizations using AI are hiring roles in prompt engineering: 7 percent of respondents whose organizations have adopted AI are hiring roles in this category. Second, organizations using AI are hiring a lot fewer AI-related-software engineers than in 2022: 28 percent of organizations reported hiring for these roles, down from 39 percent last year.
What is prompt engineering?

Articles referenced by the article What is Prompt Engineering
“What’s the future of generative AI? An early view in 15 charts,” August 25, 2023
“The state of AI in 2023: Generative AI’s breakout year,” August 1, 2023, Michael Chui, Lareina Yee, Bryce Hall, Alex Singla, and Alex Sukharevsky
“Generative AI and the future of work in America,” McKinsey Global Institute, July 26, 2023, Kweilin Ellingrud, Saurabh Sanghvi, Gurneet Singh Dandona, Anu Madgavkar, Michael Chui, Olivia White, and Paige Hasebe
“Technology’s generational moment with generative AI: A CIO and CTO guide,” July 11, 2023, Aamer Baig, Sven Blumberg, Eva Li, Douglas Merrill, Adi Pradhan, Megha Singa, Alex Sukharevsky, and Stephen Xu
“Generative AI can give you ‘superpowers,’ new McKinsey research finds,” July 6, 2023
“Unleashing developer productivity with generative AI,” June 27, 2023, Begum Karaci Deniz, Chandra Gnanasambandam, Martin Harrysson, Alharith Hussin, and Shivam Srivastava
“The economic potential of generative AI: The next productivity frontier,” June 14, 2023, Michael Chui, Eric Hazan, Roger Roberts, Alex Singla, Kate Smaje, Alex Sukharevsky, Lareina Yee, and Rodney Zemmel
“What every CEO should know about generative AI,” May 12, 2023, Michael Chui, Roger Roberts, Tanya Rodchenko, Alex Singla, Alex Sukharevsky, Lareina Yee, and Delphine Zurkiya
“AI-powered marketing and sales reach new heights with generative AI,” May 11, 2023, Richelle Deveau, Sonia Joseph Griffin, and Steve Reis
“Generative AI is here: How tools like ChatGPT could change your business,” December 20, 2022, Michael Chui, Roger Roberts, and Lareina Yee
What is prompt engineering?