The Good, the Bad, and the Future of Prompt Engineering

Jan 10, 2024

Jan 10, 2024

Jan 10, 2024

Introduction

As artificial intelligence (AI) becomes an increasingly important part of our lives, we hear the term "prompt engineering" more frequently. But what is prompt engineering?

Prompt engineering is the practice of developing and optimizing prompts to efficiently use language models for a variety of applications.

In simpler terms:

  • Imagine a language model (GPT-3.5 or GPT-4) as a chef in a kitchen, about to cook something. You, as the customer, need to tell the chef what you want to eat (let’s say a pizza)

  • Suppose you learn that the chef makes better pizzas when you give more details, like "Please make a cheese pizza with thin crust and extra mozzarella."

  • This process of figuring out the best way to give the order to get the best pizza is prompt engineering. It's about refining and optimizing your instructions (or prompts) so the chef (GPT) understands exactly what you want and can make it perfectly.

But when it comes to the future, people have different views about whether prompt engineering will remain an important skill for everybody or whether it will become a specialized skill mainly used by some in certain niches?

Right now, prompt engineering is quite important for everyone. It's the difference between getting a useful answer from a language model and getting something below average or off-topic. But there's a lot of talk about where this is heading.

Many people think that as AI gets better, not everyone will need to learn how to come up with the right prompt. Some think this skill will always be useful, especially for jobs that need to be done in great detail or that are technically complex.

Today, we’re going to look at both sides of this conversation. We're going to see what makes prompt engineering a key part of using AI today and how it might change as AI gets even better at understanding us.

So, whether you're new to AI or you've been following it for a while, let's explore this question together.

2. The Historical Context of Prompt Engineering

Let's take a step back and see how prompt engineering started. Imagine the early days of asking computers to do things. We started with the simplest of commands, much like teaching a toddler single words. This was the dawn of what we now call 'prompt engineering'—knowing how to ask a computer to get a clear answer. Over the years, as AI got smarter, so did our questions.

We went from basic instructions to full conversations, where AI could understand not just the 'what' but also the ‘why’ behind our words. It's a fascinating story of growth, from the early, straightforward commands to the complex, nuanced prompts we use today, especially with large language models like GPT-3.5 and GPT-4, which can write poems or answer tricky questions.

The study "A Brief History of Prompt: Leveraging Language Models Through Advanced Prompting," available on ResearchGate, provides a better look at this evolution. It traces how we've moved from simple interactions to complex dialogues with AI, shaping the way we communicate with these systems.

The role of prompt engineering may change as AI keeps getting better, which makes people wonder how important it will be in the future. Will it remain an essential skill, or will AI get smart enough that we won't need such specific knowledge?

3. The Evolution of Prompts in Relation to NLP and LLM Advancements

With early chatbots, prompts needed to be basic due to limited language capabilities. More natural prompts emerged as neural networks enabled stronger understanding. Systems like ELIZA could only respond based on pattern matching of keywords, so prompts had to closely match pre-defined interaction templates.

Transformer models like BERT then enabled multi-sentence prompts. But massive models like GPT-3 required less context again, demonstrating strong generalization.

So while advances can reduce the need for prompting complexity on average, unique model strengths and use cases also influence ideal prompts.

Here are a few key trends:

  • Early systems like rule-based bots and narrow AI required very simple, directive prompts due to limited understanding capabilities. One short question was typically best.

  • The introduction of deep learning allowed for some additional context and complexity, with prompts commonly 1-2 short sentences. But ambiguity remained an issue.

  • Transformer models like BERT massively increased understanding of longer-form text. Prompts grew to include richer context and setup across multiple sentences.

  • With massive pre-trained LLMs, the trend reversed again. Models like GPT-3 demonstrated such broad abilities that simple, concise prompts often suffice on their own due to inherent context understanding.

  • While very large models like GPT-3 do have strong general understanding, newer models may not surpass earlier versions in all ways due to different training strategies or datasets.

  • As AI models get better, we're seeing more use of longer, detailed prompts for topics that need extra explanation.

While advances tend to decrease needed prompt complexity on average, individual model strengths/weaknesses and application needs are also important.

4. The Present State: The Good in Prompt Engineering

Right now, prompt engineering is proving to be quite valuable. It's like having a map in a foreign city—it guides AI to give us the answers we need. Let's say you're using a language model to write an article; prompt engineering helps you get not just any article, but one that fits exactly what you're looking for in terms of tone, style, and content.

This LinkedIn post by Jay Dang provides another insightful take on prompt engineering. It suggests that as AI models improve, the prompts we use to interact with them may become more complex.

This is somewhat counterintuitive; one might expect that better AI would understand us more easily, requiring simpler prompts, but the trend appears to be the opposite.

Jay compares this to the early days of computer programming, where advancements in technology didn't lead to shorter code, but instead allowed programmers to tackle more complex problems.

Similarly, as AI becomes more sophisticated, we are not simplifying our communication with it. Instead, we are using the enhanced capabilities to ask more detailed and deep questions, seeking more comprehensive answers.

This perspective highlights the immediate benefit of prompt engineering: it empowers us to make the most of what AI can offer.

Whether for creative projects, technical tasks, or everyday queries, the ability to craft effective prompts can unlock AI's full potential.

Despite the uncertainty about its future, prompt engineering is a powerful tool today, helping to steer AI interactions towards more meaningful outcomes.

5. The Present State: The Bad in Prompt Engineering

While prompt engineering unlocks AI's capabilities, it also comes with some downsides in its current form. Crafting good prompts can be challenging and requires skill - it's not as simple as typing a few words.

Let's look at some of the difficulties:

Time Consuming

Coming up with well-structured prompts takes time and effort. You often have to experiment with different phrasings and approaches before landing on something optimal. This makes prompt engineering less accessible for casual users who want quick answers.

Requires Additional Knowledge

To write effective prompts, you need some expertise - understanding the model's abilities, its training data, and how to frame requests logically. This creates a knowledge gap between specialists who can prompt well and regular users who struggle without this background.

Over-Reliance on Model's Interpretation

There's also a potential over-reliance on the model's interpretation and response generation capabilities. Users might assume that the model fully understands the context or nuances of a prompt, which is not always the case. This can lead to overconfidence in the responses, especially in scenarios where precision and accuracy are critical.

Cost of Ready-to-Use Prompt Templates

For those who lack the expertise or time to craft effective prompts, there are marketplaces offering pre-made prompt templates tailored for various applications. However, the cost of high-quality prompt templates is a notable concern. For a well-crafted prompt, prices can range between $1 to $5 each, meaning a bundle of 50 top-notch templates could cost as much as $250. This can be very expensive, especially for regular users who might not have a big budget.

6. The Future: Will Advancing AI Make Prompt Engineering Obsolete?

As AI grows more advanced, some believe prompt engineering will fade in importance. The idea is that models will become so conversant, they won't need carefully crafted prompts to understand requests and provide relevant responses. Do you agree with this view? There are good arguments on both sides.

The Case For Less Need

Here are some reasons future AI may require less specialized prompting:

  • Models pre-trained on exponentially more data become better at inferring context and intent from minimal input.

  • Architectures like memory networks and knowledge graphs allow models to tap into shared facts and connections between concepts.

  • Models that can ask clarifying questions enable two-way dialogue when additional context is needed.

  • Personalization allows models to learn an individual user's preferences and quirks over time.

  • Improved common sense capabilities fill in gaps more accurately based on real-world understanding.

The Case For Continued Need

However, there are also factors indicating prompting skills will remain important:

  • New capabilities unlock new complex use cases with nuanced requirements beyond surface-level requests.

  • Training limitations and biases mean models still make mistakes prompting can help avoid.

  • Individual users have unique communication styles and contexts models cannot fully generalize.

  • Creative pursuits often require detailed prompting to achieve a specific vision.

  • For safety, transparency and control, some may prioritize direction over full autonomy.

The future likely holds a bit of both - day-to-day prompts becoming simpler while prompt engineering remains key for intricate applications. As with programming, new capabilities tend to supplement rather than replace existing skills. The line between model knowledge and human guidance must be continually recalibrated.

7. To Learn or Not to Learn Prompt Engineering?

Given the uncertainty around prompting's future role, is it worth investing time now to learn prompt engineering skills?

There are good reasons both for and against:

Reasons to Learn Prompt Engineering

  • Prompting is invaluable for current AI systems and will retain value for complex tasks.

  • Skills like being exact and showing examples will always be useful.

  • It builds intuition for communicating with language models.

  • We should take advantage of what works well today, while being prepared to incorporate new strategies.

Reasons Not to Learn Prompt Engineering

  • You might use your time better by learning more about your own field, which can help with using prompts.

  • If AI gets better at talking like humans, using prompts might get simpler.

  • Other skills like problem solving, visual thinking or stakeholder reasoning are more future-proof.

While prompting's future is uncertain, its value today is clear. Even if its importance decreases over time, the skills learned can still improve your use of language models today.

8. Should I buy a course, a book or learn free from Youtube or Google?

If you feel learning this skill isn't essential for you right now, you might choose to skip this part. But if you want to get better at this, it's good to know the good and bad sides of each way to learn.

Let's look at what each choice offers.

Courses and books

Pros:

  • Structured curriculum progresses logically to build skills efficiently

  • Vetted information focuses on key concepts and best practices

  • Opportunity to ask questions and get feedback

  • Synthesized knowledge rather than fragmented information

Cons:

  • Costs money to purchase, which can be a significant amount for some people.

  • Sometimes the material might not match your specific learning style or needs.


Free YouTube Resources

Pros:

  • Completely free access. Here are some useful channels to start: AssemblyAI, @engineerprompt

  • Learn at your own schedule and pace

  • Wide range of styles and approaches from different creators

  • Can focus on specific skills as needed

Cons:

  • Information quality varies greatly

  • No structured progression or feedback

  • Harder to identify key concepts and best practices

  • Have to synthesize fragmented information yourself

Paid resources offer efficiency, structure and expert guidance, while YouTube provides flexibility, customization and free access. Evaluating your goals, needs and budget can help determine the best option or balance of approaches.

Conclusion

Let's wrap up what we've learned about prompt engineering. This skill is about how we talk to AI to get the most out of it. But, as AI gets better, will everyone still need to know how to do this?

In the future, we can think about two main groups of people using AI:

  1. General Users: These are most people who use AI. They might not need to know a lot about prompt engineering. AI will get better at understanding simple instructions, making it easier for everyone to use.

  2. Specialists or Niche Users: These are people who really want to get the most out of AI. For them, knowing how to make detailed prompts is very important. They use this skill to do much more with AI.

So, what does this mean for you? Well, if you're just using AI for everyday things, you might not need to learn much about prompt engineering. But, if you're someone who wants to explore and do more advanced stuff with AI, then understanding prompt engineering is really useful.

What is certain, however, is that its journey, controversial by all means, won’t be over anytime soon, for better of for worse.

Frequently Asked Questions (FAQs)

  1. How do I become a prompt engineer?

    To become a prompt engineer, start by learning the basics of AI and natural language processing. Gain experience with AI models like ChatGPT by experimenting with different prompts. Enhance your skills through practice, online courses, and staying updated with the latest AI advancements.


  2. Is it easy to learn prompt engineering?

    Learning prompt engineering can be straightforward if you have a keen interest in AI and language. It requires understanding how AI interprets language, which can be learned through practice and study. However, mastering it takes time and experimentation.


  3. How to learn prompt engineering for ChatGPT?

    To learn prompt engineering for ChatGPT, familiarize yourself with its capabilities and limitations. Practice crafting prompts and observe ChatGPT's responses. Utilize online resources, tutorials, and community forums to improve your skills and understand advanced techniques.


  4. How does ChatGPT understand prompts?

    ChatGPT understands prompts using natural language processing algorithms. It analyzes the text, determines the intent, and generates a response based on its training data and programmed rules. The model's understanding is limited to its training and doesn't imply human-like comprehension.

Loved It? Share it with your friends:

ChatGPT for designers ebook.
ChatGPT for designers ebook.

DesignFix: The Newsletter You Never Knew You Needed

DesignFix: The Newsletter You Never Knew You Needed

From UX hacks, resources, books, tips, tricks to AI insights, get all you need to fuel your creativity twice a month.

The Good, the Bad, and the Future of Prompt Engineering