GPT-3 Tips and Tricks for Prompt Engineering

submited by
Style Pass
2021-05-19 17:36:50

In this post I’ll briefly explain what Prompt Engineering is, why it matters, and some tips and tricks to help you do it well. While I doubt traditional programming is going away anytime soon, I do predict that Prompt Engineering is going to be a very important part of most developers’ toolboxes. Prompt Engineering allows developers to implement natural language understanding and soft decision-making processes that would otherwise be difficult or impossible.

I’ve been fortunate enough to get to spend time integrating GPT-3 into a complex product. A significant portion of this time was spent doing “Prompt Engineering”, in which you convince a Large Language Model (LLM) like GPT-3 that it is writing a document whose structure and content cause it to perform your desired task. This document, called the “prompt”, often contains instructions and examples of what you’d like the LLM to do.

In this example prompt, we have some context (This is a list of startup ideas:) and some few-shot examples. The most likely token to come next in the document is a space, followed by a brilliant new startup idea involving Machine Learning, and indeed, this is what GPT-3 provides: “An online service that lets people upload a bunch of data, and then automatically builds a machine learning model based on that data.” (Ideas 1-4 are also based on ones suggested by GPT-3 from previous iterations of this prompt.)

Leave a Comment