Skip to content

Commit

Permalink
Chapter 4 Content Complete
Browse files Browse the repository at this point in the history
  • Loading branch information
nitya committed Oct 17, 2023
1 parent 1c6dd56 commit 365ffe3
Show file tree
Hide file tree
Showing 29 changed files with 243 additions and 110 deletions.
1 change: 1 addition & 0 deletions .env.copy
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
OPENAI_API_KEY=
43 changes: 26 additions & 17 deletions 4-prompt-engineering-fundamentals/4.0-introduction.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,30 +11,39 @@ CODE CHALLENGE:
If provided, should have an education focus - help show how the concepts can be applied to make the lives of teachers and students easier.
-->

## 4.1 Introduction
We know that _Generative AI_ is capable of creating new content (e.g., text, images, audio, code etc.) in response to user requests. It achieves this using _Large Language Models_ (LLMs) like OpenAI's GPT ("Generative Pre-trained Transformer") series that are trained for using natural language and code.

Our education startup is focused on ways in which we can [bring AI innovation to education](https://educationblog.microsoft.com/2023/06/collaborating-to-bring-ai-innovation-to-education) to support _administrators, educators and students_ with generative AI experiences. We're focusing on **personalized learning experiences** that can be activated with natural language (chat) interfaces so our users don't need any technical expertise to start using them.
Users can now interact with these models using familiar interfaces & paradigms ("chat") without needing technical expertise or training. The models are **prompt-based** - users send a text input (or "prompt") with a request, and the AI responds with a text output (or "completion"). Users can then "talk to the AI" iteratively, in multi-turn conversations, to refine that prompt till the returned results meet their expectations.

Prompts are now the primary "programming interface" for generative AI apps. They tell the model what to do - setting model weights and influencing its resulting behaviors. **Prompt Engineering** is now a fast-growing field of study focused on the _design and optimization_ of these prompts to drive improved user experiences and response quality at scale.

By now, we're familiar with these two terms:
- **Generative AI** - is a category of artificial intelligence capable of _generating new content based on pre-trained models_ - in response to a natural language input or "prompt".
- **Large Language Models (LLM)** - are a type of generative AI trained on massive quantities of text data to execute natural language processing (NLP) tasks at scale.

In this lesson, we're going to introduce a third term - **Prompt Engineering** - that reflects a new field of engineering focused on _more effective prompt design_ with tools and techniques to guide LLMs to deliver more relevant and consistent results for our generative AI applications.
- We'll define prompt engineering - and motivate the need to design better prompts.
- We'll explore prompt usage with examples - and understand opportunities and limitations.
- We'll explore design techniques - and iterate and validate prompts to meet expectations.
## Learning Goals

In this lesson, we learn what Prompt Engineering is, why it matters, and how users can learn to craft more effective prompts for a given model and objective. We'll discuss best practices for prompt engineering, and learn to use an interactive sandbox (with Jupyter Notebooks and an OpenAI or Azure OpenAI API endpoint) to explore prompt engineering techniques interactively, with real examples.

## Learning Goals
By the end of this lesson you will be able to:
1. _Define Prompt Engineering_ - Explain what it is & why it matters.
1. _Describe Prompt Components_ - Understand prompt structure & usage.
1. _Learn Best Practices_ - Know basic prompt engineering strategies & tactics.
1. _Apply Learned Techniques_ - Work through real examples in interactive exercises.

## Learning Sandbox

The key takeaway fromt his lesson is that **prompt engineering is more art than science**. The best way to learn is with a trial-and-error approach that combines your intuition and domain expertise, with the recommended techniques and model-specific optimizations. To help you, we introduce a _sandbox environment_ with three components:

By the end of this lesson we will be able to:
- _Describe Prompt Engineering_ - what it is, and why it matters in generative AI apps.
- _Analyze Prompt Examples_ - to understand best practices & limitations for prompt design.
- _Apply Prompt Engineering Techniques_ - to iterate & validate prompts for response quality.
- _Explore Prompt Engineering with OpenAI_ - using GitHub Codespaces & Jupyter Notebooks.
- Jupyter Notebooks - Python-based prompt authoring & execution environment
- OpenAI API key - Active service endpoint for the LLM (e.g., GPT-3.5, GPT-4)
- Dev Containers - Consistent runtime in GitHub Codespaces or Docker Desktop

Once we get an understanding of prompt engineering fundamentals, we can then answer this question for our education startup:
We provide _starter exercises_ to help you explore these techniques in practice. However, the real value of the sandbox is that you can author new prompts or modify existing ones, then test them interactively, to build your own intuition on effective prompt design.

_How can we engineer prompts to deliver a more reliable and consistent experience for our students, educators and adminstrators using our generative AI app?_
## Our Startup
Before we dive into the lesson, let's think about how this topic relates to our broader mission to [bring AI innovation to education](https://educationblog.microsoft.com/2023/06/collaborating-to-bring-ai-innovation-to-education). Our startup is building a suite of AI-powered applications to drive **personalized learning experiences** and we want to target three kinds of users: _administrators, educators and students_.

Think how these users can "program" our AI-powered applications by authoring prompts focused on accomplishing a specific task or objective. For example:
- An administrator can ask the AI to _analyze curriculum data to identify gaps in coverage_. The AI can summarize results in text, or visualize them with code.
- An educator can ask the AI to _generate a lesson plan for a target audience and topic_. The AI can build the personalized plan and deliver it in a desired format.
- A student can ask the AI to _tutor them in a difficult subject_. The AI can act like a coach, guiding students with lessons, hints & examples tailored to their level.

And that's just the tip of the iceberg. Check out [Prompts For Education](https://github.com/microsoft/prompts-for-edu/tree/main) - an open-source repository curated by experts for use in education applications keeping both prompt engineering and responsible AI practices in mind. Once you complete the code challenge section of this lesson, find more prompts in the repo and try them out in that sandbox.
Loading

0 comments on commit 365ffe3

Please sign in to comment.