This is the repository for the LinkedIn Learning course Controlling ChatGPT with Custom Instructions or API System Messages. The full course is available from LinkedIn Learning.
Custom Instructions in ChatGPT allow you to provide additional context and even format the type of response you want from the system every time you interact with it. In the OpenAI Chat Completions API you can access the same functionality using the System and Assistant messages. In this short course, learn how to take advantage of these features to get more consistent and customizable responses from both systems.
This repository provides examples of how to use message roles and the instructions parameter to control the output of OpenAI's language models.
Note
The easiest way to run and interact with the examples is by opening this repository in GitHub Codespaces.
The examples are found in two folders:
./CompletionsAPIhas examples using GitHub Models and the Completions API../ResponsesAPIhas examples using OpenAI's API and the Responses API.
If you're running the repository in GitHub Codespaces, the examples in ./CompletionsAPI will work out of the box without any further setup. GitHub Models are automatically authenticated in the Codespace and the completions will generate when called.
To run the examples in ./ResponsesAPI you first need to add an OpenAI API key to your environment:
- Go to https://platform.openai.com and sign in or sign up
- Generate a new key at https://platform.openai.com/api-keys
- Copy the key (you only get to see it once).
- In Codespaces, open Terminal
- Set up a new environment variable like this:
export OPENAI_API_KEY="your_api_key_here"The examples are self-contained and run in Terminal with the Python command:
- Open Terminal
- Navigate to one of the folders:
cd CompletionsAPI- Run a script using Python:
python basic-response.pyYou are free to experiment by modifying and expanding these examples in any way you like!