Walkthrough of Using The ChatGPT API

Walkthrough of Using The ChatGPT API

Posted in

ChatGPT, and AI in general, are absolutely everywhere in 2023. In the span of six months, ChatGPT’s gone from being a clever gimmick, on par with something like Midjourney, to being used by some of the largest corporations on Earth, including Coca-Cola, Duolingo, Shopify, and Slack.

OpenAI has since made ChatGPT available as an API, allowing developers to integrate ChatGPT into their code and applications. With that in mind, we’ve put together a walkthrough for getting started with the ChatGPT API and trying it out for yourself.

Note that, at the time of this writing, all ChatGPT API calls cost something, so you’ll need some credits in your account to complete this tutorial.

Getting Started With the ChatGPT API

Get an OpenAI API Key

To start, you’ll need to sign up for the OpenAI Platform. Once you’ve signed up, log in to your account. Once logged in, click on the Personal tab in the top-right corner. Select View API Keys from the dropdown menu. Click Create new secret key and give it a name. We called ours TestKey. Make sure to write down your API key, as you won’t be able to see it again.

Using the ChatGPT API

The OpenAI API uses the GPT-3.5-turbo and GPT-4 models that drive ChatGPT and ChatGPT+. These data models are capable of understanding and interpreting natural language. The ChatGPT API is primarily meant for chat but can also be used for text completion.

Using the ChatGPT API for Chat Completion

To start, you need to configure the API so that it’s ready to receive an API call. Here’s an example of what that looks like:

import openai

openai.api_key = "YOUR_API_KEY"

completion = openai.ChatCompletion.create(
  model = "gpt-3.5-turbo",
  temperature = 0.8,
  max_tokens = 2000,
  messages = [
    {"role": "system", "content": "You are a funny comedian who tells dad jokes."},
    {"role": "user", "content": "Write a dad joke related to numbers."},
    {"role": "assistant", "content": "Q: How do you make 7 even? A: Take away the s."},
    {"role": "user", "content": "Write one related to programmers."}
  ]
)

print(completion.choices[0].message)

Make sure you have OpenAI installed before running this script. If you don’t, install it using pip install OpenAI. Insert your API key for the YOUR_API_KEY variable as well.

Note that at the time of this writing, all ChatGPT API calls cost something. You’ll need some form of paid plan to make API calls.

If everything’s working as it should, you should get the following result:

[
    "content": "Q: Why do programmers prefer dark mode? /nA: Because light attracts bugs!",
    "role": "Assistant"
]

There you have it, your very own programmer humor bot written in Python.

Let’s look at that code a little bit. The most important aspect of a ChatGPI API call is the role parameter. These roles tell ChatGPT how it should behave, who’s making the call, and the context of the question. This lets ChatGPT know how to respond.

system refers to the virtual assistant. user is generally provided by the end user, but developers have some influence. assistant provides the context that lets the system know how to respond.

max_tokens and temperature can also be used to further customize responses. max_tokens limits the length of the response. At the time of writing, GPT-3.5-turbo’s limit is 4,096 characters, and GPT-4 has a limit of 8,192 characters. Higher temperatures result in more randomness, while lower temperatures return more controlled responses. Temperatures range from 0 to 2.

Using ChatGPT API for Text Completion

ChatGPT-3.5-turbo is also quite good for text completion. Here’s how to configure the ChatGPT API to complete text via an API call.

import openai

openai.api_key = "YOUR_API_KEY"

completion = openai.ChatCompletion.create(
  model = "gpt-3.5-turbo",
  temperature = 0.8,
  max_tokens = 2000,
  messages = [
    {"role": "system", "content": "You are a poet who creates poems that evoke emotions."},
    {"role": "user", "content": "Write a short poem for programmers."}
  ]
)

print(completion.choices[0].message.content)

For text completion, you don’t even need to provide the system role or its contents. You only need to provide the input, which could look something like this:

messages = [
  {"role": "user", "content": "Write a short poem for programmers."}
]

Running this code should return a short poem about the wonders of programming:

In lines of code, a world unfolds,
Where logic weaves and dreams take hold.
Syntax dances, elegant and precise,
Creating wonders, a digital paradise.

Variables hold stories, values untold,
Loops and branches, secrets unfold.
Bug by bug, we strive to mend,
To craft solutions, until the end.

Programmers, poets of the digital age,
With keyboards as brushes, we engage.
With each line typed, a masterpiece begins,
A symphony of bytes, where innovation wins.

Understanding ChatGPT API’s Response Format

Responses from the ChatGPI API follow the following format:

{
    "choices": {
    {
        "finish-reason": "stop", 
        "index": 0,
        "message": {
            "content": "Max Ernst was one of the most influential surrealists.",
            "role": "assistant",
        }
    }
},
"created": 1677649420,
"id": 'chatcmpl-6p9XYPYSTTRi0xEviKjjilqrWU2Ve',
"model": "gpt-3.5-turbo-0301",
"object": "chat.completion",
"usage": {
    "completion_tokens": 10,
    "prompt_tokens": 10,
    "total_tokens": 20,
    }
}

The response from the ChatGPT API delivers the payload, the content, and an abundance of metadata. This metadata includes when the response was created and what model was used. It also lets you know how many credits were used for the call, completion_tokens, prompt_tokens, and total_tokens.

How to Build an Application Using ChatGPT API

You can also use the OpenAI Python Library directly in your code to have native ChatGPT-enabled applications. Let’s take a look at how this functions.

Using an API Endpoint

You’ll need to call the /v1/chat/completions endpoint to use the GPT-3.5-turbo and GPT-4 models. Here’s some sample code to give you an idea of what that looks like:

import requests

openai.api_key = "YOUR_API_KEY"
URL = "https://api.openai.com/v1/chat/completions"

payload = {
  "model": "gpt-3.5-turbo",
  "temperature" : 1.0,
  "messages" : [
    {"role": "system", "content": f"You are an assistant who tells any random and very short fun fact about this world."},
    {"role": "user", "content": f"Write a fun fact about programmers."},
    {"role": "assistant", "content": f"Programmers drink a lot of coffee!"},
    {"role": "user", "content": f"Write one related to the Python programming language."}
  ]
}

headers = {
  "Content-Type": "application/json",
  "Authorization": f"Bearer {openai.api_key}"
}

response = requests.post(URL, headers=headers, json=payload)
response = response.json()

print(response['choices'][0]['message']['content'])

This is an example of using OpenAI inside a program via the requests library. First, you must assign your API key to a variable, openai.api_key. You also need to specify what model you’re using, under the model parameter of payload. It also sets the temperature so that you can dictate the level of randomness of the response.

Running this script requests a random fact about the Python programming language from GPT-3.5-turbo. It should return something like the following:

Did you know that Python was named after the British comedy series "Monty Python's Flying Circus"? Guido van Rossum, the creator of Python, was a fan of the show and chose the name to pay homage to it. This unique naming choice has given Python a memorable and distinctive identity in the world of programming.

Using the OpenAI Library

You can also create programs specifically to interact with ChatGPT using the OpenAI library. First, you need to install the library using pip. Input the following into your terminal:

pip install openai

Now you can use the official OpenAI library in your code. Some sample code looks like this:

import openai

openai.api_key = "YOUR_API_KEY"

response = openai.ChatCompletion.create(
  model = "gpt-3.5-turbo",
  temperature = 0.2,
  max_tokens = 1000,
  messages = [
    {"role": "user", "content": "Who is Piet Mondrian?"}
  ]
)

print(response['choices'][0]['message']['content'])

Running this code should return something similar to the following:

Piet Mondrian was a Dutch painter and one of the pioneers of abstract art. He was born on March 7, 1872, in Amersfoort, the Netherlands, and passed away on February 1, 1944, in New York City, United States.

Mondrian started his artistic career as a landscape and still-life painter influenced by traditional artistic styles. However, over time, he transitioned towards abstraction and became known for his distinctive style characterized by geometric shapes, primary colors (red, blue, yellow), and a grid-like structure.
...

Final Thoughts on Using the ChatGPT API

If recent trends are any indication, AI is here to stay. ChatGPT is just the tip of the iceberg, and it’s spread unbelievably far and fast in 2023. It’s already worked its way into an impressive range of businesses and workplaces. Like any other business innovation, however, especially digital innovations, remember to tread lightly and carefully.

Do you remember the cautionary tales around social media automation? Some going back almost ten years, at the height of the social media-feeding frenzy? To leave automation in charge of your reputation, with no safeguards or fail safes in place, would be the height of madness.

ChatGPT is essentially automation on steroids. It can be very useful and highly entertaining when used wisely. But remember to keep an eye on it. ChatGPT lies and confidently hallucinates. Many of its imaginings seem entirely plausible, making it difficult to separate from reality.

It’s important to remember ChatGPT is not a fact-checking machine — not yet. It’s simply returning the most likely response based on the data it has. ChatGPT has a lot of potential and has come a long way in a very short time. Just remember to be careful with it and use it prudently.