You are viewing docs on Elastic's new documentation system, currently in technical preview. For all other Elastic docs, visit elastic.co/guide.

Connect to OpenAI

Set up an OpenAI LLM connector.

Connect to OpenAI

This page provides step-by-step instructions for setting up an OpenAI connector for the first time. This connector type enables you to leverage OpenAI's large language models (LLMs) within Kibana. You'll first need to create an OpenAI API key, then configure the connector in Kibana.

Configure OpenAI

Select a model

Before creating an API key, you must choose a model. Refer to the OpenAI docs to select a model. Take note of the specific model name (for example gpt-4-turbo); you'll need it when configuring Kibana.

Note

GPT-4 Turbo offers increased performance. GPT-4 and GPT-3.5 are also supported.

Create an API key

To generate an API key:

  1. Log in to the OpenAI platform and navigate to API keys.
  2. Select Create new secret key.
  3. Name your key, select an OpenAI project, and set the desired permissions.
  4. Click Create secret key and then copy and securely store the key. It will not be accessible after you leave this screen.

The following video demonstrates these steps.

Configure the OpenAI connector

To integrate with Kibana:

  1. Log in to Kibana.
  2. Navigate to Stack Management → Connectors → Create Connector → OpenAI.
  3. Provide a name for your connector, such as OpenAI (GPT-4 Turbo Preview), to help keep track of the model and version you are using.
  4. Under Select an OpenAI provider, choose OpenAI.
  5. The URL field can generally be left unchanged.
  6. Enter the API key that you previously created in the corresponding field.
  7. Click Save.

The following video demonstrates these steps.

On this page