Skip to main content
Portkey provides a robust and secure gateway to integrate various Large Language Models (LLMs) into applications, including AI21’s models. With Portkey, take advantage of features like fast AI gateway access, observability, prompt management, and more, while securely managing API keys through Model Catalog.

Quick Start

Get AI21 working in 3 steps:
from portkey_ai import Portkey

# 1. Install: pip install portkey-ai
# 2. Add @ai21 provider in model catalog
# 3. Use it:

portkey = Portkey(api_key="PORTKEY_API_KEY")

response = portkey.chat.completions.create(
    model="@ai21/jamba-1-5-large",
    messages=[{"role": "user", "content": "Say this is a test"}]
)

print(response.choices[0].message.content)
Tip: You can also set provider="@ai21" in Portkey() and use just model="jamba-1-5-large" in the request.

Add Provider in Model Catalog

  1. Go to Model Catalog → Add Provider
  2. Select AI21
  3. Choose existing credentials or create new by entering your AI21 API key
  4. Name your provider (e.g., ai21-prod)

Complete Setup Guide →

See all setup options, code examples, and detailed instructions

Managing AI21 Prompts

Manage all prompt templates to AI21 in the Prompt Library. All current AI21 models are supported, and you can easily test different prompts. Use the portkey.prompts.completions.create interface to use the prompt in an application.

Next Steps

For complete SDK documentation:

SDK Reference

Complete Portkey SDK documentation