OpenAI is not callable

922 Views Asked by At

When I try to run the following code:

import os
from langchain.llms import OpenAI
from apikey import apikey
import streamlit as st


os.environ["OPENAI_API_KEY"] = apikey
st.title("Content GPT Creator")
prompt = st.text_input('Plug in your prompt here')

llm = OpenAI(temperature = .9)

I keep getting the error that OpenAI is not callable, has anyone every encountered this? Thanks for any help in advance.

I thought I would be able to instantiate an llm with an instance of OpenAI but it keeps saying its not callable, according to the docs for langchain this is how you would do it.

1

There are 1 best solutions below

0
dmontaner On

In my current version of opeanai (1.3.4) your code works for me. Make surethat our "apikey" variable contains the correct string...

Said that I have had the same issue of the global variable not working in previous versions of openai.

In general this 2 alternatives have worked for me in previous versions of the library:

  1. paste the key into the client:
client = openai.OpenAI(api_key=apikey)
  1. paste the key into the openai module namespace:
openai.api_key = apikey
client = openai.OpenAI()

In general the environment variables are used to store the key "outside" your script for security. So if you already have the key string loaded in your session it may make more sense to parse it to the client as above instead of sending it back to the env vars.

A final note. If you are using streamlit it has its own mechanism to load env variables:

If you create a file called .streamlit/secrets.toml in your project directory with a line like this:

OPENAI_API_KEY = "YOUR_API_KEY"

Then the st module has a secrets object that you can use as:

client = OpenAI(api_key=st.secrets["OPENAI_API_KEY"])

I got that from here: https://docs.streamlit.io/knowledge-base/tutorials/build-conversational-apps

Notice that the st loads the secrets also as env vars... so in theory you should not need the api_key client when initializing the client :)