Protect OpenAI key using Firebase function

59 Views Asked by At

I have an app that uses OpenAI and like many others my key was recently compromised.

I have this simple code:

const functions = require('firebase-functions');
const OpenAI = require("openai");

const openai = new OpenAI({
    apiKey: functions.config().openai.key,
});

exports.generateText = functions.https.onCall(async (data, context) => {

  try {

    const response = await openai.chat.completions.create({
      messages: [{ role: 'user', content:  data.prompt }],
      model: 'gpt-3.5-turbo',
   });

   return { response: response.choices[0].message.content };

   } catch (error) {

    throw new functions.https.HttpsError('internal', 'Failed to generate text from OpenAI.');
  }
});

that I then call within my iOS app as follows:

let functions = Functions.functions()
func generateText(prompt: String, completion: @escaping (String?, Error?) -> Void) {
    
    functions.httpsCallable("generateText").call(["prompt": prompt]) { result, error in
        
        if let error = error as NSError? {
            if error.domain == FunctionsErrorDomain {
                let code = FunctionsErrorCode(rawValue: error.code)
                let message = error.localizedDescription
                let details = error.userInfo[FunctionsErrorDetailsKey]
                
                print(code, message, details)
                return
            }
        }
        
        if let textResponse = (result?.data as? [String: Any])?["response"] as? String {
            completion(textResponse, nil)
        } else {
            completion(nil, NSError(domain: "AppErrorDomain", code: -1, userInfo: [NSLocalizedDescriptionKey: "Failed to parse function response"]))
        }
    }
}

However as I'm not a backend engineer, I'm wondering if this is secure, or what is to stop someone using this endpoint? E.g. is this automatically restricted to being called from my app? Is there a way to do that?

I note other responses ask to authenticate the user via a login, but I would rather avoid that within my app.

Thank you!

2

There are 2 best solutions below

0
Frank van Puffelen On BEST ANSWER

The problem is not (as Greg and lorem ipsum have suggested) that your API key is not secured, as that value is only accessible to folks who have administrative access to your project.

The problem is that your generateText Cloud Function performs a pretty blanket API call to OpenAI and doesn't do anything to verify that the API call is authorized. Essentially: anyone who knows the endpoint of generateText (which they can determine from your public app), can call that endpoint with whatever prompt they want to send to OpenAI, and make that call on your API key.

One way to protect against this sort of abuse is with App Check, which sends a special app attestation token with requests coming from your genuine code on an unhacked device, and then validates that token on the server. See the documentation on enabling enforcement of App Check on callable Cloud Functions.

0
Greg Fenton On

From a Cloud Functions standpoint, that approach is fine. You have put your API key into the Configuration as an environment variable. It would be safer to put the key into Google Secret Manager, but your approach here is reasonable.

Except....the question is how you are populating the Configuration for your environment variables and how you are storing the "source" of that information.

For example, if you put the API key into some type of shell script and are passing that API key via firebase functions:config:set .... then it is the data in the shell script that is a risk. You need to protect that file. Storing it as clear text in your version control system, for example, would be an insecure approach.