Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] Assistants API integration/example with Azure OpenAI #701

Open
1 task done
danny-avila opened this issue Mar 2, 2024 · 7 comments
Open
1 task done
Labels
documentation Improvements or additions to documentation

Comments

@danny-avila
Copy link

Confirm this is a feature request for the Node library and not the underlying OpenAI API.

  • This is a feature request for the Node library

Describe the feature or improvement you're requesting

I understand there is an Azure specific SDK, but I've built a lot of my app using this library and believe this shouldn't be too hard to integrate and maintains parity with the openai python SDK.

From this example, it looks like the python SDK can handle this pretty easily:

from openai import AzureOpenAI

# Load the environment variables - These are secrets.
load_dotenv()

api_URI = os.getenv("OPENAI_URI")
api_KEY = os.getenv("OPENAI_KEY")
api_version = os.getenv("OPENAI_VERSION")
deployment_name = os.getenv("OPENAI_DEPLOYMENT_NAME")

# Create an OpenAI Azure client
client = AzureOpenAI(api_key=api_key,
        api_version=api_version,
        azure_endpoint=api_endpoint)

That's all it takes and then all the "beta" assistant methods are available:

assistant = client.beta.assistants.create()
thread = client.beta.threads.create()
message = client.beta.threads.messages.create()
# etc.

Source: https://github.com/Azure-Samples/azureai-samples/tree/main/scenarios/Assistants/assistants-api-in-a-box

If this can't be achieved or isn't in the roadmap through this node library, I'd rather make my own REST methods rather than try to use 2 different libraries

Additional context

No response

@rattrayalex rattrayalex added the documentation Improvements or additions to documentation label Mar 2, 2024
@rattrayalex
Copy link
Collaborator

rattrayalex commented Mar 2, 2024

This is the example for how to use Azure with this library: https://github.com/openai/openai-node/blob/master/examples/azure.ts

You can instantiate a client that way and then do client.beta.assistants.create() exactly as you would in Python.

cc @kristapratico

@danny-avila
Copy link
Author

@rattrayalex Thanks for your reply.

Unfortunately I already tried to initialize a client with azure on v4.28.4.

I regularly initialize Azure clients, and the same client will be able to use regular chat completions as expected, but It's not apparent to me how the API call gets constructed when using the Azure options.

@rattrayalex
Copy link
Collaborator

What code did you use and what problems did you run into?

@danny-avila
Copy link
Author

danny-avila commented Mar 3, 2024

What code did you use and what problems did you run into?

Thanks for prompting me to code as I tried to reproduce with a simpler approach.

I figured out that the baseURL needs to be modified to be compatible with Assistants API.

from:

baseURL: `https://${resource}.openai.azure.com/openai/deployments/${model}`,

to

baseURL = `https://${resource}.openai.azure.com/openai`;

Can I make a PR to update the Azure example?

import OpenAI from 'openai';

(async () => {
  const model = process.env.DEPLOYMENT_NAME;
  const resource = process.env.API_RESOURCE;
  const apiVersion = process.env.API_VERSION;
  const apiKey = process.env.ASSISTANTS_API_KEY;

  const options = {
    defaultQuery: {
      'api-version': apiVersion,
    },
    defaultHeaders: {
      'api-key': apiKey,
    },
    apiKey: apiKey, // error without this
    baseURL: `https://${resource}.openai.azure.com/openai/deployments/${model}`,
  };
  const openai = new OpenAI(options);

  console.log('Streaming:');
  const stream = await openai.chat.completions.create({
    model,
    messages: [{ role: 'user', content: 'Say hello!' }],
    stream: true,
  });

  for await (const part of stream) {
    process.stdout.write(part.choices[0]?.delta?.content ?? '');
  }
  process.stdout.write('\n');

  try {
    openai.baseURL = `https://${resource}.openai.azure.com/openai`;
    const response = await openai.beta.assistants.list({
      order: 'desc',
      limit: 20,
    });
    console.log('List:', response);
  } catch (error) {
    console.error('Error listing Assistants:', error);
  }

   const url = `https://${resource}.openai.azure.com/openai/assistants?order=desc&limit=20&api-version=${apiVersion}`;
  const fetchOptions = {
    method: 'GET',
    headers: {
      'Content-Type': 'application/json',
      'OpenAI-Beta': 'assistants=v1',
      'api-key': apiKey,
    },
  };

  fetch(url, fetchOptions)
    .then(response => response.json()) // Convert the response to JSON
    .then(data => console.log(data)) // Log the data
    .catch(error => console.error('Error:', error)); // Log any errors
})();

@rattrayalex
Copy link
Collaborator

Ah, I see. A PR adding an azure-assistants.ts file would be welcome.

@rattrayalex rattrayalex reopened this Mar 3, 2024
@naichalun
Copy link

Hello, friends, can we add proxy interfaces to the code? Similar to using python
/os.environ["http_proxy"] = "http://xxx.xxxxxx:port"

@rattrayalex
Copy link
Collaborator

@naichalun that is not on-topic for this thread. I'd ask you to open another, but the answer is here: https://github.com/openai/openai-node?tab=readme-ov-file#configuring-an-https-agent-eg-for-proxies

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation
Projects
None yet
Development

No branches or pull requests

3 participants