New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature Request] Assistants API integration/example with Azure OpenAI #701
Comments
This is the example for how to use Azure with this library: https://github.com/openai/openai-node/blob/master/examples/azure.ts You can instantiate a client that way and then do |
@rattrayalex Thanks for your reply. Unfortunately I already tried to initialize a client with azure on v4.28.4. I regularly initialize Azure clients, and the same client will be able to use regular chat completions as expected, but It's not apparent to me how the API call gets constructed when using the Azure options. |
What code did you use and what problems did you run into? |
Thanks for prompting me to code as I tried to reproduce with a simpler approach. I figured out that the baseURL needs to be modified to be compatible with Assistants API. from: Line 24 in 6175eca
to baseURL = `https://${resource}.openai.azure.com/openai`; Can I make a PR to update the Azure example? import OpenAI from 'openai';
(async () => {
const model = process.env.DEPLOYMENT_NAME;
const resource = process.env.API_RESOURCE;
const apiVersion = process.env.API_VERSION;
const apiKey = process.env.ASSISTANTS_API_KEY;
const options = {
defaultQuery: {
'api-version': apiVersion,
},
defaultHeaders: {
'api-key': apiKey,
},
apiKey: apiKey, // error without this
baseURL: `https://${resource}.openai.azure.com/openai/deployments/${model}`,
};
const openai = new OpenAI(options);
console.log('Streaming:');
const stream = await openai.chat.completions.create({
model,
messages: [{ role: 'user', content: 'Say hello!' }],
stream: true,
});
for await (const part of stream) {
process.stdout.write(part.choices[0]?.delta?.content ?? '');
}
process.stdout.write('\n');
try {
openai.baseURL = `https://${resource}.openai.azure.com/openai`;
const response = await openai.beta.assistants.list({
order: 'desc',
limit: 20,
});
console.log('List:', response);
} catch (error) {
console.error('Error listing Assistants:', error);
}
const url = `https://${resource}.openai.azure.com/openai/assistants?order=desc&limit=20&api-version=${apiVersion}`;
const fetchOptions = {
method: 'GET',
headers: {
'Content-Type': 'application/json',
'OpenAI-Beta': 'assistants=v1',
'api-key': apiKey,
},
};
fetch(url, fetchOptions)
.then(response => response.json()) // Convert the response to JSON
.then(data => console.log(data)) // Log the data
.catch(error => console.error('Error:', error)); // Log any errors
})(); |
Ah, I see. A PR adding an |
Hello, friends, can we add proxy interfaces to the code? Similar to using python |
@naichalun that is not on-topic for this thread. I'd ask you to open another, but the answer is here: https://github.com/openai/openai-node?tab=readme-ov-file#configuring-an-https-agent-eg-for-proxies |
Confirm this is a feature request for the Node library and not the underlying OpenAI API.
Describe the feature or improvement you're requesting
I understand there is an Azure specific SDK, but I've built a lot of my app using this library and believe this shouldn't be too hard to integrate and maintains parity with the
openai
python SDK.From this example, it looks like the python SDK can handle this pretty easily:
That's all it takes and then all the "beta" assistant methods are available:
Source: https://github.com/Azure-Samples/azureai-samples/tree/main/scenarios/Assistants/assistants-api-in-a-box
If this can't be achieved or isn't in the roadmap through this node library, I'd rather make my own REST methods rather than try to use 2 different libraries
Additional context
No response
The text was updated successfully, but these errors were encountered: