Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Passing global context into tools called by the runTools helper #597

Open
1 task done
tday opened this issue Dec 21, 2023 · 11 comments
Open
1 task done

Passing global context into tools called by the runTools helper #597

tday opened this issue Dec 21, 2023 · 11 comments
Labels
enhancement New feature or request

Comments

@tday
Copy link

tday commented Dec 21, 2023

Confirm this is a feature request for the Node library and not the underlying OpenAI API.

  • This is a feature request for the Node library

Describe the feature or improvement you're requesting

I currently have a pattern where I need to pass context to my tools to allow them to act on my app. For example:

function updateEvent(context: { eventId: string}, args: ArgsFromOpenAi) {
   const event = await fetchEvent(eventId);
}

It'd be great if there were some way to pass a global context to the runner since the runner is passed into each function call. Then, I could do something like this:

function updateEvent( args: ArgsFromOpenAi, runner: ChatCompletionStreamingRunner<EventContext>) {
  const { eventId } = runner.context;
   const event = await fetchEvent(eventId);
}

Additional context

A workaround is to build my own runner that leverages the existing helpers. However, this is complicated because of the types integration.

@tday tday changed the title Passing global context into runTools helper Passing global context into tools called by the runTools helper Dec 21, 2023
@rattrayalex rattrayalex added the enhancement New feature or request label Dec 21, 2023
@rattrayalex
Copy link
Collaborator

Could you use a closure for this?

async function callFunctions() {
  const context = {};
  
  function updateEvent(args: ArgsFromOpenAi) {
    const { eventId } = context;
    const event = await fetchEvent(eventId);
  }
  
  await client.beta.messages.runTools({ tools: [{type: 'function', function: { function: updateEvent }}]})
}

@tday
Copy link
Author

tday commented Dec 21, 2023

I considered a closure like you wrote, but it would require merging tools defined across multiple files into a single (very) large file. I'm currently leveraging an inherited class to provide the closure, but running into some typing limitations with it that are worked around with casting and use of any.

LMK if I'm missing a simpler solution 🙏

@rattrayalex
Copy link
Collaborator

rattrayalex commented Dec 21, 2023 via email

@tday
Copy link
Author

tday commented Dec 22, 2023

I'm imagining something that matches the lifecycle of runTools. The context should stay the same throughout that run. It is probably simplest to pass down a context through the runTools interface. E.g.

runTools(... context)

My implementation is kind of a hack that I worked up after realizing that runner was passed to each tool call. I simply added a context class variable to a class that extends ChatCompletionStreamingRunner. This doesn't work out well because other references to tool type don't expect my custom class.

@rattrayalex
Copy link
Collaborator

Thanks. Could you provide a more complete code sample of what you're trying to do / how you're trying to use this? Including how you update and reference the context?

@rattrayalex
Copy link
Collaborator

Have you tried using .bind(context) on the functions before passing them in, and referencing this for context? Or even a pattern like this?:

// in one file
const updateEvent = (context: Context) => async function updateEvent(args: ArgsFromOpenAI) {
  const { eventId } = context;
  const event = await fetchEvent(eventId);
}
  

// in another
const context = {};
await client.beta.messages.runTools({ tools: [{
  type: 'function', function: { 
    function: updateEvent(context), 
    name: 'updateEvent' 
   }
}]})

@tday
Copy link
Author

tday commented Jan 3, 2024

That pattern works! Thanks for the suggestion.

The caveat is that the tool definition would have to be managed within the scope of the context which requires a good bit of refactor for me.

You can close this issue if you think it's best that providing context not be built into the library!

@rattrayalex
Copy link
Collaborator

rattrayalex commented Jan 4, 2024

Thanks!

Hmm, it might be optimal, but I'd like to provide the best possible experience. Would you be willing to share a more complete code sample of what you'd ideally like to see, including how you update & read from context?

@tday
Copy link
Author

tday commented Jan 18, 2024

Sorry for the late response-- I've been pushing to get the feature launched and left this as tech debt.

I was able to circle back to clean it up.


Here's how my implementation looks with the function closure:

// types.ts
interface ToolContext {
  eventId: string
}


// eventManagerTools.ts
const updateEvent = (context: ToolContext) =>
  async function updateEvent(eventDetails: UpdateEventArgs) {
    const { eventId } = context;

   ...
   };

export const eventManagerTools: Record<
  EventManagerToolNames,
  LLMFunctionWithContext<UpdateEventArgs | SetPrimaryVendorArgs>
> = {
  [EventManagerToolNames.UPDATE_EVENT]: {
    name: EventManagerToolNames.UPDATE_EVENT,
    description: 'Updates event given one or more event details from customer. Only call when values have changed',
    function: updateEvent,
    parse: JSON.parse,
    parameters: {
      type: 'object',
      properties: {
        maxBudgetPerGuest: {
          type: 'number',
          description:
            'Sets maximum budget guest. This should only include numeric values. If math is required, think through it and provide the output',
        },
        numDays: {
          type: 'number',
          description: 'Duration (in days) of the event',
        },
...
      },
    },
  },
};

// llmFacade.ts
export type LLMFunctionWithContext<Args extends object | string> = Omit<RunnableFunction<Args>, 'function'> & {
  function: (context: BoompopToolContext) => RunnableFunction<Args>['function'];
};

export function toTools(llmFunctions: LLMFunctionWithContext<any>[], context: BoompopToolContext) {
  // helper to convert to function-like definitions to tools

  return llmFunctions.map(
    (llmFunction) =>
      ({
        type: 'function',
        function: {
          ...llmFunction,
          function: llmFunction.function(context),
        },
      })
  );
}

export async function completionStreamWithTools(systemPrompt: string, tools: RunnableToolFunction<any>[]) {
  // simplified as an example
   const runner = ChatCompletionStreamingRunner.runTools(openai.chat.completions, {
    messages,
    model,
    tools,
    temperature,
    stream: true,
  });

}

// llmOrchestrator.ts
async function orchestrateResponse() {
const agent = {
   tools: [...eventManagerTools]
}

// pass context available scoped to this single stream call
await completionStreamWithTools('Plan an event', toTools(agent.tools, { eventId }))
}

This feels fairly good. The only caveat is I have to override and maintain my own type and wrapper to convert to the function type that RunnableFunction expects. The abstraction gets a bit leaky


It would be great to simplify the above by being able to simply change completionStreamWithTools to a function signature like this:

export async function completionStreamWithTools(systemPrompt: string, tools: RunnableToolFunction<any>[], globalToolContext: ToolContext) 

Then, I would call runTools like this:

  const runner = ChatCompletionStreamingRunner.runTools(openai.chat.completions, {
    messages,
    model,
    tools,

    toolContext: globalToolContext, 

    temperature,
    stream: true,
  });

I imagine that the tool context would then be provided to each tool with something like this:

updateEvent({ ... }: Args, runner: Runner, toolContext: ToolContext)

Alternatively, the tool context could be destructured into Args, but that might be more complicated than it is worth.

@rattrayalex
Copy link
Collaborator

Interesting. Thank you very much for sharing, this is quite helpful. The toolContext suggestion is interesting and we'll take that back to the team.

What do you think about something like this, so you don't have to subclass or write toTools?

export const eventManagerTools: Record<
  EventManagerToolNames,
  LLMFunctionWithContext<UpdateEventArgs | SetPrimaryVendorArgs>
> = {
  [EventManagerToolNames.UPDATE_EVENT]: new RunnableFunction({
    description: 'Updates event given one or more event details from customer. Only call when values have changed',
    function: updateEvent(context),
// …

  const runner = openai.beta.chat.completions.runTools({
    messages,
    model,
    tools,
//…

@tday
Copy link
Author

tday commented Jan 24, 2024

Oh, that's a nice suggestion! Though, I think that might still lead to folks wanting to DRY up the RunnableFunction instantiation with a helper like toTools to reduce boilerplate

e.g. to DRY up this

export const eventManagerTools: Record<
  EventManagerToolNames,
  LLMFunctionWithContext<UpdateEventArgs | SetPrimaryVendorArgs>
> = {
[EventManagerToolNames.TOOL_A]: new RunnableFunction(...),
[EventManagerToolNames.TOOL_B]: new RunnableFunction(...),
[EventManagerToolNames.TOOL_C]: new RunnableFunction(...),
[EventManagerToolNames.TOOL_D]: new RunnableFunction(...),
...
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants