Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

language-chatgpt adaptor #378

Closed
wants to merge 4 commits into from
Closed

language-chatgpt adaptor #378

wants to merge 4 commits into from

Conversation

haftamuk
Copy link

@haftamuk haftamuk commented Sep 3, 2023

Summary

language-chatgpt adaptor added

Details

A promptChatGpt(prompt) function is created to interact with OpenAI gpt-3.5-turbo model.

Issues

API Reference

Review Checklist

Before merging, the reviewer should check the following items:

  • Does the PR do what it claims to do?
  • If this is a new adaptor, has the migration tool been run and the
    migration guide followed?
  • Are there any unit tests? Should there be?
  • Is there a changeset associated with this PR? Should there be? Note that
    dev only changes don't need a changeset.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@haftamuk please update this to chat gpt 521x200.png

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@haftamuk please update this to chatgpt 256x256 logo

} from '@openfn/language-common';
import { expandReferences } from '@openfn/language-common/util';
import { request } from './Utils';
import 'dotenv/config';
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We use state.configuration for setting up credentials, please remove this library

"$schema": "http://json-schema.org/draft-07/schema#",
"properties": {
"OPENAI_SECRET_KEY": {
"title": "OPENAI_SECRET_KEY",
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the title should be apiKey

* @param {string} prompt - User textual prompt
* @returns {Operation}
*/
export function promptChatGpt(prompt) {
Copy link
Collaborator

@mtuchi mtuchi Sep 5, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A slight suggestion on the function signature that will give users more flexibility

@param {string} prompt - User textual prompt
@param {object} options - Options for textual prompt eg model, role
ecport function promptChatGpt(prompt, options) {
  return state => {
    const [resolvedPrompt, resolvedOptions] = expandReferences(
      state,
      prompt,
      options
    );
    //Everything else
  }
}

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In addition to @mtuchi's comment (which is correct and wise), I'd like to think more about the name of this function.

We wouldn't normally include the service name in the functions - so I'd prefer prompt to promptChatGpt (it's shorter and chartGPT is implied by the adaptor package itself).

I would also question if prompt is the correct function name. I'm not an ML engineer so maybe it is. But I wonder if ask(prompt) or chat(prompt) or question(prompt) would be semantically richer and more useful alternatives? Maybe even completion(prompt), which aligns with the npm library and so I guess will be familar to openai engineers.

I welcome your thoughts on this!

@@ -0,0 +1,104 @@
# @openfn/language-chatgpt

## 3.0.1
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The changelog probably needs updating :)

@josephjclark
Copy link
Collaborator

Hi @haftamuk - thank you for this contribution!

We're very keen to get it merged in. Are you able to respond to the feedback this week? If not, do you mind if we take it over?

@haftamuk
Copy link
Author

haftamuk commented Nov 1, 2023

Hi @haftamuk - thank you for this contribution!

We're very keen to get it merged in. Are you able to respond to the feedback this week? If not, do you mind if we take it over?

Hey.
I will spend enough time to address the feedbacks this week. Also Please feel free to dive in on any other comments. I will be pleased to see the work getting merged too.

@josephjclark
Copy link
Collaborator

Thank you @haftamuk! We'll give you a few days and maybe take this over next week. We'll let you know if we start working on anything so that we're not duplicating effort.

Copy link
Collaborator

@josephjclark josephjclark left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Whoops, just realised I had two review comments still pending - posting those now!

*/
export function promptChatGpt(prompt) {
return async state => {
const openai = new OpenAI({
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This will create a new openai client for every prompt submitted, which presumably means context is lost in between calls.

Is this the desired behaviour? Or should we create one shared client for the job (and workflow)?

* @param {string} prompt - User textual prompt
* @returns {Operation}
*/
export function promptChatGpt(prompt) {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In addition to @mtuchi's comment (which is correct and wise), I'd like to think more about the name of this function.

We wouldn't normally include the service name in the functions - so I'd prefer prompt to promptChatGpt (it's shorter and chartGPT is implied by the adaptor package itself).

I would also question if prompt is the correct function name. I'm not an ML engineer so maybe it is. But I wonder if ask(prompt) or chat(prompt) or question(prompt) would be semantically richer and more useful alternatives? Maybe even completion(prompt), which aligns with the npm library and so I guess will be familar to openai engineers.

I welcome your thoughts on this!

@josephjclark
Copy link
Collaborator

Hi @haftamuk - we'd like to merge this, make a few adjustments and put out a release. Is that OK with you?

@josephjclark
Copy link
Collaborator

Hi @haftamuk,

I am really sorry but I'm going to close this PR for now.

There was a minute there when we were going to take it over and release it ourselves. But we missed that boat and we've got other priorities at the moment.

I'd welcome a new PR which addresses the comments above and updates the library version. But until then the PR must be closed 🙏

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

None yet

3 participants