Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PYDANTIC_DEFER_BUILD causes issues #1306

Open
1 task done
radoshi opened this issue Apr 9, 2024 · 8 comments
Open
1 task done

PYDANTIC_DEFER_BUILD causes issues #1306

radoshi opened this issue Apr 9, 2024 · 8 comments
Labels
bug Something isn't working

Comments

@radoshi
Copy link

radoshi commented Apr 9, 2024

Confirm this is an issue with the Python library and not an underlying OpenAI API

  • This is an issue with the Python library

Describe the bug

Release 1.16.1 broke clients by deferring the build if pydantic models. As a result, doing a model_dump() on an object returned by the library can fail.

bc6866e

While this speeds up build times, it should be considered a breaking change and ideally, avoided.

If the speedup is significant, we would request not using a generic envvar like PYDANTIC_DEFER_BUILD to revert the change, instead using something like OPENAI_PYDANTIC_DEFER_BUILD.

We noticed this issue with a streaming tool call message, and are not sure if other deep classes exhibit this behaviour.

To Reproduce

  1. Create a streaming tool call chat completion message.
  2. Call model_dump() on the object.
  3. Observe stack trace in pydantic serializer code.

Code snippets

No response

OS

macOS, Linux

Python version

Python v3.11.8

Library version

openai v1.16.1

@radoshi radoshi added the bug Something isn't working label Apr 9, 2024
@RobertCraigie
Copy link
Collaborator

Can you share a full example script to reproduce this? I can't reproduce this.

@radoshi
Copy link
Author

radoshi commented Apr 9, 2024

Yep, let me try and pry it loose from our code and create a standalone repro.

@RobertCraigie
Copy link
Collaborator

Thanks, even just a full stack trace would be helpful :)

@radoshi
Copy link
Author

radoshi commented Apr 9, 2024

That's easier. I manually elided some file paths to non-library code.

Traceback (most recent call last):
  File "/opt/homebrew/Cellar/python@3.11/3.11.8/Frameworks/Python.framework/Versions/3.11/lib/python3.11/wsgiref/handlers.py", line 138, in run
    self.finish_response()
  File "/Users/.../Library/Caches/pypoetry/virtualenvs/backend--rBupBVn-py3.11/lib/python3.11/site-packages/django/core/servers/basehttp.py", line 173, in finish_response
    super().finish_response()
  File "/opt/homebrew/Cellar/python@3.11/3.11.8/Frameworks/Python.framework/Versions/3.11/lib/python3.11/wsgiref/handlers.py", line 183, in finish_response
    for data in self.result:
  File "/.../backend/chat/views.py", line 91, in message_save_wrapper
    for message_chunk in generator:
  File "/.../backend/llm/handlers/openaifunction.py", line 285, in chat_streaming
    for chunk in response:
  File "/.../backend/llm/handlers/openai.py", line 73, in _traced_completion_create_streaming
    call_params["messages"] = [m.model_dump() for m in params["messages"]]
                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/.../backend/llm/handlers/openai.py", line 73, in <listcomp>
    call_params["messages"] = [m.model_dump() for m in params["messages"]]
                               ^^^^^^^^^^^^^^
  File "/Users/.../Library/Caches/pypoetry/virtualenvs/backend--rBupBVn-py3.11/lib/python3.11/site-packages/pydantic/main.py", line 314, in model_dump
    return self.__pydantic_serializer__.to_python(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: 'MockValSer' object cannot be converted to 'SchemaSerializer'

params["messages"] is list[ChoiceDeltaToolCall] or list[ChatCompletionMessageToolCall] which blows up when model_dump() is called on it.

We got around this by rewriting our code to not do direct model_dump on the returned object, but that was pretty unintuitive behavior.

Let me know if you'd like me to try and isolate a repro for you.

@RobertCraigie
Copy link
Collaborator

What version of pydantic are you using? Looks like this is a pydantic bug that's been reported a couple of times: pydantic/pydantic#7713

In the meantime you could probably fix this by setting PYDANTIC_DEFER_BUILD to 0 or by calling ChatCompletionChunk.model_rebuild().

@radoshi
Copy link
Author

radoshi commented Apr 9, 2024

Ooh quite possible. We're on pydantic 2.6.4

We did fix this by PYDANTIC_DEFER_BUILD - this is what led to having confidence about where the bug was. Our claim is that this is an unanticipated breaking change (it broke production for a brief moment) in a minor semver update. We then backed it out and fixed it the "right way".

Ultimately, the bug doesn't block us, but it's possible that others will run into the same issue and start to see strange stacks in production.

FWIW, the name PYDANTIC_DEFER_BUILD is unfortunate. It's very specific to openai lib rather than all pydantic defer_build=True situations.

@RobertCraigie
Copy link
Collaborator

Sure, it was never intended to be public, it was just planned for internal use. But renaming it makes sense.

@radoshi
Copy link
Author

radoshi commented Apr 10, 2024 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants