Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tokens_per_message issue #3165

Open
LG920716 opened this issue May 9, 2024 · 3 comments · Fixed by #3067
Open

Tokens_per_message issue #3165

LG920716 opened this issue May 9, 2024 · 3 comments · Fixed by #3067
Assignees
Labels
enhancement New feature or request

Comments

@LG920716
Copy link

LG920716 commented May 9, 2024

description

  • We encountered an error message when we created a flow that is text_only also worked perfectly fine on local before, but recently when we tried to exectue it, it showed this error below.
Bot: Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\_cli\_pf\entry.py", line 157, in <module>
    main()
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\_cli\_pf\entry.py", line 153, in main
    entry(command_args)
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\_cli\_pf\entry.py", line 134, in entry
    cli_exception_and_telemetry_handler(run_command, activity_name)(args)
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\_cli\_utils.py", line 294, in wrapper
    raise e
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\_cli\_utils.py", line 282, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\_cli\_pf\entry.py", line 85, in run_command
    raise ex
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\_cli\_pf\entry.py", line 59, in run_command
    dispatch_flow_commands(args)
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\_cli\_pf\_flow.py", line 78, in dispatch_flow_commands
    test_flow(args)
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\_cli\_pf\_flow.py", line 462, in test_flow
    _test_flow_interactive(args, pf_client, inputs, environment_variables)
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\_cli\_pf\_flow.py", line 534, in _test_flow_interactive
    pf_client.flows._chat(
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\_sdk\_telemetry\activity.py", line 265, in wrapper
    return f(self, *args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\_sdk\operations\_flow_operations.py", line 360, in _chat
    submitter._chat_flow(
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\_sdk\_orchestrator\test_submitter.py", line 573, in _chat_flow
    print_chat_output(
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\_sdk\_orchestrator\utils.py", line 383, in print_chat_output
    for event in resolve_generator_output_with_cache(output, generator_record, generator_key=generator_key):
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\_sdk\_orchestrator\utils.py", line 420, in resolve_generator_output_with_cache
    generator_record[generator_key] = list(output)
                                      ^^^^^^^^^^^^
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\tracing\contracts\generator_proxy.py", line 34, in generate_from_proxy
    yield from proxy
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\tracing\contracts\generator_proxy.py", line 19, in __next__
    item = next(self._iterator)
           ^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\tracing\_trace.py", line 196, in traced_generator
    yield from generator_proxy
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\tracing\contracts\generator_proxy.py", line 19, in __next__
    item = next(self._iterator)
           ^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\tools\common.py", line 472, in generator
    for chunk in completion:
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\tracing\contracts\generator_proxy.py", line 34, in generate_from_proxy
    yield from proxy
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\tracing\contracts\generator_proxy.py", line 19, in __next__
    item = next(self._iterator)
           ^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\tracing\_trace.py", line 198, in traced_generator
    enrich_span_with_llm_if_needed(span, original_span, inputs, generator_output)
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\tracing\_trace.py", line 178, in enrich_span_with_llm_if_needed
    token_collector.collect_openai_tokens_for_streaming(span, inputs, generator_output, parser.is_chat)
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\tracing\_trace.py", line 60, in collect_openai_tokens_for_streaming
    tokens = calculator.get_openai_metrics_for_chat_api(inputs, output)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\tracing\_openai_utils.py", line 101, in get_openai_metrics_for_chat_api
    enc, tokens_per_message, tokens_per_name = self._get_encoding_for_chat_api(self._try_get_model(inputs, output))
                                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\USER\AppData\Local\Programs\Python\Python311\Lib\site-packages\promptflow\tracing\_openai_utils.py", line 130, in _get_encoding_for_chat_api
    return enc, tokens_per_message, tokens_per_name
                ^^^^^^^^^^^^^^^^^^
UnboundLocalError: cannot access local variable 'tokens_per_message' where it is not associated with a value
You can view the trace detail from the following URL:
http://localhost:23333/v1.0/ui/traces/?#collection=flow&uiTraceId=0x4f6ab012772402b1abe8d4fc0d5c8f5b

Describe the solution you'd like
make it work perfectly like before, and if avaliable, please provide what exactly cause this error so that we can fix this when this happen again.

Describe alternatives you've considered

  • We looked it up on google and saw someone with similar problem and they said just reinstall all the package from requirement.txt as well as python, we did that, but the error remains the same.
    • the following is the requirement.txt file we use.
azure-identity
azure-search-documents==11.4.0b6
openai
promptflow
promptflow-tools
tenacity==8.2.2
@LG920716 LG920716 added the enhancement New feature or request label May 9, 2024
@lumoslnt
Copy link
Contributor

lumoslnt commented May 9, 2024

Hello @LG920716, thanks for reaching us, we will fix the issue in the upcoming release.

@LG920716
Copy link
Author

LG920716 commented May 9, 2024

Hi, @lumoslnt, first of all, thanks for your fast reply. Basically, we're a 5-man group of students who are doing our graduation project, and the deadline of our system will be at 5/25 UTF+8, therefore we probably need an instant fix solution so that we can reach our deadline, sorry for the inconvenience and the knotty situation we're in, thank you for your time and hopefully that we will recieve your reply, or solution, THX!

@lumoslnt
Copy link
Contributor

Hi @LG920716, the fix PR for this issue has been merged, you can download the main branch and install the promptflow package by running python scripts/dev-setup/main.py.

@lumoslnt lumoslnt linked a pull request May 10, 2024 that will close this issue
7 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants