Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

An error occurred that Vector couldn't handle: failed to encode record: BufferTooSmal #20488

Closed
Abhinav04 opened this issue May 13, 2024 · 1 comment
Labels
type: bug A code related bug.

Comments

@Abhinav04
Copy link

Abhinav04 commented May 13, 2024

A note for the community

  • Please vote on this issue by adding a 馃憤 reaction to the original issue to help the community and maintainers prioritize this request
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment

Problem

Vector aggregator crashes with the following error:

2024-05-13T09:01:41.304287Z ERROR transform{component_kind="transform" component_id=application_logs component_type=filter component_name=application_logs}: vector_buffers::topology::channel::sender: Disk buffer writer has encountered an unrecoverable error.
2024-05-13T09:01:41.318165Z ERROR transform{component_kind="transform" component_id=application_logs component_type=filter component_name=application_logs}: vector::topology: An error occurred that Vector couldn't handle: failed to encode record: BufferTooSmall.
2024-05-13T09:01:41.318265Z  INFO vector: Vector has stopped.
2024-05-13T09:01:41.321088Z  INFO vector::topology::running: Shutting down... 

Configuration

customConfig: 
  data_dir: /vector-data-dir
  api:
    enabled: true
    address: 0.0.0.0:8686
    playground: false
  sources:
    agent:
      address: 0.0.0.0:6000
      type: vector
    vector_aggregator_logs:
      type: "internal_logs"
  transforms:
    agent_remap_logs:
      type: remap
      inputs:
        - agent
      source: |-
        structured,err = parse_json(.message)
        if err == null {
          .,err = merge(., structured)
          if err == null{
            del(.message)
          }
        }
    if !exists(.name_id) || .name_id == "" {
          .name_id = "default"
        }
    application_logs:
      type: filter
      inputs:
        - agent_remap_logs
      condition:
        type: "vrl"
        source: " .source_type == \"kubernetes_logs\""
    vector_agent_logs:
      type: filter
      inputs:
        - application_logs
      condition:
        type: "vrl"
        source: " .source_type == \"internal_logs\""
  sinks:
    cloud_storage:
      type: gcp_cloud_storage
      inputs: [ application_logs ]
      compression: gzip
      buffer:
        max_size: 4294967296
        type: disk
        when_full: block
      batch:
        max_bytes: 209715200
        timeout_secs: 60
      request:
        concurrency: adaptive
        rate_limit_duration_secs: 60
     shorthand
      encoding:
        codec: json
      framing:
        method: "newline_delimited"
      metadata:
        Content-Encoding: ''
      bucket: abhinav-new-logs-bucket
      key_prefix: 'k8s/year=%Y/month=%m/day=%d/hour=%H/%M_'

Version

0.32.0

Additional Context

Additional Context

Flow:
Kubernetes Pods --> Vector Agent --> Vector Aggregator --> Google Cloud Storage Bucket

Disk size per vector aggregator pod is 10GiB. The error seems to be coming from "Transform". Need some guidance on the next steps.

@Abhinav04 Abhinav04 added the type: bug A code related bug. label May 13, 2024
@jszwedko
Copy link
Member

Hi @Abhinav04 ! I think this is a duplicate of #18346. I'll close this one, but let me know if you disagree.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type: bug A code related bug.
Projects
None yet
Development

No branches or pull requests

2 participants