Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] server slow log, support loader import & client IP #2468

Open
1 task done
SunnyBoy-WYH opened this issue Mar 1, 2024 · 1 comment · May be fixed by #2466
Open
1 task done

[Bug] server slow log, support loader import & client IP #2468

SunnyBoy-WYH opened this issue Mar 1, 2024 · 1 comment · May be fixed by #2466
Labels
bug Something isn't working

Comments

@SunnyBoy-WYH
Copy link
Contributor

SunnyBoy-WYH commented Mar 1, 2024

Bug Type (问题类型)

others (please edit later)

Before submit

  • 我已经确认现有的 IssuesFAQ 中没有相同 / 重复问题 (I have confirmed and searched that there are no similar problems in the historical issue and documents)

Environment (环境信息)

latest branch

Expected & Actual behavior (期望与实际表现)

we support slow log before ,but it cause bug when loader batch import data, the feat PR see #2327

and later we downgrade it ,see pr : #2347

the bug due to:

we need get post body from req, and we need set it back to request, so it changed.
the loader request use the "GZIP" header, after get post body, server cant read it
so we ready to resolve it , use BufferedInputStream to cache the stream:

` BufferedInputStream bufferedStream = new BufferedInputStream(context.getEntityStream());

    bufferedStream.mark(Integer.MAX_VALUE);

    context.setProperty(REQUEST_PARAMS_JSON, IOUtils.toString(bufferedStream, Charsets.toCharset(CHARSET)));

    bufferedStream.reset();

    context.setEntityStream(bufferedStream);

Vertex/Edge example (问题点 / 边数据举例)

No response

Schema [VertexLabel, EdgeLabel, IndexLabel] (元数据结构)

No response

@SunnyBoy-WYH SunnyBoy-WYH added the bug Something isn't working label Mar 1, 2024
Copy link

dosubot bot commented Mar 1, 2024

Based on the information you've provided, it seems like the issue is related to handling GZIP compressed requests. Here are some potential solutions and insights:

  1. Ensure Proper GZIP Handling: The server should properly decompress the request body before processing it and recompress the response if necessary. This might involve wrapping the request input stream in a GZIPInputStream for decompression and similarly using a GZIPOutputStream for response compression.

  2. Stream Resetting: If the bug involves changes to the request stream that prevent it from being read again, consider using a mechanism to buffer the input stream so it can be read multiple times. This could involve copying the input stream to a ByteArrayInputStream that supports resetting and can be read multiple times without losing the data.

  3. Optimize Compression and Decompression: Review the compression and decompression logic for efficiency. Depending on the data characteristics, you might optimize these operations by adjusting the buffer size or using different compression levels.

  4. Concurrency and Resource Management: Ensure that the compression and decompression operations do not become bottlenecks under high load. This might involve profiling the server under load to identify hotspots and optimizing or parallelizing operations where possible. Additionally, ensure that resources like Deflater and Inflater instances are properly managed and released after use to avoid memory leaks.

  5. Error Handling and Logging: Improve error handling and logging around the compression and decompression operations. Specifically, ensure that any exceptions thrown during these operations are logged with sufficient detail to diagnose issues. This might help in identifying and resolving the root cause of the server slow log issue.

  6. Review Client-Side Implementation: If the issue is related to how the loader sends GZIP compressed requests, review the client-side implementation to ensure it correctly sets the Content-Encoding: gzip header and properly compresses the request body. Misconfiguration or bugs on the client side could lead to issues on the server.

  7. Server Configuration: Check the server configuration for settings related to request body size limits, timeout settings, and compression. It's possible that server configuration might need adjustment to better handle large or compressed requests.

Without specific details on how the server handles incoming requests and integrates with GZipUtil, these are general insights and potential solutions. Further investigation and debugging would be necessary to pinpoint the exact cause of the issue and implement the most appropriate solution.

For more details, you can refer to the source code of the GZipUtil class and the DecompressInterceptor class in the Apache HugeGraph repository:

I hope this helps! If you have any further questions or need more clarification, feel free to ask.

Sources

About Dosu

This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@SunnyBoy-WYH SunnyBoy-WYH changed the title [Bug] describe the main problem [Bug] server slow log, support loader import & client IP Mar 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant