Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat: increase data serialization speed #85

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

luke321321
Copy link

Use pandas to_json() to serialize the dataframe instead of the custom serialization.

On a test dataframe of length 10_000 this sped up the conversion to json by 10x.

I've tested the outputs and they are consistent with the previous get_row_data() function for most use cases. There's a slight tweak with nested dataframes that the nested datafarme is already in json format and so doesn't have to be unpacked in JS. This makes it even easier for users as the previous behavior was confusing.

Test results

%timeit test_df.to_json(orient="records", date_format="iso")
20 ms ± 980 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)
%timeit get_row_data(test_df)
210 ms ± 23.5 ms per loop (mean ± std. dev. of 7 runs, 1 loop each)

Use pandas to_json to serialize the dataframe.  On a test dataframe of
10_000 x 3 this sped up the whole rending process by over 10x.
From 2.5 sec to 0.22 sec.

This also reduces the complexity and browser processing of nested grids
as there is no need to parse the json for the row details.
@cmayoracurzio
Copy link

@thunderbug1 tagging you as this is similar to a pull request of yours that is still open. Have you or @luke321321 heard back from PablocFonseca on this performance improvement? Looks great tbh :)

@lukedyer-peak
Copy link

@marduk2 ah I didn't realise there was an almost duplicate PR in #62. I haven't heard from @PablocFonseca

@smartm13
Copy link

Hey,
I was looking for some speed-ups over here, and tried to apply your fix.
I see the wait time for showing the dataframe has come down 🥳 to 8secs from 15-20secs that I originally had. (shape: ~6000x30)

However, now I ran into a new issue. My json/dict based column is not rendered correctly out of the box.
Before the fix:
image

After the fix: (Notice attributes column)
image

Any clue what we could do so that this speed-up doesn't break default rendering for json kind of columns?

@PablocFonseca
Copy link
Owner

I've been busy over the past months, with little time to work on ag-grid.
I remember I had to use a custom serialization bc of an issue on how tz-aware timestamps where serialized.
I'll check this again and update on the next version.

@PablocFonseca
Copy link
Owner

Just checked.
The issue is that pandas convert tz-aware to UTC when serializing to json, it is a known issue pandas-dev/pandas#46730.
I did that first implementation as a work around, I just changed it now using a new method: _cast_tz_aware_date_columns_to_iso8601. Below are the initial results I got when serializing a 1_000_000 x 8 df on my modest computer.
Captura de Tela 2022-07-05 às 20 17 30

@luke321321
Copy link
Author

luke321321 commented Jul 6, 2022 via email

@mkleinbort-ic
Copy link

Any updates on this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants