Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Discussion: Approaches to data migration #13

Open
psychedelicious opened this issue Jan 7, 2024 · 3 comments
Open

Discussion: Approaches to data migration #13

psychedelicious opened this issue Jan 7, 2024 · 3 comments
Assignees
Labels

Comments

@psychedelicious
Copy link
Contributor

I'm exploring adding data migrations to our persisted state and have a very rough draft for handling this on a per-reducer basis.

I'm having some trouble imagining an implementation for whole-store migration. I suppose it would require some support from redux-remember to be integrated with its enhancer?

That said, does whole-store data migration even make sense with modern redux, with the emphasis and support for slices? Maybe a tidy migration abstraction for individual slices is a more sensible approach, especially with lots of slices.

Do you have any thoughts or experience with migrations? Thanks!

@psychedelicious
Copy link
Contributor Author

psychedelicious commented Jan 8, 2024

My rough draft was way too convoluted and needed complicated types that are a bit beyond me. Also, I was getting a race condition where, as multiple reducers migrated themselves, unmigrated data reached the UI between reducer calls, causing runtime errors. I think I was doing something wrong, but couldn't figure it out.

Anyways, I realized there is a place to do whole-store migration - the unserialize callback:

const unserialize: UnserializeFunction = (data, key) => {
  const log = logger('system');
  const config = sliceConfigs[key as keyof typeof sliceConfigs];
  if (!config) {
    throw new Error(`No unserialize config for slice "${key}"`);
  }
  const parsed = JSON.parse(data);

  // strip out old keys
  const stripped = pick(parsed, keys(config.initialState));

  try {
    // merge in initial state as default values...
    const transformed = defaultsDeep(
      // ...after migrating, if a migration exists
      config.migrate ? config.migrate(stripped) : stripped,
      config.initialState
    );
    log.debug(
      {
        persistedData: parsed,
        rehydratedData: transformed,
        diff: diff(parsed, transformed) as JsonObject, // this is always serializable
      },
      `Rehydrated slice "${key}"`
    );
    return transformed;
  } catch (err) {
    log.warn(
      { error: serializeError(err) },
      `Error rehydrating slice "${key}", falling back to default initial state`
    );
    return config.initialState;
  }
};

This works just fine and I suppose is a logical place to do data migration, as we always want it to run when unserializing.

Edit: I don't know why I was thinking the unserialize function is any different than handling the migration within the reducer, and don't know why I was getting the weird race condition. It should work the same. I must have been doing something wrong.

@zewish
Copy link
Owner

zewish commented Jan 8, 2024

I'm glad you figured it out! I think it probably makes sense to add this example to the docs for anyone who would be interested in doing the same thing 😉

@zewish zewish added the docs label Jan 8, 2024
@psychedelicious
Copy link
Contributor Author

Could you assign this to me @zewish? I'll make the example a bit more generalized.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants