Skip to content

jpotts244/strong_migrations

 
 

Repository files navigation

Strong Migrations

Catch unsafe migrations in development

  ✓  Detects potentially dangerous operations
  ✓  Prevents them from running by default
  ✓  Provides instructions on safer ways to do what you want

Supports for PostgreSQL, MySQL, and MariaDB

🍊 Battle-tested at Instacart

Build Status

Installation

Add this line to your application’s Gemfile:

gem 'strong_migrations'

And run:

bundle install
rails generate strong_migrations:install

Checks

Potentially dangerous operations:

Postgres-specific checks:

Best practices:

You can also add custom checks or disable specific checks.

Removing a column

Bad

ActiveRecord caches database columns at runtime, so if you drop a column, it can cause exceptions until your app reboots.

class RemoveSomeColumnFromUsers < ActiveRecord::Migration[6.0]
  def change
    remove_column :users, :some_column
  end
end

Good

  1. Tell ActiveRecord to ignore the column from its cache
class User < ApplicationRecord
  self.ignored_columns = ["some_column"]
end
  1. Deploy code
  2. Write a migration to remove the column (wrap in safety_assured block)
class RemoveSomeColumnFromUsers < ActiveRecord::Migration[6.0]
  def change
    safety_assured { remove_column :users, :some_column }
  end
end
  1. Deploy and run migration

Adding a column with a default value

Note: This operation is safe in Postgres 11+, MySQL 8.0.12+, and MariaDB 10.3.2+.

Bad

Adding a column with a default value to an existing table causes the entire table to be rewritten.

class AddSomeColumnToUsers < ActiveRecord::Migration[6.0]
  def change
    add_column :users, :some_column, :text, default: "default_value"
  end
end

Good

Instead, add the column without a default value, then change the default.

class AddSomeColumnToUsers < ActiveRecord::Migration[6.0]
  def up
    add_column :users, :some_column, :text
    change_column_default :users, :some_column, "default_value"
  end

  def down
    remove_column :users, :some_column
  end
end

See the next section for how to backfill.

Backfilling data

Bad

Backfilling in the same transaction that alters a table locks the table for the duration of the backfill.

class AddSomeColumnToUsers < ActiveRecord::Migration[6.0]
  def change
    add_column :users, :some_column, :text
    User.update_all some_column: "default_value"
  end
end

Also, running a single query to update data can cause issues for large tables.

Good

There are three keys to backfilling safely: batching, throttling, and running it outside a transaction. Use the Rails console or a separate migration with disable_ddl_transaction!.

class BackfillSomeColumn < ActiveRecord::Migration[6.0]
  disable_ddl_transaction!

  def up
    User.unscoped.in_batches do |relation|
      relation.update_all some_column: "default_value"
      sleep(0.01) # throttle
    end
  end
end

Renaming or changing the type of a column

Bad

class RenameSomeColumn < ActiveRecord::Migration[6.0]
  def change
    rename_column :users, :some_column, :new_name
  end
end

or

class ChangeSomeColumnType < ActiveRecord::Migration[6.0]
  def change
    change_column :users, :some_column, :new_type
  end
end

A few changes are safe in Postgres:

  • Changing between varchar and text columns
  • Increasing the precision of a decimal or numeric column
  • Making a decimal or numeric column unconstrained
  • Changing between timestamp and timestamptz columns when session time zone is UTC in Postgres 12+

And a few in MySQL and MariaDB:

  • Increasing the length of a varchar column from under 255 up to 255
  • Increasing the length of a varchar column over 255

Good

A safer approach is to:

  1. Create a new column
  2. Write to both columns
  3. Backfill data from the old column to the new column
  4. Move reads from the old column to the new column
  5. Stop writing to the old column
  6. Drop the old column

Renaming a table

Bad

class RenameUsersToCustomers < ActiveRecord::Migration[6.0]
  def change
    rename_table :users, :customers
  end
end

Good

A safer approach is to:

  1. Create a new table
  2. Write to both tables
  3. Backfill data from the old table to new table
  4. Move reads from the old table to the new table
  5. Stop writing to the old table
  6. Drop the old table

Creating a table with the force option

Bad

The force option can drop an existing table.

class CreateUsers < ActiveRecord::Migration[6.0]
  def change
    create_table :users, force: true do |t|
      # ...
    end
  end
end

Good

Create tables without the force option.

class CreateUsers < ActiveRecord::Migration[6.0]
  def change
    create_table :users do |t|
      # ...
    end
  end
end

If you intend to drop an existing table, run drop_table first.

Using change_column_null with a default value

Bad

This generates a single UPDATE statement to set the default value.

class ChangeSomeColumnNull < ActiveRecord::Migration[6.0]
  def change
    change_column_null :users, :some_column, false, "default_value"
  end
end

Good

Backfill the column safely. Then use:

class ChangeSomeColumnNull < ActiveRecord::Migration[6.0]
  def change
    change_column_null :users, :some_column, false
  end
end

Note: In Postgres, change_column_null is still not safe with this method.

Executing SQL directly

Strong Migrations can’t ensure safety for raw SQL statements. Make really sure that what you’re doing is safe, then use:

class ExecuteSQL < ActiveRecord::Migration[6.0]
  def change
    safety_assured { execute "..." }
  end
end

Adding an index non-concurrently

Bad

In Postgres, adding an index non-concurrently locks the table.

class AddSomeIndexToUsers < ActiveRecord::Migration[6.0]
  def change
    add_index :users, :some_column
  end
end

Good

Add indexes concurrently.

class AddSomeIndexToUsers < ActiveRecord::Migration[6.0]
  disable_ddl_transaction!

  def change
    add_index :users, :some_column, algorithm: :concurrently
  end
end

If you forget disable_ddl_transaction!, the migration will fail. Also, note that indexes on new tables (those created in the same migration) don’t require this.

With gindex, you can generate an index migration instantly with:

rails g index table column

Adding a reference

Bad

Rails adds an index non-concurrently to references by default, which is problematic for Postgres.

class AddReferenceToUsers < ActiveRecord::Migration[6.0]
  def change
    add_reference :users, :city
  end
end

Good

Make sure the index is added concurrently.

class AddReferenceToUsers < ActiveRecord::Migration[6.0]
  disable_ddl_transaction!

  def change
    add_reference :users, :city, index: {algorithm: :concurrently}
  end
end

Adding a foreign key

Bad

In Postgres, new foreign keys are validated by default, which acquires a ShareRowExclusiveLock that can be expensive on large tables.

class AddForeignKeyOnUsers < ActiveRecord::Migration[6.0]
  def change
    add_foreign_key :users, :orders
  end
end

or

class AddReferenceToUsers < ActiveRecord::Migration[6.0]
  def change
    add_reference :users, :order, foreign_key: true
  end
end

Good

Instead, validate it in a separate migration with a more agreeable RowShareLock. This approach is documented by Postgres to have “the least impact on other work.”

For Rails 5.2+, use:

class AddForeignKeyOnUsers < ActiveRecord::Migration[6.0]
  def change
    add_foreign_key :users, :orders, validate: false
  end
end

Then validate it in a separate migration.

class ValidateForeignKeyOnUsers < ActiveRecord::Migration[6.0]
  def change
    validate_foreign_key :users, :orders
  end
end

For Rails < 5.2, use:

class AddForeignKeyOnUsers < ActiveRecord::Migration[5.1]
  def change
    safety_assured do
      execute 'ALTER TABLE "users" ADD CONSTRAINT "fk_rails_c1e9b98e31" FOREIGN KEY ("order_id") REFERENCES "orders" ("id") NOT VALID'
    end
  end
end

Then validate it in a separate migration.

class ValidateForeignKeyOnUsers < ActiveRecord::Migration[5.1]
  def change
    safety_assured do
      execute 'ALTER TABLE "users" VALIDATE CONSTRAINT "fk_rails_c1e9b98e31"'
    end
  end
end

Adding a json column

Bad

In Postgres, there’s no equality operator for the json column type, which can cause errors for existing SELECT DISTINCT queries.

class AddPropertiesToUsers < ActiveRecord::Migration[6.0]
  def change
    add_column :users, :properties, :json
  end
end

Good

Use jsonb instead.

class AddPropertiesToUsers < ActiveRecord::Migration[6.0]
  def change
    add_column :users, :properties, :jsonb
  end
end

Setting NOT NULL on an existing column

Bad

In Postgres, setting NOT NULL on an existing column requires an AccessExclusiveLock, which is expensive on large tables.

class SetSomeColumnNotNull < ActiveRecord::Migration[6.0]
  def change
    change_column_null :users, :some_column, false
  end
end

Good

Instead, add a constraint:

class SetSomeColumnNotNull < ActiveRecord::Migration[6.0]
  def change
    safety_assured do
      execute 'ALTER TABLE "users" ADD CONSTRAINT "users_some_column_null" CHECK ("some_column" IS NOT NULL) NOT VALID'
    end
  end
end

Then validate it in a separate migration.

class ValidateSomeColumnNotNull < ActiveRecord::Migration[6.0]
  def change
    safety_assured do
      execute 'ALTER TABLE "users" VALIDATE CONSTRAINT "users_some_column_null"'
    end

    # in Postgres 12+, you can safely turn this into a traditional column constraint
    change_column_null :users, :some_column, false
    safety_assured do
      execute 'ALTER TABLE "users" DROP CONSTRAINT "users_some_column_null"'
    end
  end
end

Note: This is not 100% the same as NOT NULL column constraint before Postgres 12. Here’s a good explanation.

Keeping non-unique indexes to three columns or less

Bad

Adding a non-unique index with more than three columns rarely improves performance.

class AddSomeIndexToUsers < ActiveRecord::Migration[6.0]
  def change
    add_index :users, [:a, :b, :c, :d]
  end
end

Good

Instead, start an index with columns that narrow down the results the most.

class AddSomeIndexToUsers < ActiveRecord::Migration[6.0]
  def change
    add_index :users, [:b, :d]
  end
end

For Postgres, be sure to add them concurrently.

Assuring Safety

To mark a step in the migration as safe, despite using a method that might otherwise be dangerous, wrap it in a safety_assured block.

class MySafeMigration < ActiveRecord::Migration[6.0]
  def change
    safety_assured { remove_column :users, :some_column }
  end
end

Certain methods like execute and change_table cannot be inspected and are prevented from running by default. Make sure what you’re doing is really safe and use this pattern.

Custom Checks

Add your own custom checks with:

StrongMigrations.add_check do |method, args|
  if method == :add_index && args[0].to_s == "users"
    stop! "No more indexes on the users table"
  end
end

Use the stop! method to stop migrations.

Note: Since remove_column always requires a safety_assured block, it’s not possible to add a custom check for remove_column operations.

Opt-in Checks

Removing an index non-concurrently

Postgres supports removing indexes concurrently, but removing them non-concurrently shouldn’t be an issue for most applications. You can enable this check with:

StrongMigrations.enable_check(:remove_index)

Disable Checks

Disable specific checks with:

StrongMigrations.disable_check(:add_index)

Check the source code for the list of keys.

Custom Messages

To customize specific messages, create an initializer with:

StrongMigrations.error_messages[:add_column_default] = "Your custom instructions"

Check the source code for the list of keys.

Timeouts

It’s extremely important to set a short lock timeout for migrations. This way, if a migration can’t acquire a lock in a timely manner, other statements won’t be stuck behind it. We also recommend setting a long statement timeout so migrations can run for a while.

Create config/initializers/strong_migrations.rb with:

StrongMigrations.lock_timeout = 10.seconds
StrongMigrations.statement_timeout = 1.hour

Or set the timeouts directly on the database user that runs migrations. For Postgres, use:

ALTER ROLE myuser SET lock_timeout = '10s';
ALTER ROLE myuser SET statement_timeout = '1h';

Note: If you use PgBouncer in transaction mode, you must set timeouts on the database user.

Existing Migrations

To mark migrations as safe that were created before installing this gem, create an initializer with:

StrongMigrations.start_after = 20170101000000

Use the version from your latest migration.

Target Version

If your development database version is different from production, you can specify the production version so the right checks run in development.

StrongMigrations.target_postgresql_version = "10"
StrongMigrations.target_mysql_version = "8.0.12"
StrongMigrations.target_mariadb_version = "10.3.2"

For safety, this option only affects development and test environments. In other environments, the actual server version is always used.

Analyze Tables

Analyze tables automatically (to update planner statistics) after an index is added. Create an initializer with:

StrongMigrations.auto_analyze = true

Faster Migrations

Only dump the schema when adding a new migration. If you use Git, create an initializer with:

ActiveRecord::Base.dump_schema_after_migration = Rails.env.development? &&
  `git status db/migrate/ --porcelain`.present?

Schema Sanity

Columns can flip order in db/schema.rb when you have multiple developers. One way to prevent this is to alphabetize them. Add to the end of your Rakefile:

task "db:schema:dump": "strong_migrations:alphabetize_columns"

Dangerous Tasks

For safety, dangerous database tasks are disabled in production - db:drop, db:reset, db:schema:load, and db:structure:load. To get around this, use:

SAFETY_ASSURED=1 rails db:drop

Permissions

We recommend using a separate database user for migrations when possible so you don’t need to grant your app user permission to alter tables.

Additional Reading

Credits

Thanks to Bob Remeika and David Waller for the original code and Sean Huber for the bad/good readme format.

Contributing

Everyone is encouraged to help improve this project. Here are a few ways you can help:

To get started with development:

git clone https://github.com/ankane/strong_migrations.git
cd strong_migrations
bundle install

# Postgres
createdb strong_migrations_test
bundle exec rake test

# MySQL and MariaDB
mysqladmin create strong_migrations_test
ADAPTER=mysql2 bundle exec rake test

About

Catch unsafe migrations in development

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Ruby 100.0%