README.md in strong_migrations-0.2.0 vs README.md in strong_migrations-0.2.1
- old
+ new
@@ -12,89 +12,91 @@
```ruby
gem 'strong_migrations'
```
+## How It Works
+
+Strong Migrations detects potentially dangerous operations in migrations, prevents them from running, and gives instructions on safer ways to do what you want.
+
+```
+ __ __ _____ _______ _
+ \ \ / /\ |_ _|__ __| |
+ \ \ /\ / / \ | | | | | |
+ \ \/ \/ / /\ \ | | | | | |
+ \ /\ / ____ \ _| |_ | | |_|
+ \/ \/_/ \_\_____| |_| (_)
+
+ActiveRecord caches attributes which causes problems
+when removing columns. Be sure to ignore the column:
+
+class User < ApplicationRecord
+ self.ignored_columns = %w(some_column)
+end
+
+Once that's deployed, wrap this step in a safety_assured { ... } block.
+
+More info: https://github.com/ankane/strong_migrations#removing-a-column
+```
+
## Dangerous Operations
The following operations can cause downtime or errors:
- adding a column with a non-null default value to an existing table
+- removing a column
- changing the type of a column
-- renaming a table
- renaming a column
-- removing a column
+- renaming a table
- adding an index non-concurrently (Postgres only)
- adding a `json` column to an existing table (Postgres only)
-For more info, check out:
-
-- [Rails Migrations with No Downtime](http://pedro.herokuapp.com/past/2011/7/13/rails_migrations_with_no_downtime/)
-- [Safe Operations For High Volume PostgreSQL](https://www.braintreepayments.com/blog/safe-operations-for-high-volume-postgresql/) (if it’s relevant)
-
Also checks for best practices:
-- keeping indexes to three columns or less
+- keeping non-unique indexes to three columns or less
## The Zero Downtime Way
### Adding a column with a default value
-1. Add the column without a default value
-2. Add the default value
-3. Commit the transaction - **extremely important if you are backfilling in the migration**
-4. Backfill the column
+Adding a column with a non-null default causes the entire table to be rewritten.
+Instead, add the column without a default value, then change the default.
+
```ruby
-class AddSomeColumnToUsers < ActiveRecord::Migration
+class AddSomeColumnToUsers < ActiveRecord::Migration[5.1]
def up
- # 1
add_column :users, :some_column, :text
-
- # 2
change_column_default :users, :some_column, "default_value"
-
- # 3
- commit_db_transaction
-
- # 4.a (Rails 5+)
- User.in_batches.update_all some_column: "default_value"
-
- # 4.b (Rails < 5)
- User.find_in_batches do |users|
- User.where(id: users.map(&:id)).update_all some_column: "default_value"
- end
end
def down
remove_column :users, :some_column
end
end
```
-### Renaming or changing the type of a column
+### Backfilling data
-If you really have to:
+To backfill data, use the Rails console or a separate migration with `disable_ddl_transaction!`. Avoid backfilling in a transaction, especially one that alters a table. See [this great article](https://wework.github.io/data/2015/11/05/add-columns-with-default-values-to-large-tables-in-rails-postgres/) on why.
-1. Create a new column
-2. Write to both columns
-3. Backfill data from the old column to the new column
-4. Move reads from the old column to the new column
-5. Stop writing to the old column
-6. Drop the old column
+```ruby
+class BackfillSomeColumn < ActiveRecord::Migration[5.1]
+ disable_ddl_transaction!
-### Renaming a table
+ def change
+ # Rails 5+
+ User.in_batches.update_all some_column: "default_value"
-If you really have to:
+ # Rails < 5
+ User.find_in_batches do |users|
+ User.where(id: users.map(&:id)).update_all some_column: "default_value"
+ end
+ end
+end
+```
-1. Create a new table
-2. Write to both tables
-3. Backfill data from the old table to new table
-4. Move reads from the old table to the new table
-5. Stop writing to the old table
-6. Drop the old table
-
### Removing a column
ActiveRecord caches database columns at runtime, so if you drop a column, it can cause exceptions until your app reboots. To prevent this:
1. Tell ActiveRecord to ignore the column from its cache
@@ -115,46 +117,91 @@
2. Deploy code
3. Write a migration to remove the column (wrap in `safety_assured` block)
```ruby
- class RemoveSomeColumnFromUsers < ActiveRecord::Migration
+ class RemoveSomeColumnFromUsers < ActiveRecord::Migration[5.1]
def change
safety_assured { remove_column :users, :some_column }
end
end
```
4. Deploy and run migration
+### Renaming or changing the type of a column
+
+If you really have to:
+
+1. Create a new column
+2. Write to both columns
+3. Backfill data from the old column to the new column
+4. Move reads from the old column to the new column
+5. Stop writing to the old column
+6. Drop the old column
+
+One exception is changing a `varchar` column to `text`, which is safe in Postgres 9.1+.
+
+### Renaming a table
+
+If you really have to:
+
+1. Create a new table
+2. Write to both tables
+3. Backfill data from the old table to new table
+4. Move reads from the old table to the new table
+5. Stop writing to the old table
+6. Drop the old table
+
### Adding an index (Postgres)
Add indexes concurrently.
```ruby
-class AddSomeIndexToUsers < ActiveRecord::Migration
+class AddSomeIndexToUsers < ActiveRecord::Migration[5.1]
+ disable_ddl_transaction!
+
def change
- commit_db_transaction
add_index :users, :some_index, algorithm: :concurrently
end
end
```
+If you forget `disable_ddl_transaction!`, the migration will fail.
+
+Also, note that indexes on new tables (those created in the same migration) don’t require this.
+
### Adding a json column (Postgres)
-There’s no equality operator for the `json` column type, which causes issues for `SELECT DISTINCT` queries. Replace all calls to `uniq` with a custom scope.
+There’s no equality operator for the `json` column type, which causes issues for `SELECT DISTINCT` queries.
+If you’re on Postgres 9.4+, use `jsonb` instead.
+
+If you must use `json`, replace all calls to `uniq` with a custom scope.
+
```ruby
-scope :uniq_on_id, -> { select("DISTINCT ON (your_table.id) your_table.*") }
+class User < ApplicationRecord
+ scope :uniq_on_id, -> { select("DISTINCT ON (users.id) users.*") }
+end
```
+Then add the column:
+
+```ruby
+class AddJsonColumnToUsers < ActiveRecord::Migration[5.1]
+ def change
+ safety_assured { add_column :users, :some_column, :json }
+ end
+end
+```
+
## Assuring Safety
To mark a step in the migration as safe, despite using method that might otherwise be dangerous, wrap it in a `safety_assured` block.
```ruby
-class MySafeMigration < ActiveRecord::Migration
+class MySafeMigration < ActiveRecord::Migration[5.1]
def change
safety_assured { remove_column :users, :some_column }
end
end
```
@@ -192,19 +239,19 @@
```ruby
task "db:schema:dump": "strong_migrations:alphabetize_columns"
```
-## Custom Error Messages
+## Custom Messages
-To customize specific error messages, create an initializer with:
+To customize specific messages, create an initializer with:
```ruby
StrongMigrations.error_messages[:add_column_default] = "Your custom instructions"
```
-Check the source code for the list of keys.
+Check the [source code](https://github.com/ankane/strong_migrations/blob/master/lib/strong_migrations.rb) for the list of keys.
## Analyze Tables (Postgres)
Analyze tables automatically (to update planner statistics) after an index is added. Create an initializer with:
@@ -219,9 +266,14 @@
```sql
ALTER ROLE myuser SET lock_timeout = '10s';
```
There’s also [a gem](https://github.com/gocardless/activerecord-safer_migrations) you can use for this.
+
+## Additional Reading
+
+- [Rails Migrations with No Downtime](http://pedro.herokuapp.com/past/2011/7/13/rails_migrations_with_no_downtime/)
+- [Safe Operations For High Volume PostgreSQL](https://www.braintreepayments.com/blog/safe-operations-for-high-volume-postgresql/)
## Credits
Thanks to Bob Remeika and David Waller for the [original code](https://github.com/foobarfighter/safe-migrations).