This blog is part of our Rails 6 series. Rails 6.0 was recently released.

Rails 6 has added support for bulk inserts similar to how bulk update is supported using update_all and bulk delete is supported using delete_all .

Bulk inserts can be performed using newly added methods: insert_all , insert_all! and upsert_all .

All of these new methods allow the insertion of multiple records of the same model into the database. A single INSERT SQL query is prepared by these methods and a single sql statement is sent to the database, without instantiating the model or invoking Active Record callbacks or validations.

During bulk insertion, violation of primary key, violation of unique indexes, and violation of unique constraints is possible. Rails leverages database-specific features to either skip, or upsert the duplicates, depending on the case.

Let’s discuss insert_all , insert_all! and upsert_all methods in detail, which are all used to perform bulk insert.

We will create an articles table with two unique indexes.

create_table :articles do | t | t . string :title , null: false t . string :slug , null: false t . string :author , null: false t . text :description t . index :slug , unique: true t . index [ :title , :author ], unique: true end

Note that we do not allow duplicate slug columns. We also prevent records from having duplicate title and author columns together.

To try out the examples provided in this blog post, please ensure to always clean up the articles table before running any example.

1. Performing bulk inserts by skipping duplicates

Let’s say we want to insert multiple articles at once into the database. It is possible that certain records may violate the unique constraint(s) of the table. Such records are considered duplicates.

In other words, rows or records are determined to be unique by every unique index on the table by default.

To skip the duplicate rows or records, and insert the rest of the records at once, we can use ActiveRecord::Persistence#insert_all method.

1.1 Behavior with PostgreSQL

Let’s run the following example on a PostgreSQL database.

result = Article . insert_all ( [ { id: 1 , title: 'Handling 1M Requests Per Second' , author: 'John' , slug: '1m-req-per-second' }, { id: 1 , # duplicate 'id' here title: 'Type Safety in Elm' , author: 'George' , slug: 'elm-type-safety' }, { id: 2 , title: 'Authentication with Devise - Part 1' , author: 'Laura' , slug: 'devise-auth-1' }, { id: 3 , title: 'Authentication with Devise - Part 1' , author: 'Laura' , # duplicate 'title' & 'author' here slug: 'devise-auth-2' }, { id: 4 , title: 'Dockerizing and Deploying Rails App to Kubernetes' , author: 'Paul' , slug: 'rails-on-k8s' }, { id: 5 , title: 'Elm on Rails' , author: 'Amanda' , slug: '1m-req-per-second' }, # duplicate 'slug' here { id: 6 , title: 'Working Remotely' , author: 'Greg' , slug: 'working-remotely' } ] ) # Bulk Insert (2.3ms) INSERT INTO "articles"("id","title","author","slug") VALUES (1, 'Handling 1M Requests [...snip...] 'working-remotely') ON CONFLICT DO NOTHING RETURNING "id" puts result . inspect #<ActiveRecord::Result:0x00007fb6612a1ad8 @columns=["id"], @rows=[[1], [2], [4], [6]], @hash_rows=nil, @column_types={"id"=>#<ActiveModel::Type::Integer:0x00007fb65f420078 @precision=nil, @scale=nil, @limit=8, @range=-9223372036854775808...9223372036854775808>}> puts Article . count # 4

The insert_all method accepts a mandatory argument which should be an array of hashes with the attributes of the same model. The keys in all hashes should be same.

Notice the ON CONFLICT DO NOTHING clause in the INSERT query. This clause is supported by PostgreSQL and SQLite databases. This instructs the database that when there is a conflict or a unique key constraint violation during bulk insert operation, to skip the conflicting record silently and proceed with the insertion of the next record.

In the above example, we have exactly 3 records which violate various unique constraints defined on the articles table.

One of the records being inserted has a duplicate id: 1 attribute, which violates unique primary key constraint. Another record that has duplicate title: 'Authentication with Devise - Part 1', author: 'Laura' attributes violates the multi-column unique index defined on title and author columns. Another record has duplicate slug: '1m-req-per-second' attributes violates the unique index defined on the slug column.

All of these records that violate any unique constraint or unique index are skipped and are not inserted into the database.

If successful, ActiveRecord::Persistence#insert_all returns an instance of ActiveRecord::Result . The contents of the result vary per database. In case of PostgreSQL database, this result instance holds information about the successfully inserted records such as the chosen column names, values of the those columns in each successfully inserted row, etc.

For PostgreSQL, by default, insert_all method appends RETURNING "id" clause to the SQL query where id is the primary key(s). This clause instructs the database to return the id of every successfully inserted record. By inspecting the result, especially the @columns=["id"], @rows=[[1], [2], [4], [6]] attributes of the result instance, we can see that the records having id attribute with values 1, 2, 4 and 6 were successfully inserted.

What if we want to see more attributes and not just the id attribute of the successfully inserted records in the result?

We should use the optional returning option, which accepts an array of attribute names, which should be returned for all successfully inserted records!

result = Article . insert_all ( [ { id: 1 , title: 'Handling 1M Requests Per Second' , author: 'John' , slug: '1m-req-per-second' }, #...snip... ], returning: %w[ id title ] ) # Bulk Insert (2.3ms) INSERT INTO "articles"("id","title","author","slug") VALUES (1, 'Handling 1M Requests [...snip...] 'working-remotely') ON CONFLICT DO NOTHING RETURNING "id","title" puts result . inspect #<ActiveRecord::Result:0x00007f902a1196f0 @columns=["id", "title"], @rows=[[1, "Handling 1M Requests Per Second"], [2, "Authentication with Devise - Part 1"], [4, "Dockerizing and Deploying Rails App to Kubernetes"], [6, "Working Remotely"]], @hash_rows=nil, @column_types={"id"=>#<ActiveModel::Type::Integer:0x00007f90290ca8d0 @precision=nil, @scale=nil, @limit=8, @range=-9223372036854775808...9223372036854775808>, "title"=>#<ActiveModel::Type::String:0x00007f9029978298 @precision=nil, @scale=nil, @limit=nil>}> puts result . pluck ( "id" , "title" ). inspect #[[1, "Handling 1M Requests Per Second"], [2, "Authentication with Devise - Part 1"], [4, "Dockerizing and Deploying Rails App to Kubernetes"], [6, "Working Remotely"]]

Notice how the INSERT query appends RETURNING "id","title" clause and the result now holds the id and title attributes of the successfully inserted records.

1.2 Behavior with SQLite

Similar to PostgreSQL, the violating records are skipped during the bulk insert operation performed using insert_all when we run our example on a SQLite database.

result = Article . insert_all ( [ { id: 1 , title: 'Handling 1M Requests Per Second' , author: 'John' , slug: '1m-req-per-second' }, { id: 1 , title: 'Type Safety in Elm' , author: 'George' , slug: 'elm-type-safety' }, # duplicate 'id' here #...snip... ] ) # Bulk Insert (1.6ms) INSERT INTO "articles"("id","title","author","slug") VALUES (1, 'Handling 1M Requests [...snip...] 'working-remotely') ON CONFLICT DO NOTHING puts result . inspect #<ActiveRecord::Result:0x00007fa9df448ff0 @columns=[], @rows=[], @hash_rows=nil, @column_types={}> puts Article . pluck ( :id , :title ) #[[1, "Handling 1M Requests Per Second"], [2, "Authentication with Devise - Part 1"], [4, "Dockerizing and Deploying Rails App to Kubernetes"], [6, "Working Remotely"]] puts Article . count # 4

Note that since SQLite does not support RETURING clause, it is not being added to the SQL query. Therefore, the returned ActiveRecord::Result instance does not contain any useful information.

If we try to explicitly use the returning option when the database being used is SQLite, then the insert_all method throws an error.

Article . insert_all ( [ { id: 1 , title: 'Handling 1M Requests Per Second' , author: 'John' , slug: '1m-req-per-second' }, #...snip... ], returning: %w[ id title ] ) # ActiveRecord::ConnectionAdapters::SQLite3Adapter does not support :returning (ArgumentError)

1.3 Behavior with MySQL

The records that violate primary key, unique key constraints, or unique indexes are skipped during bulk insert operation performed using insert_all on a MySQL database.

result = Article . insert_all ( [ { id: 1 , title: 'Handling 1M Requests Per Second' , author: 'John' , slug: '1m-req-per-second' }, { id: 1 , title: 'Type Safety in Elm' , author: 'George' , slug: 'elm-type-safety' }, # duplicate 'id' here #...snip... ] ) # Bulk Insert (20.3ms) INSERT INTO `articles`(`id`,`title`,`author`,`slug`) VALUES (1, 'Handling 1M Requests [...snip...] 'working-remotely') ON DUPLICATE KEY UPDATE `id`=`id` puts result . inspect #<ActiveRecord::Result:0x000055d6cfea7580 @columns=[], @rows=[], @hash_rows=nil, @column_types={}> puts Article . pluck ( :id , :title ) #[[1, "Handling 1M Requests Per Second"], [2, "Authentication with Devise - Part 1"], [4, "Dockerizing and Deploying Rails App to Kubernetes"], [6, "Working Remotely"]] puts Article . count # 4

Here, the ON DUPLICATE KEY UPDATE 'id'='id' clause in the INSERT query is essentially doing the same thing as the ON CONFLICT DO NOTHING clause supported by PostgreSQL and SQLite.

Since MySQL does not support RETURING clause, it is not being included in the SQL query and therefore, the result doesn’t contain any useful information.

Explicitly trying to use returning option with insert_all method on a MySQL database throws ActiveRecord::ConnectionAdapters::Mysql2Adapter does not support :returning error.

2. Performing bulk inserts by skipping duplicates on a specified unique constraint but raising exception if records violate other unique constraints

In the previous case, we were skipping the records that were violating any unique constraints. In some case, we may want to skip duplicates caused by only a specific unique index but abort transaction if the other records violate any other unique constraints.

The optional unique_by option of the insert_all method allows to define such a unique constraint.

2.1 Behavior with PostgreSQL and SQLite

Let’s see an example to skip duplicate records that violate only the specified unique index :index_articles_on_title_and_author using unique_by option. The duplicate records that do not violate index_articles_on_title_and_author index are not skipped, and therefore throw an error.

result = Article . insert_all ( [ { .... }, { .... }, # duplicate 'id' here { .... }, { .... }, # duplicate 'title' and 'author' here { .... }, { .... }, # duplicate 'slug' here { .... } ], unique_by: :index_articles_on_title_and_author ) # PG::UniqueViolation: ERROR: duplicate key value violates unique constraint "articles_pkey" (ActiveRecord::RecordNotUnique) # DETAIL: Key (id)=(1) already exists.

In case of SQLite, the error appears as shown below.

# SQLite3::ConstraintException: UNIQUE constraint failed: articles.id (ActiveRecord::RecordNotUnique)

In this case we get ActiveRecord::RecordNotUnique error which is caused by the violation of primary key constraint on the id column.

It didn’t skip the second record in the example above which violated the unique index on primary key id since the unique_by option was specified with a different unique index.

When an exception occurs, no record persists to the database since insert_all executes just a single SQL query.

The unique_by option can be identified by columns or a unique index name.

``

`` unique_by: :index_articles_on_title_and_author # is same as unique_by: %i[ title author ] # Also, unique_by: :slug # is same as unique_by: %i[ :slug ] # and also same as unique_by: :index_articles_on_slug

Let’s remove (or fix) the record that has duplicate primary key and re-run the above example.

result = Article . insert_all ( [ { .... }, { .... }, { .... }, { .... }, # duplicate 'title' and 'author' here { .... }, { .... }, # duplicate 'slug' here { .... } ], unique_by: :index_articles_on_title_and_author ) # PG::UniqueViolation: ERROR: duplicate key value violates unique constraint "index_articles_on_slug" (ActiveRecord::RecordNotUnique) # DETAIL: Key (slug)=(1m-req-per-second) already exists.

In case of SQLite, the error looks appears as shown below.

# SQLite3::ConstraintException: UNIQUE constraint failed: articles.slug (ActiveRecord::RecordNotUnique)

The ActiveRecord::RecordNotUnique error in the example above now says the index_articles_on_slug unique constraint is violated. Note how it intentionally didn’t raise an error for the unique constraint violated on the title and author columns by the fourth record in the examplea above.

Now we will remove (or fix) the record that has same slug.

result = Article . insert_all ( [ { .... }, { .... }, { .... }, { .... }, # duplicate 'title' and 'author' here { .... }, { .... }, { .... } ], unique_by: :index_articles_on_title_and_author ) # Bulk Insert (2.5ms) INSERT INTO "articles"("id","title","author","slug") VALUES (1, 'Handling 1M Requests Per Second', [...snip...] 'working-remotely') ON CONFLICT ("title","author") DO NOTHING RETURNING "id" puts result . inspect #<ActiveRecord::Result:0x00007fada2069828 @columns=["id"], @rows=[[1], [7], [2], [4], [5], [6]], @hash_rows=nil, @column_types={"id"=>#<ActiveModel::Type::Integer:0x00007fad9fdb9df0 @precision=nil, @scale=nil, @limit=8, @range=-9223372036854775808...9223372036854775808>}>

Here, the fourth record was skipped since that record violates the unique index index_articles_on_title_and_author specified by the unique_by option.

Similarly, we can specify a different unique index using the unique_by option. For example, if we specify unique_by: :slug option then the records containing duplicate slug columns will be skipped, but would raise ActiveRecord::RecordNotUnique exception if any record violates other unique constraints.

2.2 Behavior with MySQL

The unique_by option is not supported when the database is MySQL.

3. Raising exception if any of the records being bulk inserted violate any unique constraints

The insert_all! method (with bang version) never skips a duplicate record. If a record violates any unique constraints, then insert_all! method would simply throw an ActiveRecord::RecordNotUnique error.

When database is PostgreSQL, insert_all! method can accept optional returning option, which we discussed in depth in 1.1 section above.

The unique_by option is not supported by the insert_all! method.

So far, in the sections 1, 2 and 3 above, we discussed either skipping the duplicates or raising an exception if a duplicate is encountered during bulk inserts. Sometimes, we want to update the existing record when a duplicate occurs otherwise insert a new record. This operation is called upsert because either it tries to update the record, or if there is no record to update, then it tries to insert.

The upsert_all method in Rails 6 allows performing bulk upserts.

Let’s see it’s usage and behavior with different database systems.

4.1 upsert_all in MySQL

Let’s try to bulk upsert multiple articles containing some duplicates.

result = Article . upsert_all ( [ { id: 1 , title: 'Handling 1M Requests Per Second' , author: 'John' , slug: '1m-req-per-second' }, { id: 1 , .... }, # duplicate 'id' here { id: 2 , .... }, { id: 3 , .... }, # duplicate 'title' and 'author' here { id: 4 , .... }, { id: 5 , .... }, # duplicate 'slug' here { id: 6 , .... } ] ) # Bulk Insert (26.3ms) INSERT INTO `articles`(`id`,`title`,`author`,`slug`) VALUES (1, 'Handling 1M Requests Per Second', 'John', [...snip...] 'working-remotely') ON DUPLICATE KEY UPDATE `title`=VALUES(`title`),`author`=VALUES(`author`),`slug`=VALUES(`slug`) puts result . inspect #<ActiveRecord::Result:0x000055a43c1fae10 @columns=[], @rows=[], @hash_rows=nil, @column_types={}> puts Article . count # 5 puts Article . all #<ActiveRecord::Relation [#<Article id: 1, title: "Type Safety in Elm", slug: "elm-type-safety", author: "George", description: nil>, #<Article id: 2, title: "Authentication with Devise - Part 1", slug: "devise-auth-2", author: "Laura", description: nil>, #<Article id: 4, title: "Dockerizing and Deploying Rails App to Kubernetes", slug: "rails-on-k8s", author: "Paul", description: nil>, #<Article id: 5, title: "Elm on Rails", slug: "1m-req-per-second", author: "Amanda", description: nil>, #<Article id: 6, title: "Working Remotely", slug: "working-remotely", author: "Greg", description: nil>]>

The persisted records in the database look exactly as intended. Let’s discuss it in detail.

The second row in the input array that has the id: 1 attribute replaced the first row, which also had the duplicate id: 1 attribute.

The fourth row that has id: 3 replaced the attributes of the third row since both had duplicate “title” and “author” attributes.

The rest of the rows were not duplicates or no longer became duplicates, and therefore were inserted without any issues.

Note that the returning and unique_by options are not supported in the upsert_all method when the database is MySQL.

4.2 upsert_all in SQLite

Let’s try to execute the same example from the above section 4.1 when database is SQLite.

result = Article . upsert_all ( [ { id: 1 , title: 'Handling 1M Requests Per Second' , author: 'John' , slug: '1m-req-per-second' }, { id: 1 , title: 'Type Safety in Elm' , author: 'George' , slug: 'elm-type-safety' }, # duplicate 'id' here { id: 2 , title: 'Authentication with Devise - Part 1' , author: 'Laura' , slug: 'devise-auth-1' }, { id: 3 , title: 'Authentication with Devise - Part 1' , author: 'Laura' , slug: 'devise-auth-2' }, # duplicate 'title' and 'author' here { id: 4 , title: 'Dockerizing and Deploying Rails App to Kubernetes' , author: 'Paul' , slug: 'rails-on-k8s' }, { id: 5 , title: 'Elm on Rails' , author: 'Amanda' , slug: '1m-req-per-second' }, # duplicate 'slug' here { id: 6 , title: 'Working Remotely' , author: 'Greg' , slug: 'working-remotely' } ] ) # Bulk Insert (4.0ms) INSERT INTO "articles"("id","title","author","slug") VALUES (1, 'Handling 1M Requests Per Second', [...snip...] 'working-remotely') ON CONFLICT ("id") DO UPDATE SET "title"=excluded."title","author"=excluded."author","slug"=excluded."slug" # SQLite3::ConstraintException: UNIQUE constraint failed: articles.title, articles.author (ActiveRecord::RecordNotUnique)

The bulk upsert operation failed in the above example due to ActiveRecord::RecordNotUnique exception.

Why it didn’t work similar to MySQL?

As per the documentation of MySQL, an upsert operation takes place if a new record violates any unique constraint.

Whereas, in case of SQLite, by default, new record replaces existing record when both the existing and new record have the same primary key. If a record violates any other unique constraints other than the primary key, it then raises ActiveRecord::RecordNotUnique exception.

The ON CONFLICT ("id") DO UPDATE clause in the SQL query above conveys the same intent.

Therefore, upsert_all in SQLite doesn’t behave exactly same as in MySQL.

As a workaround, we will need to upsert records with the help of multiple upsert_all calls with the usage of unique_by option.

If a duplicate record is encountered during the upsert operation, which violates the unique index specified using unique_by option then it will replace the attributes of the existing matching record.

Let’s try to understand this workaround with another example.

Article . upsert_all ( [ { id: 1 , title: 'Handling 1M Requests Per Second' , author: 'John' , slug: '1m-req-per-second' }, { id: 1 , title: 'Type Safety in Elm' , author: 'George' , slug: 'elm-type-safety' }, # duplicate 'id' here ], unique_by: :id ) Article . upsert_all ( [ { id: 2 , title: 'Authentication with Devise - Part 1' , author: 'Laura' , slug: 'devise-auth-1' }, { id: 3 , title: 'Authentication with Devise - Part 1' , author: 'Laura' , slug: 'devise-auth-2' }, # duplicate 'title' and 'author' here { id: 4 , title: 'Dockerizing and Deploying Rails App to Kubernetes' , author: 'Paul' , slug: 'rails-on-k8s' }, { id: 5 , title: 'Elm on Rails' , author: 'Amanda' , slug: '1m-req-per-second' }, # duplicate 'slug' here { id: 6 , title: 'Working Remotely' , author: 'Greg' , slug: 'working-remotely' } ], unique_by: %i[ title author ] ) puts Article . count # 5 puts Article . all #<ActiveRecord::Relation [#<Article id: 1, title: "Type Safety in Elm", slug: "elm-type-safety", author: "George", description: nil>, #<Article id: 2, title: "Authentication with Devise - Part 1", slug: "devise-auth-2", author: "Laura", description: nil>, #<Article id: 4, title: "Dockerizing and Deploying Rails App to Kubernetes", slug: "rails-on-k8s", author: "Paul", description: nil>, #<Article id: 5, title: "Elm on Rails", slug: "1m-req-per-second", author: "Amanda", description: nil>, #<Article id: 6, title: "Working Remotely", slug: "working-remotely", author: "Greg", description: nil>]>

Here, we first tried to upsert all the records which violated the unique primary key index on id column. Later, we upsert successfully all the remaining records, which violated the unique index on the title and author columns.

Note that since the first record’s slug attribute was already replaced with the second record’s slug attribute; the last second record having id: 5 didn’t raise an exception because of duplicate slug column.

4.3 upsert_all in PostgreSQL

We will run the same example in the 4.1 section above with PostgreSQL database.

result = Article . upsert_all ( [ { id: 1 , title: 'Handling 1M Requests Per Second' , author: 'John' , slug: '1m-req-per-second' }, { id: 1 , title: 'Type Safety in Elm' , author: 'George' , slug: 'elm-type-safety' }, # duplicate 'id' here { id: 2 , title: 'Authentication with Devise - Part 1' , author: 'Laura' , slug: 'devise-auth-1' }, { id: 3 , title: 'Authentication with Devise - Part 1' , author: 'Laura' , slug: 'devise-auth-2' }, # duplicate 'title' and 'author' here { id: 4 , title: 'Dockerizing and Deploying Rails App to Kubernetes' , author: 'Paul' , slug: 'rails-on-k8s' }, { id: 5 , title: 'Elm on Rails' , author: 'Amanda' , slug: '1m-req-per-second' }, # duplicate 'slug' here { id: 6 , title: 'Working Remotely' , author: 'Greg' , slug: 'working-remotely' } ] ) # PG::CardinalityViolation: ERROR: ON CONFLICT DO UPDATE command cannot affect row a second time (ActiveRecord::StatementInvalid) # HINT: Ensure that no rows proposed for insertion within the same command have duplicate constrained values.

The bulk upsert operation failed in the above example due to ActiveRecord::StatementInvalid exception which was caused by another PG::CardinalityViolation exception.

The PG::CardinalityViolation exception originates from here.

The PG::CardinalityViolation exception occurs when a row cannot be updated a second time in the same ON CONFLICT DO UPDATE SQL query. PostgreSQL assumes this behavior would lead the same row to updated a second time in the same SQL query, in unspecified order, non-deterministically.

PostgreSQL recommends it is the developer’s responsibility to prevent this situation from occurring.

Here’s more discussion about this issue - rails/rails#35519.

Therefore, the upsert_all method doesn’t work as intended due to the above limitation in PostgreSQL.

As a workaround, we can divide the single upsert_all query into multiple upsert_all queries with the use of unique_by option similar to how we did in case of SQLite in the 4.2 section above.

Article . insert_all ( [ { id: 1 , title: 'Handling 1M requests per second' , author: 'John' , slug: '1m-req-per-second' }, { id: 2 , title: 'Authentication with Devise - Part 1' , author: 'Laura' , slug: 'devise-auth-1' }, { id: 4 , title: 'Dockerizing and deploy Rails app to Kubernetes' , author: 'Paul' , slug: 'rails-on-k8s' }, { id: 6 , title: 'Working Remotely' , author: 'Greg' , slug: 'working-remotely' } ] ) Article . upsert_all ( [ { id: 1 , title: 'Type Safety in Elm' , author: 'George' , slug: 'elm-type-safety' }, # duplicate 'id' here ] ) Article . upsert_all ( [ { id: 3 , title: 'Authentication with Devise - Part 1' , author: 'Laura' , slug: 'devise-auth-2' }, # duplicate 'title' and 'author' here ], unique_by: :index_articles_on_title_and_author ) Article . upsert_all ( [ { id: 5 , title: 'Elm on Rails' , author: 'Amanda' , slug: '1m-req-per-second' }, # duplicate 'slug' here ] ) puts Article . count # 5 puts Article . all #<ActiveRecord::Relation [#<Article id: 1, title: "Type Safety in Elm", slug: "elm-type-safety", author: "George", description: nil>, #<Article id: 2, title: "Authentication with Devise - Part 1", slug: "devise-auth-2", author: "Laura", description: nil>, #<Article id: 4, title: "Dockerizing and deploy Rails app to Kubernetes", slug: "rails-on-k8s", author: "Paul", description: nil>, #<Article id: 5, title: "Elm on Rails", slug: "1m-req-per-second", author: "Amanda", description: nil>, #<Article id: 6, title: "Working Remotely", slug: "working-remotely", author: "Greg", description: nil>]>

For reference, note that the upsert_all method also accepts returning option for PostgreSQL which we have already discussed in the 1.1 section above.

5. insert , insert! and upsert

Rails 6 has also introduced three more additional methods namely insert , insert! and upsert for convenience.

The insert method inserts a single record into the database. If that record violates a uniqueness constrain, then the insert method will skip inserting record into the database without raising an exception.

Similarly, the insert! method also inserts a single record into the database, but will raise ActiveRecord::RecordNotUnique exception if that record violates a uniqueness constraint.

The upsert method inserts or updates a single record into the database similar to how upsert_all does.

The methods insert , insert! and upsert are wrappers around insert_all , insert_all! and upsert_all respectively.

Let’s see some examples to understand the usage of these methods.

Article . insert ({ id: 5 , title: 'Elm on Rails' , author: 'Amanda' , slug: '1m-req-per-second' }, unique_by: :slug ) # is same as Article . insert_all ([{ id: 5 , title: 'Elm on Rails' , author: 'Amanda' , slug: '1m-req-per-second' }], unique_by: :slug )

Article . insert! ({ id: 5 , title: 'Elm on Rails' , author: 'Amanda' , slug: '1m-req-per-second' }, returning: %w[ id title ] ) # is same as Article . insert_all! ([{ id: 5 , title: 'Elm on Rails' , author: 'Amanda' , slug: '1m-req-per-second' }], returning: %w[ id title ] )

Article . upsert ({ id: 5 , title: 'Elm on Rails' , author: 'Amanda' , slug: '1m-req-per-second' }, unique_by: :slug , returning: %w[ id title ] ) # is same as Article . upsert_all ([{ id: 5 , title: 'Elm on Rails' , author: 'Amanda' , slug: '1m-req-per-second' }], unique_by: :slug , returning: %w[ id title ] )

To learn more about the bulk insert feature and its implementation, please check rails/rails#35077, rails/rails#35546 and rails/rails#35854.