Importing data quickly in Ruby on Rails applications
15-Sep-2020 1302
Application frameworks often provide their own out of the box services for interacting with databases. This works great when processing data in a onesie-twosie fashion, but what about large batches of data?I ran into this problem over a decade ago with the not yet 1.0 Ruby on Rails framework. I had a lot of data that I needed to routinely integrate into a local application, but ActiveRecord’s mechanism for creating records was too slow. It was taking several hours. At that time I wrote a library for importing large batches of data with ActiveRecord in an efficient manner. The job that took several hours now took under two minutes. That library today is known as the activerecord-import and its available as a rubygem.Over the years activerecord-import has become the defacto standard for efficiently importing large batches of data with ActiveRecord. It has been maintained by an awesome community of contributors. Not only has it kept up with Rails development, but it works with several databases. The big three being: MySQL, SQLite3, PostgreSQL. Oh, and it runs on JRuby too. Plus, if you’re using any ActiveRecord adapter(s) compatible with these databases – e.g. mysql2_makara, mysql2spatial, postgis, postgresql_makara, seamless_database_pool, spatialite – then those all work too. If you don’t see an adapter you’re using listed let us know.
Importing data quickly in Ruby on Rails applications #ruby #rubydeveloper #rubyonrails #Importing #quickly #Rails #applications #applications https://rubyonrails.ba/link/importing-data-quickly-in-ruby-on-rails-applications