Optimising Bulk Imports in Rails with PostgreSQL - YouTube

03-Jun-2022 1065

Processing masses of data is always tricky, and it's tempting to reach for specialised tools the moment we hit problems like this, but it's my firm belief that pushing forward with the tools you're already using has compounding benefits.

In the above presentation, we deep-dive on some Rails and PostgreSQL APIs and features which allowed me to optimise bulk imports in a large-scale Rails application down to less than 1% of it's original processing time.

If you'd like to follow along in code, the exercise is available on Github – you can fork it to your own account and push changes to see the performance impact in Github Actions. It could make a good training exercise for your team, or even to pair on in an interview.

The final optimised version is also available on Github. You'll see this takes import time from 25 seconds down to a few hundred milliseconds, memory down from 10 megabytes to less than half a megabyte and allocations from over 140,000 to less than 7,000 – this gives enormous scope for scalability.

Use coupon code:

RUBYONRAILS

to get 30% discount on our bundle!
Prepare for your next tech interview with our comprehensive collection of programming interview guides. Covering JavaScript, Ruby on Rails, React, and Python, these highly-rated books offer thousands of essential questions and answers to boost your interview success. Buy our 'Ultimate Job Interview Preparation eBook Bundle' featuring 2200+ questions across multiple languages. Ultimate Job Interview Preparation eBook Bundle