Well it's trivial in ActiveRecord to avoid n+1s by preloading associations. As with any technology, if you use it wrong and inefficently then things will be incorrect and slow.
Here's an example I remember: if I want to update a field in a million records in my database, I can't just send the update command to the database to run. Instead, an ActiveRecord ORM will try and load all the records into the application, and for each record object make the change in memory, and then persist the record.
If I'm remembering correctly, that is a fundamentally poor approach. Instead of telling the database to do some work, the database is doing more work, and the application is doing work. One mitigation is to batch the work[0]. Another is to special-case updates[1], which bypasses all the ActiveRecord pre/post-save logic. In either case you aren't holding the tool wrong. The tool is wrong.
[0] https://apidock.com/rails/ActiveRecord/Batches/find_each - you'll note that the batches are "subject to race conditions" - i.e. each batch is its own transaction! And you're still loading the records into your application pointlessly. You're just limiting how many do it at once.
> Be aware that the update() method is converted directly to an SQL statement. It is a bulk operation for direct updates. It doesn’t run any save() methods on your models, or emit the pre_save or post_save signals (which are a consequence of calling save()), or honor the auto_now field option. If you want to save every item in a QuerySet and make sure that the save() method is called on each instance, you don’t need any special function to handle that. Loop over them and call save()