I think the fog require may still be at issue (though it is less obvious now). Since fog has so very many different things within it, we made a choice long ago to defer loading many of the dependencies until they were needed. This has the benefit of speeding up 'fog', but can have the detriment of slowing down the first time certain things happen. Not sure how I forgot about this part, but in doing some small benchmarking on my machine I can certainly see a slow down when taking this into consideration.
To get around this, you can change the related requiring benchmark above to something like:
require 'benchmark'
require 'fog'
Fog::Storage.new(
provider: 'AWS',
aws_access_key_id: ENV['AWS_ACCESS_KEY_ID'],
aws_secret_access_key: ENV['AWS_SECRET_ACCESS_KEY']
)
Benchmark.measure { ... }
It may seem a bit odd in that you don't use that connection, but I set it up to defer loading S3 specifics until you do that (so that you don't, for instance, have to load S3 specifics in order to use EC2). By initializing a connection at some earlier time, however, you avoid that overhead. Hopefully that will at least get you closer to where you want to be.