Assuming you keep your very simple scenario, you could use a small custom class to store the information and provide thread-safe methods (it is not clear from your question where your problem exactly resides, but this one problem will arise anyway):
require 'json'
require 'sinatra'
require 'date'
require 'thread'
require 'twitter'
set :server, 'webrick'
set :haml, :format => :html5
class MyCache
def initialize()
@mutex = Mutex.new
@last_update = DateTime.new # by default, -4732 BC
@client = Twitter::REST::Client.new do |config|
config.consumer_key = ""
config.consumer_secret = ""
config.access_token = ""
config.access_token_secret = ""
end
end
def get_cache
@mutex.synchronize do
if DateTime.now - @last_update > 10.0 / (3600 * 24)
@last_update = DateTime.now
arr = []
retweeters = @client.retweeters_of(429627812459593728)
retweeters.each do |retweeter|
ob = {}
ob[:name] = retweeter.name
ob[:followers_count] = retweeter.followers_count
arr.push(ob)
end
# remove the duplicates and sort on the users with the most followers,
sorted_influencers = arr.sort_by { |hsh| hsh[:followers_count] }
sorted_influencers.reverse!
@cache = sorted_influencers[0..9].to_s
end
@cache
end
end
end
my_cache = MyCache.new
get '/' do
content_type :json
my_cache.get_cache
end
This version now includes everything needed. I use the @client to store the instance of the twitter client (I suppose it's reusable), also note how the whole code is inside the if
statement, and at last we update @cache
. If you are unfamiliar with Ruby, the value of a block is determined by its last expression, so when I write @cache
alone it is as if I had written return @cache
.