Hi,
I’m looking for a reliable way how to fetch data from a http server,
guarantee it won’t exceed a specified time.
I did tried the most common solution (timeout 5 secs):
#____________________________________________________________________
require ‘net/http’
uri = URI(‘some.query.url/path?args’)
Net::HTTP.start(uri.host, uri.port, open_timeout: 5, read_timeout: 5,
ssl_timeout: 5) do
|http|
request = Net::HTTP::Get.new uri
response = http.request request
end
#____________________________________________________________________
Unfortunately, the given timeouts do not cover hostname resolution. If
the DNS is slower or stops communicating the above code won’t finish
sooner then after a minute or so.
I’ve tried to wrap the code in Timeout.timeout(5) { … } block but it
does not help, as the underlaying IO won’t allow triggering the timeout
exception.
Spawning a thread for unspecified time or use direct IP address of the
remote machine is not an option for this task.
It’s ok if the code won’t retrieve data in a given timeout, just need to
guarantee the attempt won’t last longer.
Any idea how to write it more bulletproof ?