I’ve been faced the the following symptoms for some time.
I have links coded as :post or :put, so I can make sure that bots
aren’t hitting particular links.
But it something is either hitting them as :get through an error I’ve
made (like link_to not working well in some browsers?), or there’s 1
or more plugins that pre load urls; or I have scrapers.
Each day I’ll get 50-100 error messages - where routes aren’t found -
of this nature.
When I get many of these hits from the same ip address, I usually
assume a scraper, then block that ip address…but I don’t want to do
this in very case if it’s possible that a legimate (pre-loader browser
plugin) is causing this to happen.
Does anybody else this kind of behaviour? How do you handle it?
I’ve been faced the the following symptoms for some time.
I have links coded as :post or :put, so I can make sure that bots
aren’t hitting particular links.
But it something is either hitting them as :get through an error I’ve
made (like link_to not working well in some browsers?), or there’s 1
or more plugins that pre load urls; or I have scrapers.
A browser with js turned off would also do this (or using a firefox
plugin like noscript to only have it on for certain websites)
I don’t mind pre-loaders - so I think I’ll see if I can find a patter
in the plugins loaded…and if I don’t find a plugin then this could
work as a honeypot.
I guess I don’t have a specific question - merely symptoms - hopeful
that someone may have faced such a thing.
Jodi
This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.