Jason A. wrote:
I am building an API and I was wondering if there are any best
practices out there for logging and monitoring the calls. Ideally I’d
like to track:
If I log to a database, it will grow very quickly
Of course any text file would also grow very quickly if you log every
single call to an application that gets hit quite a lot. So I’m not
exactly sure how this would be an argument for/against writing your log
to a database table as opposed to writing to a normal text file.
You might find RRDtool [http://oss.oetiker.ch/rrdtool/] useful for at
least some of your needs. You could write all calls to the db and then
“move” this information out of there once it is older than x weeks.
RRDtool is perfect for aggregating this information into more general
stats such as API key XYZ has hit the application 40 times on this day,
380 times on this day, etc.
At some point you will either have to eliminate specific data out of
something like “API key XYZ has access user profile FOO at 5:15pm, the
request was processed in 0.24 seconds” in order to be able to aggregate
multiple calls together à la “API key XYZ accessed 50 user profiles on
the 10th of March, the calls were processed on avg in 0.12s” OR keep a
high level of detail and face an ever-growing mountain of logs.
Should you use MySQL, you might also be interested in using the ARCHIVE
storage engine. This engine compressed data as it is inserted. Of course
this means it needs to be extracted whenever you want to read it out so
this clearly is a storage/cpu-power trade-off. I does not seem to suit
your scenario in which there presumably will be quite a lot of reads.