SQLite as a Log?


#1

In a current project we’re discussing using SQLite for all application
logging.

This is a heavy multiprocessing environment, which was one of our
reasons for considering it. We also love the idea of structured
queries helping us debug issues: show me any logged errors between
these times.

The minus of course is a slight loss in transparency. For example,
you can’t just tail a log file when you want.

Anyway, I would love to hear from anyone who has done this. Did you
like it? What were the downsides I am missing?

Thanks in advance for any information you can provide.

James Edward G. II


#2

James G. wrote:

Anyway, I would love to hear from anyone who has done this. Did you
like it? What were the downsides I am missing?

Thanks in advance for any information you can provide.

James Edward G. II

IIRC one writer blocks the entire sqlite db. You could avoid this using
a gatekeeper server for writing to the log.

Re tailing log files: I avoid that anyway, even with file logging,
because it doesn’t know about log rotation. The alternative I’ve found
helpful is to set up a drb service that lets a process listen for log
entries.


#3

On Thu, 12 Feb 2009 18:17:34 -0500, Joel VanderWerf wrote:

can’t just tail a log file when you want.

Re tailing log files: I avoid that anyway, even with file logging,
because it doesn’t know about log rotation. The alternative I’ve found
helpful is to set up a drb service that lets a process listen for log
entries.

tail -F is useful following a log across log rotations. It follows the
file by filename, not by file descriptor. (I learned this trick
debugging
an embedded system that rotated its logs every 5 minutes or so.)

–Ken


#4

On Feb 12, 2009, at 3:59 PM, James G. wrote:

Anyway, I would love to hear from anyone who has done this. Did you
like it? What were the downsides I am missing?

Thanks in advance for any information you can provide.

Have you looked into using CouchDB as the logging destination instead?

I’ve started a branch of my logging gem that allows you to do this,
but I’ve had to table it due to other obligations.

The goal is to just dump everything to CouchDB with some mnemonics on
the messages. Then you create views to see all messages for a given
application. All error messages in the past week, etc.

Blessings,
TwP


#5

On Thu, 12 Feb 2009 17:59:53 -0500, James G. wrote:

In a current project we’re discussing using SQLite for all application
logging.

Consider a concurrent RDBMS instead if you can deploy it in your
environment.

If you want to tail the log, then it shouldn’t be too hard to write a
script to do it. For a project I run, to look at the last 15 log
entries,
I have a shell function:

atlog ()
{
$sql atman -t -e ‘select * from atman_log order by time desc limit
15’
}

(Where $sql starts with “mysql” and then adds login information)

If you want to do continuous monitoring, you could use a loop (maybe
write a ruby script) that checks every 2 seconds or something, and
selects all of the log entries that are more recent than the last check.


#6

On Fri, 2009-02-13 at 07:59 +0900, James G. wrote:

In a current project we’re discussing using SQLite for all application
logging.

Have you considered splunk? That way you can do queries and event
correlation easily.

They have a community version which allows you to collect 500MB of logs
a day…


#7

On Feb 12, 2009, at 5:17 PM, Joel VanderWerf wrote:

The alternative I’ve found helpful is to set up a drb service that
lets a process listen for log entries.

That’s very interesting. Thanks for the tip.

James Edward G. II


#8

On Feb 12, 2009, at 8:26 PM, Matt Williams wrote:

a day…
I need a local solution. Thanks for the idea though.

James Edward G. II


#9

On Feb 12, 2009, at 5:27 PM, Tim P. wrote:

Have you looked into using CouchDB as the logging destination instead?

On Feb 12, 2009, at 6:08 PM, Ken B. wrote:

Consider a concurrent RDBMS instead if you can deploy it in your
environment.

I appreciate the suggestions but this is for installed software, so I
want to keep the dependencies pretty light. That’s why I’m thinking
of SQLite.

James Edward G. II


#10

On Fri, 2009-02-13 at 12:22 +0900, James G. wrote:

They have a community version which allows you to collect 500MB of
logs
a day…

I need a local solution. Thanks for the idea though.

I saw you wanted to keep it light, so yeah, it is probably not the best
solution.
However, you can log to a local database – as I understand it, that’s
what we’re doing at work…

Another option would be to use syslog & batch load (every N minutes)
into sqlite…

Good Luck!

Matt


#11

On Fri, Feb 13, 2009 at 08:27:42AM +0900, Tim P. wrote:

The minus of course is a slight loss in transparency. For example, you
I’ve started a branch of my logging gem that allows you to do this, but
I’ve had to table it due to other obligations.

The goal is to just dump everything to CouchDB with some mnemonics on the
messages. Then you create views to see all messages for a given
application. All error messages in the past week, etc.

I’ll have to take a look at that and possibly do something similar with
Amalgalite. Maybe an Amalgalite Appender? That might also fill what
James is
thinking about.

enjoy,

-jeremy


#12

On Feb 13, 2009, at 9:50 AM, Jeremy H. wrote:

I’ll have to take a look at that and possibly do something similar
with
Amalgalite. Maybe an Amalgalite Appender? That might also fill
what James is thinking about.

Yeah, that would be really neat.

James Edward G. II