Multiple processes logging to single file

When using the native ruby logger, is it save to have multiple
processes writing to the same log file?

For instance, I have the following ruby source example:

test.rb

require ‘logger’

log = Logger.new(’/tmp/RLOG’)

20.times do
log.error “PID: #{$$}”
sleep 1
end

log.close

end test.rb

Then I execute like so:

$ ruby test.rb & ruby test.rb & ruby test.rb

After inspection of the log file, each process logged 20 times, all
messages interleaved of course. It looks like it works, but am I
delusional? I’m on an OpenBSD box. Does the OS handle caching/writing
from multiple sources?

-pachl

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

pachl wrote:

| After inspection of the log file, each process logged 20 times, all
| messages interleaved of course. It looks like it works, but am I
| delusional? I’m on an OpenBSD box. Does the OS handle caching/writing
| from multiple sources?

Yes, the OS should manage that (it is file-system level, rather than
userspace level). However, you can tweak your code to have
‘transactional’ logs (i.e. that every application logs a meaningful
chunk of information in one go).

Anyway, the file system and drivers take care of the actual writing.

Though, having everything log into one logfile may result in fractured
files, or odd race conditions and/or deadlocks. It might be saner to log
each process to its own file.


Phillip G.
Twitter: twitter.com/cynicalryan
Blog: http://justarubyist.blogspot.com

Don’t over-comment.
~ - The Elements of Programming Style (Kernighan & Plaugher)
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.8 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iEYEARECAAYFAkg0tLAACgkQbtAgaoJTgL9zpACfR/jr2oNQSrRc34nr7a6/rTX1
oGcAnj/BTPpEjGy9eZJz2NHYXBWPf5yk
=zdqV
-----END PGP SIGNATURE-----

This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.

| Privacy Policy | Terms of Service | Remote Ruby Jobs