Unix Philosophy in Ruby Programing

I use Linux about 5 years, but, this year that i started to “use” linux.
Now i understand a little more about the unix philosophy and some other
trick things windows dont do/have.
This changed the way i program. Ruby was build to help manage linux
systems, rigth?

So, this is the question, what you guys think about the way ruby
programs are made?
Why build a gem/program to send mail if you can send with sendmail?
Why build process monitor if you can use monit?
And so…

A nice post about this is: http://tomayko.com/writings/unicorn-is-unix

Because not all OSs are Unix, not all Unixes have sendmail,
configuring sendmail is difficult.

Do you use ed as your editor or something else? (“$ cat > app.rb” for
example)

People use gems so that they can get the job done, the same reason
that they use Ruby.

The unix philosophy is not about being hardcore and using things like
sendmail, its about simple tools that to a simple things well and
joining then all together.

It is about being “good enough” rather than doing things the “proper
way”. This is the ‘MIT approach’ vs the ‘New Jersey approach’

http://www.xent.com/FoRK-archive/summer96/0591.html

Excerpts from Diego B.'s message of Mon Aug 16 14:34:19 +0200
2010:

And so…
Because Ruby applications also run on Mac, Windows, …
And because Ruby can send emails even if there is no sendmail installed
etc. In contrast to Windows XX versions there is no such “Linux”.
there are thousands of different distributions which are all called
“Linux”.

In the end you don’t have to use the Ruby packages. You can just use
sendmail…

Another reason for duplicating work is that Ruby progrommers may find it
easier to extend a Ruby program. Coding C takes more time.

I don’t think there is an easy answer to your question.

Marc W.

Gems or any other reusable library structure/components provide
brilliance
of the community. You leveraging the minds of many.

As you implement systems you will always find better ways to do what you
have already done. Chances are the wheel maybe written for what you
need.
Using the wheel gets you where you’re trying to go and if you are
studious
enough Improving the wheel benefits the community.

I would be interested in your question at your 10 year mark and have you
answer your own question. That would be valuable!

You’re question is gracious and thought provoking and that too benefits
the
community. It makes us think.

I do use send mail in many of my scripts to fire off quick message to me
about system alerts, job completion etc… However, in my client facing
sites I integrate tightly to community needs employing ActionMailer and
the
likes of DelayedJob handling mass mailings for formatting of emails,
adaptable names, subject and personalized content…

So, you have a correct thought here, which leads me to the comment:
Never
get caught in 1 language and 1 solution. The languages, systems, or any
utility can be used to the job that fits your requirements well.
Choosing
the application is what we intend to master.

On Mon, 16 Aug 2010, Diego B. wrote:

And so…
Why write sendmail when you could send with UUCP? Why write monit if
you
could use Big Brother or NetSaint (now Nagios) or mon? :slight_smile: Times
change,
needs change, tastes differ.

In the specific case of sendmail as a mailing interface, there are
obviously the issues with portability and availability that others have
been brought up, but it’s also not a particularly flexible or
well-defined
interface for sending mail. For example, you might have an application
where you want verification that mail was delivered to a remote MTA
rather
than just added to the outgoing queue.

There are also a number of bugs and inconsistencies in the behaviour of
the interface as implemented by some MTAs; one in particular that bit me
is that postfix errors out if you invoke sendmail with a -t and there
are
no To headers in the message, but qmail silently ignores the flag.

Matt.

@ [email protected]

Thanks for the link…

http://www.xent.com/FoRK-archive/summer96/0591.html

Good read.

Now that I reflect on this I find that the point where I understood
the Unix philosophy was when I read the book “Software Tools” by
Kernighan and Plauger - don’t bother with the Pascal edition of the
book :frowning: Not sure that you get to learn a lot of programming techniques
from that book these days (in the old days, the 70s in my case, there
were few usable programming texts). But the underlying “small tools
joined together like lego to build something bigger” outlook was an
eye opener for my - being as I was a mainframe programmer.

Unix was so cool in how it didn’t care about things it didn’t care
about.

On Mon, Aug 16, 2010 at 9:43 AM, Peter H. <
[email protected]> wrote:

What’s neat is that the core unix philosophy of small tools working
together
with clear interfaces is a great example of clean object oriented
principles. Modularity, well-defined roles and interfaces, each section
of
code doing one thing with no side effects, it’s all just great design at
the
end of the day.

On 16 Aug 2010, at 13:52, Marc W. wrote:

Why build process monitor if you can use monit?
Another reason for duplicating work is that Ruby progrommers may find it
easier to extend a Ruby program. Coding C takes more time.

I don’t think there is an easy answer to your question.

Marc W.

The number of platforms for Ruby seems to be increasing, particularly
through JRuby (AppEngine, Android, IBM mainframe things), so it’s
probably increasingly unsafe to assume a UNIX-y host environment.

On the other side, John L. gave a very good presentation about taking
advantage of UNIX features in custom applications:

http://video2010.scottishrubyconference.com/show_video/6/1


Stuart E.
[email protected]

2010/8/16 James [email protected]:

eye opener for my - being as I was a mainframe programmer.

Unix was so cool in how it didn’t care about things it didn’t care about.

What’s neat is that the core unix philosophy of small tools working together
with clear interfaces is a great example of clean object oriented
principles. Modularity, well-defined roles and interfaces, each section of
code doing one thing with no side effects, it’s all just great design at the
end of the day.

Not all Unix tools are free of side effects or can have side effects
under certain conditions: tee, mail, awk variants, perl, python, ruby,
dd, find, xargs, rm…

Cheers

robert

Robert K. wrote:

2010/8/16 James [email protected]:

What’s neat is that the core unix philosophy of small tools working together
Not all Unix tools are free of side effects or can have side effects
under certain conditions: tee, mail, awk variants, perl, python, ruby,
dd, find, xargs, rm…

Unix was considered to have significantly departed from this philosophy
as early as 1983. If you’ve never read it, this paper (derived from Rob
Pike’s talk “cat -v Considered Harmful”) is worth a read:

http://harmful.cat-v.org/cat-v/

Clifford H…

‘The cheapest, fastest, and most reliable components are those that
aren’t there.’
– Gordon Bell

On Monday, August 16, 2010 07:52:40 am Marc W. wrote:

Excerpts from Diego B.'s message of Mon Aug 16 14:34:19 +0200 2010:

Why build a gem/program to send mail if you can send with sendmail?
Why build process monitor if you can use monit?
And so…

[snip]

…because Ruby can send emails even if there is no sendmail installed
etc.

I don’t really buy this as an argument. I’ll respond directly in a
separate
email, but what if there’s no Ruby interpreter installed? No rubygems?
What if
there’s no mail gem installed?

The answer is, of course, you install it – and systems like dpkg and
rpm have
been around a lot longer than rubygems.

Think about it – most Rubyists are on some sort of unix, and most
modern
unices have these sorts of commands – my servers have postfix, my
laptop
appears to have mailx. Why is it that we don’t use the sendmail binary,
whatever it happens to be pointing to?

There is a reason, but it’s got nothing to do with “Oh noes, what if we
have
gasp dependencies! Whatever shall we do?”

On Monday, August 16, 2010 09:41:49 am [email protected] wrote:

On Mon, 16 Aug 2010, Diego B. wrote:

Why build a gem/program to send mail if you can send with sendmail?
Why build process monitor if you can use monit?
And so…

In the specific case of sendmail as a mailing interface, there are
obviously the issues with portability and availability that others have
been brought up, but it’s also not a particularly flexible or well-defined
interface for sending mail.

Bingo. This is probably the second biggest reason for doing it this way.

There are also a number of bugs and inconsistencies in the behaviour of
the interface as implemented by some MTAs;

But the same can be said about SMTP, so I don’t really buy this.

On Monday, August 16, 2010 07:34:19 am Diego B. wrote:

Ruby was build to help manage linux
systems, rigth?

I don’t think so. Perl, maybe, but Wikipedia has this quote from Matz:

“I wanted a scripting language that was more powerful than Perl, and
more
object-oriented than Python. That’s why I decided to design my own
language.”

Why build a gem/program to send mail if you can send with sendmail?

[email protected] pretty much nailed this one – sendmail isn’t a
particularly good interface for sending mail. Maybe it’s a flaw in Ruby,
but
it really is more difficult to invoke an external command (with
appropriate
arguments), communicate with it, and capture the result than it is to
use
something more native to Ruby.

I haven’t sent email in awhile, but let’s use a more direct example:
grabbing
the contents of a URL. I could do it with wget:

system(‘wget’, some_url) || raise “wget exited with error code #{$?}!”
contents = File.read ‘/wherever/wget/put/the/file’

Or I could do it with the ‘curl’ binary (or a more complex wget
command):

contents = curl #{some_url}

…but then, how do I detect errors? What if some_url needs to be shell-
escaped? Hmm. Maybe I could do popen, but that’s more complicated…

Or I could do this in pure Ruby:

require ‘open-uri’
contents = open(some_url) {|x| x.read}

To me, this is both more natively Ruby, and more in keeping with the
Unix
philosophy – open-uri is a small library that does one job, and does it
well.
I then don’t have to repeat myself with tons of error-handling crap all
over
the place, as I do with system, or risk silent errors, as my above
backticks
example might do – I can assume open-uri will return an error if
something
goes wrong.

If I need more control, I can always use Net::HTTP, or one of the many
other
libraries available. Aside from having comprehensive documentation,
it’s,
again, more native. Here’s an example from the standard library docs:

res = 

Net::HTTP.post_form(URI.parse(‘http://www.example.com/search.cgi’),
{‘q’=>‘ruby’, ‘max’=>‘50’})

And here’s that same example with Curl:

curl -F q=ruby -F max=50 http://www.example.com/search.cgi

Here it is calling Curl from Ruby:

uri = ‘http://www.example.com/search.cgi
q = ‘ruby’
max = 50
res = curl #{uri} -F q=#{q} -F max=#{max}

But what if I want to generate those query options on the fly, from a
hash?
It’s trivial to do with Net::HTTP.post_form, but with curl?

uri = ‘http://www.example.com/search.cgi
options = {‘q’ => ‘ruby’, ‘max’ => 50}
options_string = options.each_pair.map{|k,v| “-F #{k}=#{v}”}.join(’ ')
res = curl #{uri} #{options_string}

And the best part? In each of the two examples above, I again may have
to
escape things for the shell, if I don’t know where those options are
coming
from. Curl is sane enough that I probably don’t have to escape them for
Curl
itself, at least. And I’ve again done no error checking – I’m just
getting a
giant string back.

To be fair, I didn’t do any error checking in my earlier example from
Net::HTTP, but that’s easy to do:

res = 

Net::HTTP.post_form(URI.parse(‘http://www.example.com/search.cgi’),
{‘q’=>‘ruby’, ‘max’=>‘50’})

res_value = res.value   #throws exception unless it was successful

And if there is an error, I can tell exactly what kind of error:

res.kind_of? Net::HTTPNotFound   # true
res.code                         # "404"
res.body                         # contents of the 404 page

With Curl, I have to pass some extra commandline options to ensure I get
that
information, probably specifying a custom output format so I can get the
error
code, and then I have to parse that output format. Yuck – I may as well
parse
raw HTTP.

I didn’t really answer your question about Sendmail, then, but I suspect
the
answer would be similar to the answer to the same question about Curl –
maybe
more so. The Mail gem, for instance, handles things like file
attachments – I
don’t think the sendmail interface does that, at least not directly, so
you’d
still need something like Mail to wrap Sendmail if you wanted to use it
sanely.

Now, you might well ask why we don’t just wrap these existing binaries
in a
gem. It’s not clear that there would be a huge benefit, though. Most of
the
time, properly-engineered Unix binaries also have C libraries, and it’s
probably easier and cleaner to write a Ruby binding for a C library than
to
write a Ruby wrapper around a Unix executable – for one, you don’t have
to
deal with escaping commandline arguments or passing text back and forth
through a pipe just to communicate, you’ve got an actual API to work
with.

There are a number of side benefits to the current approach. For one,
it’s
ridiculously portable – the JRuby guys have Rails running on AppEngine
and
IRB running on Android, in environments which wouldn’t make it easy (if
it
were allowed at all) to use C extensions or binaries. I can write my
HTML
parsing once, using Nokogiri, and it will use libxml on MRI and (I
think) the
standard Java XML libraries on JRuby.

And, even more fringe benefits – not forking off a separate process per
request is likely faster, and passing around Ruby data structures, or
ever
converting from Ruby strings to C strings, seems cheaper than converting
everything to giant blobs of text, squeezing it through a pipe, and
parsing it
in another process.

But you’ll notice, I call these “fringe benefits”, and I use a lot of
weasel
words. For all I know, separate processes might be faster. I do use
RESTful
services as components of a larger app, the Unix philosophy taken to an
extreme – this could be one physical server that does one job and
does it
well.

I think it just comes down to this: Ruby semantics are richer than Unix
semantics, and where Unix makes sense, HTTP might make even more sense.

I’m not saying “system considered harmful” – far from it, I’d be the
first to
call Unix commands when appropriate (xdg-open, for example), or even to
write
a quick shell script in sh or bash instead of trying to learn something
like
Rush.

Why build process monitor if you can use monit?

Well, or God, or any number of other things. I think the answer to this
one
is, choice is good, and this isn’t quite a solved problem yet.

On Tuesday, August 17, 2010 04:44:10 am R… Kumar 1.9.1 OSX wrote:

David M. wrote:

Another issue with calling Unix commands is portability. I’ve finally
begun converting my shell scripts to ruby since the parameters and
output across unices do not match. e.g. OSX’s BSD commands differ a
lot
from the GNU coreutils ones such as ‘date’, ‘expr’, and sleep.

This is solvable – there is enough in common, and GNU coreutils are
themselves portable. I’m not convinced that this is more of a problem
than
porting Ruby code, particularly Ruby C extensions, between versions and
implementations of Ruby.

If Nokogiri can use either libxml (on MRI) or Apache Xerces (on JRuby),
I’d
think we could write Ruby abstraction layers for different shell
commands,
especially when the inconsistencies are often minor – but we don’t do
that.

So, as I said, portability is a nice side benefit of Gems over commands,
but I
don’t think it’s why we shy away from using commands.

I could be wrong, though – I think grit is a counterexample. Git is
portable,
and there’s no libgit, so grit seems to just call git. I would guess
this is
done because:

  • Writing a libgit would be much harder than speaking Unix to Git.
  • The Git commandline interface is particularly well-designed.
  • There’s only one implementation of the ‘git’ command.

So portability does factor into it, but I would also guess that if a
libgit
existed, someone would port grit to that instead.

Anyway, for more of Unix philosophy in ruby, there is a presentation of
the “GLI” project - helps to create application with multiple
subcommands such as github.

That’s not a Unix philosophy, that’s a Unix commandline interface.

Diego B. [email protected] wrote:

So, this is the question, what you guys think about the way ruby
programs are made?
Why build a gem/program to send mail if you can send with sendmail?
Why build process monitor if you can use monit?
And so…

People build libraries/applications to fill particular niches/needs that
aren’t as easily filled by existing ones. Ruby makes that easier than
before. Complex configuration files for existing C applications either
no match in power to a programming language such as Ruby or much harder
to read and maintain than Ruby.

Technology is about evolution, some older technologies are gradually
replaced/superceded by superior (or sometimes unfortunately, just
better-marketed) ones. To me, Ruby is just another piece of evolution.

With Ruby, I don’t have to worry about the boring grunt work that I did
in C applications. I have a language that wraps most of the Unix API in
a pretty language with GC and exceptions[1]. I don’t think I’d
appreciate Ruby nearly as much as I do if I didn’t have years of Unix
programming experience in C and Perl before it.

However, I just see Ruby as yet another tool in the Unix toolkit. I
mostly use Ruby in places where I’d previously used Perl[2] or C. I
still use a mix of Bourne shell, awk, sed, GNU make, Perl and C in
places where I feel they’re more appropriate than Ruby (and of course
I freely combine them).

The most important part is that they’re all part of the Unix ecosystem
and information flows freely between them via pipes, sockets, files,
signals, message queues and what not. If I feel an AWK component isn’t
powerful enough to do what I need to do, I can just swap it out for
a Ruby one. And if Ruby is too slow[4], I’ll swap in a C replacement.

Scripting languages continually blur the line between configuration and
programming languages and enable less-experienced (or just lazier :slight_smile:
folks to do more. The ever-increasing performance of scripting
languages combined with advances in hardware continues to drive
higher-level things like Ruby forward and uglier things away.

A nice post about this is: http://tomayko.com/writings/unicorn-is-unix

Fwiw, I’m the primary author of the Unicorn server mentioned in that
article. I also work exclusively with Unix-like platforms and refuse to
work with anything I feel has too much corporate control/influence over
it. Maybe the corporate trolls will one day kill off dinosaurs like me,
or we’ll just eat them alive :slight_smile:

[1] - exceptions-by-default the biggest reason I favor Ruby over Perl/C.
I don’t need to write error checking code for every single function call
I make, Ruby will just raise an exception and fail early and loudly[3].

[2] - the other big reason I favor Ruby is that I can easily drop down
to C (the language of Unix) and use any exotic system calls that Ruby
doesn’t already provide for me. Many Ruby programmers did not have
C programming backgrounds, so they may feel differently.

[3] - an important tenet in Unix programming :slight_smile:

[4] - less and less these days since Ruby 1.9.2 + tcmalloc

David M. wrote:

On Monday, August 16, 2010 07:34:19 am Diego B. wrote:

Now, you might well ask why we don’t just wrap these existing binaries
in a
gem. It’s not clear that there would be a huge benefit, though. Most of
the
time, properly-engineered Unix binaries also have C libraries, and it’s
probably easier and cleaner to write a Ruby binding for a C library than
to
write a Ruby wrapper around a Unix executable – for one, you don’t have
to
deal with escaping commandline arguments or passing text back and forth
through a pipe just to communicate, you’ve got an actual API to work
with.

Another issue with calling Unix commands is portability. I’ve finally
begun converting my shell scripts to ruby since the parameters and
output across unices do not match. e.g. OSX’s BSD commands differ a
lot
from the GNU coreutils ones such as ‘date’, ‘expr’, and sleep.

Anyway, for more of Unix philosophy in ruby, there is a presentation of
the “GLI” project - helps to create application with multiple
subcommands such as github. http://awesome-cli-ruby.heroku.com/

David M. wrote:

On Tuesday, August 17, 2010 04:44:10 am R… Kumar 1.9.1 OSX wrote:

David M. wrote:

Another issue with calling Unix commands is portability. I’ve finally
begun converting my shell scripts to ruby since the parameters and
output across unices do not match. e.g. OSX’s BSD commands differ a
lot
from the GNU coreutils ones such as ‘date’, ‘expr’, and sleep.

This is solvable – there is enough in common, and GNU coreutils are
themselves portable. I’m not convinced that this is more of a problem

If you’ve tried to make your software run: say the date command for
computing the next date and the user could have BSD or GNU’s date you’d
know what I mean. What’s more you don’t know if a user on OSX has
installed GNU coreutils with “+default_names” or not. And you cannot
force him to use one of these for your software. What gaurantee is there
that there won’t be changes in the future. Some of these may not even be
clearly documented. Such as white spaces in the output of “wc” which
made one script fail, then differences in whether you need to escape “+”
in regexes and whether some commands even take a “+” or not in regex.
Far too many differences in behavior for us to solve at a ruby level.
And we haven’t even looked at other flavors of unix yet.

Anyway, i guess this is more an opinion, I personally reached a point
where I realized that anything more than simple script needed to go into
ruby.

Anyway, for more of Unix philosophy in ruby, there is a presentation of
the “GLI” project - helps to create application with multiple
subcommands such as github.

That’s not a Unix philosophy, that’s a Unix commandline interface.

If you read what I wrote, i am talking about a presentation, not the
software itself.

Eric W. wrote:

A nice post about this is: http://tomayko.com/writings/unicorn-is-unix

Fwiw, I’m the primary author of the Unicorn server mentioned in that
article. I also work exclusively with Unix-like platforms and refuse to

Eric, I was reading that page. The link of the file Tomayko has been
“studying”
is broken. I have been searching around.
Is this the new link:
unicorn.rb « lib - unicorn.git - Rack HTTP server for Unix and fast clients ?

thx.

Diego B. wrote:

I use Linux about 5 years, but, this year that i started to “use” linux.
Now i understand a little more about the unix philosophy and some other
trick things windows dont do/have.
This changed the way i program. Ruby was build to help manage linux
systems, rigth?

So, this is the question, what you guys think about the way ruby
programs are made?
Why build a gem/program to send mail if you can send with sendmail?
Why build process monitor if you can use monit?
And so…

A nice post about this is: http://tomayko.com/writings/unicorn-is-unix

I think you are making a wrong comparison: comparing ruby gems to unix
commands. After all, unix commands themselves use libraries internally.
So gems are the libraries that ruby programs use internally. Externally,
unix commands can be piped and glued together.

Ruby programs to be run on the command line can also be written with the
same philosophy. e.g.:

  • return 0 for success or 1 for error.
  • allow user to get output which can be filtered or grepped or awked
    etc.
  • do one job and do it well
  • use standard input and output appropriately
  • use traps and cleanup if aborted
  • use optionparser so user can enter --long-options or short options
    (-f)
  • accept input as a parameter or as standard input where appropriate
  • allow for colored output, but give plain output if output is being
    piped
  • have verbose and quiet options
  • have user interaction or no-interaction modes
  • use similar flags to most existing unix programs (-v, -f, -i, -o, -r)

I am sure there are a lot more … these are just from the top of my
head.