Forum: Ruby Ruby OpenGL Gears example that uses shiny and velvet GLSL shaders

Announcement (2017-05-07): www.ruby-forum.com is now read-only since I unfortunately do not have the time to support and maintain the forum any more. Please see rubyonrails.org/community and ruby-lang.org/en/community for other Rails- und Ruby-related community platforms.
C742175a04e90db217deaf3650ef5d30?d=identicon&s=25 Michael Brooks (Guest)
on 2009-03-09 02:55
(Received via mailing list)
Hello:

For your entertainment, I've created a modified version of the
original "gears.rb" OpenGL Ruby program (by Arto Bendiken) which
uses GLSL shaders to color the gears with velvet and shiny
materials.

You'll need Ruby 1.8.6, GLUT and an OpenGL 2.0 (or greater) driver
to run it.  Even if you don't run it, you might find it educational
because it's, in my opinion, fairly easy to read and will help anyone
trying to understand how OpenGL programs can implements GLSL.

The zip file containing the Ruby and GLSL code (which may disappear
in the future if I change providers) is available here
http://members.shaw.ca/michael.brooks/gears_using_...

Please look at the readme.txt file in the zip for more details.

Michael
09348009e57e24e10bbc08d925bf69ca?d=identicon&s=25 Matthias Reitinger (reima)
on 2009-03-09 08:29
Michael Brooks wrote:
> For your entertainment, I've created a modified version of the
> original "gears.rb" OpenGL Ruby program (by Arto Bendiken) which
> uses GLSL shaders to color the gears with velvet and shiny
> materials.

Thank you for providing it to the public! It runs perfectly on my linux
box (GeForce Go 7600, Ubuntu 8.10, Ruby 1.8.7, ruby-opengl 0.60.0). I
always toyed with the idea of writing a small GLSL shader development
IDE in Ruby, maybe this will get me started after all ;-)

One question though:

>  def get_shader_source(file_name)·
>    # Read all the text from the file and return it in the file_content variable.
>    file_content = ''
>    File.open(file_name) do |file|
>      while file_line = file.gets()
>        file_content += file_line
>      end
>    end
>    return(file_content)
>  end

You know there is File.read, right?

-Matthias
Fa2521c6539342333de9f42502657e5a?d=identicon&s=25 Eleanor McHugh (Guest)
on 2009-03-09 17:54
(Received via mailing list)
On 9 Mar 2009, at 07:28, Matthias Reitinger wrote:
> IDE in Ruby, maybe this will get me started after all ;-)
It also works nicely under 1.9.1 on my MacBook with Nvidia 9400M.


Ellie

Eleanor McHugh
Games With Brains
http://slides.games-with-brains.net
----
raise ArgumentError unless @reality.responds_to? :reason
C742175a04e90db217deaf3650ef5d30?d=identicon&s=25 Michael Brooks (Guest)
on 2009-03-10 06:20
(Received via mailing list)
Hello Matthias

>>    return(file_content)
>>  end
>
> You know there is File.read, right?
>
> -Matthias

Actually I didn't give it much thought... that was the least of my
worries in making the demo program :)  However, I appreciate you taking
the time to mention it.  I'll investigate and update the code to use
File#read instead.

Michael
C742175a04e90db217deaf3650ef5d30?d=identicon&s=25 Michael Brooks (Guest)
on 2009-03-10 06:40
(Received via mailing list)
Hello Matthias (again):

>>>        file_content += file_line
> worries in making the demo program :)  However, I appreciate you taking
> the time to mention it.  I'll investigate and update the code to use
> File#read instead.
>
> Michael

I just changed the code to use the File#read and re-posted it.  I feel
silly for doing it the way I originally did.  I cut-and-paste some code
from another program and didn't think to look for a better Ruby way.
Thanks for pointing out the area for improvement.

On another topic, what kind of performance / fps are you getting?  My
past experience has been that Nvidia has better OpenGL drivers than ATI.

Michael
09348009e57e24e10bbc08d925bf69ca?d=identicon&s=25 Matthias Reitinger (reima)
on 2009-03-10 08:57
Michael Brooks wrote:
> On another topic, what kind of performance / fps are you getting?  My
> past experience has been that Nvidia has better OpenGL drivers than ATI.

reima@marvin:/tmp$ uname -a
Linux marvin 2.6.27-11-generic #1 SMP Thu Jan 29 19:24:39 UTC 2009 i686
GNU/Linux
reima@marvin:/tmp$ ruby -v
ruby 1.8.7 (2008-08-11 patchlevel 72) [i486-linux]
reima@marvin:/tmp$ head -n 12 /proc/cpuinfo
processor  : 0
vendor_id  : AuthenticAMD
cpu family  : 15
model    : 72
model name  : AMD Turion(tm) 64 X2 Mobile Technology TL-52
stepping  : 2
cpu MHz    : 1600.000
cache size  : 512 KB
physical id  : 0
siblings  : 2
core id    : 0
cpu cores  : 2
reima@marvin:/tmp$ lspci | grep VGA
02:00.0 VGA compatible controller: nVidia Corporation G70 [GeForce Go
7600] (rev a1)
reima@marvin:/tmp$ ruby gears_using_shaders.rb
14146 frames in  5.000 seconds = 2829.200 FPS
15088 frames in  5.000 seconds = 3017.600 FPS
15100 frames in  5.000 seconds = 3020.000 FPS
15100 frames in  5.000 seconds = 3020.000 FPS
15091 frames in  5.000 seconds = 3018.200 FPS
reima@marvin:/tmp$ ruby gears.rb
23585 frames in  5.000 seconds = 4717.000 FPS
24379 frames in  5.000 seconds = 4875.800 FPS
24410 frames in  5.000 seconds = 4882.000 FPS
24396 frames in  5.000 seconds = 4879.200 FPS
24314 frames in  5.000 seconds = 4862.800 FPS
reima@marvin:/tmp$ glxgears
25291 frames in 5.0 seconds = 5058.096 FPS
25284 frames in 5.0 seconds = 5056.642 FPS
25206 frames in 5.0 seconds = 5041.085 FPS
25111 frames in 5.0 seconds = 5022.135 FPS
25252 frames in 5.0 seconds = 5050.322 FPS

-Matthias
1bc63d01bd3fcccc36fb030a62039352?d=identicon&s=25 David Masover (Guest)
on 2009-03-10 09:35
(Received via mailing list)
Matthias Reitinger wrote:
> Michael Brooks wrote:
>
>> On another topic, what kind of performance / fps are you getting?  My
>> past experience has been that Nvidia has better OpenGL drivers than ATI.
>>
>
> reima@marvin:/tmp$ ruby gears_using_shaders.rb
> 14146 frames in  5.000 seconds = 2829.200 FPS
>

My results were frighteningly worse, at first:

dave@SERENITY ~/tmp> uname -a
Linux serenity 2.6.27-11-generic #1 SMP Thu Jan 29 19:28:32 UTC 2009
x86_64
GNU/Linux

dave@SERENITY ~/tmp> 18 ruby -v
ruby 1.8.7 (2008-08-11 patchlevel 72) [x86_64-linux]
dave@SERENITY ~/tmp> 19 ruby -v
ruby 1.9.1p0 (2009-01-30 revision 21907) [x86_64-linux]
dave@SERENITY ~/tmp> head -n 12 /proc/cpuinfo
processor       : 0
vendor_id       : GenuineIntel
cpu family      : 6
model           : 23
model name      : Intel(R) Core(TM)2 Duo CPU     T9300  @ 2.50GHz
stepping        : 6
cpu MHz         : 800.000
cache size      : 6144 KB
physical id     : 0
siblings        : 2
core id         : 0
cpu cores       : 2
dave@SERENITY ~/tmp> lspci | grep VGA
01:00.0 VGA compatible controller: nVidia Corporation GeForce 8600M GT
(rev a1)
dave@SERENITY ~/tmp> 18 ruby -rubygems gears_using_shaders.rb
9232 frames in  5.022 seconds = 1838.311 FPS
9982 frames in  5.000 seconds = 1996.400 FPS
9962 frames in  5.000 seconds = 1992.400 FPS
9923 frames in  5.000 seconds = 1984.600 FPS
9981 frames in  5.000 seconds = 1996.200 FPS
dave@SERENITY ~/tmp> 19 ruby gears_using_shaders.rb
9261 frames in  5.000 seconds = 1852.200 FPS
9835 frames in  5.000 seconds = 1967.000 FPS
9880 frames in  5.000 seconds = 1976.000 FPS
9901 frames in  5.000 seconds = 1980.200 FPS
9906 frames in  5.000 seconds = 1981.200 FPS
dave@SERENITY ~/tmp> glxgears
11864 frames in 5.0 seconds = 2372.706 FPS
12272 frames in 5.0 seconds = 2453.951 FPS
12457 frames in 5.0 seconds = 2491.129 FPS
11393 frames in 5.0 seconds = 2278.451 FPS
11255 frames in 5.0 seconds = 2246.413 FPS

Then I remembered I'm running KDE4, with all kinds of effects turned on.
Quickly toggling them off reveals:

dave@SERENITY ~/tmp> 18 ruby -rubygems gears_using_shaders.rb
21374 frames in  5.000 seconds = 4274.800 FPS
21964 frames in  5.000 seconds = 4392.800 FPS
22223 frames in  5.005 seconds = 4440.160 FPS
21960 frames in  5.000 seconds = 4392.000 FPS
22011 frames in  5.002 seconds = 4400.440 FPS
dave@SERENITY ~/tmp> 19 ruby gears_using_shaders.rb
20571 frames in  5.000 seconds = 4114.200 FPS
21922 frames in  5.000 seconds = 4384.400 FPS
22217 frames in  5.000 seconds = 4443.400 FPS
22022 frames in  5.000 seconds = 4404.400 FPS
21693 frames in  5.000 seconds = 4338.600 FPS
dave@SERENITY ~/tmp> glxgears
26268 frames in 5.0 seconds = 5253.481 FPS
27987 frames in 5.0 seconds = 5597.365 FPS
27844 frames in 5.0 seconds = 5568.736 FPS
28008 frames in 5.0 seconds = 5601.509 FPS
28031 frames in 5.0 seconds = 5600.459 FPS

Looks like it is GPU-bound -- no real difference between Ruby 1.8.7 and
1.9.1, but a huge difference between compositing and no compositing. For
those with KDE4, the default keystroke to toggle desktop effects is
shift+alt+f12. Can't live with 'em, can't live without 'em.
699c00ad35f2755810b4aa5f423d73e2?d=identicon&s=25 Albert Schlef (alby)
on 2009-03-10 11:30
David Masover wrote:
> For those with KDE4, the default keystroke to toggle
> desktop effects is [snip]

I understand nothing in these things but I wonder: why do desktop
effects affect your OpenGL performance? I mean, the animation is run in
a window of its own, so why has KDE anything to do with it?
1bc63d01bd3fcccc36fb030a62039352?d=identicon&s=25 David Masover (Guest)
on 2009-03-10 11:55
(Received via mailing list)
Albert Schlef wrote:
> David Masover wrote:
>
>> For those with KDE4, the default keystroke to toggle
>> desktop effects is [snip]
>>
>
> I understand nothing in these things but I wonder: why do desktop
> effects affect your OpenGL performance?

Because desktop effects -- at least the shiny, new ones -- use OpenGL.

> I mean, the animation is run in
> a window of its own, so why has KDE anything to do with it?
>

Here, do a quick demo and find out: Run KDE4 or Compiz, with wobbly
windows on. Launch glxgears, or this gears demo, or anything else
OpenGL. Now drag it around.

See how that "window of its own" wobbles?

I bet if you do the fancy raindrop effect in Compiz, that'll send
ripples all over those spinning gears.

That's because with compositing, every window on the screen is going to
be fed through some KDE effects, and through OpenGL, at least once. And
it's not just the wobbliness -- how else would you apply a smooth drop
shadow to a window, for example? You can do it in software, but it will
probably be slower.

Of course, it's important to keep in mind that glxgears is not a
benchmark (or at least, not a good one), but I could show you similar
results even when trying to play a fullscreen game -- KDE apparently
isn't smart enough to figure out that the game is fullscreen, so it
doesn't automatically disable compositing.

So, KDE4 has a keystroke that toggles compositing. Compiz, I believe,
will detect fullscreen windows and disable compositing, at least on that
window.
C742175a04e90db217deaf3650ef5d30?d=identicon&s=25 Michael Brooks (Guest)
on 2009-03-11 03:55
(Received via mailing list)
David Masover wrote:
> Matthias Reitinger wrote:
>> Michael Brooks wrote:
>>
>>> On another topic, what kind of performance / fps are you getting?  My
>>> past experience has been that Nvidia has better OpenGL drivers than ATI.
>>>
>> reima@marvin:/tmp$ ruby gears_using_shaders.rb
>> 14146 frames in  5.000 seconds = 2829.200 FPS
 >>
>>> dave@SERENITY ~/tmp> 18 ruby -rubygems gears_using_shaders.rb
> 22223 frames in  5.005 seconds = 4440.160 FPS
> dave@SERENITY ~/tmp> 19 ruby gears_using_shaders.rb
> 22217 frames in  5.000 seconds = 4443.400 FPS
> dave@SERENITY ~/tmp> glxgears
> 28008 frames in 5.0 seconds = 5601.509 FPS
 >

Thanks for the feedback.

Wow... I need a new computer.  You guys are killing my scores.  My Intel
P4 3Ghz and overclocked AGP ATI 3850 only get about 1926.8 fps.  I
believe some of this is due to running under Windows XP, AGP and because
the ATI drivers + card don't run OpenGL as efficiently as Nvidia.  I'd
try this in Linux (Mint is my favorite right now) but I can't get my ATI
card to work under Linux.

Michael
1bc63d01bd3fcccc36fb030a62039352?d=identicon&s=25 David Masover (Guest)
on 2009-03-13 23:56
(Received via mailing list)
Michael Brooks wrote:
> >>
> Wow... I need a new computer.  You guys are killing my scores.
Not really... At one point, glxgears would refuse to report FPS unless
you sent in a flag that was something like:
--i-acknowledge-that-this-is-not-a-benchmark

Remember, running several thousands of frames per second is irrelevant,
if your monitor is some 60 hz. What matters is if it can stay at 60 fps
when rendering more interesting scenes than three spinning gears.
Sometimes there's a correlation, sometimes not.

The only real use of glxgears as a benchmark is to test, definitively,
whether your OpenGL is running in a hardware-accelerated mode, because
the numbers are usually in the hundreds in software, versus thousands
for hardware. But with CPUs getting faster, I'm not sure how long that
will be the case -- Intel's new video card is supposed to be just a ton
of x86 cores.

Now, if you want to compare FPS in Nexuiz, or Quake 4...
This topic is locked and can not be replied to.