hello,
i’m trying to understand how to connect ‘playbin2’ to other elements.
i can connect a pipeline with the ‘mad’ decoder to the
‘equalizer10-bands’ plugin like this:
…
@pipeline = Gst::Pipeline.new
filesrc = Gst::ElementFactory.make(“filesrc”)
filesrc.location = “./test.mp3”
decoder = Gst::ElementFactory.make(“mad”)
audioconvert=Gst::ElementFactory.make(“audioconvert”)
sink = Gst::ElementFactory.make(“autoaudiosink”)
@plug = Gst::ElementFactory.make(“equalizer-10bands”)
@pipeline.add(filesrc, decoder, audioconvert, @plug, sink)
filesrc >> decoder >> audioconvert >> @plug >> sink
@pipeline.ready
@pipeline.play
…
everything works nicely, and creating a simple gui i can control the
eq with no problems.
i understand that the ‘playbin2’ element is quite different, in that
it is a pipeline in and of itself. from what i’ve read (
http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-base-plugins/html/gst-plugins-base-plugins-playbin2.html
) you must watch the bus for playbin2’s pads to be created, and then
connect them.
i get plenty of messages watching the bus like this:
…
pipeline = Gst::ElementFactory.make(“playbin2”)
audio = “/home/jk/ruby/audio tests/test.flac”
pipeline.uri= GLib.filename_to_uri(audio)
bus = pipeline.bus
bus.add_watch{|bus, message|
puts message.type.name
puts message.source
puts message.structure
puts
true}
…
it looks like lots of elements get automatically created and linked
along the way, like the ‘audioconvert’ and ‘autoaudiosink’ elements that
i create and link manually in the first example. i also see with “puts
pipeline.sources” that it is the Gst::ElementURIDecodeBin that seems to
be doing the decoding of mp3, ogg, wav, and flac files.
my question is, how / when / where to connect playbin2 to the
equalizer? i imagine it is a matter of watching the bus for a specific
event, and then linking elements at that point. what event in
particular am i looking for? when i see it, how do i link things
correctly?
thanks in advance for any ideas,
-j