well, GNU Radio is not really based on a “requesting data” model – it’s
mainly buffer driven.
This means that as long as there’s space in your block’s output buffer,
GNU Radio might (and will) call your work() function. Typical buffer
sizes are e.g. 8192 items. So, GNU Radio uses backpressure to limit the
rate at which things run.
You can try gr::block::set_max_output_buffer, but please heed the
warnings in : This kind of breaks a lot of assumptions about “good”
On a different note, this sounds like a hint of architectural problem in
your transmitter. I assume 1200 bd means that you take x bits, generate
y symbols out of these, and package these into z samples. These z
samples are then fed to SDR hardware, consuming at a constant sampling
rate. Now, since the ratio of y/x is fixed, and so is the ratio z/y,
normally the hardware sampling rate simply defines how many bits are
consumed in a given time; GNU Radio then just makes sure this happens as
soon as possible (i.e. to keep the items flowing as far as possible).
That is typically very good – because it a) means that transmit samples
are computed as early as possible, so that “hiccups” upstream are hidden
by the full buffers, and b) so that the slowest part in your signal
processing chain (which should be the SDR hardware) defines the speed
at which things happen.
Now, there’s some cases where that doesn’t apply, mainly “bursty” data
– I don’t know whether yours is one of these, but in case it is:
Have a look at Tim O’Shea’s gr-eventstream ; it doesn’t change the
way GNU Radio works (ie. your buffer sizes define the latency between a
burst of samples being injected into the otherwise continuous 0-sample
stream and that burst reaching your SDR hardware), but it makes working
with bursty data more intuitive and allows for non-continuous data to be
fed into GNU Radio.