gr_delay is intended to delay the input of the signal by n samples.
I’ve been doing some testing with that block and I have observed that
the delay is applied to the chunk of data that the block receives as
input (on my case that was some 8000 samples) and applied again to the
I see two problems with that.
- I’m not able to apply the delay once (to the very beginning of the
signal). This is actually a very trivial problem that could be solved
building a custom block.
- If the delay I want to apply is larger than the size of the data
that the block receives (say a delay of 10000 samples when the block
receives the data in chunks of 8000 samples), the signal is set to 0
as if applying an infinite delay.
Is this the expected behavior of the block or am I missing something
on my observations?
Thanks in advance for your answers.