Bug in block::declare_sample_delay

Hello,
I think that the declare_sample_delay (and/or the way it is used) is
implemented incorrectly for interpolator blocks. Using GR 3.7.6.

The delay should be added after scaling the offset by the
interpolation ratio, not before. For example, using the
filter.interp_fir_filter(4,[1,2,3,4]):
If a tag is on sample2, and the sample_delay is declared as one, the
output tag should be on sample 2*4+1=9. However, it falls on sample 12
= (2+1)*4.

Here’s an example to demonstrate the error:

from gnuradio import gr, blocks, filter
import pmt

top = gr.top_block()
tag = gr.tag_utils.python_to_tag((2,
pmt.to_pmt(‘A’),
pmt.PMT_NIL,
pmt.PMT_NIL))

src = blocks.vector_source_f(range(0, 50, 10), tags=[tag])
dut = filter.interp_fir_filter_fff(4, [1, 2, 3, 4])
dut.declare_sample_delay(1)
snk = blocks.vector_sink_f()

top.connect(src, dut, snk)
top.run()

result = snk.data()
tags = snk.tags()
print tags[0].offset

Regards.

This forum is not affiliated to the Ruby language, Ruby on Rails framework, nor any Ruby applications discussed here.

| Privacy Policy | Terms of Service | Remote Ruby Jobs