Hi Guys,

If I use interp_fir_filter_ccc(3, [1, 2, 3, 4, 5, 6]) on a vector

source containing [1, 0, 0, …] then I get [1, 2, 3, 4, 5, 6, 0, 0,

…] as expected. However, if I use interp_fir_filter_ccc(3, [1, 2, 3,

4, 5]) then I get [0, 1, 2, 3, 4, 5, 0, 0, …] so the samples are

shifted to the right by one. For taps of length divisible by 3 there

is no delay, and for others there is 1 or 2 sample delay introduced by

the interpolation filter. Why is this?

Also, maybe related to this, I see filter.declare_sample_delay(0) in

generated code, and I do not know what it does and if it is strictly

necessary.

Miklos