Re: How to "declare_sample_delay" in C++?

Hi Martin,

Thanks for the answer. My confusion is that when you generate, for
example,
a multi-stage polyphase decimator in GRC, a ‘declare_sample_delay(0)’
code
will be generated for each decimation block. When I go from Python to
C++ I
thought I should do the same. Now, since the default is zero, does that
mean the ‘declare_sample_delay(0)’ codes generated by GRC are redundant?

Thank you.

Best wishes,
Khalid