Goodmorning,
I have general questions about processing a fixed number of samples.
In front of the necessity to process N samples (N=8192 items) in a
sinchronous block, I developed a mechanism like this:
if(noutput_items>N)
{
processing of N elements
consume(N); return(N); //in this way I process the required N
elements
}
else
{consume(0); return(0);} // here I’m waiting until the input buffer
reach
the required number of elements
Probably is not the convenctional way to do this, but for now it works
well.
My questions are:
Is there a convenctional way to do this?
it seems that N cannot be larger than 8192, how can I enlarge the
input
buffer?
Is there a convenctional way to do this?
Yes, there is! Since in a sync block, the numbers of samples you consume
on you rinput and you produce on your output are identical,
you can use set_output_multiple(8192) [1].
If what you’re doing feels more like a operation on vectors of samples
instead of an operation on a stream of samples, you could also use a
input signature with an input and output item size of 8192 times the
original item size, and use stream_to_vector[2]before and
vector_to_stream to convert from streams to vectors. A typical example
(and generally a very nice block) is the fft_vcc[3] block, which
encapsulates the FFT, which is a vector operation, mathematically.
There is a set_min_noutput items (and if you’re in a synchronous block
or if you implement forecast, you can be sure that ninput_items >
noutput_items)
- it seems that N cannot be larger than 8192, how can I enlarge the
input
buffer?
Essentially you can’t … if you need fixed large size at your input,
your only option is to have your own buffer internally.
I know this sucks, I had the same issue several time.