The 2.x gr-atsc seems to be closer to working, at least the flow control
(forecast,consume,etc) creates the exact to-the-byte correct quantity of
output with no loss of sync (since it keys on easy to discern -5/+5), on
120MB size output files.
However the quality of the output is only about 65 percent correct,
which is not suprising if you compare the output of a working (0.9)
equalizer:
Any hints or wild guesses from the atsc experts on where to look? It
must be something with the bit_timing_loop or the equalizer but it is
largely just copied from the 0.9 stuff.
Any hints or wild guesses from the atsc experts on where to look? It
must be something with the bit_timing_loop or the equalizer but it is
largely just copied from the 0.9 stuff.
–Chuck
Did you port the “right” bit timing loop?
As I recall, I tried 2 or 3 different approaches. Without checking
the old code, I’m not sure which one I ended up using.
the old code, I’m not sure which one I ended up using.
This is probably an issue in atsc_equalizer - the 0.9 stuff has what
looks like compensation for an offset between data and tags:
inputs[0].index = output.index; // the equalizer data
inputs[0].size = output.size + ntaps - 1; // history on data
// FIXME if there’s a problem, it’s probably on the next line…
int offset = ntaps - npretaps - 1;
assert (offset >= 0 && offset < ntaps);
inputs[1].index = output.index + offset; // align equalizer tags
inputs[1].size = output.size; // N.B., no extra
history on tags
return 0;
}
I’m not sure how to do that in the 2.x system. Looks like we need to
delay the tag stream by an amount equal to the number of filter taps in
the data stream.