On Thu, 2015-04-23 at 12:01 -0400, [email protected]
Date: Thu, 23 Apr 2015 10:23:31 -0400
From: Tom R. [email protected]
On Wed, Apr 22, 2015 at 11:17 PM, Nick F. [email protected] wrote:
Short answer: use corr_est as your tag. The corr_start tag is undelayed by
the matched filter length, and intended for other purposes (data-aided
equalizers, etc.). Andy W. can say more, but it’s enough to say the
later three tags are the ones that match the peak of the correlation in the
output symbol stream. See corr_est_cc_impl.cc lines 206-214.
Ummm, well, to clarify:
“corr_start” is marked at exactly the length of the matched filter
before the detected/declared correlation peak. I like to think of this
tag as “the preamble I was looking for starts here”. It is useful as an
indicator to reset/restart downstream Data-Aided blocks, like an LMS DA
equalizer for example.
“corr_est”, “time_est”, and “phase_est” are always marked at exactly the
user provided mark delay after “corr_start”:
(i is the index of the “corr_start” tag)
“corr_est”, “time_est”, and “phase_est” have values that tell you
something about the signal at the detected/declared correlation peak (at
the length of your matched filter samples after “corr_start”). As the
engineer, you have to decide where you want that information placed
relative to corr_start. Traditionally that is the center of the first
symbol in the preamble, to reset/restart timing recovery blocks
downstream for example. Since the corr_est block has no idea if you are
FSK or PSK and how much ISI your symbols have, it is up to you as the
engineer to set this tag marking delay. 0 is a valid value; negative
values and values longer than your matched filter length are not valid.
I’ve attached a screenshot showing this behavior. In the same running
flow graph the tag placement will go back and forth between being on the
same sample or being off by one sample.
I can’t honestly see how the corr_est block could be emitting this
directly, unless something internal to the block is trashing the
d_mark_delay class variable.
Some questions for you:
Is the direct output of the corr_est block, or as Tom R. implies, is
it after processing by some downstream blocks that decimate?
What is the tag marking delay you set on the corr_est block?
What is the samples/symbol you are using and provide as input to the
What is the length of you matched filter (how many samples\taps)?
As long as the corr_start tag is
always on the correct sample
“corr_start” is always the length of your matched filter samples ahead
of the detected/declared correlation peak.
Whether or not that is “right” depends on your downstream receiver
design and the probability of false alarm with declaring a correlation.
Aside from what Nick said, another issue that we’ve seen in the scheduler
is propagation of tags through blocks that have changing sample rates. The
clock sync blocks (either the PFB or M&M) for example don’t have a strict
N:M input to output ratio, but can have N:M+/-d depending on the state of
the signal. Jeff L. and I worked out a mediocre solution to help with
this, but it’s not perfect and we can still see the problem occurring on
occasion. Generally, this settles down quickly and provides a constant
offset of where the tag is relative to where it really should be, and it
might move plus or minus 1 sample every now and then.
IMO, if Richard set the tag marking delay to 0 and is not showing us
decimated output, then i think the above factors shouldn’t matter.
The corr_est block
has the “Tag marking delay” which among other reasons can also help with
Tag marking delay is specifically there because the block can’t guess
about parameters of the demodulation, but a human can easily empirically
set the parameter for the target demodulation.
The problem that I’m referring to doesn’t seem to have a simple answer,
either, and I’ve challenged a few people who are good at this sort of thing
to try and sort it out, but we still don’t have a solution. The “good
enough” solution that’s in there so far has indeed proved to be good enough
for most purposes.
A little OT, but:
The best I could do myself, was to time tag certain critical samples
before they go through these variabl-ish rate blocks and then use time
tags to sort out things later (e.g. TOA).