Buffer size with timed packets

Hi all,

I have a question regarding the timed packets with UHD. Basically I’m
sending packets from a burst tagger into my UHD sink with a tx_time tag,
but after a while I’m getting L’s (late packet, if I’m not wrong) at the
output.

I tried changing both the packet size and the sample rate of my
flowgraph
and, even though the effect is reduced, the L’s are occasionally still
coming up. More than finding a exact solution (that I’d clearly like to
get), I want to understand what is happening in there. The time that I’m
applying for my tx_time tag is calculated as follows:

future_time = first_UHD_time + delay + relative_time

where the first UHD time is captured by a parallel rx_time tag, the
relative time is the actual sample position (or offset) divided by my
sample rate, and the delay is user defined. I used a sample rate up to
400k, but still the same behavior is present.

Our idea to overcome this Late packets here in the lab was at first to
change the buffer size of the burst tagger (or the UHD sink), so that it
would be filled with just one packet and then sent right away. However,
for
so long I’ve read in other questions, this can reduce like “horse-power”
of
out application and also maybe flexibility for future implementations. I
would gladly hear some opinions about this idea.

I see that the device is waiting for the times given on the tag before
sending, so that part is working fine. However with delays up to 2
seconds,
the L’s still appear. That is where the “stacked” timed packets
hypothesis
appeared. The ideal delay is yet to be calculated (I’m experimenting
first), but a smaller delay (so far) leads to more L’s at the output

Any comment about how to overcome the lateness is welcomed.

Thank you in advance!

Nicolas.