Usrp_spectrum_sense and average dBm power levels

I’m having trouble trying to implement an energy detector in GNU Radio.
I want to find empty channels that have a signal power less than or
equal to the FCC’s -114 dBm detection threshold. I’ve been modifying the code that came with GNU Radio, and it’s been a
headache trying to convert the magnitude squared values from the FFT
block to proper dBm values. I am using the USRP N200 along with the WBX

I am simply trying to get the averaged values to match up with the
values in uhd_fft, but they seem to be off. I’ve got tune delay and
dwell delay both set to 0.05 seconds, and changing these options along
with the sampling rate and FFT size seem to give drastically different

How is uhd_fft able to get the dBm values that it uses? Right now I take
the 10 * log10(bin[i]) - 20 * log10(fft_size) - 10 * log10(tb.power /
fft_size) for each sample, but they don’t seem to match up.

Should I even bother using usrp_spectrum_sense? Using bin statistics is
for looking at really wide ranges of spectrum that are outside the range
of maximum bandwidth the USRP can look at, but here I only want to look
at 6MHz TV channels consecutively.

I have attached the code that I’ve been working on.