Gr-qtgui Waterfall Time Axis

I have some large spectrum records and I would like to use GR for
analysis in a a sort of “off-line” mode. I want to visualize the data
using the spectrogram and then identify subsets of interesting features
(e.g. WiFi Packets in 2.4 GHz) by frequency and time coordinates.
Essentially, I want to know what index in the file certain features
correspond with so I can save them as separate files. To this end, I’ve
found gr_spectrogram_plot very handy, but it doesn’t quite do what I
want because the y-axis which displays time isn’t showing up correctly.
My y-axis is always 0.0, and the cursor highlight also reads 0.0
(although the frequency is correct). However, the qt spectrogram
appears to have correct time information when I make a simple file
source -> qt spectrogram sink.

I’d like to fix this issue so I’ve been pouring over the code in
gr-qtgui. I don’t have much experience with QWT or Python GUIs in
general. From what I can tell, there are methods for frequency axis and
intensity scaling, but not time. Is this true or am I missing something
in the API? The historyExtent parameter in the WaterfallData
constructor appears to get hardcoded to 200 in WaterfallDisplayPlot.cc
(gr-qtgui/lib). It would also be nice to toggle between the time and
sample number.

I’m happy to do the work to implement this. Could someone point me in
the right direction here? Thanks!

PWG

I’ve solved this and wanted to share my solutions, as well as
improvements (to me, anyways) to GR’s off-line plotting tools
(gr_spectrogram_plot). My overall goal here is to be able to do some
analysis on wideband (25MSPS) data using GR since MATLAB isn’t really
efficient for that.

I’ve seen some interesting updates regarding gr-qtgui on the git commit
log – I’d be very interested in comments from those folks so I know
what’s in the pipeline and perhaps what you would want to pull into the
baseline.

Below is a list of issues I’ve encountered and what I’ve done to fix
them.

Problem:In gr_spectrogram_plot_c, the time scale always reads 0.

Solution: In waterfalldisplayform.cc, d_update_time is passed as the
timePerFFT argument to plotNewData (WaterfallDisplayPlot object). This
happens in WaterfallDisplayForm::newData(). However, gr_spectrogram_plot
sets this to zero by calling set_update_time(0) on the
waterfall_sink_c_impl object. In waterfall_sink_c_impl.cc it calls the
postEvent() function with d_last_time, which is a variable representing
the time between waterfall updates. Ultimately, this is regulated by
d_update_time, which is set to zero by the set_update_time(0) call. The
net result is that timePerFFT is always zero, resulting in an improperly
set timescale. The proper thing to do here is to set timePerFFT based on
the FFT size and sample rate in off-line mode. Obviously, this assumes
you know the sample rate, but since I’ve modified gr_spectrogram_plot to
read files with headers and autofill stuff (sample rate, frequency, data
type, etc) appropriately, it always does. It appears that you won’t be
necessarily processing every sample if you don’t set d_update_time=0,
which you want to when running off-line.

Problem:In gr_spectrogram_plot, the cursor pointer only gives readings
to a 100th of a second. For analysis, it would be more useful to have
scientific notation which scales with higher sample rates.

Solution: In WaterfallDisplayPlot.cc, change .arg(secs,0,‘f’,2) to
.arg(secs,0,‘e’,2)

Problem: In gr_spectrogram_plot, the time scale is in in hundreds of a
second. It may be more useful to have the plot in scientific notation.

Solution: In WaterfallDisplayPlot.cc, change
QwtText(QString(“”).sprintf(“%.1f”,secs)) to
QwtText(QString(“”).sprintf(“%e”,secs))

Additional tasks:

Problem: In gr_spectrogram_plot, the spectrogram is hardcoded to 200
FFTs. Make this a configurable parameter.

Problem: In gr_spectrogram_plot, the timescale does not update properly
with the file seek widget.


Message: 12
Date: Tue, 21 Oct 2014 16:00:15 -0400 (EDT)
From: “Garver, Paul W” [email protected]
To: [email protected]
Subject: [Discuss-gnuradio] gr-qtgui Waterfall Time Axis
Message-ID:
[email protected]
Content-Type: text/plain; charset=utf-8

I have some large spectrum records and I would like to use GR for
analysis in a a sort of “off-line” mode. I want to visualize the data
using the spectrogram and then identify subsets of interesting features
(e.g. WiFi Packets in 2.4 GHz) by frequency and time coordinates.
Essentially, I want to know what index in the file certain features
correspond with so I can save them as separate files. To this end, I’ve
found gr_spectrogram_plot very handy, but it doesn’t quite do what I
want because the y-axis which displays time isn’t showing up correctly.
My y-axis is always 0.0, and the cursor highlight also reads 0.0
(although the frequency is correct). However, the qt spectrogram
appears to have correct time information when I make a simple file
source → qt spectrogram sink.

I’d like to fix this issue so I’ve been pouring over the code in
gr-qtgui. I don’t have much experience with QWT or Python GUIs in
general. From what I can tell, there are methods for frequency axis and
intensity scaling, but not time. Is this true or am I missing something
in the API? The historyExtent parameter in the WaterfallData
constructor appears to get hardcoded to 200 in WaterfallDisplayPlot.cc
(gr-qtgui/lib). It would also be nice to toggle between the time and
sample number.

I’m happy to do the work to implement this. Could someone point me in
the right direction here? Thanks!

PWG

On Wed, Oct 29, 2014 at 1:23 PM, Garver, Paul W [email protected]
wrote:

Thanks for the feedback. Comments below.

result is that timePerFFT is always zero, resulting in an improperly set
timescale. The proper thing to do here is to set timePerFFT based on the
FFT size and sample rate in off-line mode. Obviously, this assumes you know
the sample rate, but since I’ve modified gr_spectrogram_plot to read files
with headers and autofill stuff (sample rate, frequency, data type, etc)
appropriately, it always does. It appears that you won’t be necessarily
processing every sample if you don’t set d_update_time=0, which you want to
when running off-line.

That sounds fair. The update time is set to zero so it plots
immediately.
As for the autofill, are you using the file metadata format for this or
your own? I’ve been wanting to make versions of these programs that read
this data from our metadata files for just this reason. And if nothing
is
provided for the sample (the -r sets this on the command line) it
defaults
to 1.0, which should still operate well with what you are suggesting
here,
possibly with just meaningless numbers.

Problem:In gr_spectrogram_plot, the cursor pointer only gives readings to
a 100th of a second. For analysis, it would be more useful to have
scientific notation which scales with higher sample rates.

Solution: In WaterfallDisplayPlot.cc, change .arg(secs,0,‘f’,2) to
.arg(secs,0,‘e’,2)

I’d like to see what that really looks like, but I’m not opposed to it.

Problem: In gr_spectrogram_plot, the time scale is in in hundreds of a
second. It may be more useful to have the plot in scientific notation.

Solution: In WaterfallDisplayPlot.cc, change
QwtText(QString(“”).sprintf(“%.1f”,secs)) to
QwtText(QString(“”).sprintf(“%e”,secs))

Same again.

Additional tasks:

Problem: In gr_spectrogram_plot, the spectrogram is hardcoded to 200 FFTs.
Make this a configurable parameter.

Yep, that seems wrong.

Problem: In gr_spectrogram_plot, the timescale does not update properly
with the file seek widget.

Ah, interesting. That might be a bit more difficult fix.

Can you put this work together into a git branch, probably on github
forked
off gnuradio/gnuradio.git so we can see what the commits would look
like? I
think these off-line analysis tools are useful, but they definitely need
work. Glad someone’s interested in it!

Tom

On Wed, Nov 5, 2014 at 10:24 PM, Garver, Paul W [email protected]
wrote:

Does that mean you’ve changed the format of the header? These things are
really only useful if we’re all using the same structure. The “extras”
part
of the header is designed to hold stuff like that, too, or is that what
you
meant? As for the extra parameters, you have samples/second, so why
would
you need seconds/sample? And with the samples/second and item count in
the
headers, you can calculate the length in seconds easily.

interesting features in the spectrogram and then be able to cut out a
section of a much larger file for analysis.

PWG

It doesn’t get the time info from a time tag, just from the system’s
timer
itself using the gr::high_res_timer_type. So we’d have to pass this
information through to the plotting widget.

Tom

Hey Tom,

I’ll pull together the git branch and let you know when it’s ready. As
for gr_spectrogram_plot, we do use the standard GR header format, with
some “extras” such as center frequency. I think it would be useful in
the metadata format to have center frequency, seconds/sample, and file
length in seconds for the next metadata version. I should also mention I
have some simple scripts to make headers for header-less data with
command line args. I’m working on expanding them so they can do data
type conversion and modification of headers as well.

The time axis scaling is next on my list to do. Any suggestions on where
to begin? I was thinking of adding some functions to the spectrogram
plot to do this. Then in the plot reset function, manually set the scale
properly. There are probably more elegant ways to do this, but I don’t
want to break existing functionality with the on-line mode. I think the
Y-Axis zero point is based on the timestamp, which makes total sense in
“on-line” mode. I suppose we could keep that convention and generate
fake timestamps for off-line mode. That would also work well when we
have real timestamps in the headers. I also think it would be good to
have a key toggle to switch between time and sample number. I want to
identify interesting features in the spectrogram and then be able to cut
out a section of a much larger file for analysis.

PWG

----- Original Message -----

From: “Tom R.” [email protected]
To: “Paul W Garver” [email protected]
Cc: “GNURadio D.ion List” [email protected]
Sent: Monday, November 3, 2014 6:16:38 PM
Subject: Re: [Discuss-gnuradio] gr-qtgui Waterfall Time Axis

On Wed, Oct 29, 2014 at 1:23 PM, Garver, Paul W < [email protected] >
wrote:

I’ve solved this and wanted to share my solutions, as well as
improvements (to me, anyways) to GR’s off-line plotting tools
(gr_spectrogram_plot). My overall goal here is to be able to do some
analysis on wideband (25MSPS) data using GR since MATLAB isn’t really
efficient for that.

I’ve seen some interesting updates regarding gr-qtgui on the git commit
log – I’d be very interested in comments from those folks so I know
what’s in the pipeline and perhaps what you would want to pull into the
baseline.

Below is a list of issues I’ve encountered and what I’ve done to fix
them.

Thanks for the feedback. Comments below.

Problem:In gr_spectrogram_plot_c, the time scale always reads 0.

Solution: In waterfalldisplayform.cc, d_update_time is passed as the
timePerFFT argument to plotNewData (WaterfallDisplayPlot object). This
happens in WaterfallDisplayForm::newData(). However, gr_spectrogram_plot
sets this to zero by calling set_update_time(0) on the
waterfall_sink_c_impl object. In waterfall_sink_c_impl.cc it calls the
postEvent() function with d_last_time, which is a variable representing
the time between waterfall updates. Ultimately, this is regulated by
d_update_time, which is set to zero by the set_update_time(0) call. The
net result is that timePerFFT is always zero, resulting in an improperly
set timescale. The proper thing to do here is to set timePerFFT based on
the FFT size and sample rate in off-line mode. Obviously, this assumes
you know the sample rate, but since I’ve modified gr_spectrogram_plot to
read files with headers and autofill stuff (sample rate, frequency, data
type, etc) appropriately, it always does. It appears that you won’t be
necessarily processing every sample if you don’t set d_update_time=0,
which you want to when running off-line.

That sounds fair. The update time is set to zero so it plots
immediately. As for the autofill, are you using the file metadata format
for this or your own? I’ve been wanting to make versions of these
programs that read this data from our metadata files for just this
reason. And if nothing is provided for the sample (the -r sets this on
the command line) it defaults to 1.0, which should still operate well
with what you are suggesting here, possibly with just meaningless
numbers.

Problem:In gr_spectrogram_plot, the cursor pointer only gives readings to a 100th of a second. For analysis, it would be more useful to have scientific notation which scales with higher sample rates.

Solution: In WaterfallDisplayPlot.cc, change .arg(secs,0,‘f’,2) to
.arg(secs,0,‘e’,2)

I’d like to see what that really looks like, but I’m not opposed to it.

Problem: In gr_spectrogram_plot, the time scale is in in hundreds of a second. It may be more useful to have the plot in scientific notation.

Solution: In WaterfallDisplayPlot.cc, change
QwtText(QString(“”).sprintf(“%.1f”,secs)) to
QwtText(QString(“”).sprintf(“%e”,secs))

Same again.

Additional tasks:

Problem: In gr_spectrogram_plot, the spectrogram is hardcoded to 200
FFTs. Make this a configurable parameter.

Yep, that seems wrong.

Problem: In gr_spectrogram_plot, the timescale does not update properly with the file seek widget.

Ah, interesting. That might be a bit more difficult fix.

Can you put this work together into a git branch, probably on github
forked off gnuradio/gnuradio.git so we can see what the commits would
look like? I think these off-line analysis tools are useful, but they
definitely need work. Glad someone’s interested in it!

Tom

------------------------------

Message: 12
Date: Tue, 21 Oct 2014 16:00:15 -0400 (EDT)
From: “Garver, Paul W” < [email protected] >
To: [email protected]
Subject: [Discuss-gnuradio] gr-qtgui Waterfall Time Axis
Message-ID:
< [email protected] >
Content-Type: text/plain; charset=utf-8

I have some large spectrum records and I would like to use GR for
analysis in a a sort of “off-line” mode. I want to visualize the data
using the spectrogram and then identify subsets of interesting features
(e.g. WiFi Packets in 2.4 GHz) by frequency and time coordinates.
Essentially, I want to know what index in the file certain features
correspond with so I can save them as separate files. To this end, I’ve
found gr_spectrogram_plot very handy, but it doesn’t quite do what I
want because the y-axis which displays time isn’t showing up correctly.
My y-axis is always 0.0, and the cursor highlight also reads 0.0
(although the frequency is correct). However, the qt spectrogram appears
to have correct time information when I make a simple file source → qt
spectrogram sink.

I’d like to fix this issue so I’ve been pouring over the code in
gr-qtgui. I don’t have much experience with QWT or Python GUIs in
general. From what I can tell, there are methods for frequency axis and
intensity scaling, but not time. Is this true or am I missing something
in the API? The historyExtent parameter in the WaterfallData constructor
appears to get hardcoded to 200 in WaterfallDisplayPlot.cc
(gr-qtgui/lib). It would also be nice to toggle between the time and
sample number.

I’m happy to do the work to implement this. Could someone point me in
the right direction here? Thanks!

PWG


Discuss-gnuradio mailing list
[email protected]
https://lists.gnu.org/mailman/listinfo/discuss-gnuradio

Sorry my response was really confusing after re-reading it – let me try
again. We use the standard metadata headers and simply store the center
frequency in the ‘extras’ part . The metadata format has not been
changed.

I am not suggesting that we add redundant information to the header but
rather have gr_read_file_metadata calculate these things for us (based
on the header) and display them. Of course, they are easily calculated
by hand but it does simplify the analysis workflow. To that point,
perhaps we should define combinations of item types and real/complex.
For example, rather than displaying Complex T/F and item type float,
just say Data Type: Complex Floats (CF) and have a table clearly
identifying the types, sizes, enum values, etc in the documentation. I
think a few refinements like this (and enhancing the plotting tools)
could really help make GR a powerful analysis platform.

PWG

----- Original Message -----

From: “Tom R.” [email protected]
To: “Paul W Garver” [email protected]
Cc: “GNURadio D.ion List” [email protected]
Sent: Monday, November 10, 2014 11:46:18 AM
Subject: Re: [Discuss-gnuradio] gr-qtgui Waterfall Time Axis

On Wed, Nov 5, 2014 at 10:24 PM, Garver, Paul W < [email protected] >
wrote:

Hey Tom,

I’ll pull together the git branch and let you know when it’s ready. As
for gr_spectrogram_plot, we do use the standard GR header format, with
some “extras” such as center frequency. I think it would be useful in
the metadata format to have center frequency, seconds/sample, and file
length in seconds for the next metadata version. I should also mention I
have some simple scripts to make headers for header-less data with
command line args. I’m working on expanding them so they can do data
type conversion and modification of headers as well.

Does that mean you’ve changed the format of the header? These things are
really only useful if we’re all using the same structure. The “extras”
part of the header is designed to hold stuff like that, too, or is that
what you meant? As for the extra parameters, you have samples/second, so
why would you need seconds/sample? And with the samples/second and item
count in the headers, you can calculate the length in seconds easily.

The time axis scaling is next on my list to do. Any suggestions on where
to begin? I was thinking of adding some functions to the spectrogram
plot to do this. Then in the plot reset function, manually set the scale
properly. There are probably more elegant ways to do this, but I don’t
want to break existing functionality with the on-line mode. I think the
Y-Axis zero point is based on the timestamp, which makes total sense in
“on-line” mode. I suppose we could keep that convention and generate
fake timestamps for off-line mode. That would also work well when we
have real timestamps in the headers. I also think it would be good to
have a key toggle to switch between time and sample number. I want to
identify interesting features in the spectrogram and then be able to cut
out a section of a much larger file for analysis.

PWG

It doesn’t get the time info from a time tag, just from the system’s
timer itself using the gr::high_res_timer_type. So we’d have to pass
this information through to the plotting widget.

Tom

From: “Tom R.” < [email protected] >
To: “Paul W Garver” < [email protected] >
Cc: “GNURadio D.ion List” < [email protected] >
Sent: Monday, November 3, 2014 6:16:38 PM
Subject: Re: [Discuss-gnuradio] gr-qtgui Waterfall Time Axis

On Wed, Oct 29, 2014 at 1:23 PM, Garver, Paul W < [email protected] >
wrote:

I've solved this and wanted to share my solutions, as well as improvements (to me, anyways) to GR's off-line plotting tools (gr_spectrogram_plot). My overall goal here is to be able to do some analysis on wideband (25MSPS) data using GR since MATLAB isn't really efficient for that.

I’ve seen some interesting updates regarding gr-qtgui on the git commit
log – I’d be very interested in comments from those folks so I know
what’s in the pipeline and perhaps what you would want to pull into the
baseline.

Below is a list of issues I’ve encountered and what I’ve done to fix
them.

Thanks for the feedback. Comments below.

Problem:In gr_spectrogram_plot_c, the time scale always reads 0.

Solution: In waterfalldisplayform.cc, d_update_time is passed as the
timePerFFT argument to plotNewData (WaterfallDisplayPlot object). This
happens in WaterfallDisplayForm::newData(). However, gr_spectrogram_plot
sets this to zero by calling set_update_time(0) on the
waterfall_sink_c_impl object. In waterfall_sink_c_impl.cc it calls the
postEvent() function with d_last_time, which is a variable representing
the time between waterfall updates. Ultimately, this is regulated by
d_update_time, which is set to zero by the set_update_time(0) call. The
net result is that timePerFFT is always zero, resulting in an improperly
set timescale. The proper thing to do here is to set timePerFFT based on
the FFT size and sample rate in off-line mode. Obviously, this assumes
you know the sample rate, but since I’ve modified gr_spectrogram_plot to
read files with headers and autofill stuff (sample rate, frequency, data
type, etc) appropriately, it always does. It appears that you won’t be
necessarily processing every sample if you don’t set d_update_time=0,
which you want to when running off-line.

That sounds fair. The update time is set to zero so it plots
immediately. As for the autofill, are you using the file metadata format
for this or your own? I’ve been wanting to make versions of these
programs that read this data from our metadata files for just this
reason. And if nothing is provided for the sample (the -r sets this on
the command line) it defaults to 1.0, which should still operate well
with what you are suggesting here, possibly with just meaningless
numbers.

Problem:In gr_spectrogram_plot, the cursor pointer only gives readings to a 100th of a second. For analysis, it would be more useful to have scientific notation which scales with higher sample rates.

Solution: In WaterfallDisplayPlot.cc, change .arg(secs,0,‘f’,2) to
.arg(secs,0,‘e’,2)

I’d like to see what that really looks like, but I’m not opposed to it.

Problem: In gr_spectrogram_plot, the time scale is in in hundreds of a second. It may be more useful to have the plot in scientific notation.

Solution: In WaterfallDisplayPlot.cc, change
QwtText(QString(“”).sprintf(“%.1f”,secs)) to
QwtText(QString(“”).sprintf(“%e”,secs))

Same again.

Additional tasks:

Problem: In gr_spectrogram_plot, the spectrogram is hardcoded to 200
FFTs. Make this a configurable parameter.

Yep, that seems wrong.

Problem: In gr_spectrogram_plot, the timescale does not update properly with the file seek widget.

Ah, interesting. That might be a bit more difficult fix.

Can you put this work together into a git branch, probably on github
forked off gnuradio/gnuradio.git so we can see what the commits would
look like? I think these off-line analysis tools are useful, but they
definitely need work. Glad someone’s interested in it!

Tom

------------------------------

Message: 12
Date: Tue, 21 Oct 2014 16:00:15 -0400 (EDT)
From: “Garver, Paul W” < [email protected] >
To: [email protected]
Subject: [Discuss-gnuradio] gr-qtgui Waterfall Time Axis
Message-ID:
< [email protected] >
Content-Type: text/plain; charset=utf-8

I have some large spectrum records and I would like to use GR for
analysis in a a sort of “off-line” mode. I want to visualize the data
using the spectrogram and then identify subsets of interesting features
(e.g. WiFi Packets in 2.4 GHz) by frequency and time coordinates.
Essentially, I want to know what index in the file certain features
correspond with so I can save them as separate files. To this end, I’ve
found gr_spectrogram_plot very handy, but it doesn’t quite do what I
want because the y-axis which displays time isn’t showing up correctly.
My y-axis is always 0.0, and the cursor highlight also reads 0.0
(although the frequency is correct). However, the qt spectrogram appears
to have correct time information when I make a simple file source → qt
spectrogram sink.

I’d like to fix this issue so I’ve been pouring over the code in
gr-qtgui. I don’t have much experience with QWT or Python GUIs in
general. From what I can tell, there are methods for frequency axis and
intensity scaling, but not time. Is this true or am I missing something
in the API? The historyExtent parameter in the WaterfallData constructor
appears to get hardcoded to 200 in WaterfallDisplayPlot.cc
(gr-qtgui/lib). It would also be nice to toggle between the time and
sample number.

I’m happy to do the work to implement this. Could someone point me in
the right direction here? Thanks!

PWG


Discuss-gnuradio mailing list
[email protected]
https://lists.gnu.org/mailman/listinfo/discuss-gnuradio

On Mon, Nov 10, 2014 at 12:43 PM, Garver, Paul W [email protected]
wrote:

Sorry my response was really confusing after re-reading it – let me try
again. We use the standard metadata headers and simply store the center
frequency in the ‘extras’ part . The metadata format has not been changed.

Gotcha, thanks for clearing that up.

PWG

Ah, ok. Misunderstood that. The gr_read_file_metadata spits out the
header
information for every segment, since the metadata files are segmented by
default (for better reconstruction if a file error occurs and to make
reading out smaller bits of large files easier). But I see your point,
and
I think it might be useful to have this tool keep track of all items
throughout the file and have a summary dump at the end of the print out
with the total number of items read in and possibly even the total
length
of time of the file since each segment might have a different sample
rate.
Also info like the total number of data segments in the file, etc.

Good suggestions on the data type combining (right now, it’s just
translating from what’s in the header to a print out), and the
documentation.

Can you roll these feature requests up into an Issue on gnuradio.org?
(or
better yet, send us a pull request on github…)

Thanks,
Tom

Hey Tom,

I did a git pull request for my “olplottweaks” branch. Here’s the link
for anyone else on the list who is interested:
https://github.com/garverp/gnuradio/tree/olplottweaks

Let me know if you like the direction this is going. I pulled numffts
through the constructors of the spectrogram because it seems like a
constructor type parameter to me. I wasn’t planning on supporting a
change in the spectrogram length via widget.

I can add some more stuff, particularly:

  • Have gr_spectrogram_plot respect -r sample rate flag (Currently can
    not tell if default of 1 or it simply wasn’t passed from optparse).
  • Add a box/key to toggle between sample number (e.g. rate=1) and time
    on the y-axis in gr_spectrogram_plot
  • Update the y-axis time with proper time/sample indices when using the
    file slider
  • Update y-axis time on spectrogram plot appropriately when fft size or
    sample rate changes.

Is there currently any way to give the spectrogram an overlap to reduce
the noise variance (Welch’s method)? I see there is an averaging
setting, but the overlap would give us better plots too.

On another note, some of the other features such as a more succinct
gr_read_file_metadata ouput are currently being worked on so I hope to
be able to push those upstream once we have a more finished product.

PWG

----- Original Message -----

From: “Tom R.” [email protected]
To: “Paul W Garver” [email protected]
Cc: “GNURadio D.ion List” [email protected]
Sent: Monday, November 10, 2014 12:52:39 PM
Subject: Re: [Discuss-gnuradio] gr-qtgui Waterfall Time Axis

On Mon, Nov 10, 2014 at 12:43 PM, Garver, Paul W < [email protected] >
wrote:

Sorry my response was really confusing after re-reading it – let me try
again. We use the standard metadata headers and simply store the center
frequency in the ‘extras’ part . The metadata format has not been
changed.

Gotcha, thanks for clearing that up.

I am not suggesting that we add redundant information to the header but
rather have gr_read_file_metadata calculate these things for us (based
on the header) and display them. Of course, they are easily calculated
by hand but it does simplify the analysis workflow. To that point,
perhaps we should define combinations of item types and real/complex.
For example, rather than displaying Complex T/F and item type float,
just say Data Type: Complex Floats (CF) and have a table clearly
identifying the types, sizes, enum values, etc in the documentation. I
think a few refinements like this (and enhancing the plotting tools)
could really help make GR a powerful analysis platform.

PWG

Ah, ok. Misunderstood that. The gr_read_file_metadata spits out the
header information for every segment, since the metadata files are
segmented by default (for better reconstruction if a file error occurs
and to make reading out smaller bits of large files easier). But I see
your point, and I think it might be useful to have this tool keep track
of all items throughout the file and have a summary dump at the end of
the print out with the total number of items read in and possibly even
the total length of time of the file since each segment might have a
different sample rate. Also info like the total number of data segments
in the file, etc.

Good suggestions on the data type combining (right now, it’s just
translating from what’s in the header to a print out), and the
documentation.

Can you roll these feature requests up into an Issue on gnuradio.org ?
(or better yet, send us a pull request on github…)

Thanks,
Tom

From: “Tom R.” < [email protected] >
To: “Paul W Garver” < [email protected] >
Cc: “GNURadio D.ion List” < [email protected] >
Sent: Monday, November 10, 2014 11:46:18 AM

Subject: Re: [Discuss-gnuradio] gr-qtgui Waterfall Time Axis

On Wed, Nov 5, 2014 at 10:24 PM, Garver, Paul W < [email protected] >
wrote:

Hey Tom,

I’ll pull together the git branch and let you know when it’s ready. As
for gr_spectrogram_plot, we do use the standard GR header format, with
some “extras” such as center frequency. I think it would be useful in
the metadata format to have center frequency, seconds/sample, and file
length in seconds for the next metadata version. I should also mention I
have some simple scripts to make headers for header-less data with
command line args. I’m working on expanding them so they can do data
type conversion and modification of headers as well.

Does that mean you’ve changed the format of the header? These things are
really only useful if we’re all using the same structure. The “extras”
part of the header is designed to hold stuff like that, too, or is that
what you meant? As for the extra parameters, you have samples/second, so
why would you need seconds/sample? And with the samples/second and item
count in the headers, you can calculate the length in seconds easily.

The time axis scaling is next on my list to do. Any suggestions on where
to begin? I was thinking of adding some functions to the spectrogram
plot to do this. Then in the plot reset function, manually set the scale
properly. There are probably more elegant ways to do this, but I don’t
want to break existing functionality with the on-line mode. I think the
Y-Axis zero point is based on the timestamp, which makes total sense in
“on-line” mode. I suppose we could keep that convention and generate
fake timestamps for off-line mode. That would also work well when we
have real timestamps in the headers. I also think it would be good to
have a key toggle to switch between time and sample number. I want to
identify interesting features in the spectrogram and then be able to cut
out a section of a much larger file for analysis.

PWG

It doesn’t get the time info from a time tag, just from the system’s
timer itself using the gr::high_res_timer_type. So we’d have to pass
this information through to the plotting widget.

Tom

From: “Tom R.” < [email protected] >
To: “Paul W Garver” < [email protected] >
Cc: “GNURadio D.ion List” < [email protected] >
Sent: Monday, November 3, 2014 6:16:38 PM
Subject: Re: [Discuss-gnuradio] gr-qtgui Waterfall Time Axis

On Wed, Oct 29, 2014 at 1:23 PM, Garver, Paul W < [email protected] >
wrote:

I've solved this and wanted to share my solutions, as well as improvements (to me, anyways) to GR's off-line plotting tools (gr_spectrogram_plot). My overall goal here is to be able to do some analysis on wideband (25MSPS) data using GR since MATLAB isn't really efficient for that.

I’ve seen some interesting updates regarding gr-qtgui on the git commit
log – I’d be very interested in comments from those folks so I know
what’s in the pipeline and perhaps what you would want to pull into the
baseline.

Below is a list of issues I’ve encountered and what I’ve done to fix
them.

Thanks for the feedback. Comments below.

Problem:In gr_spectrogram_plot_c, the time scale always reads 0.

Solution: In waterfalldisplayform.cc, d_update_time is passed as the
timePerFFT argument to plotNewData (WaterfallDisplayPlot object). This
happens in WaterfallDisplayForm::newData(). However, gr_spectrogram_plot
sets this to zero by calling set_update_time(0) on the
waterfall_sink_c_impl object. In waterfall_sink_c_impl.cc it calls the
postEvent() function with d_last_time, which is a variable representing
the time between waterfall updates. Ultimately, this is regulated by
d_update_time, which is set to zero by the set_update_time(0) call. The
net result is that timePerFFT is always zero, resulting in an improperly
set timescale. The proper thing to do here is to set timePerFFT based on
the FFT size and sample rate in off-line mode. Obviously, this assumes
you know the sample rate, but since I’ve modified gr_spectrogram_plot to
read files with headers and autofill stuff (sample rate, frequency, data
type, etc) appropriately, it always does. It appears that you won’t be
necessarily processing every sample if you don’t set d_update_time=0,
which you want to when running off-line.

That sounds fair. The update time is set to zero so it plots
immediately. As for the autofill, are you using the file metadata format
for this or your own? I’ve been wanting to make versions of these
programs that read this data from our metadata files for just this
reason. And if nothing is provided for the sample (the -r sets this on
the command line) it defaults to 1.0, which should still operate well
with what you are suggesting here, possibly with just meaningless
numbers.

Problem:In gr_spectrogram_plot, the cursor pointer only gives readings to a 100th of a second. For analysis, it would be more useful to have scientific notation which scales with higher sample rates.

Solution: In WaterfallDisplayPlot.cc, change .arg(secs,0,‘f’,2) to
.arg(secs,0,‘e’,2)

I’d like to see what that really looks like, but I’m not opposed to it.

Problem: In gr_spectrogram_plot, the time scale is in in hundreds of a second. It may be more useful to have the plot in scientific notation.

Solution: In WaterfallDisplayPlot.cc, change
QwtText(QString(“”).sprintf(“%.1f”,secs)) to
QwtText(QString(“”).sprintf(“%e”,secs))

Same again.

Additional tasks:

Problem: In gr_spectrogram_plot, the spectrogram is hardcoded to 200
FFTs. Make this a configurable parameter.

Yep, that seems wrong.

Problem: In gr_spectrogram_plot, the timescale does not update properly with the file seek widget.

Ah, interesting. That might be a bit more difficult fix.

Can you put this work together into a git branch, probably on github
forked off gnuradio/gnuradio.git so we can see what the commits would
look like? I think these off-line analysis tools are useful, but they
definitely need work. Glad someone’s interested in it!

Tom

------------------------------

Message: 12
Date: Tue, 21 Oct 2014 16:00:15 -0400 (EDT)
From: “Garver, Paul W” < [email protected] >
To: [email protected]
Subject: [Discuss-gnuradio] gr-qtgui Waterfall Time Axis
Message-ID:
< [email protected] >
Content-Type: text/plain; charset=utf-8

I have some large spectrum records and I would like to use GR for
analysis in a a sort of “off-line” mode. I want to visualize the data
using the spectrogram and then identify subsets of interesting features
(e.g. WiFi Packets in 2.4 GHz) by frequency and time coordinates.
Essentially, I want to know what index in the file certain features
correspond with so I can save them as separate files. To this end, I’ve
found gr_spectrogram_plot very handy, but it doesn’t quite do what I
want because the y-axis which displays time isn’t showing up correctly.
My y-axis is always 0.0, and the cursor highlight also reads 0.0
(although the frequency is correct). However, the qt spectrogram appears
to have correct time information when I make a simple file source → qt
spectrogram sink.

I’d like to fix this issue so I’ve been pouring over the code in
gr-qtgui. I don’t have much experience with QWT or Python GUIs in
general. From what I can tell, there are methods for frequency axis and
intensity scaling, but not time. Is this true or am I missing something
in the API? The historyExtent parameter in the WaterfallData constructor
appears to get hardcoded to 200 in WaterfallDisplayPlot.cc
(gr-qtgui/lib). It would also be nice to toggle between the time and
sample number.

I’m happy to do the work to implement this. Could someone point me in
the right direction here? Thanks!

PWG


Discuss-gnuradio mailing list
[email protected]
https://lists.gnu.org/mailman/listinfo/discuss-gnuradio