USRP2 abnormal function with spectrum_sense code

Hi, folks. Happy New Year!

With the nice post of Firas’ ‘usrp_spectrum_sense.py explanation’
post, I slightly modified the code to work with my USRP2 box.

  1. When I interrupt the execution of the code with ^C, I the following
    error message

    thread-per-block[4]: <gr_block bin_statistics_f (5)>]: caught
    unrecognized exception

    I guess this is because I forced to terminate the flow graph in
    the middle of its execution. I wonder if there is any ‘nice way’ to
    stop the execution normally so that I don’t get the error above and
    the problem #2 below.

  2. After I get the error message above, my USRP2 won’t work properly
    when I re-run the code. The followings are the message it returns.

    Traceback (most recent call last):
    File “./usrp2_spectrum_sense.py”, line 192, in
    tb = my_top_block()
    File “./usrp2_spectrum_sense.py”, line 128, in init
    self.u = usrp2.source_32fc(options.interface, options.MAC_addr)
    File “/usr/local/lib/python2.5/site-packages/gnuradio/usrp2.py”,
    line 449, in source_32fc
    return _usrp2.source_32fc(*args)
    RuntimeError: Unable to retrieve daughterboard info

    When I cold reset the box, it just works fine just like before. I
    monitored the serial port of the box and no messages come out when it
    won’t work properly. However, when I disconnect the GbE line, it
    recognizes the disconnect and send out message saying something like
    ‘connection speed changed to 0’. When I re-connect the ethernet, it
    says something like ‘connection speed changed to 1000’.
    So, my guess is that the USRP2 box is probably still busy sending
    samples to PC when I terminated the program execution and does not
    listen to the control message from PC.
    Is there anybody knows if this is a problem with the USRP2
    firmware(MicroBlaze Soft Core?) problem or a problem of my python
    code?

  3. I also experimented the code with pre-saved sample file. When I run
    the code with ‘usrp2_spectrum_sense.py -i SomePreSavedSamples.dat’ and
    terminate the execution with ^C, I get ‘Segmentation fault’ and heavy
    core file dumped.
    I wonder what is the most popular way to debug the core file to
    track down the problem? Is the gdb best as usual? Is there anybody to
    share the debugging tricks or tips ?

FYI, the followings are the code I used.
-----------------------------------------------------------------------------------------------------------------------------------------------------------
#!/usr/bin/env python

Copyright 2005,2007 Free Software Foundation, Inc.

This file is part of GNU Radio

GNU Radio is free software; you can redistribute it and/or modify

it under the terms of the GNU General Public License as published by

the Free Software Foundation; either version 3, or (at your option)

any later version.

GNU Radio is distributed in the hope that it will be useful,

but WITHOUT ANY WARRANTY; without even the implied warranty of

MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the

GNU General Public License for more details.

You should have received a copy of the GNU General Public License

along with GNU Radio; see the file COPYING. If not, write to

the Free Software Foundation, Inc., 51 Franklin Street,

Boston, MA 02110-1301, USA.

from gnuradio import gr, gru, eng_notation, window
from gnuradio import usrp2
from gnuradio.eng_option import eng_option
from optparse import OptionParser
import sys
import math
import struct

class tune(gr.feval_dd):
“”"
This class allows C++ code to callback into python.
“”"
def init(self, tb):
gr.feval_dd.init(self)
self.tb = tb

def eval(self, ignore):
    """
    This method is called from gr.bin_statistics_f when it wants to 

change
the center frequency. This method tunes the front end to the
new center
frequency, and returns the new frequency as its result.
“”"
try:
# We use this try block so that if something goes wrong from
here
# down, at least we’ll have a prayer of knowing what went
wrong.
# Without this, you get a very mysterious:
#
# terminate called after throwing an instance of
‘Swig::DirectorMethodException’
# Aborted
#
# message on stderr. Not exactly helpful :wink:

        new_freq = self.tb.set_next_freq()
        return new_freq

    except Exception, e:
        print "tune: Exception: ", e

class parse_msg(object):
def init(self, msg):
self.center_freq = msg.arg1()
self.vlen = int(msg.arg2())
assert(msg.length() == self.vlen * gr.sizeof_float)

    # FIXME consider using Numarray or NumPy vector
    t = msg.to_string()
    self.raw_data = t
    self.data = struct.unpack('%df' % (self.vlen,), t)

class my_top_block(gr.top_block):

def __init__(self):
    gr.top_block.__init__(self)

    parser = OptionParser(option_class=eng_option)
    parser.add_option("-e", "--interface", type="string", 

default=“eth0”,
help=“Select ethernet interface. Default is
eth0”)
parser.add_option("-m", “–MAC_addr”, type=“string”, default="",
help=“Select USRP2 by its MAC address.
Default is auto-select”)
parser.add_option("", “–start”, type=“eng_float”, default=1e7,
help=“Start ferquency [default = %default]”)
parser.add_option("", “–stop”, type=“eng_float”, default=1e8,
help=“Stop ferquency [default = %default]”)
parser.add_option("", “–tune-delay”, type=“eng_float”,
default=1e-3, metavar=“SECS”,
help=“time to delay (in seconds) after
changing frequency [default=%default]”)
parser.add_option("", “–dwell-delay”, type=“eng_float”,
default=10e-3, metavar=“SECS”,
help=“time to dwell (in seconds) at a given
frequncy [default=%default]”)
parser.add_option("-s", “–fft-size”, type=“int”, default=256,
help=“specify number of FFT bins
[default=%default]”)
parser.add_option("-d", “–decim”, type=“intx”, default=16,
help=“set decimation to DECIM
[default=%default]”)
parser.add_option("-i", “–input_file”, default="",
help=“radio input file”, metavar=“FILE”)

    (options, args) = parser.parse_args()

    if options.input_file == "":
        self.IS_USRP2 = True
    else:
        self.IS_USRP2 = False

    self.min_freq = options.start
    self.max_freq = options.stop

    if self.min_freq > self.max_freq:
        self.min_freq, self.max_freq = self.max_freq,

self.min_freq # swap them
print “Start and stop frequencies order swapped!”

self.fft_size = options.fft_size

    # build graph

s2v = gr.stream_to_vector(gr.sizeof_gr_complex, self.fft_size)

    mywindow = window.blackmanharris(self.fft_size)
    fft = gr.fft_vcc(self.fft_size, True, mywindow)

    c2mag = gr.complex_to_mag_squared(self.fft_size)

    # Set the usrp2 dependent parts
    if self.IS_USRP2:
        self.u = usrp2.source_32fc(options.interface, 

options.MAC_addr)
self.u.set_decim(options.decim)
samp_rate = self.u.adc_rate() / self.u.decim()
else:
self.u = gr.file_source(gr.sizeof_gr_complex,
options.input_file, True)
samp_rate = 64e6 / options.decim

    # Set the freq_step to 75% of the actual data throughput.
    # This allows us to discard the bins on both ends of the 

spectrum.
self.freq_step = 0.75 * samp_rate
self.min_center_freq = self.min_freq + self.freq_step/2
nsteps = math.ceil((self.max_freq - self.min_freq) /
self.freq_step)
self.max_center_freq = self.min_center_freq + (nsteps *
self.freq_step)

    self.next_freq = self.min_center_freq

    tune_delay  = max(0, int(round(options.tune_delay * samp_rate

/ self.fft_size))) # in fft_frames
dwell_delay = max(1, int(round(options.dwell_delay * samp_rate
/ self.fft_size))) # in fft_frames

    self.msgq = gr.msg_queue(16)
    self._tune_callback = tune(self)        # hang on to this to

keep it from being GC’d
stats = gr.bin_statistics_f(self.fft_size, self.msgq,
self._tune_callback, tune_delay,
dwell_delay)

    # Now, connect them all

self.connect(self.u, s2v, fft, c2mag, stats)

def set_next_freq(self):
    target_freq = self.next_freq
    self.next_freq = self.next_freq + self.freq_step

    if self.next_freq >= self.max_center_freq:
        self.next_freq = self.min_center_freq

    if self.IS_USRP2:
        if not self.set_freq(target_freq):
            print "Failed to set frequency to ", target_freq, "Hz"

    return target_freq

def set_freq(self, target_freq):
    return self.u.set_center_freq(target_freq)

def main_loop(tb):
while 1:
# Get the next message sent from the C++ code (blocking call).
# It contains the center frequency and the mag squared of the
fft
m = parse_msg(tb.msgq.delete_head())

    # Print center freq so we know that something is happening...
    print "Center Frequency :", m.center_freq, "Hz"
    print m.data

    # FIXME do something useful with the data...

    # m.data are the mag_squared of the fft output (they are in the
    # standard order.  I.e., bin 0 == DC.)
    # You'll probably want to do the equivalent of "fftshift" on 

them
# m.raw_data is a string that contains the binary floats.
# You could write this as binary to a file.

if name == ‘main’:
tb = my_top_block()
try:
tb.start() # start executing flow graph in another
thread…
main_loop(tb)

except KeyboardInterrupt:
    pass
-----------------------------------------------------------------------------------------------------------------------------------------------------------

Thank you for reading the long post and sharing your valuable time. :slight_smile:

Best regards,

Ilkyoung.

On Sat, Jan 03, 2009 at 03:45:22PM +0900, ILKYOUNG KWOUN wrote:

------------------------------------------------------------------
I guess this is because I forced to terminate the flow graph in

the middle of its execution. I wonder if there is any ‘nice way’ to
stop the execution normally so that I don’t get the error above and
the problem #2 below.

That’s for posting the problem and the code that reproduces it.
We’ve got some control-C handling problems that need attention.

Because of the incorrect shutdown, the USRP2 isn’t getting shutdown
properly, and is probably still streaming samples.
That’s http://gnuradio.org/trac/ticket/276

  1. I also experimented the code with pre-saved sample file. When I run
    the code with ‘usrp2_spectrum_sense.py -i SomePreSavedSamples.dat’ and
    terminate the execution with ^C, I get ‘Segmentation fault’ and heavy
    core file dumped.
    I wonder what is the most popular way to debug the core file to
    track down the problem? Is the gdb best as usual? Is there anybody to
    share the debugging tricks or tips ?

Use gdb. See these GNU Radio specific instructions on connecting to
the python process that has all the GR stuff loaded:
http://www.gnu.org/software/gnuradio/doc/howto-write-a-block.html#debugging

Once the problem occurs try:

bt

or

thread apply all bt

Eric

hey! I tried your script “usrp2_spectrum_sense.py” but its not working,
every time i run this code it gives the error “Failed to set frequency”,
if you have updated it or fixed it, please give some suggestion.

thanks,
Adnan