Transmitter receiving its own packet

Hi,
I have two USRP N210s. The sender’s “Tx/Rx” port
is connected to the receiver’s “Rx2” port and vice-versa (via RF-cable).
I have a python app running on the sender and receiver which sends
a REQ packet to the receiver. I see that the sender receives
its own packet. In this app, the receiver is supposed
to send an ACK packet when it receives the REQ packets.
So, on both the USRPs, the rx and tx paths are set up.

Is the problem occurring because the sender is transmitting
on the “Tx/Rx” port and the rx path is valid when the transmission
is happening? If so, how do I switch off the rx path while
transmitting (is there a uhd API routine for that?)?

Any insight will be really helpful.

thanks and regards

–Anirudh Sahoo
Advanced Network Technology Div.
National Institute of Standards and Technology (NIST)
100 Bureau Drive,
Gaithersburg, MD - 20878
Room - B230, bldg.- 222
Phone- 301-975-4439

This seems like a common scenario and it seems like it would make sense
to
push the behaviour as low as possible in the stack. Is there a
configuration parameter for the USRP “device driver” that instructs the
driver to drop self routed signals?

Advanced Network Technology Div.

Well, assuming you have TX thread and an RX thread, you can signal to
the RX thread, via a shared variable, or some such,
that TX is currently in progress, and to please ignore any samples
for a while, or some such thing.

I’ll make the general comment that nobody can be really successful in
SDR, if the ‘S’ part of it is a bit of a mystery to them.
Real-world signal-processing solutions require things beyond the
strictly-mathematical treatments of DSP you find in textbooks.
In real systems, you need to implement algorithms that aren’t
strictly-mathematical in nature, and require some non-trivial
understanding of familiar topics in CS and computer-programming
concepts.

On 04/09/2013 02:14 PM, M. Ranganathan wrote:

This seems like a common scenario and it seems like it would make sense to
push the behaviour as low as possible in the stack. Is there a
configuration parameter for the USRP “device driver” that instructs the
driver to drop self routed signals?

So it would be possible to mute the ADC samples when the device is
transmitting. However, that might only be desirable in a couple of use
cases, and there are probably a dozen ways to solve this from all the
way down at the physical layer up the the MAC layer:

  • mute the RX ADC when transmitting (probably an FPGA mod)
  • tune RX and TX two different center frequencies
  • mute the samples going into the demodulator block when transmitting
  • use the issue_stream_command api to avoid receiving when transmitting
  • use a different preamble on the packets so you cant correlate yourself
  • use an identifier in the header of the packet to differentiate nodes

I believe precog addresses this issue by using the last option. Your MAC
layer has to be able to identify what a packet is and who it is destined
to after all: https://github.com/jmalsbury/pre-cog/wiki

-josh

This seems like a common scenario and it seems like it would make
sense to push the behaviour as low as possible in the stack. Is there
a configuration parameter for the USRP “device driver” that instructs
the driver to drop self routed signals?

It could do that, and I’ll let Josh comment on how awkward it would be
for UHD to arrange to do that.

But the RX and TX chains on the hardware are essentially
entirely-independant units, with no cross-unit semantic implied anywhere
in the hardware
(except for antenna switching).

The problem of “downloading” operational semantics into hardware that
should be as “semantic free” as possible is that it adds to:

 o FPGA bloat
 o loss of generality

And while I agree this particular semantic may be justified, in some
circumstances, it’s a slippery slope. One might argue that
‘nFSK’ is such a common requirement, it should be done at very low
layers, or, ‘OFDM’ or whatever else is popular this week. That turns
the hardware, and the software stack above it, into not so much an
SDR, as an SCR, with pre-defined purposes.

Furthermore, if you want to make the hardware implement part of your
operational semantic, you’re free to do that in the FPGA code
yourself. The codebase is freely available, and there’s room in the
FPGAs of the various USRP devices to do tweaks like that.


Marcus L.
Principal Investigator
Shirleys Bay Radio Astronomy Consortium

“tx/rx” port. Is that possible (note the receiver streamer
RX chain to simply ignore the RX samples while you’re transmitting".

can this be done in the python layer or do I have to touch the c++

layer? I am using ofdm as the physical layer (ofdm.py etc.).

Sorry, if all this sounds incoherent; I am still a newbee in this

field.

The receive chain is either connected to the TX/RX port or the RX2 port.
When you setup the usrp object, you can choose which port the
receive chain is connected to, this is an analog thing. In GRC, you
can specify this when you create a SINK/SOURCE.

If you’re doing transmitting, the hardware will switch the RX path
away from the TX/RX port, to prevent zapping the RX hardware.
But you can arrange for your RX path to be connected full-time to the
RX2 port, just by specifying that when you create the object.

But again, in Gnu Radio, you can easily specify the antenna ports when
the device is created.

Shutting down the streamer and re-starting is more awkard, and I believe
that Josh has already commented with other useful suggestions.

Hi Josh,

So it would be possible to mute the ADC samples when the device is
transmitting. However, that might only be desirable in a couple of use
cases, and there are probably a dozen ways to solve this from all the
way down at the physical layer up the the MAC layer:

  • mute the RX ADC when transmitting (probably an FPGA mod)

We are interested to operate our USRP N200 radios in TDM. In other
words, we want each of our radio to transmit and receive at the same
frequency. Since we expect very high self-interference in this case, we
would be interested to play a little bit with the ADC mute option. Could
you provide us some pointers on where to modify the FPGA code to do
this? Also, how much realistic/feasible it sounds if we want to control
ADC (on/off) switching from within the GRC flow graph (at the very
least, we will expect some latency associated with this operation).

Thanks in anticipation,
Mahboob

On 04/10/2013 12:27 AM, Rahman, Muhammad Mahboob Ur wrote:

We are interested to operate our USRP N200 radios in TDM. In other
words, we want each of our radio to transmit and receive at the same
frequency. Since we expect very high self-interference in this case,
we would be interested to play a little bit with the ADC mute option.
Could you provide us some pointers on where to modify the FPGA code
to do this? Also, how much realistic/feasible it sounds if we want to

See the section at line 614? Use the run_tx signal to conditionally zero
out the values going to .adc_a and .adc_b

control ADC (on/off) switching from within the GRC flow graph (at the
very least, we will expect some latency associated with this
operation).

If you are running continuous receive, by knowing the transmit time of a
packet - or based on some concept of valid transmit window, you can
choose what rx samples to ignore based on the rx sample’s timestamps. –
that would be the most exact way to do it.

-josh