Eric B. wrote:
- Is ppio_ppdev fast enough to keep up with this bitrate?
I doubt it. It just bit bangs the serial port. Thus you’ve got no control
of the data rate, etc.
I assume you meant the parallel port. I suspected as much. I was just
hoping that it would keep up with my low data rate, which is being
throttled elsewhere.
- Are there other, better or easier ways to do this, such as using one
of the debug pins on a USRP daughterboard?
If you’re willing to hack verilog, you can get it out the debug pins.
I’m willing, but I don’t have the time/funding to learn verilog. I’m
working under a very small, focused research grant.
If the output needs to be synchronous serial output, the easiest way to
get it out may be to use a serial card with a USART on it. E.g.,
something that’s capable of synchronous serial transmission. Look for
a card that’ll do HDLC, then see if there’s a “raw” or “unframed”
mode. Generally the hardware that’ll can do HDLC can also do
synchronous.
Yes, I’m very familiar with HDLC synchronous serial cards, having
written several Linux network device drivers for them. Again, this is
doable, but probably not under the tight time/money constraints I have.
Another thought occured to me. Since the LFTX daughterboard goes down to
DC, shouldn’t it be posible to use it to output the bitstream? It’s a
bit of a kludge, but it should be easy. Comments?
What’s the protocol for talking to the equipment? Is there any
handshaking, etc, or do you just need a raw synchronous stream?
For my immediate purpose, I just need a raw clock and data synchronous
bitstream with no handshaking. In reality, the bitstream is carying
NRZI-encoded HDLC/Frame-relay, with IP packet payloads that conform to
the Multi-Protocol-over-Frame-Relay (MPoFR) standard. This gets fed
into a COTS router which is performing all sorts of statistics gathering
on the HDLC. I know that I could do all this link layer stuff in
software, but I’d essentially be “re-inventing the wheel” when I can’t
afford to.
Let me give you a little background on what I’m trying to do. I’m
working with the US Naval Academy’s MidStar-1 spacecraft. Due to an
unfortunate design choice, the average of the digital waveform that gets
FSK modulated onto the carrier has a non-zero, varying DC offset,
dependent on the data content. As a result, once the waveform is
demodulated on the ground, a standard 0 volt bitslicer will not work.
We were forced to build hardware for an adaptive-level bitslicer, based
on a 1970s-era design which uses a pair of sample-and-hold circuits to
track the min and max levels in the signal, and slices at 1/2 of the
difference (effectively the mean instead of the average). But this
circuit performs poorly, especially at points where the DC offset
changes abruptly by a large amount. Unfortunately, this is a common
occurance in NRZI encoded HDLC. My goal is to use GNU Radio to implement
a smarter adaptive-level bitslicer that will outperform our hardware
one.
@(^.^)@ Ed