Streaming content generated on the fly with nginx

Hi all,

I’m working on a project for which I need to render on the fly and
serve an “endless” mp3 stream (think of a webradio… in which the
audio content is generated automatically). The use case would be:

an HTTP GET on /create_stream returns a token id
an HTTP GET on /stream?id=token serves an endless chain of audio buffers
an HTTP GET on /set_param?id=token&key=value alters one of the audio
rendering settings of the stream.

Latency/pre-buffering should be in the 0.5s-2s ballpark. So ideally,
every 0.5s (or everytime we know the client has consumed 50% of what
was generated during the previous call), some rendering code should be
called.

I’d rather rely on an existing networking I/O infrastructure rather
than rolling my own socket server, so I was exploring the possibility
of doing this with an nginx module. However, I’m not sure how to do
this. It doesn’t seem to fit the handler model well, since, from what
I understood, the handler sets up a chain of return buffers, returns
immediately, and doesn’t have anything more to say about this chain of
buffers. What I would like to do, instead, would be to generate in the
handler for the “/streaming” request a chain with a couple of buffers
; and also specify a callback that would be called every time the last
but one buffer in the chain has been sent to the client. Is there an
easy way of achieving that?

Another option I thought of would be to reuse something like the
ngx_http_static_module.c, generate a memory buffer with the mp3 header
; and a file buffer referencing the fd of a named pipe ; with a bunch
of processes in parallel scanning all the opened named pipes and
filling them up. If things go well, when time will come to write the
file to the socket, nginx will read and stream as much as possible
from the FIFO and move to something else until the FIFO will become
readable again? In this situation, how do I handle a dropped
connection? Doesn’t sound like a good idea to me…

Or would it be possible to do that by abusing the “upstream” plug-in
infrastructure for this aplication?

I’m really looking for any solution to this “server continually sends
a packet every 0.5s to a http connection kept open” problem.

Best,
Olivier