Does Starlight support UDP Streaming?

Jul 29, 2009 at 3:57 PM

Hi All,

Does Project Starlight support UDP streaming, Actually our client using "Digital Rapid" for multicast media streaming in CDN network. and i want to use starlight (SilverLight with Qumu PlugIn)

I have integrated starlight with our silverlight player but it doesn't work with UDP streaming URL.

Waiting for your reply

Adi

Coordinator
Jul 29, 2009 at 4:03 PM

I am not sure what you mean by UDP streaming.  UDP is a low-level protocol much like TCP.  There are protocols such as RTSPU which run over UDP, much like HTTP Streaming runs over TCP.  What protocol does Digital Rapids use?  Project Starlight only supports Microsoft's MSB protocol.

Jul 29, 2009 at 4:32 PM
Edited Jul 30, 2009 at 2:25 PM

I don't know much about infrastructure and the protocols they are using, all the info available at

http://www.digital-rapids.com/Products/Expertise2_LiveStream.aspx

http://www.digital-rapids.com/News/PressArchive/NAB2006%20Stream22.aspx

Actually i got a UDP IP address ip i.e UDP://123.121.23.33 for streaming and i just want to capture it on silverlight player. This streaming URL is not streaming on HTTP so how can i capture this type of streaming in silverlight player (StarLight Integrated).

Coordinator
Jul 30, 2009 at 7:33 PM

I suggest you research the protocols supported by Digital Rapids.  Based on the URL you provided, I think it it most likely not using MSB and multicast, but I don't know enough about Digital Rapids to say for sure.  Generally when using MSB, there will be an NSC file that is fed to Starlight that specifies the multicast parameters to use.  This NSC file is generated by the Windows Media Server originating the broadcast.  You need determine to determine if Digital Rapids is broadcasting a MSB stream, and figure out how to get the NSC file that describes the stream if so.  You should then supply the NSC file to Starlight.  If Digital Rapids is not using MSB and generating an NSC file, Starlight will be unable to playback the stream.

Aug 5, 2009 at 9:43 PM

from this conversation, will it be correct to infer that Starlight does support:

1. UDP (because it is used by MSB)

BUT... it does does not support following combinations:

1. RTP over UDP (for multicast)
2. RTSP/RTP over UDP (for unicast)
2. RTSP/RTP over TCP (for unicast)

???

thanks in advance...

 

Coordinator
Aug 5, 2009 at 10:24 PM

UDP: Sort of.  The code explicitly joins a multicast group, so it cannot be used to receive unicast UDP packets.

RTP over UDP multicast:  Not out of the box.  If the RTP packets contain ASF data you could implement an alternate PacketSource that knows how to extract the ASF stream from RTP packets.  There is a PacketSource implementation that extracts an ASF stream from MSB packets provided as part of the project.

RTSP/RTP over UDP: No, due to the lack of support for unicast mentioned above.

RTSP/RTP over TCP: This could be implemented with a PacketSource as mentioned above built on top of the standard Silverlight TCP support.  You would have to obey the usual restrictions of Silverlight TCP, so that would probably mean running the RTSP server on an alternate port, but in theory it would be possible.  This would most likely be a great deal more work than a multicast scenario since this would involve multiple exchanges with the server, while multicast is quite simple since there is no direct communication with the server.

 

Aug 5, 2009 at 10:36 PM

thanks! a couple of more questions:

1. any plans or chances for adding support for any of the given scenarios?

2. any info on the streaming appliances/vendors that you think can be used to accomplish live multicast streaming with silverlight/qumu at the user end? in other words, which streaming appliances/vendors support MSB over UDP (other than Microsoft itself)? :)

highly appreciate your help with this!

thanks again...

Coordinator
Aug 5, 2009 at 11:16 PM

1.)  Currently there are no plans to enable any of the scenarios listed above.  For any multicast or TCP scenario it should be possible to implement solely in managed code if you wanted to tackle developing it.  Unicast would require native C++ code in addition to the managed code components.  All the code to everything except for the ASF MediaStreamSource is open source, and we've tried very hard to make it where nothing in the ASF MediaStreamSource is tied to any network protocol.

2.)  This is a bit of a tricky one for me to answer since it's stepping out of the purely technical realm.  If you're looking for an all in one encoder/streaming server in an appliance form, I'm not aware of any off the top of my head (not to say it doesn't exist).  Other than that I'll just shamelessly plug for Qumu's product line.  We provide all the components you would need to accomplish live streaming using MSB, from encoders to streaming server appliances, as well as our Video Control Center which can tie all these components together.  There are other vendors who do some or all of this, but I'm not familiar enough with any of them to be able to provide any intelligent commentary on the subject.  If you'd like I could put you in touch with someone from Qumu who would be more familiar with this sort of information.  Sorry I can't be more helpful on this point.

Aug 7, 2009 at 3:36 PM

 

I've been working on a project that requires UDP multicast video as well.  Here's the current state of things as I understand it:

 

The Starlight plugin -only- supports multicast from Windows Media Services.  WMS multicast streams are "published" by generating a proprietary .nsc pointer file.  (similar to an SDP file but with encrypted content).    This file is what's served by http for the WMP & Starlight players, and is what your playlist file points to.

WMS is the only streaming server product I've found that outputs .nsc files (Helix 10 might, but I'm not going there!)

 

Outside of the WMS world, there are a few common ways to multicast video streams.  UDP and RTP are used to transport the streams, sometimes RTSP servers or SDP files are used to provide pointers to the multicast data.   The multicast content itself can either be elemental streams (video and audio are seperate streams, no container) or "muxed" streams (where the A and V are combined into a container, such as mpeg-ts).    There is no "streaming server" involved or required, since the "client" never talks to the "server."    The encoder puts the multicast content on the network, and the network infrastructure handles distribution.   The "client" only ever talks to it's connected network gear (via IGMP).     (in a unicast world, every viewer connects directly to the stream source, so streaming servers are needed to handle that load.)

The set top box side of the world uses this setup.   Since I need to support STBs, I can't stop streaming this way.  

 

From what I've seen of WMS so far for multicast support, it will -only- recieve unicast http:// pull streams for it's multicast publishing points.   That means that I need to add another whole set of encoder outputs in all of my bitrates to feed WMS.   Not to mention having to find ways to setup WMS in a high availability environment, and adding more hardware and software to maintain.   With my current encoder setup, I also have to -double- the number of encoder boxes required.

 

If there were a way to bypass the .nsc file required for Starlight, and directly provide it the multicast addresses (udp://230.0.0.1:1234, for example), that would be a life saver for my project.    I'm not a developer and I can't even begin to wrap my head around the Starlight code tree, but it seems to me that wherever the "output" of the NSCParser module goes is where we'd want to feed our addresses.     The .nsc file contains additional information about video size, container and codec information, etc.   I'm thinking/hoping that that information is optional.....I can hand any support video file to the Silverlight player without having to predefine what it is, and it's able to recognize and play it....I'm assuming the logic is there to do that with a live stream, too....

 

 

My end goal is to switch our platform from mpeg1 and mpeg4p2 elemental streams to using h264 muxed streams, feeding new STB hardware and using a Silverlight based web player which could handle both our VOD and our live platforms.     As it stands, there are -no- UDP multicast capable web players that are not 10+ year old Java - and none of those support H.264.   On the desktop side VLC can handle it, but WMP can't.  QT can't either (requires an SDP file, and even then the support is buggy and flawed).   

 

 

 

I -suspect- that WMS uses a more or less standard muxed udp multicast stream for output, and I've had some success at getting VLC to stream the udp stream

Coordinator
Aug 7, 2009 at 3:57 PM

The Starlight plugin -only- supports multicast from Windows Media Services. 

Correct

WMS multicast streams are "published" by generating a proprietary .nsc pointer file.  (similar to an SDP file but with encrypted content).    This file is what's served by http for the WMP & Starlight players, and is what your playlist file points to.

Correct, although encrypted is a strong term.  It's essentially a modified base64 encoding.

If there were a way to bypass the .nsc file required for Starlight, and directly provide it the multicast addresses (udp://230.0.0.1:1234, for example), that would be a life saver for my project.    I'm not a developer and I can't even begin to wrap my head around the Starlight code tree, but it seems to me that wherever the "output" of the NSCParser module goes is where we'd want to feed our addresses.     The .nsc file contains additional information about video size, container and codec information, etc.   I'm thinking/hoping that that information is optional.....I can hand any support video file to the Silverlight player without having to predefine what it is, and it's able to recognize and play it....I'm assuming the logic is there to do that with a live stream, too....

The problem with this is the format of the stream.  The starlight code expects to get an ASF stream (Microsoft's proprietary muxed stream format).  The other problem is that the Silverlight MediaElement needs to know certain information about the video before it starts playing it (codec, size, audio format, etc.).  Usually this information is contained in a file header, and that's how Silverlight auto-detects how to play the file.  When multicasting,  this information generally appears in some sort of SDP/NSC file that can be parsed ahead of time to set up the media pipeline with the correct codecs, since it would be inefficient to send this information with evey multicast packet.  I don't think this information is ever optional, it's just that usually Silverlight can parse it out itself since the file headers are present.

My end goal is to switch our platform from mpeg1 and mpeg4p2 elemental streams to using h264 muxed streams, feeding new STB hardware and using a Silverlight based web player which could handle both our VOD and our live platforms.     As it stands, there are -no- UDP multicast capable web players that are not 10+ year old Java - and none of those support H.264.   On the desktop side VLC can handle it, but WMP can't.  QT can't either (requires an SDP file, and even then the support is buggy and flawed).   

Now that Silverlight supports H.264 what you describe should in theory be possible, but you would need to write (or find someone to write) a MediaStreamSource that can demux mpreg-ts streams.

 

 

Aug 7, 2009 at 4:09 PM
Edited Aug 7, 2009 at 4:11 PM

The problem with this is the format of the stream.  The starlight code expects to get an ASF stream (Microsoft's proprietary muxed stream format).

Ahhh.  This wasn't clear in any of the documentation or release information that I've seen.  It seemed to be implied that Starlight only handled the transport connection, and let -any- supported content through.

So this would -never- work for h.264 content as it stands, coming from WMS or otherwise.   And since my STBs dont even know that ASF exists.......

 

The other problem is that the Silverlight MediaElement needs to know certain information about the video before it starts playing it (codec, size, audio format, etc.).  Usually this information is contained in a file header, and that's how Silverlight auto-detects how to play the file.  When multicasting,  this information generally appears in some sort of SDP/NSC file that can be parsed ahead of time to set up the media pipeline with the correct codecs, since it would be inefficient to send this information with evey multicast packet.  I don't think this information is ever optional, it's just that usually Silverlight can parse it out itself since the file headers are present.

 

Hmm.  I wonder how VLC handles this? It can, somehow, determine the codec information just by looking at the stream.

 
Thanks for your help and information.
Coordinator
Aug 7, 2009 at 4:17 PM

The starlight native code only handles the transport connection.  Unfortunately, we can't just hand that connection to the Silverlight code and let it figure out what's going on, we have to demux the stream for it.  (Silverlight can basically work in two modes:  fully native where it handles transport and demuxing, or MediaStreamSource, where it handles neither...to implement your own transport you have to do your own demuxing too).  The demuxing of the stream is what the managed code within Starlight is doing.  We've attempted to make it modular enough such that alternate demuxing can be implemented on top of the same transport connection, but we don't provide support for anything other than ASF out of the box since that's what our customers will be using this for.

Aug 7, 2009 at 4:30 PM

got it.  that makes sense now.

 

thanks