Example Live Feed Setups

On this page we will describe a couple of concrete setups from broadcasting live from different video sources
such as a DV camera, DVB-T receiver or simple webcam.

Via Windows

In general, the simplest way of getting a feed up is via Windows and the VideoLanClient (VLC).
VLC can stream both DirectShow capable devices such as Webcams or DV cameras, as well as
DVB-T cards that support the Windows Broadcast Driver Architecture (BDA).
E.g. the Hauppauge WinTV-HVR 900H. For BDA support you need VLC 0.9.2 or above.

We'll explain how to setup VLC such that it offers a ready-to-broadcast stream for both classes:

  1. Attach any DirectShow capable device to the machine.
  2. Install and run VLC.
  3. Select "Open Capture Device" from the File menu (called Media in vlc 0.9)

DirectShow

  1. Select the video device from the "Video device name" dropdown box (may need to press "Refresh list" first)
  2. Check the "Stream/Save" box at the bottom. (or use the dropdown box next to the play button (vlc 0.9)).
  3. Click the "Settings..." button at the bottom.
  4. Check "Play locally" and "HTTP"
  5. Select encapsulation method "MPEGTS"
  6. Check and select the video and audio encoding you want (We used mp4v at 512 kbps, scale .5, no audio successfully)
  7. Click OK twice

BDA / DVB-T

  1. Change the "Video capture mode" from DirectShow to "DVB DirectShow"
  2. Select the DVB-type: DVB-T
  3. Set the Transponder/multiplex frequency. For the Netherlands, use the "Zenderlokatie" table on this page. The frequency must be entered in kHZ, so e.g. 618000 for Amsterdam (RAI).
  4. Use the dropdown box next to the play button to select "Stream"
  5. Check "Play locally" and "HTTP"
  6. Select encapsulation method "MPEGTS"
  7. Check and select the video and audio encoding you want.
  8. Click Stream

It's unclear how to select another channel. When in GUI mode you can change it via
"Playback/Programs". You may also need to run VLC from the commandline with the composed
command-string, rather than use the GUI. TODO: investigate the "--programs=" option.

Broadcasting

After these steps VLC now offers the video transcoded according to your settings on
http://yourhostname:1234/ This HTTP source can be used directly by the Tribler tools to
broadcast, as follows. You can do this on Windows, Mac or Linux, broadcasting the video
once VLC offers it is platform independent.

  1. Get the source:
    svn co http://svn.tribler.org/abc/branches/player-release-1.0
    
    
  2. Install the required additional libraries as described in Tribler/readme.txt
  3. Set the PYTHONPATH shell variable to the current dir
    set PYTHONPATH=.  (windows)
    
    export PYTHONPATH=. (Linux)
    
    
  4. Start the Tribler broadcast:
    python Tribler/Tools/createlivestream.py --name streamname.mpegts --source http://yourhostname:1234/ --destdir . 
    
    
  5. If you have a bitrate other than the default (512 kbps) use the --bitrate xyz

option where xyz is the bitrate in bytes per second. For high bitrates you may also want
to change the size at which we transmit data using the --piecesize xyz option.
More info below

  1. The createlivestream.py program will now write a streamname.mpegts.tstream file

that you should distribute to your users. They should start their swarmplayers using this
torrent-like file.

We use this setup to broadcast from a DV camera.

Note that the .tstream doesn't depend on the content. So you can reuse the same .tstream
file for broadcasting different streams at different times, as long as its parameters
(bitrate, piecesize, name) do not change. A special case are authenticated streams (the
default). Here the .tstream will remain the same when the public/private keypair remains
the same (in addition to the params). In practice, keeping the public/private keypair the
same means you should not delete the streamname.sauth file that the script writes
the first time. You can disable our default source-authentication scheme to reduce the CPU
usage during playback. For that use the Tribler/Tools/createlivestream-noauth.py
script.

DVB-T on Linux

Our first setup is a DVB-T receiver attached to a Linux box. The DVB-T is a Hauppauge
WinTV Nova-T USB stick that is supported by Linux. We assume you have setup the stick
correctly and can watch TV from it (e.g. cat /dev/dvb/adapter0/dvr0 | vlc -). You
can now broadcast the TV signal as follows. We assume the channel information needed by
the stick is in a file called channels.conf

  1. Tune in to the desired channel (we do this in the background)
    (tzap -c channels.conf -r "SomeChannel" > /dev/null 2>&1 &)
    
    
  2. Start the createlivestream.py
    python Tribler/Tools/createlivestream.py --name streamname.mpegts --source pipe:generate-video.sh --duration 1:00:00 --port 7765 --destdir . > /tmp/dvb-t-live.log 2>&1
    
    
  3. This command requires a generate-video.sh script in the current directory that looks like:
    ffmpeg -f mpegts -vsync 1 -map 0.0:0.1 -map 0.1 -i /dev/dvb/adapter0/dvr0 -vcodec mpeg4 -vb 428288 -s 320x240 -acodec libmp3lame -ab 96000 -ac 1 -f mpegts -
    
    
  4. In other words, FFMPEG takes the raw MPEG transport stream from the DVB-T receiver

and transcodes it to 512 kbps of MPEG4 video and MP3 audio, synchronized on audio, and
outputs this on stdout, so createlivestream.py can read it.

  1. We have experimented with H.264 via the libx264 library, but somehow the combination

of libx264 and current ffmpeg encoding in a pipeline gives motion prediction errors.
Try for yourself:

ffmpeg -f mpegts -vsync 1 -map 0.0:0.1 -map 0.1 -i /dev/dvb/adapter0/dvr0 -vcodec libx264 -vb 428288 -g 16 -s 320x240 -acodec libfaac -ab 96000 -ac 1 -deinterlace -f mpegts -

We set GOP (-g) to 16 to speedup tuning in at playback. With the default ffmpeg setting this takes a long time.

  1. We did manage to get H.264 working with mencoder:
    mencoder -cache 8192 -ovc x264 -x264encopts bitrate=1024 -nosound  -of lavf -lavfopts i_certify_that_my_video_stream_does_not_use_b_frames:format=mpegts -vf scale=640:360 -quiet -o /dev/stdout /dev/dvb/adapter0/dvr0
    
    

VLC on Linux

For some Linux sources (e.g. IP multicast) you can use VLC to do the transcoding.
Unfortunately, there appears to be something wrong when we try to read directly from a
DVB-T device and try to offer it via HTTP. VLC starts as it should, but its HTTP server
produces no data with the MPEGTS demux enabled. Here's an example for 1 mbps H.264 from a
IP multicast source:

vlc --daemon udp://@224.0.0.1:2001 --sout='#transcode{vcodec=h264,venc=x264{vbv-maxrate=512,vbv-minrate=512,qcomp=0,ratetol=0,keyint=20}vb=512,width=640,height=360,acodec=none,scodec=none,me=umh}:duplicate{dst=std{access=http,mux=ts,dst=:8081}}' :sout-all

DV Camera on Linux

For our first public trial we used a DV camera connected to a remote Linux box directly.
The remote box would grab the IEEE1394, transcode it to the right format and transmit it
to the broadcast box via SSH. This setup is described on another page.

Looping a File

Newer versions of Tribler support the "--fileloop True" option for createlivestream.py that allows you to loop a file as if it were a live broadcast.

Ogg Live Streaming

For the new SwarmPlayer you need a feed in Ogg format with theora+vorbis codecs. Also the feed needs to replay the Ogg headers each time you tune in, as players cannot just tune-in midstream as with MPEG-TS. The software that offers this functionality is the IceCast server. It acts as a relay server, so you need to inject a feed into it that will be read by createlivestream.py for P2P rebroadcast. There are several clients that can inject, we have tested with VLC. So there are three steps to creating a P2P Ogg live stream:

  1. Start default IceCast server (just change passwords in icecast.xml, tested 2.3.2):
    cd  "\Program Files\Icecast2 Win32"
    
    icecast2console.exe -c icecast.xml
    
    
  1. Inject e.g. camera: feed transcoded to Ogg/theora+vorbis using VLC:
    vlc dshow:// --sout "#transcode{vcodec=theo,vb=800,scale=1,acodec=vorb,ab=128,channels=2,samplerate=44100}:duplicate{dst=std{access=shout,mux=ogg,dst=source:hackme@127.0.0.1:8000/cam3.ogg},dst=display}" --no-sout-rtp-sap --no-sout-standard-sap --sout-keep
    
    
  1. Rebroadcast using P2P:
    python Tribler\Tools\createlivestream.py --name MacCam3.ogg --source http://127.0.0.1:8000/cam3.ogg --bitrate 132000
    
    

Unfortunately, this software combo is not very stable. The Ogg that VLC 1.1.2 generates cannot be replayed by Firefox 3.6.8. Older VLCs (e.g., 0.9.8) did generate playable Ogg, but they could not reliable grab a built-in camera.

P2P Parameters

There are several parameters you can play with to optimize performance:

  • Piecesize
  • Subpiecesize aka chunksize
  • "Duration"
  • Live-source Authentication method
  • Disk space allocation policy
  • Source parameters: number of upload slots, auxiliary seeders and bandwidth limits

The piecesize determines the highest level of blocks in which the stream is divided. Simulations have shown (Fig. 5) that a piecesize of 32K is preferred for live. Above 64K it can have a negative impact on the swarm, with peers not being able to watch the video. But using the larger 64K value will result in less pieces/second to process and thus lower resource usage.

Another parameter is the chunksize. A piece is requested from a peer in a number of chunks, currently 16K. If we increase the chunksize, there is less REQUEST+PIECE packets per second. Normally this parameter is fixed for BitTorrent, but you can play around with it, the wire protocol can handle that. See DownloadConfig.set_download_slice_size(). A microtest showed little impact.

"duration" is the amount of time a live stream plays before it starts reusing piece numbers. As such, it together with the piece size determines the range of piece numbers, following the simple formula:

npieces = (avg. bitrate * duration) / piecesize

The range of piece numbers is an important performance factor. A large range costs CPU as some code needs to for-loop over the entire range. So a good performance optimization we've used is to increase piece size and to set duration from 1 hour to 0.5 hours. Making duration too small may cause problems, as peers that drift a little in playback time may simultaneously use the same piece numbers but from different epochs.

Another parameter is the live-source authentication method to use. We support 3 methods:

  • No auth
  • ECDSA auth with arbitrary key-length
  • RSA auth with arbitrary key-length

Using no authentication is very unwise because it will make the swarm susceptible to malfunctioning and malicious clients. Malfunctioning clients that get stuck will advocate pieces with old content (different epoch, i.e. previous loop over the piece range). Malicious clients can replace the content of pieces. The performance of the public-key based methods highly depends on cryptosystem and key length. Investigations show that RSA with small key lengths (e.g. 768 bits) performs the best in terms of CPU requirements for verifying the signature. If that is still to expensive, please turn off only the signature verification at the client such that the protection against malfunctioning clients remains in place.

Final parameter is disk-allocation policy. We haven't tested the effect of this parameter yet, but using the DISKALLOC_SPARSE policy on UNIX systems could theoretically increase performance. A client hooks into a live stream at a certain piece number and then starts downloading pieces in its proximity. If this piece number is high it could be beneficial to create a hole at the beginning using sparse support. Update: the SPARSE policy is the default on *NIX.

The source, that is, the createlivestream.py script can also be tuned. Source parameters are the number of upload slots, auxiliary seeders and bandwidth limits. We used 32 upload slots for our large live streaming trial. In addition, we had 5 auxiliary seeders (DownloadStartupConfig.set_live_aux_seeders()), that is, peers that get preference at the source, meaning they are always unchoked and which thus help to distribute the signal faster.