← Back to team overview

p2psp team mailing list archive

Re: Current status of the Android Streaming Source

 

Hi.

I've uploaded the current version as promised. There were some mistakes on
my previous mail, though. The H264+Audio (better) version is on 'app'
module while the H263 one is on 'sprinkler'. Please note that there is also
a highly-modified 'kickflip-android-sdk' subproject, which right now is
what does all the work for the 'app' module.

As I said before there are some bugs:

   - Stopping the recording will make the the app crash. It's nothing
   serious, as it can still run, but the force close dialog appears anyway.
   - '.ts' files aren't deleted when the recording ends, taking up some
   space for every recording.
   - There is a small notification when the user starts recording -the
   little 'buffering' view at top-left corner- but there isn't one when the
   broadcast starts as no method uses the callback interface to show that.

So we have these tasks to do:

   1. Fix those bugs.
   2. Modify kickflip-android-sdk so it uploads the video using a chunked
   HTTP request to the webservice.
   3. Improve the webservice, as it lacks lots of functionality - mostly on
   peer-side.
   4. Add some kind of wrapper to the peer on the P2PSP code so it can
   correctly use the webservice.
   5. Try to modify the kickflip sdk - and the FFMpegWrapper it includes,
   which can be found here https://github.com/OpenWatch/FFmpegWrapper so it
   redirects the video output to a pipe or socket which we can upload, saving
   us from a lot of delay.
   6. Find out if WebM can be live-streamed using FFMPEG.
   7. Try to build a Vorbis-capable FFMPEG version with NDK.


El mar., 10 de marzo de 2015 a las 14:14, Jorge Martín Espinosa (<
jorgemartinespinosa@xxxxxxxxx>) escribió:

> Hi everyone!
>
> I've been working on an app which could solve this problem for the last
> few months, so I'm going to share with you what I've found out so far:
>
>    - There is no native support for Vorbis on Android - you probably
>    already knew this.
>    - Although video can be encoded with VP8 on latest versions, Android
>    will refuse to do it in a way that it can be streamed - will print 5 empty
>    bytes to the pipeline which would later be read and streamed, then stop
>    completely.
>    - H263 video can be recorded and streamed directly into VLC and only
>    VLC if there is no audio present. Otherwise, VLC won't know how to handle
>    it.
>    - H264 can't be directly streamed to VLC, as it doesn't have the
>    required MOOV atom.
>    - While you could use MPEG-TS to wrap the H264 video and Android has
>    basic support, it won't encode things properly if SDK methods are used.
>    - AFAIK WebM can't be live-streamed using FFMPEG, only gstreamer - we
>    should check this again as I'm not really sure.
>
> So I'll tell you a few things that we've tried:
>
>    - It is possible to use a custom NDK-compiled version of FFMPEG to
>    encode video. I couldn't build it with Vorbis support due to compilation
>    errors, but maybe you could.
>    - The encoded video can be saved to HLS format -small .ts files with
>    MPEG-TS wrapped H264 video- and streamed, but this causes a delay as we
>    have to wait for every .ts file to be completely written before we stream
>    it. It would be great if we could make FFMPEG encode the raw video and then
>    redirect it to some socket or pipe which we could read on Android.
>    - Using KickFlip SDK <https://github.com/Kickflip/kickflip-android-sdk>
>    looks really promising as uses the previous method and a few tests have
>    been ran showing that we can actually stream video and audio with 2-3s
>    delay to a PC without using RTP or any other protocol, just TCP or HTTP.
>    Also, it has Apache2 license, so modifying it shouldn't be an issue.
>    - KickFlip SDK code is tightly tied to their web services and API so
>    modifying it wasn't a simple task, but I have a version that no longer
>    depends on that - although I haven't uploaded it yet as I wanted to fix a
>    pair of bugs.
>
> So the *current status *would be:
>
>    - A working version that streams only H263 video and no audio through
>    HTTP chunked request to the splitter and which has been slightly tested
>    with P2PSP.
>    - Working video streaming with modified KickFlip SDK through TCP using
>    MPEG-TS, with H264 and AAC video codecs, not tested with P2PSP but should
>    work fine.
>    - There are some bugs while using KickFlip SDK and needs better
>    documentation.
>    - We should check if FFMPEG can redirect the encoded video to anything
>    that we can read on Android and which could reduce the delay to a better
>    value (<1s maybe?), such as a pipe or a socket.
>    - We should check if WebM video can be live-streamed too.
>    - We may try to build FFMPEG using NDK and adding Vorbis support.
>    - To handle the streaming session we thought of making a webservice
>    which would store the live-video sessions -mostly its TCP ports on P2PSP-
>    and allow peers to connect with them. I can elaborate more later. There are
>    some preliminary work done on this area, but it needs a lot of work.
>
> *Sources:*
>
> You can find the work done so far in this repo:
> https://github.com/jp-garcia-ortiz/sprinkler
>
> Although the H263 and no audio version is on app folder, the current one
> with KickFlip SDK will be on sprinkler once I upload it - sorry, until now
> it was just me working on this, so I didn't really had the urge to upload
> it.
>
> The webservice - which we called 'bocast' - is on p2psp folder. There are
> 2 files, server.py and server-flask.py:
>
>    - The first one was used to test the H263 version and can read a HTTP
>    chunked request containing the video and pass it to a Splitter instance of
>    the P2PSP protocol, which will later distribute it to the peers connected
>    to it. It needs a lot of work, but this first steps are done. And yes, the
>    request-handling API is terrible.
>    - Flask API is way better but we're stuck as Flask can't read chunk
>    requests. If anything similar to this could be done on python or some other
>    language -yeah, we even accept javascript as a language ;) -, it would be
>    great.
>
> P2PSP files have been edited while trying to simplify the connection
> between bocast and splitter, but they should be restored to their previous
> state as the communication is done through another socket as it was meant
> to be when it was designed.
>
> That said, I will upload my H264 KickFlip based version ASAP so you can
> take a look at it and I will send another email with the work that must be
> done to both App and WebService.
>
> A completely native C++ and QT implementation could also be awesome as I
> thought I read something about it in previous emails, but I'll have to
> trust your work as I haven't coded more than a pair of C++ classes. Also,
> it could get tricky as on Android 5.0 SELinux enforced security a lot and
> I'm not sure if it would let you use Linux sockets and pipes out of the box.
>
> Also, I hope you'll forgive me for the longest mail ever :).
>
> Jorge.
>

References