Merge pull request #174 from FDH2/master

UxPlay-1.71 with support for HLS (HTTP Live Streaming) video from YouTube
This commit is contained in:
antimof
2025-01-20 10:29:46 +03:00
committed by GitHub
33 changed files with 4425 additions and 1556 deletions

View File

@@ -1,7 +1,7 @@
if ( APPLE )
cmake_minimum_required( VERSION 3.13 )
else ()
cmake_minimum_required( VERSION 3.5 )
cmake_minimum_required( VERSION 3.10 )
endif ()
project( uxplay )
@@ -32,7 +32,9 @@ if ( ( UNIX AND NOT APPLE ) OR USE_X11 )
endif()
if( UNIX AND NOT APPLE )
add_definitions( -DSUPPRESS_AVAHI_COMPAT_WARNING )
add_definitions( -DSUPPRESS_AVAHI_COMPAT_WARNING )
# convert AirPlay colormap 1:3:7:1 to sRGB (1:1:7:1), needed on Linux and BSD
add_definitions( -DFULL_RANGE_RGB_FIX )
else()
set( CMAKE_INSTALL_RPATH_USE_LINK_PATH TRUE )
endif()

View File

@@ -1,31 +1,27 @@
<h1
id="uxplay-1.70-airplay-mirror-and-airplay-audio-server-for-linux-macos-and-unix-now-also-runs-on-windows.">UxPlay
1.70: AirPlay-Mirror and AirPlay-Audio server for Linux, macOS, and Unix
id="uxplay-1.71-airplay-mirror-and-airplay-audio-server-for-linux-macos-and-unix-now-also-runs-on-windows.">UxPlay
1.71: AirPlay-Mirror and AirPlay-Audio server for Linux, macOS, and Unix
(now also runs on Windows).</h1>
<h3
id="now-developed-at-the-github-site-httpsgithub.comfdh2uxplay-where-all-user-issues-should-be-posted-and-latest-versions-can-be-found."><strong>Now
developed at the GitHub site <a
href="https://github.com/FDH2/UxPlay">https://github.com/FDH2/UxPlay</a>
(where ALL user issues should be posted, and latest versions can be
found).</strong></h3>
developed at the GitHub site <a href="https://github.com/FDH2/UxPlay"
class="uri">https://github.com/FDH2/UxPlay</a> (where ALL user issues
should be posted, and latest versions can be found).</strong></h3>
<ul>
<li><em><strong>NEW in v1.70</strong>: Support for 4k (h265) video with
the new “-h265” option.</em> (Recent Apple devices will send HEVC (h265)
video in AirPlay mirror mode if larger resolutions (<em>h</em> &gt;
1080) are requested with UxPlays “-s wxh” option; wired ethernet
connection is prefered to wireless in this mode, and may also be
required by the client; the “-h265” option changes the default
resolution from 1920x1080 to 3840x2160, but leaves default maximum
framerate (“-fps” option) at 30fps.)</li>
<li><em><strong>NEW in v1.71</strong>: Support for (YouTube) HLS (HTTP
Live Streaming) video with the new “-hls” option.</em> Click on the
airplay icon in the YouTube app to stream video. (You may need to wait
until advertisements have finished or been skipped before clicking the
YouTube airplay icon.) <strong>Please report any issues with this new
feature of UxPlay</strong>.</li>
</ul>
<h2 id="highlights">Highlights:</h2>
<ul>
<li>GPLv3, open source.</li>
<li>Originally supported only AirPlay Mirror protocol, now has added
support for AirPlay Audio-only (Apple Lossless ALAC) streaming from
current iOS/iPadOS clients. <strong>There is no current support for
Airplay HLS video-streaming (e.g., YouTube video) but this is in
development.</strong></li>
current iOS/iPadOS clients. <strong>Now with support for Airplay HLS
video-streaming (currently only YouTube video).</strong></li>
<li>macOS computers (2011 or later, both Intel and “Apple Silicon” M1/M2
systems) can act either as AirPlay clients, or as the server running
UxPlay. Using AirPlay, UxPlay can emulate a second display for macOS
@@ -92,12 +88,14 @@ you may wish to add “as pipewiresink” or “vs waylandsink” as defaults to
the file. <em>(Output from terminal commands “ps waux | grep pulse” or
“pactl info” will contain “pipewire” if your Linux/BSD system uses
it).</em></p></li>
<li><p>On Raspberry Pi: If you use Ubuntu 22.10 or earlier, GStreamer
must be <a
<li><p>On Raspberry Pi: models using hardware h264 video decoding by the
Broadcom GPU (models 4B and earlier) may require the uxplay option
-bt709. If you use Ubuntu 22.10 or earlier, GStreamer must be <a
href="https://github.com/FDH2/UxPlay/wiki/Gstreamer-Video4Linux2-plugin-patches">patched</a>
to use hardware video decoding by the Broadcom GPU (also recommended but
optional for Raspberry Pi OS (Bullseye): use option
<code>uxplay -bt709</code>” if you do not use the patch).</p></li>
optional for Raspberry Pi OS (Bullseye): the patched GStreamer does not
need option ” -bt709`“. The need for -bt709 when hardware video decoding
is used seems to have reappeared starting with GStreamer-1.22.</p></li>
</ul>
<p>To (easily) compile the latest UxPlay from source, see the section <a
href="#getting-uxplay">Getting UxPlay</a>.</p>
@@ -169,16 +167,15 @@ stops/restarts as you leave/re-enter</em> <strong>Audio</strong>
<li><p><strong>Note that Apple video-DRM (as found in “Apple TV app”
content on the client) cannot be decrypted by UxPlay, and the Apple TV
app cannot be watched using UxPlays AirPlay Mirror mode (only the
unprotected audio will be streamed, in AAC format), but both video and
audio content from DRM-free apps like “YouTube app” will be streamed by
UxPlay in Mirror mode.</strong></p></li>
<li><p><strong>As UxPlay does not currently support non-Mirror AirPlay
video streaming (where the client controls a web server on the AirPlay
server that directly receives HLS content to avoid it being decoded and
re-encoded by the client), using the icon for AirPlay video in apps such
as the YouTube app will only send audio (in lossless ALAC format)
without the accompanying video (there are plans to support HLS video in
future releases of UxPlay)</strong></p></li>
unprotected audio will be streamed, in AAC format).</strong></p></li>
<li><p><strong>With the new “-hls” option, UxPlay now also supports
non-Mirror AirPlay video streaming (where the client controls a web
server on the AirPlay server that directly receives HLS content to avoid
it being decoded and re-encoded by the client). This currently only
supports streaming of YouTube videos. Without the -hls option, using the
icon for AirPlay video in apps such as the YouTube app will only send
audio (in lossless ALAC format) without the accompanying
video.</strong></p></li>
</ul>
<h3
id="possibility-for-using-hardware-accelerated-h264h265-video-decoding-if-available.">Possibility
@@ -218,13 +215,18 @@ Raspberry Pi, so far only included in Raspberry Pi OS, and two other
distributions (Ubuntu, Manjaro) available with Raspberry Pi Imager.
<em>(For GStreamer &lt; 1.22, see the <a
href="https://github.com/FDH2/UxPlay/wiki/Gstreamer-Video4Linux2-plugin-patches">UxPlay
Wiki</a>)</em>.</p></li>
<li><p><strong>(New): Support for h265 (HEVC) hardware decoding on
Raspberry Pi (Pi 4 model B and Pi 5)</strong></p>
<p>Support is present, but so far satisfactory results have not been
obtained. Pi model 5 only provides hardware-accelerated (GPU) decoding
for h265 video, but not H264, as its CPU is powerful enough for
satisfactory software H264 decoding</p></li>
Wiki</a>)</em>. Pi model 5 has no support for hardware H264 decoding, as
its CPU is powerful enough for satisfactory software H264
decoding</p></li>
<li><p><strong>Support for h265 (HEVC) hardware decoding on Raspberry Pi
(Pi 4 model B and Pi 5)</strong></p>
<p>These Raspberry Pi models have a dedicated HEVC decoding block (not
the GPU), with a driver “rpivid” which is not yet in the mainline Linux
kernel (but is planned to be there in future). Unfortunately it produces
decoded video in a non-standard pixel format (NC30 or “SAND”) which will
not be supported by GStreamer until the driver is in the mainline
kernel; without this support, UxPlay support for HEVC hardware decoding
on Raspberry Pi will not work.</p></li>
</ul>
<h3 id="note-to-packagers">Note to packagers:</h3>
<p>UxPlays GPLv3 license does not have an added “GPL exception”
@@ -260,7 +262,7 @@ libraries installed. Debian-based systems provide a package
“build-essential” for use in compiling software. You also need
pkg-config: if it is not found by “<code>which pkg-config</code>”,
install pkg-config or its work-alike replacement pkgconf. Also make sure
that cmake&gt;=3.5 is installed: “<code>sudo apt install cmake</code>
that cmake&gt;=3.10 is installed: “<code>sudo apt install cmake</code>
(add <code>build-essential</code> and <code>pkg-config</code> (or
<code>pkgconf</code>) to this if needed).</p>
<p>Make sure that your distribution provides OpenSSL 1.1.1 or later, and
@@ -583,6 +585,12 @@ GStreamer-1.24).</p></li>
<li><p>If the server is “headless” (no attached monitor, renders audio
only) use <code>-vs 0</code>.</p></li>
</ul>
<p>Note that videosink options can set using quoted arguments to -vs:
<em>e.g.</em>, <code>-vs "xvimagesink display=:0"</code>: ximagesink and
xvimagesink allow an X11 display name to be specified, and waylandsink
has a similar option. Videosink options (“properties”) can be found in
their GStreamer description pages,such as
https://gstreamer.freedesktop.org/documentation/xvimagesink .</p>
<p>GStreamer also searches for the best “audiosink”; override its choice
with <code>-as &lt;audiosink&gt;</code>. Choices on Linux include
pulsesink, alsasink, pipewiresink, oss4sink; see what is available with
@@ -626,6 +634,9 @@ Wiki</a>. Legacy Raspberry Pi OS (Bullseye) has a partially-patched
GStreamer-1.18.4 which needs the uxplay option -bt709 (and dont use
-v4l2); it is still better to apply the full patch from the UxPlay Wiki
in this case.</p></li>
<li><p><strong>It appears that when hardware h264 video decoding is
used, the option -bt709 became needed again in GStreamer-1.22 and
later.</strong></p></li>
<li><p>For “double-legacy” Raspberry Pi OS (Buster), there is no patch
for GStreamer-1.14. Instead, first build a complete newer
GStreamer-1.18.6 from source using <a
@@ -636,14 +647,17 @@ GPU with the GStreamer OMX plugin (use option
<code>-vd omxh264dec</code>”), but this is broken by Pi 4 Model B
firmware. OMX support was removed from Raspberry Pi OS (Bullseye), but
is present in Buster.</p></li>
<li><p><strong>H265 (4K)</strong> video is supported with hardware
decoding by the Broadcom GPU on Raspberry Pi 5 models, as well as on
Raspberry Pi 4 model B. <strong>While GStreamer seem to make use of this
hardware decoding, satisfactory rendering speed of 4K video by UxPlay on
these Raspberry Pi models has not yet been acheived.</strong> The option
“-h265” is required for activating h265 support. A wired ethernet
connection is preferred in this mode (and may be required by the
client).</p></li>
<li><p><strong>H265 (4K)</strong> video is potentially supported by
hardware decoding on Raspberry Pi 5 models, as well as on Raspberry Pi 4
model B, using a dedicated HEVC decoding block, but the “rpivid” kernel
driver for this is not yet supported by GStreamer (this driver decodes
video into a non-standard format that cannot be supported by GStreamer
until the driver is in the mainline Linux kernel). Raspberry Pi provides
a version of ffmpeg that can use that format, but at present UxPlay
cannot use this. The best solution would be for the driver to be
“upstreamed” to the kernel, allowing GStreamer support. (Software HEVC
decoding works, but does not seem to give satisfactory results on the
Pi).</p></li>
</ul>
<p>Even with GPU video decoding, some frames may be dropped by the
lower-power models to keep audio and video synchronized using
@@ -665,6 +679,7 @@ framebuffer video, use <code>&lt;videosink&gt;</code> =
<li>Tip: to start UxPlay on a remote host (such as a Raspberry Pi) using
ssh:</li>
</ul>
<!-- -->
<pre><code> ssh user@remote_host
export DISPLAY=:0
nohup uxplay [options] &gt; FILE &amp;</code></pre>
@@ -687,9 +702,9 @@ done with package managers <a
href="http://www.macports.org">MacPorts</a>
(<code>sudo port install cmake</code>), <a
href="http://brew.sh">Homebrew</a> (<code>brew install cmake</code>), or
by a download from <a
href="https://cmake.org/download/">https://cmake.org/download/</a>. Also
install <code>git</code> if you will use it to fetch UxPlay.</p>
by a download from <a href="https://cmake.org/download/"
class="uri">https://cmake.org/download/</a>. Also install
<code>git</code> if you will use it to fetch UxPlay.</p>
<p>Next install libplist and openssl-3.x. Note that static versions of
these libraries will be used in the macOS builds, so they can be
uninstalled after building uxplay, if you wish.</p>
@@ -705,11 +720,11 @@ automake, libtool, <em>etc.</em>) to be installed.</p>
<p>Next get the latest macOS release of GStreamer-1.0.</p>
<p><strong>Using “Official” GStreamer (Recommended for both MacPorts and
Homebrew users)</strong>: install the GStreamer release for macOS from
<a
href="https://gstreamer.freedesktop.org/download/">https://gstreamer.freedesktop.org/download/</a>.
(This release contains its own pkg-config, so you dont have to install
one.) Install both the gstreamer-1.0 and gstreamer-1.0-devel packages.
After downloading, Shift-Click on them to install (they install to
<a href="https://gstreamer.freedesktop.org/download/"
class="uri">https://gstreamer.freedesktop.org/download/</a>. (This
release contains its own pkg-config, so you dont have to install one.)
Install both the gstreamer-1.0 and gstreamer-1.0-devel packages. After
downloading, Shift-Click on them to install (they install to
/Library/FrameWorks/GStreamer.framework). Homebrew or MacPorts users
should <strong>not</strong> install (or should uninstall) the GStreamer
supplied by their package manager, if they use the “official”
@@ -729,13 +744,15 @@ href="https://formulae.brew.sh/formula/gstreamer#default">Homebrew
gstreamer installation</a> has recently been reworked into a single
“formula” named <code>gstreamer</code>, which now works without needing
GST_PLUGIN_PATH to be set in the enviroment. Homebrew installs gstreamer
to <code>(HOMEBREW)/lib/gstreamer-1.0</code> where
<code>(HOMEBREW)/*</code> is <code>/opt/homebrew/*</code> on Apple
to <code>HOMEBREW_PREFIX/lib/gstreamer-1.0</code> where by default
<code>HOMEBREW_PREFIX/*</code> is <code>/opt/homebrew/*</code> on Apple
Silicon Macs, and <code>/usr/local/*</code> on Intel Macs; do not put
any extra non-Homebrew plugins (that you build yourself) there, and
instead set GST_PLUGIN_PATH to point to their location (Homebrew does
not supply a complete GStreamer, but seems to have everything needed for
UxPlay).</p>
UxPlay). <strong>New: the UxPlay build script will now also detect
Homebrew installations in non-standard locations indicated by the
environment variable <code>$HOMEBREW_PREFIX</code>.</strong></p>
<p><strong>Using GStreamer installed from MacPorts</strong>: this is
<strong>not</strong> recommended, as currently the MacPorts GStreamer is
old (v1.16.2), unmaintained, and built to use X11:</p>
@@ -932,10 +949,14 @@ GStreamer plugins in the pipeline are specific for h264 or h265, the
correct version will be used in each pipeline. A wired Client-Server
ethernet connection is preferred over Wifi for 4K video, and might be
required by the client. Only recent Apple devices (M1/M2 Macs or iPads,
and some iPhones) can send h265 video if a resolut “-s wxh” with h &gt;
1080 is requested. The “-h265” option changes the default resolution
(“-s” option) from 1920x1080 to 3840x2160, and leaves default maximum
framerate (“-fps” option) at 30fps.</p>
and some iPhones) can send h265 video if a resolution “-s wxh” with h
&gt; 1080 is requested. The “-h265” option changes the default
resolution (“-s” option) from 1920x1080 to 3840x2160, and leaves default
maximum framerate (“-fps” option) at 30fps.</p>
<p><strong>-hls</strong> Activate HTTP Live Streaming support. With this
option YouTube videos can be streamed directly from YouTube servers to
UxPlay (without passing through the client) by clicking on the AirPlay
icon in the YouTube app.</p>
<p><strong>-pin [nnnn]</strong>: (since v1.67) use Apple-style
(one-time) “pin” authentication when a new client connects for the first
time: a four-digit pin code is displayed on the terminal, and the client
@@ -1086,8 +1107,18 @@ decoding in the GPU by Video4Linux2. Equivalent to
<p><strong>-bt709</strong> A workaround for the failure of the older
Video4Linux2 plugin to recognize Apples use of an uncommon (but
permitted) “full-range color” variant of the bt709 color standard for
digital TV. This is no longer needed by GStreamer-1.20.4 and backports
from it.</p>
digital TV. This was no longer needed by GStreamer-1.20.4 and backports
from it, but appears to again be required in GStreamer-1.22 and
later.</p>
<p><strong>-srgb</strong> A workaround for a failure to display
full-range 8-bit color [0-255], and instead restrict to limited range
[16-235] “legal BT709” HDTV format. The workaround works on x86_64
desktop systems, but does not yet work on Raspberry Pi. The issue may be
fixed in a future GStreamer release: it only occurs in Linux and
*BSD.</p>
<p><strong>-srbg no</strong>. Disables the -srgb option, which is
enabled by default in Linux and *BSD, but may be useless on Raspberry
Pi, and may be unwanted, as it adds extra processing load.</p>
<p><strong>-rpi</strong> Equivalent to “-v4l2” (Not valid for Raspberry
Pi model 5, and removed in UxPlay 1.67)</p>
<p><strong>-rpigl</strong> Equivalent to “-rpi -vs glimagesink”.
@@ -1340,8 +1371,7 @@ like</p>
+ lo IPv4 UxPlay AirPlay Remote Video local
+ eno1 IPv6 863EA27598FE@UxPlay AirTunes Remote Audio local
+ eno1 IPv4 863EA27598FE@UxPlay AirTunes Remote Audio local
+ lo IPv4 863EA27598FE@UxPlay AirTunes Remote Audio local
</code></pre>
+ lo IPv4 863EA27598FE@UxPlay AirTunes Remote Audio local</code></pre>
<p>If only the loopback (“lo”) entries are shown, a firewall on the
UxPlay host is probably blocking full DNS-SD service, and you need to
open the default UDP port 5353 for mDNS requests, as loopback-based
@@ -1572,6 +1602,8 @@ an AppleTV6,2 with sourceVersion 380.20.1 (an AppleTV 4K 1st gen,
introduced 2017, running tvOS 12.2.1), so it does not seem to matter
what version UxPlay claims to be.</p>
<h1 id="changelog">Changelog</h1>
<p>1.71 2024-12-13 Add support for HTTP Live Streaming (HLS), initially
only for YouTube movies. Fix issue with NTP timeout on Windows.</p>
<p>1.70 2024-10-04 Add support for 4K (h265) video (resolution 3840 x
2160). Fix issue with GStreamer &gt;= 1.24 when client sleeps, then
wakes.</p>
@@ -1686,7 +1718,7 @@ and added -rpigl (OpenGL) and -rpiwl (Wayland) options for RPi Desktop
systems. Also modified timestamps from “DTS” to “PTS” for latency
improvement, plus internal cleanups.</p>
<p>1.49 2022-03-28 Addded options for dumping video and/or audio to
file, for debugging, etc. h264 PPS/SPS NALUs are shown with -d. Fixed
file, for debugging, etc. h264 PPS/SPS NALUs are shown with -d. Fixed
video-not-working for M1 Mac clients.</p>
<p>1.48 2022-03-11 Made the GStreamer video pipeline fully configurable,
for use with hardware h264 decoding. Support for Raspberry Pi.</p>
@@ -1750,13 +1782,13 @@ closed, with uxplay still running. Corrected in v. 1.34</p>
<p>If you need to do this, note that you may be able to use a newer
version (OpenSSL-3.0.1 is known to work). You will need the standard
development toolset (autoconf, automake, libtool). Download the source
code from <a
href="https://www.openssl.org/source/">https://www.openssl.org/source/</a>.
Install the downloaded openssl by opening a terminal in your Downloads
directory, and unpacking the source distribution: (“tar -xvzf
openssl-3.0.1.tar.gz ; cd openssl-3.0.1”). Then build/install with
“./config ; make ; sudo make install_dev”. This will typically install
the needed library <code>libcrypto.*</code>, either in /usr/local/lib or
code from <a href="https://www.openssl.org/source/"
class="uri">https://www.openssl.org/source/</a>. Install the downloaded
openssl by opening a terminal in your Downloads directory, and unpacking
the source distribution: (“tar -xvzf openssl-3.0.1.tar.gz ; cd
openssl-3.0.1”). Then build/install with “./config ; make ; sudo make
install_dev”. This will typically install the needed library
<code>libcrypto.*</code>, either in /usr/local/lib or
/usr/local/lib64.</p>
<p><em>(Ignore the following for builds on MacOS:)</em> On some systems
like Debian or Ubuntu, you may also need to add a missing entry
@@ -1770,8 +1802,9 @@ can avoid this step by installing libplist-dev and libplist3 from Debian
10 or Ubuntu 18.04.)</em> As well as the usual build tools (autoconf,
automake, libtool), you may need to also install some libpython*-dev
package. Download the latest source with git from <a
href="https://github.com/libimobiledevice/libplist">https://github.com/libimobiledevice/libplist</a>,
or get the source from the Releases section (use the *.tar.bz2 release,
href="https://github.com/libimobiledevice/libplist"
class="uri">https://github.com/libimobiledevice/libplist</a>, or get the
source from the Releases section (use the *.tar.bz2 release,
<strong>not</strong> the *.zip or *.tar.gz versions): download <a
href="https://github.com/libimobiledevice/libplist/releases/download/2.3.0/libplist-2.3.0.tar.bz2">libplist-2.3.0</a>,
then unpack it (“tar -xvjf libplist-2.3.0.tar.bz2 ; cd libplist-2.3.0”),

2758
README.md

File diff suppressed because it is too large Load Diff

View File

@@ -1,24 +1,21 @@
# UxPlay 1.70: AirPlay-Mirror and AirPlay-Audio server for Linux, macOS, and Unix (now also runs on Windows).
# UxPlay 1.71: AirPlay-Mirror and AirPlay-Audio server for Linux, macOS, and Unix (now also runs on Windows).
### **Now developed at the GitHub site <https://github.com/FDH2/UxPlay> (where ALL user issues should be posted, and latest versions can be found).**
- ***NEW in v1.70**: Support for 4k (h265) video with the new "-h265"
option.* (Recent Apple devices will send HEVC (h265) video in
AirPlay mirror mode if larger resolutions (*h* \> 1080) are
requested with UxPlay's "-s wxh" option; wired ethernet connection
is prefered to wireless in this mode, and may also be required by
the client; the "-h265" option changes the default resolution from
1920x1080 to 3840x2160, but leaves default maximum framerate ("-fps"
option) at 30fps.)
- ***NEW in v1.71**: Support for (YouTube) HLS (HTTP Live Streaming)
video with the new "-hls" option.* Click on the airplay icon in the
YouTube app to stream video. (You may need to wait until
advertisements have finished or been skipped before clicking the
YouTube airplay icon.) **Please report any issues with this new
feature of UxPlay**.
## Highlights:
- GPLv3, open source.
- Originally supported only AirPlay Mirror protocol, now has added
support for AirPlay Audio-only (Apple Lossless ALAC) streaming from
current iOS/iPadOS clients. **There is no current support for
Airplay HLS video-streaming (e.g., YouTube video) but this is in
development.**
current iOS/iPadOS clients. **Now with support for Airplay HLS
video-streaming (currently only YouTube video).**
- macOS computers (2011 or later, both Intel and "Apple Silicon" M1/M2
systems) can act either as AirPlay clients, or as the server running
UxPlay. Using AirPlay, UxPlay can emulate a second display for macOS
@@ -84,12 +81,15 @@ After installation:
from terminal commands "ps waux \| grep pulse" or "pactl info" will
contain "pipewire" if your Linux/BSD system uses it).*
- On Raspberry Pi: If you use Ubuntu 22.10 or earlier, GStreamer must
be
- On Raspberry Pi: models using hardware h264 video decoding by the
Broadcom GPU (models 4B and earlier) may require the uxplay option
-bt709. If you use Ubuntu 22.10 or earlier, GStreamer must be
[patched](https://github.com/FDH2/UxPlay/wiki/Gstreamer-Video4Linux2-plugin-patches)
to use hardware video decoding by the Broadcom GPU (also recommended
but optional for Raspberry Pi OS (Bullseye): use option
"`uxplay -bt709`" if you do not use the patch).
but optional for Raspberry Pi OS (Bullseye): the patched GStreamer
does not need option " -bt709\`". The need for -bt709 when hardware
video decoding is used seems to have reappeared starting with
GStreamer-1.22.
To (easily) compile the latest UxPlay from source, see the section
[Getting UxPlay](#getting-uxplay).
@@ -159,17 +159,16 @@ stops/restarts as you leave/re-enter* **Audio** *mode.*
- **Note that Apple video-DRM (as found in "Apple TV app" content on
the client) cannot be decrypted by UxPlay, and the Apple TV app
cannot be watched using UxPlay's AirPlay Mirror mode (only the
unprotected audio will be streamed, in AAC format), but both video
and audio content from DRM-free apps like "YouTube app" will be
streamed by UxPlay in Mirror mode.**
unprotected audio will be streamed, in AAC format).**
- **As UxPlay does not currently support non-Mirror AirPlay video
streaming (where the client controls a web server on the AirPlay
server that directly receives HLS content to avoid it being decoded
and re-encoded by the client), using the icon for AirPlay video in
apps such as the YouTube app will only send audio (in lossless ALAC
format) without the accompanying video (there are plans to support
HLS video in future releases of UxPlay)**
- **With the new "-hls" option, UxPlay now also supports non-Mirror
AirPlay video streaming (where the client controls a web server on
the AirPlay server that directly receives HLS content to avoid it
being decoded and re-encoded by the client). This currently only
supports streaming of YouTube videos. Without the -hls option, using
the icon for AirPlay video in apps such as the YouTube app will only
send audio (in lossless ALAC format) without the accompanying
video.**
### Possibility for using hardware-accelerated h264/h265 video-decoding, if available.
@@ -212,14 +211,19 @@ used.
available with Raspberry Pi Imager. *(For GStreamer \< 1.22, see the
[UxPlay
Wiki](https://github.com/FDH2/UxPlay/wiki/Gstreamer-Video4Linux2-plugin-patches))*.
Pi model 5 has no support for hardware H264 decoding, as its CPU is
powerful enough for satisfactory software H264 decoding
- **(New): Support for h265 (HEVC) hardware decoding on Raspberry Pi
(Pi 4 model B and Pi 5)**
- **Support for h265 (HEVC) hardware decoding on Raspberry Pi (Pi 4
model B and Pi 5)**
Support is present, but so far satisfactory results have not been
obtained. Pi model 5 only provides hardware-accelerated (GPU)
decoding for h265 video, but not H264, as its CPU is powerful enough
for satisfactory software H264 decoding
These Raspberry Pi models have a dedicated HEVC decoding block (not
the GPU), with a driver "rpivid" which is not yet in the mainline
Linux kernel (but is planned to be there in future). Unfortunately
it produces decoded video in a non-standard pixel format (NC30 or
"SAND") which will not be supported by GStreamer until the driver is
in the mainline kernel; without this support, UxPlay support for
HEVC hardware decoding on Raspberry Pi will not work.
### Note to packagers:
@@ -259,7 +263,7 @@ libraries installed. Debian-based systems provide a package
"build-essential" for use in compiling software. You also need
pkg-config: if it is not found by "`which pkg-config`", install
pkg-config or its work-alike replacement pkgconf. Also make sure that
cmake\>=3.5 is installed: "`sudo apt install cmake`" (add
cmake\>=3.10 is installed: "`sudo apt install cmake`" (add
`build-essential` and `pkg-config` (or `pkgconf`) to this if needed).
Make sure that your distribution provides OpenSSL 1.1.1 or later, and
@@ -573,6 +577,13 @@ what is available. Some possibilites on Linux/\*BSD are:
- If the server is "headless" (no attached monitor, renders audio
only) use `-vs 0`.
Note that videosink options can set using quoted arguments to -vs:
*e.g.*, `-vs "xvimagesink display=:0"`: ximagesink and xvimagesink allow
an X11 display name to be specified, and waylandsink has a similar
option. Videosink options ("properties") can be found in their GStreamer
description pages,such as
https://gstreamer.freedesktop.org/documentation/xvimagesink .
GStreamer also searches for the best "audiosink"; override its choice
with `-as <audiosink>`. Choices on Linux include pulsesink, alsasink,
pipewiresink, oss4sink; see what is available with
@@ -621,6 +632,9 @@ See [Usage](#usage) for more run-time options.
-v4l2); it is still better to apply the full patch from the UxPlay
Wiki in this case.
- **It appears that when hardware h264 video decoding is used, the
option -bt709 became needed again in GStreamer-1.22 and later.**
- For "double-legacy" Raspberry Pi OS (Buster), there is no patch for
GStreamer-1.14. Instead, first build a complete newer
GStreamer-1.18.6 from source using [these
@@ -632,14 +646,17 @@ See [Usage](#usage) for more run-time options.
this is broken by Pi 4 Model B firmware. OMX support was removed
from Raspberry Pi OS (Bullseye), but is present in Buster.
- **H265 (4K)** video is supported with hardware decoding by the
Broadcom GPU on Raspberry Pi 5 models, as well as on Raspberry Pi 4
model B. **While GStreamer seem to make use of this hardware
decoding, satisfactory rendering speed of 4K video by UxPlay on
these Raspberry Pi models has not yet been acheived.** The option
"-h265" is required for activating h265 support. A wired ethernet
connection is preferred in this mode (and may be required by the
client).
- **H265 (4K)** video is potentially supported by hardware decoding on
Raspberry Pi 5 models, as well as on Raspberry Pi 4 model B, using a
dedicated HEVC decoding block, but the "rpivid" kernel driver for
this is not yet supported by GStreamer (this driver decodes video
into a non-standard format that cannot be supported by GStreamer
until the driver is in the mainline Linux kernel). Raspberry Pi
provides a version of ffmpeg that can use that format, but at
present UxPlay cannot use this. The best solution would be for the
driver to be "upstreamed" to the kernel, allowing GStreamer support.
(Software HEVC decoding works, but does not seem to give
satisfactory results on the Pi).
Even with GPU video decoding, some frames may be dropped by the
lower-power models to keep audio and video synchronized using
@@ -726,12 +743,15 @@ be installed by Homebrew as dependencies. The [Homebrew gstreamer
installation](https://formulae.brew.sh/formula/gstreamer#default) has
recently been reworked into a single "formula" named `gstreamer`, which
now works without needing GST_PLUGIN_PATH to be set in the enviroment.
Homebrew installs gstreamer to `(HOMEBREW)/lib/gstreamer-1.0` where
`(HOMEBREW)/*` is `/opt/homebrew/*` on Apple Silicon Macs, and
`/usr/local/*` on Intel Macs; do not put any extra non-Homebrew plugins
(that you build yourself) there, and instead set GST_PLUGIN_PATH to
point to their location (Homebrew does not supply a complete GStreamer,
but seems to have everything needed for UxPlay).
Homebrew installs gstreamer to `HOMEBREW_PREFIX/lib/gstreamer-1.0` where
by default `HOMEBREW_PREFIX/*` is `/opt/homebrew/*` on Apple Silicon
Macs, and `/usr/local/*` on Intel Macs; do not put any extra
non-Homebrew plugins (that you build yourself) there, and instead set
GST_PLUGIN_PATH to point to their location (Homebrew does not supply a
complete GStreamer, but seems to have everything needed for UxPlay).
**New: the UxPlay build script will now also detect Homebrew
installations in non-standard locations indicated by the environment
variable `$HOMEBREW_PREFIX`.**
**Using GStreamer installed from MacPorts**: this is **not**
recommended, as currently the MacPorts GStreamer is old (v1.16.2),
@@ -934,11 +954,16 @@ the pipeline are specific for h264 or h265, the correct version will be
used in each pipeline. A wired Client-Server ethernet connection is
preferred over Wifi for 4K video, and might be required by the client.
Only recent Apple devices (M1/M2 Macs or iPads, and some iPhones) can
send h265 video if a resolut "-s wxh" with h \> 1080 is requested. The
"-h265" option changes the default resolution ("-s" option) from
send h265 video if a resolution "-s wxh" with h \> 1080 is requested.
The "-h265" option changes the default resolution ("-s" option) from
1920x1080 to 3840x2160, and leaves default maximum framerate ("-fps"
option) at 30fps.
**-hls** Activate HTTP Live Streaming support. With this option YouTube
videos can be streamed directly from YouTube servers to UxPlay (without
passing through the client) by clicking on the AirPlay icon in the
YouTube app.
**-pin \[nnnn\]**: (since v1.67) use Apple-style (one-time) "pin"
authentication when a new client connects for the first time: a
four-digit pin code is displayed on the terminal, and the client screen
@@ -1097,8 +1122,19 @@ Video4Linux2. Equivalent to `-vd v4l2h264dec -vc v4l2convert`.
**-bt709** A workaround for the failure of the older Video4Linux2 plugin
to recognize Apple's use of an uncommon (but permitted) "full-range
color" variant of the bt709 color standard for digital TV. This is no
longer needed by GStreamer-1.20.4 and backports from it.
color" variant of the bt709 color standard for digital TV. This was no
longer needed by GStreamer-1.20.4 and backports from it, but appears to
again be required in GStreamer-1.22 and later.
**-srgb** A workaround for a failure to display full-range 8-bit color
\[0-255\], and instead restrict to limited range \[16-235\] "legal
BT709" HDTV format. The workaround works on x86_64 desktop systems, but
does not yet work on Raspberry Pi. The issue may be fixed in a future
GStreamer release: it only occurs in Linux and \*BSD.
**-srbg no**. Disables the -srgb option, which is enabled by default in
Linux and \*BSD, but may be useless on Raspberry Pi, and may be
unwanted, as it adds extra processing load.
**-rpi** Equivalent to "-v4l2" (Not valid for Raspberry Pi model 5, and
removed in UxPlay 1.67)
@@ -1611,6 +1647,9 @@ what version UxPlay claims to be.
# Changelog
1.71 2024-12-13 Add support for HTTP Live Streaming (HLS), initially
only for YouTube movies. Fix issue with NTP timeout on Windows.
1.70 2024-10-04 Add support for 4K (h265) video (resolution 3840 x
2160). Fix issue with GStreamer \>= 1.24 when client sleeps, then wakes.
@@ -1743,7 +1782,7 @@ systems. Also modified timestamps from "DTS" to "PTS" for latency
improvement, plus internal cleanups.
1.49 2022-03-28 Addded options for dumping video and/or audio to file,
for debugging, etc. h264 PPS/SPS NALU's are shown with -d. Fixed
for debugging, etc. h264 PPS/SPS NALU's are shown with -d. Fixed
video-not-working for M1 Mac clients.
1.48 2022-03-11 Made the GStreamer video pipeline fully configurable,

View File

@@ -43,9 +43,11 @@ endif()
if( APPLE )
set( ENV{PKG_CONFIG_PATH} "/usr/local/lib/pkgconfig" ) # standard location, and Brew
set( ENV{PKG_CONFIG_PATH} "$ENV{PKG_CONFIG_PATH}:/opt/homebrew/lib/pkgconfig" ) # Brew for M1 macs
set( ENV{PKG_CONFIG_PATH} "$ENV{PKG_CONFIG_PATH}:$ENV{HOMEBREW_PREFIX}/lib/pkgconfig" ) # Brew using prefix
set( ENV{PKG_CONFIG_PATH} "$ENV{PKG_CONFIG_PATH}:/opt/local/lib/pkgconfig/" ) # MacPorts
set( ENV{PKG_CONFIG_PATH} "$ENV{PKG_CONFIG_PATH}:/usr/local/opt/openssl@3/lib/pkgconfig" ) # Brew openssl
set( ENV{PKG_CONFIG_PATH} "$ENV{PKG_CONFIG_PATH}:/opt/homebrew/opt/openssl@3/lib/pkgconfig" ) # Brew M1 openssl
set( ENV{PKG_CONFIG_PATH} "$ENV{PKG_CONFIG_PATH}:$ENV{HOMEBREW_PREFIX}/opt/openssl@3/lib/pkgconfig" ) # Brew using prefix openssl
message( "PKG_CONFIG_PATH (Apple, lib) = " $ENV{PKG_CONFIG_PATH} )
find_program( PKG_CONFIG_EXECUTABLE pkg-config PATHS /Library/FrameWorks/GStreamer.framework/Commands )
message( "PKG_CONFIG_EXECUTABLE " ${PKG_CONFIG_EXECUTABLE} )
@@ -80,22 +82,28 @@ else()
endif()
# libplist
pkg_search_module(PLIST REQUIRED libplist-2.0)
if ( PLIST_FOUND )
message( STATUS "found libplist-${PLIST_VERSION}" )
endif()
if( APPLE )
# use static linking
pkg_search_module(PLIST REQUIRED libplist-2.0)
find_library( LIBPLIST libplist-2.0.a REQUIRED )
message( STATUS "(Static linking) LIBPLIST " ${LIBPLIST} )
target_link_libraries ( airplay ${LIBPLIST} )
elseif( WIN32)
pkg_search_module(PLIST REQUIRED libplist-2.0)
find_library( LIBPLIST ${PLIST_LIBRARIES} PATH ${PLIST_LIBDIR} )
target_link_libraries ( airplay ${LIBPLIST} )
else ()
pkg_search_module(PLIST libplist>=2.0)
if(NOT PLIST_FOUND)
pkg_search_module(PLIST REQUIRED libplist-2.0)
endif()
find_library( LIBPLIST ${PLIST_LIBRARIES} PATH ${PLIST_LIBDIR} )
target_link_libraries ( airplay PUBLIC ${LIBPLIST} )
endif()
if ( PLIST_FOUND )
message( STATUS "found libplist-${PLIST_VERSION}" )
endif()
target_include_directories( airplay PRIVATE ${PLIST_INCLUDE_DIRS} )
#libcrypto

322
lib/airplay_video.c Normal file
View File

@@ -0,0 +1,322 @@
/**
* Copyright (c) 2024 fduncanh
* All Rights Reserved.
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Lesser General Public
* License as published by the Free Software Foundation; either
* version 2.1 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Lesser General Public License for more details.
*
*/
// it should only start and stop the media_data_store that handles all HLS transactions, without
// otherwise participating in them.
#include <stdlib.h>
#include <stdio.h>
#include <string.h>
#include <stdbool.h>
#include <assert.h>
#include "raop.h"
#include "airplay_video.h"
struct media_item_s {
char *uri;
char *playlist;
int access;
};
struct airplay_video_s {
raop_t *raop;
char apple_session_id[37];
char playback_uuid[37];
char *uri_prefix;
char local_uri_prefix[23];
int next_uri;
int FCUP_RequestID;
float start_position_seconds;
playback_info_t *playback_info;
// The local port of the airplay server on the AirPlay server
unsigned short airplay_port;
char *master_uri;
char *master_playlist;
media_item_t *media_data_store;
int num_uri;
};
// initialize airplay_video service.
int airplay_video_service_init(raop_t *raop, unsigned short http_port,
const char *session_id) {
char uri[] = "http://localhost:xxxxx";
assert(raop);
airplay_video_t *airplay_video = deregister_airplay_video(raop);
if (airplay_video) {
airplay_video_service_destroy(airplay_video);
}
airplay_video = (airplay_video_t *) calloc(1, sizeof(airplay_video_t));
if (!airplay_video) {
return -1;
}
/* create local_uri_prefix string */
strncpy(airplay_video->local_uri_prefix, uri, sizeof(airplay_video->local_uri_prefix));
char *ptr = strstr(airplay_video->local_uri_prefix, "xxxxx");
snprintf(ptr, 6, "%-5u", http_port);
ptr = strstr(airplay_video->local_uri_prefix, " ");
if (ptr) {
*ptr = '\0';
}
if (!register_airplay_video(raop, airplay_video)) {
return -2;
}
//printf(" %p %p\n", airplay_video, get_airplay_video(raop));
airplay_video->raop = raop;
airplay_video->FCUP_RequestID = 0;
size_t len = strlen(session_id);
assert(len == 36);
strncpy(airplay_video->apple_session_id, session_id, len);
(airplay_video->apple_session_id)[len] = '\0';
airplay_video->start_position_seconds = 0.0f;
airplay_video->master_uri = NULL;
airplay_video->media_data_store = NULL;
airplay_video->master_playlist = NULL;
airplay_video->num_uri = 0;
airplay_video->next_uri = 0;
return 0;
}
// destroy the airplay_video service
void
airplay_video_service_destroy(airplay_video_t *airplay_video)
{
if (airplay_video->uri_prefix) {
free(airplay_video->uri_prefix);
}
if (airplay_video->master_uri) {
free (airplay_video->master_uri);
}
if (airplay_video->media_data_store) {
destroy_media_data_store(airplay_video);
}
if (airplay_video->master_playlist) {
free (airplay_video->master_playlist);
}
free (airplay_video);
}
const char *get_apple_session_id(airplay_video_t *airplay_video) {
return airplay_video->apple_session_id;
}
float get_start_position_seconds(airplay_video_t *airplay_video) {
return airplay_video->start_position_seconds;
}
void set_start_position_seconds(airplay_video_t *airplay_video, float start_position_seconds) {
airplay_video->start_position_seconds = start_position_seconds;
}
void set_playback_uuid(airplay_video_t *airplay_video, const char *playback_uuid) {
size_t len = strlen(playback_uuid);
assert(len == 36);
memcpy(airplay_video->playback_uuid, playback_uuid, len);
(airplay_video->playback_uuid)[len] = '\0';
}
void set_uri_prefix(airplay_video_t *airplay_video, char *uri_prefix, int uri_prefix_len) {
if (airplay_video->uri_prefix) {
free (airplay_video->uri_prefix);
}
airplay_video->uri_prefix = (char *) calloc(uri_prefix_len + 1, sizeof(char));
memcpy(airplay_video->uri_prefix, uri_prefix, uri_prefix_len);
}
char *get_uri_prefix(airplay_video_t *airplay_video) {
return airplay_video->uri_prefix;
}
char *get_uri_local_prefix(airplay_video_t *airplay_video) {
return airplay_video->local_uri_prefix;
}
char *get_master_uri(airplay_video_t *airplay_video) {
return airplay_video->master_uri;
}
int get_next_FCUP_RequestID(airplay_video_t *airplay_video) {
return ++(airplay_video->FCUP_RequestID);
}
void set_next_media_uri_id(airplay_video_t *airplay_video, int num) {
airplay_video->next_uri = num;
}
int get_next_media_uri_id(airplay_video_t *airplay_video) {
return airplay_video->next_uri;
}
/* master playlist */
void store_master_playlist(airplay_video_t *airplay_video, char *master_playlist) {
if (airplay_video->master_playlist) {
free (airplay_video->master_playlist);
}
airplay_video->master_playlist = master_playlist;
}
char *get_master_playlist(airplay_video_t *airplay_video) {
return airplay_video->master_playlist;
}
/* media_data_store */
int get_num_media_uri(airplay_video_t *airplay_video) {
return airplay_video->num_uri;
}
void destroy_media_data_store(airplay_video_t *airplay_video) {
media_item_t *media_data_store = airplay_video->media_data_store;
if (media_data_store) {
for (int i = 0; i < airplay_video->num_uri ; i ++ ) {
if (media_data_store[i].uri) {
free (media_data_store[i].uri);
}
if (media_data_store[i].playlist) {
free (media_data_store[i].playlist);
}
}
}
free (media_data_store);
airplay_video->num_uri = 0;
}
void create_media_data_store(airplay_video_t * airplay_video, char ** uri_list, int num_uri) {
destroy_media_data_store(airplay_video);
media_item_t *media_data_store = calloc(num_uri, sizeof(media_item_t));
for (int i = 0; i < num_uri; i++) {
media_data_store[i].uri = uri_list[i];
media_data_store[i].playlist = NULL;
media_data_store[i].access = 0;
}
airplay_video->media_data_store = media_data_store;
airplay_video->num_uri = num_uri;
}
int store_media_data_playlist_by_num(airplay_video_t *airplay_video, char * media_playlist, int num) {
media_item_t *media_data_store = airplay_video->media_data_store;
if ( num < 0 || num >= airplay_video->num_uri) {
return -1;
} else if (media_data_store[num].playlist) {
return -2;
}
media_data_store[num].playlist = media_playlist;
return 0;
}
char * get_media_playlist_by_num(airplay_video_t *airplay_video, int num) {
media_item_t *media_data_store = airplay_video->media_data_store;
if (media_data_store == NULL) {
return NULL;
}
if (num >= 0 && num <airplay_video->num_uri) {
return media_data_store[num].playlist;
}
return NULL;
}
int get_media_playlist_by_uri(airplay_video_t *airplay_video, const char *uri) {
/* Problem: there can be more than one StreamInf playlist with the same uri:
* they differ by choice of partner Media (audio, subtitles) playlists
* If the same uri is requested again, one of the other ones will be returned
* (the least-previously-requested one will be served up)
*/
// modified to return the position of the media playlist in the master playlist
media_item_t *media_data_store = airplay_video->media_data_store;
if (media_data_store == NULL) {
return -2;
}
int found = 0;;
int num = -1;
int access = -1;
for (int i = 0; i < airplay_video->num_uri; i++) {
if (strstr(media_data_store[i].uri, uri)) {
if (!found) {
found = 1;
num = i;
access = media_data_store[i].access;
} else {
/* change > below to >= to reverse the order of choice */
if (access > media_data_store[i].access) {
access = media_data_store[i].access;
num = i;
}
}
}
}
if (found) {
//printf("found %s\n", media_data_store[num].uri);
++media_data_store[num].access;
return num;
}
return -1;
}
char * get_media_uri_by_num(airplay_video_t *airplay_video, int num) {
media_item_t * media_data_store = airplay_video->media_data_store;
if (media_data_store == NULL) {
return NULL;
}
if (num >= 0 && num < airplay_video->num_uri) {
return media_data_store[num].uri;
}
return NULL;
}
int get_media_uri_num(airplay_video_t *airplay_video, char * uri) {
media_item_t *media_data_store = airplay_video->media_data_store;
for (int i = 0; i < airplay_video->num_uri ; i++) {
if (strstr(media_data_store[i].uri, uri)) {
return i;
}
}
return -1;
}
int analyze_media_playlist(char *playlist, float *duration) {
float next;
int count = 0;
char *ptr = strstr(playlist, "#EXTINF:");
*duration = 0.0f;
while (ptr != NULL) {
char *end;
ptr += strlen("#EXTINF:");
next = strtof(ptr, &end);
*duration += next;
count++;
ptr = strstr(end, "#EXTINF:");
}
return count;
}

74
lib/airplay_video.h Normal file
View File

@@ -0,0 +1,74 @@
/*
* Copyright (c) 2024 fduncanh, All Rights Reserved.
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Lesser General Public
* License as published by the Free Software Foundation; either
* version 2.1 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Lesser General Public License for more details.
*
*=================================================================
*/
#ifndef AIRPLAY_VIDEO_H
#define AIRPLAY_VIDEO_H
#include <stdint.h>
#include <stdbool.h>
#include "raop.h"
#include "logger.h"
typedef struct airplay_video_s airplay_video_t;
typedef struct media_item_s media_item_t;
const char *get_apple_session_id(airplay_video_t *airplay_video);
void set_start_position_seconds(airplay_video_t *airplay_video, float start_position_seconds);
float get_start_position_seconds(airplay_video_t *airplay_video);
void set_playback_uuid(airplay_video_t *airplay_video, const char *playback_uuid);
void set_uri_prefix(airplay_video_t *airplay_video, char *uri_prefix, int uri_prefix_len);
char *get_uri_prefix(airplay_video_t *airplay_video);
char *get_uri_local_prefix(airplay_video_t *airplay_video);
int get_next_FCUP_RequestID(airplay_video_t *airplay_video);
void set_next_media_uri_id(airplay_video_t *airplay_video, int id);
int get_next_media_uri_id(airplay_video_t *airplay_video);
int get_media_playlist_by_uri(airplay_video_t *airplay_video, const char *uri);
void store_master_playlist(airplay_video_t *airplay_video, char *master_playlist);
char *get_master_playlist(airplay_video_t *airplay_video);
int get_num_media_uri(airplay_video_t *airplay_video);
void destroy_media_data_store(airplay_video_t *airplay_video);
void create_media_data_store(airplay_video_t * airplay_video, char ** media_data_store, int num_uri);
int store_media_data_playlist_by_num(airplay_video_t *airplay_video, char * media_playlist, int num);
char *get_media_playlist_by_num(airplay_video_t *airplay_video, int num);
char *get_media_uri_by_num(airplay_video_t *airplay_video, int num);
int get_media_uri_num(airplay_video_t *airplay_video, char * uri);
int analyze_media_playlist(char *playlist, float *duration);
void airplay_video_service_destroy(airplay_video_t *airplay_video);
// C wrappers for c++ class MediaDataStore
//create the media_data_store, return a pointer to it.
void* media_data_store_create(void *conn_opaque, uint16_t port);
//delete the media_data_store
void media_data_store_destroy(void *media_data_store);
// called by the POST /action handler:
char *process_media_data(void *media_data_store, const char *url, const char *data, int datalen);
//called by the POST /play handler
bool request_media_data(void *media_data_store, const char *primary_url, const char * session_id);
//called by airplay_video_media_http_connection::get_handler: &path = req.uri)
char *query_media_data(void *media_data_store, const char *url, int *len);
//called by the post_stop_handler:
void media_data_store_reset(void *media_data_store);
const char *adjust_primary_uri(void *media_data_store, const char *url);
#endif //AIRPLAY_VIDEO_H

36
lib/compat.c Normal file
View File

@@ -0,0 +1,36 @@
/*
* Copyright (c) 2024 F. Duncanh, All Rights Reserved.
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Lesser General Public
* License as published by the Free Software Foundation; either
* version 2.1 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Lesser General Public License for more details.
*
*==================================================================
*/
#ifdef _WIN32
#include <stdlib.h>
#include <string.h>
#include "compat.h"
#define MAX_SOCKET_ERROR_MESSAGE_LENGTH 256
/* Windows (winsock2) socket error message text */
char *wsa_strerror(int errnum) {
static char message[MAX_SOCKET_ERROR_MESSAGE_LENGTH] = { 0 };
FormatMessage(FORMAT_MESSAGE_FROM_SYSTEM|FORMAT_MESSAGE_IGNORE_INSERTS,
0, errnum, 0, message, sizeof(message), 0);
char *nl = strchr(message, '\n');
if (nl) {
*nl = 0; /* remove any trailing newline, or truncate to one line */
}
return message;
}
#endif

View File

@@ -343,7 +343,6 @@ int gcm_decrypt(unsigned char *ciphertext, int ciphertext_len, unsigned char *pl
return plaintext_len;
} else {
/* Verify failed */
printf("failed\n");
return -1;
}
}

112
lib/fcup_request.h Normal file
View File

@@ -0,0 +1,112 @@
/*
* Copyright (c) 2022 fduncanh
* All Rights Reserved.
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Lesser General Public
* License as published by the Free Software Foundation; either
* version 2.1 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Lesser General Public License for more details.
*
*/
/* this file is part of raop.c via http_handlers.h and should not be included in any other file */
//produces the fcup request plist in xml format as a null-terminated string
char *create_fcup_request(const char *url, int request_id, const char *client_session_id, int *datalen) {
char *plist_xml = NULL;
/* values taken from apsdk-public; */
/* these seem to be arbitrary choices */
const int sessionID = 1;
const int FCUP_Response_ClientInfo = 1;
const int FCUP_Response_ClientRef = 40030004;
/* taken from a working AppleTV? */
const char User_Agent[] = "AppleCoreMedia/1.0.0.11B554a (Apple TV; U; CPU OS 7_0_4 like Mac OS X; en_us";
plist_t req_root_node = plist_new_dict();
plist_t session_id_node = plist_new_uint((int64_t) sessionID);
plist_dict_set_item(req_root_node, "sessionID", session_id_node);
plist_t type_node = plist_new_string("unhandledURLRequest");
plist_dict_set_item(req_root_node, "type", type_node);
plist_t fcup_request_node = plist_new_dict();
plist_t client_info_node = plist_new_uint(FCUP_Response_ClientInfo);
plist_dict_set_item(fcup_request_node, "FCUP_Response_ClientInfo", client_info_node);
plist_t client_ref_node = plist_new_uint((int64_t) FCUP_Response_ClientRef);
plist_dict_set_item(fcup_request_node, "FCUP_Response_ClientRef", client_ref_node);
plist_t request_id_node = plist_new_uint((int64_t) request_id);
plist_dict_set_item(fcup_request_node, "FCUP_Response_RequestID", request_id_node);
plist_t url_node = plist_new_string(url);
plist_dict_set_item(fcup_request_node, "FCUP_Response_URL", url_node);
plist_t session_id1_node = plist_new_uint((int64_t) sessionID);
plist_dict_set_item(fcup_request_node, "sessionID", session_id1_node);
plist_t fcup_response_header_node = plist_new_dict();
plist_t playback_session_id_node = plist_new_string(client_session_id);
plist_dict_set_item(fcup_response_header_node, "X-Playback-Session-Id", playback_session_id_node);
plist_t user_agent_node = plist_new_string(User_Agent);
plist_dict_set_item(fcup_response_header_node, "User-Agent", user_agent_node);
plist_dict_set_item(fcup_request_node, "FCUP_Response_Headers", fcup_response_header_node);
plist_dict_set_item(req_root_node, "request", fcup_request_node);
uint32_t uint_val;
plist_to_xml(req_root_node, &plist_xml, &uint_val);
*datalen = (int) uint_val;
plist_free(req_root_node);
assert(plist_xml[*datalen] == '\0');
return plist_xml; //needs to be freed after use
}
int fcup_request(void *conn_opaque, const char *media_url, const char *client_session_id, int request_id) {
raop_conn_t *conn = (raop_conn_t *) conn_opaque;
int datalen = 0;
int requestlen;
int socket_fd = httpd_get_connection_socket_by_type(conn->raop->httpd, CONNECTION_TYPE_PTTH, 1);
logger_log(conn->raop->logger, LOGGER_DEBUG, "fcup_request send socket = %d", socket_fd);
/* create xml plist request data */
char *plist_xml = create_fcup_request(media_url, request_id, client_session_id, &datalen);
/* use http_response tools for creating the reverse http request */
http_response_t *request = http_response_create();
http_response_reverse_request_init(request, "POST", "/event", "HTTP/1.1");
http_response_add_header(request, "X-Apple-Session-ID", client_session_id);
http_response_add_header(request, "Content-Type", "text/x-apple-plist+xml");
http_response_finish(request, plist_xml, datalen);
free(plist_xml);
const char *http_request = http_response_get_data(request, &requestlen);
int send_len = send(socket_fd, http_request, requestlen, 0);
if (send_len < 0) {
int sock_err = SOCKET_GET_ERROR();
logger_log(conn->raop->logger, LOGGER_ERR, "fcup_request: send error %d:%s\n",
sock_err, strerror(sock_err));
http_response_destroy(request);
/* shut down connection? */
return -1;
}
if (logger_get_level(conn->raop->logger) >= LOGGER_DEBUG) {
char *request_str = utils_data_to_text(http_request, requestlen);
logger_log(conn->raop->logger, LOGGER_DEBUG, "\n%s", request_str);
free (request_str);
}
http_response_destroy(request);
logger_log(conn->raop->logger, LOGGER_DEBUG,"fcup_request: send sent Request of %d bytes from socket %d\n",
send_len, socket_fd);
return 0;
}

995
lib/http_handlers.h Normal file
View File

@@ -0,0 +1,995 @@
/**
* Copyright (c) 2024 fduncanh
* All Rights Reserved.
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Lesser General Public
* License as published by the Free Software Foundation; either
* version 2.1 of the License, or (at your option) any later version.
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Lesser General Public License for more details.
*
*/
/* this file is part of raop.c and should not be included in any other file */
#include "airplay_video.h"
#include "fcup_request.h"
static void
http_handler_server_info(raop_conn_t *conn, http_request_t *request, http_response_t *response,
char **response_data, int *response_datalen) {
assert(conn->raop->dnssd);
int hw_addr_raw_len = 0;
const char *hw_addr_raw = dnssd_get_hw_addr(conn->raop->dnssd, &hw_addr_raw_len);
char *hw_addr = calloc(1, 3 * hw_addr_raw_len);
//int hw_addr_len =
utils_hwaddr_airplay(hw_addr, 3 * hw_addr_raw_len, hw_addr_raw, hw_addr_raw_len);
plist_t r_node = plist_new_dict();
/* first 12 AirPlay features bits (R to L): 0x27F = 0010 0111 1111
* Only bits 0-6 and bit 9 are set:
* 0. video supported
* 1. photo supported
* 2. video protected wirh FairPlay DRM
* 3. volume control supported for video
* 4. HLS supported
* 5. slideshow supported
* 6. (unknown)
* 9. audio supported.
*/
plist_t features_node = plist_new_uint(0x27F);
plist_dict_set_item(r_node, "features", features_node);
plist_t mac_address_node = plist_new_string(hw_addr);
plist_dict_set_item(r_node, "macAddress", mac_address_node);
plist_t model_node = plist_new_string(GLOBAL_MODEL);
plist_dict_set_item(r_node, "model", model_node);
plist_t os_build_node = plist_new_string("12B435");
plist_dict_set_item(r_node, "osBuildVersion", os_build_node);
plist_t protovers_node = plist_new_string("1.0");
plist_dict_set_item(r_node, "protovers", protovers_node);
plist_t source_version_node = plist_new_string(GLOBAL_VERSION);
plist_dict_set_item(r_node, "srcvers", source_version_node);
plist_t vv_node = plist_new_uint(strtol(AIRPLAY_VV, NULL, 10));
plist_dict_set_item(r_node, "vv", vv_node);
plist_t device_id_node = plist_new_string(hw_addr);
plist_dict_set_item(r_node, "deviceid", device_id_node);
plist_to_xml(r_node, response_data, (uint32_t *) response_datalen);
//assert(*response_datalen == strlen(*response_data));
/* last character (at *response_data[response_datalen - 1]) is 0x0a = '\n'
* (*response_data[response_datalen] is '\0').
* apsdk removes the last "\n" by overwriting it with '\0', and reducing response_datalen by 1.
* TODO: check if this is necessary */
plist_free(r_node);
http_response_add_header(response, "Content-Type", "text/x-apple-plist+xml");
free(hw_addr);
/* initialize the airplay video service */
const char *session_id = http_request_get_header(request, "X-Apple-Session-ID");
airplay_video_service_init(conn->raop, conn->raop->port, session_id);
}
static void
http_handler_scrub(raop_conn_t *conn, http_request_t *request, http_response_t *response,
char **response_data, int *response_datalen) {
const char *url = http_request_get_url(request);
const char *data = strstr(url, "?");
float scrub_position = 0.0f;
if (data) {
data++;
const char *position = strstr(data, "=") + 1;
char *end;
double value = strtod(position, &end);
if (end && end != position) {
scrub_position = (float) value;
logger_log(conn->raop->logger, LOGGER_DEBUG, "http_handler_scrub: got position = %.6f",
scrub_position);
}
}
logger_log(conn->raop->logger, LOGGER_DEBUG, "**********************SCRUB %f ***********************",scrub_position);
conn->raop->callbacks.on_video_scrub(conn->raop->callbacks.cls, scrub_position);
}
static void
http_handler_rate(raop_conn_t *conn, http_request_t *request, http_response_t *response,
char **response_data, int *response_datalen) {
const char *url = http_request_get_url(request);
const char *data = strstr(url, "?");
float rate_value = 0.0f;
if (data) {
data++;
const char *rate = strstr(data, "=") + 1;
char *end;
float value = strtof(rate, &end);
if (end && end != rate) {
rate_value = value;
logger_log(conn->raop->logger, LOGGER_DEBUG, "http_handler_rate: got rate = %.6f", rate_value);
}
}
conn->raop->callbacks.on_video_rate(conn->raop->callbacks.cls, rate_value);
}
static void
http_handler_stop(raop_conn_t *conn, http_request_t *request, http_response_t *response,
char **response_data, int *response_datalen) {
logger_log(conn->raop->logger, LOGGER_INFO, "client HTTP request POST stop");
conn->raop->callbacks.on_video_stop(conn->raop->callbacks.cls);
}
/* handles PUT /setProperty http requests from Client to Server */
static void
http_handler_set_property(raop_conn_t *conn,
http_request_t *request, http_response_t *response,
char **response_data, int *response_datalen) {
const char *url = http_request_get_url(request);
const char *property = url + strlen("/setProperty?");
logger_log(conn->raop->logger, LOGGER_DEBUG, "http_handler_set_property: %s", property);
/* actionAtItemEnd: values:
0: advance (advance to next item, if there is one)
1: pause (pause playing)
2: none (do nothing)
reverseEndTime (only used when rate < 0) time at which reverse playback ends
forwardEndTime (only used when rate > 0) time at which reverse playback ends
*/
if (!strcmp(property, "reverseEndTime") ||
!strcmp(property, "forwardEndTime") ||
!strcmp(property, "actionAtItemEnd")) {
logger_log(conn->raop->logger, LOGGER_DEBUG, "property %s is known but unhandled", property);
plist_t errResponse = plist_new_dict();
plist_t errCode = plist_new_uint(0);
plist_dict_set_item(errResponse, "errorCode", errCode);
plist_to_xml(errResponse, response_data, (uint32_t *) response_datalen);
plist_free(errResponse);
http_response_add_header(response, "Content-Type", "text/x-apple-plist+xml");
} else {
logger_log(conn->raop->logger, LOGGER_DEBUG, "property %s is unknown, unhandled", property);
http_response_add_header(response, "Content-Length", "0");
}
}
/* handles GET /getProperty http requests from Client to Server. (not implemented) */
static void
http_handler_get_property(raop_conn_t *conn, http_request_t *request, http_response_t *response,
char **response_data, int *response_datalen) {
const char *url = http_request_get_url(request);
const char *property = url + strlen("getProperty?");
logger_log(conn->raop->logger, LOGGER_DEBUG, "http_handler_get_property: %s (unhandled)", property);
}
/* this request (for a variant FairPlay decryption) cannot be handled by UxPlay */
static void
http_handler_fpsetup2(raop_conn_t *conn, http_request_t *request, http_response_t *response,
char **response_data, int *response_datalen) {
logger_log(conn->raop->logger, LOGGER_WARNING, "client HTTP request POST fp-setup2 is unhandled");
http_response_add_header(response, "Content-Type", "application/x-apple-binary-plist");
int req_datalen;
const unsigned char *req_data = (unsigned char *) http_request_get_data(request, &req_datalen);
logger_log(conn->raop->logger, LOGGER_ERR, "only FairPlay version 0x03 is implemented, version is 0x%2.2x",
req_data[4]);
http_response_init(response, "HTTP/1.1", 421, "Misdirected Request");
}
// called by http_handler_playback_info while preparing response to a GET /playback_info request from the client.
typedef struct time_range_s {
double start;
double duration;
} time_range_t;
void time_range_to_plist(void *time_ranges, const int n_time_ranges,
plist_t time_ranges_node) {
time_range_t *tr = (time_range_t *) time_ranges;
for (int i = 0 ; i < n_time_ranges; i++) {
plist_t time_range_node = plist_new_dict();
plist_t duration_node = plist_new_real(tr[i].duration);
plist_dict_set_item(time_range_node, "duration", duration_node);
plist_t start_node = plist_new_real(tr[i].start);
plist_dict_set_item(time_range_node, "start", start_node);
plist_array_append_item(time_ranges_node, time_range_node);
}
}
// called by http_handler_playback_info while preparing response to a GET /playback_info request from the client.
int create_playback_info_plist_xml(playback_info_t *playback_info, char **plist_xml) {
plist_t res_root_node = plist_new_dict();
plist_t duration_node = plist_new_real(playback_info->duration);
plist_dict_set_item(res_root_node, "duration", duration_node);
plist_t position_node = plist_new_real(playback_info->position);
plist_dict_set_item(res_root_node, "position", position_node);
plist_t rate_node = plist_new_real(playback_info->rate);
plist_dict_set_item(res_root_node, "rate", rate_node);
/* should these be int or bool? */
plist_t ready_to_play_node = plist_new_uint(playback_info->ready_to_play);
plist_dict_set_item(res_root_node, "readyToPlay", ready_to_play_node);
plist_t playback_buffer_empty_node = plist_new_uint(playback_info->playback_buffer_empty);
plist_dict_set_item(res_root_node, "playbackBufferEmpty", playback_buffer_empty_node);
plist_t playback_buffer_full_node = plist_new_uint(playback_info->playback_buffer_full);
plist_dict_set_item(res_root_node, "playbackBufferFull", playback_buffer_full_node);
plist_t playback_likely_to_keep_up_node = plist_new_uint(playback_info->playback_likely_to_keep_up);
plist_dict_set_item(res_root_node, "playbackLikelyToKeepUp", playback_likely_to_keep_up_node);
plist_t loaded_time_ranges_node = plist_new_array();
time_range_to_plist(playback_info->loadedTimeRanges, playback_info->num_loaded_time_ranges,
loaded_time_ranges_node);
plist_dict_set_item(res_root_node, "loadedTimeRanges", loaded_time_ranges_node);
plist_t seekable_time_ranges_node = plist_new_array();
time_range_to_plist(playback_info->seekableTimeRanges, playback_info->num_seekable_time_ranges,
seekable_time_ranges_node);
plist_dict_set_item(res_root_node, "seekableTimeRanges", seekable_time_ranges_node);
int len;
plist_to_xml(res_root_node, plist_xml, (uint32_t *) &len);
/* plist_xml is null-terminated, last character is '/n' */
plist_free(res_root_node);
return len;
}
/* this handles requests from the Client for "Playback information" while the Media is playing on the
Media Player. (The Server gets this information by monitoring the Media Player). The Client could use
the information to e.g. update the slider it shows with progress to the player (0%-100%).
It does not affect playing of the Media*/
static void
http_handler_playback_info(raop_conn_t *conn, http_request_t *request, http_response_t *response,
char **response_data, int *response_datalen)
{
logger_log(conn->raop->logger, LOGGER_DEBUG, "http_handler_playback_info");
//const char *session_id = http_request_get_header(request, "X-Apple-Session-ID");
playback_info_t playback_info;
playback_info.stallcount = 0;
playback_info.ready_to_play = true; // ???;
playback_info.playback_buffer_empty = false; // maybe need to get this from playbin
playback_info.playback_buffer_full = true;
playback_info.playback_likely_to_keep_up = true;
conn->raop->callbacks.on_video_acquire_playback_info(conn->raop->callbacks.cls, &playback_info);
if (playback_info.duration == -1.0) {
/* video has finished, reset */
logger_log(conn->raop->logger, LOGGER_DEBUG, "playback_info not available (finishing)");
//httpd_remove_known_connections(conn->raop->httpd);
http_response_set_disconnect(response,1);
conn->raop->callbacks.video_reset(conn->raop->callbacks.cls);
return;
} else if (playback_info.position == -1.0) {
logger_log(conn->raop->logger, LOGGER_DEBUG, "playback_info not available");
return;
}
playback_info.num_loaded_time_ranges = 1;
time_range_t time_ranges_loaded[1];
time_ranges_loaded[0].start = playback_info.position;
time_ranges_loaded[0].duration = playback_info.duration - playback_info.position;
playback_info.loadedTimeRanges = (void *) &time_ranges_loaded;
playback_info.num_seekable_time_ranges = 1;
time_range_t time_ranges_seekable[1];
time_ranges_seekable[0].start = 0.0;
time_ranges_seekable[0].duration = playback_info.position;
playback_info.seekableTimeRanges = (void *) &time_ranges_seekable;
*response_datalen = create_playback_info_plist_xml(&playback_info, response_data);
http_response_add_header(response, "Content-Type", "text/x-apple-plist+xml");
}
/* this handles the POST /reverse request from Client to Server on a AirPlay http channel to "Upgrade"
to "PTTH/1.0" Reverse HTTP protocol proposed in 2009 Internet-Draft
https://datatracker.ietf.org/doc/id/draft-lentczner-rhttp-00.txt .
After the Upgrade the channel becomes a reverse http "AirPlay (reversed)" channel for
http requests from Server to Client.
*/
static void
http_handler_reverse(raop_conn_t *conn, http_request_t *request, http_response_t *response,
char **response_data, int *response_datalen) {
/* get http socket for send */
int socket_fd = httpd_get_connection_socket (conn->raop->httpd, (void *) conn);
if (socket_fd < 0) {
logger_log(conn->raop->logger, LOGGER_ERR, "fcup_request failed to retrieve socket_fd from httpd");
/* shut down connection? */
}
const char *purpose = http_request_get_header(request, "X-Apple-Purpose");
const char *connection = http_request_get_header(request, "Connection");
const char *upgrade = http_request_get_header(request, "Upgrade");
logger_log(conn->raop->logger, LOGGER_INFO, "client requested reverse connection: %s; purpose: %s \"%s\"",
connection, upgrade, purpose);
httpd_set_connection_type(conn->raop->httpd, (void *) conn, CONNECTION_TYPE_PTTH);
int type_PTTH = httpd_count_connection_type(conn->raop->httpd, CONNECTION_TYPE_PTTH);
if (type_PTTH == 1) {
logger_log(conn->raop->logger, LOGGER_DEBUG, "will use socket %d for %s connections", socket_fd, purpose);
http_response_init(response, "HTTP/1.1", 101, "Switching Protocols");
http_response_add_header(response, "Connection", "Upgrade");
http_response_add_header(response, "Upgrade", "PTTH/1.0");
} else {
logger_log(conn->raop->logger, LOGGER_ERR, "multiple TPPH connections (%d) are forbidden", type_PTTH );
}
}
/* this copies a Media Playlist into a null-terminated string. If it has the "#YT-EXT-CONDENSED-URI"
header, it is also expanded into the full Media Playlist format */
char *adjust_yt_condensed_playlist(const char *media_playlist) {
/* expands a YT-EXT_CONDENSED-URL media playlist into a full media playlist
* returns a pointer to the expanded playlist, WHICH MUST BE FREED AFTER USE */
const char *base_uri_begin;
const char *params_begin;
const char *prefix_begin;
size_t base_uri_len;
size_t params_len;
size_t prefix_len;
const char* ptr = strstr(media_playlist, "#EXTM3U\n");
ptr += strlen("#EXTM3U\n");
assert(ptr);
if (strncmp(ptr, "#YT-EXT-CONDENSED-URL", strlen("#YT-EXT-CONDENSED-URL"))) {
size_t len = strlen(media_playlist);
char * playlist_copy = (char *) malloc(len + 1);
memcpy(playlist_copy, media_playlist, len);
playlist_copy[len] = '\0';
return playlist_copy;
}
ptr = strstr(ptr, "BASE-URI=");
base_uri_begin = strchr(ptr, '"');
base_uri_begin++;
ptr = strchr(base_uri_begin, '"');
base_uri_len = ptr - base_uri_begin;
char *base_uri = (char *) calloc(base_uri_len + 1, sizeof(char));
assert(base_uri);
memcpy(base_uri, base_uri_begin, base_uri_len); //must free
ptr = strstr(ptr, "PARAMS=");
params_begin = strchr(ptr, '"');
params_begin++;
ptr = strchr(params_begin,'"');
params_len = ptr - params_begin;
char *params = (char *) calloc(params_len + 1, sizeof(char));
assert(params);
memcpy(params, params_begin, params_len); //must free
ptr = strstr(ptr, "PREFIX=");
prefix_begin = strchr(ptr, '"');
prefix_begin++;
ptr = strchr(prefix_begin,'"');
prefix_len = ptr - prefix_begin;
char *prefix = (char *) calloc(prefix_len + 1, sizeof(char));
assert(prefix);
memcpy(prefix, prefix_begin, prefix_len); //must free
/* expand params */
int nparams = 0;
int *params_size = NULL;
const char **params_start = NULL;
if (strlen(params)) {
nparams = 1;
char * comma = strchr(params, ',');
while (comma) {
nparams++;
comma++;
comma = strchr(comma, ',');
}
params_start = (const char **) calloc(nparams, sizeof(char *)); //must free
params_size = (int *) calloc(nparams, sizeof(int)); //must free
ptr = params;
for (int i = 0; i < nparams; i++) {
comma = strchr(ptr, ',');
params_start[i] = ptr;
if (comma) {
params_size[i] = (int) (comma - ptr);
ptr = comma;
ptr++;
} else {
params_size[i] = (int) (params + params_len - ptr);
break;
}
}
}
int count = 0;
ptr = strstr(media_playlist, "#EXTINF");
while (ptr) {
count++;
ptr = strstr(++ptr, "#EXTINF");
}
size_t old_size = strlen(media_playlist);
size_t new_size = old_size;
new_size += count * (base_uri_len + params_len);
char * new_playlist = (char *) calloc( new_size + 100, sizeof(char));
const char *old_pos = media_playlist;
char *new_pos = new_playlist;
ptr = old_pos;
ptr = strstr(old_pos, "#EXTINF:");
size_t len = ptr - old_pos;
/* copy header section before chunks */
memcpy(new_pos, old_pos, len);
old_pos += len;
new_pos += len;
while (ptr) {
/* for each chunk */
const char *end = NULL;
char *start = strstr(ptr, prefix);
len = start - ptr;
/* copy first line of chunk entry */
memcpy(new_pos, old_pos, len);
old_pos += len;
new_pos += len;
/* copy base uri to replace prefix*/
memcpy(new_pos, base_uri, base_uri_len);
new_pos += base_uri_len;
old_pos += prefix_len;
ptr = strstr(old_pos, "#EXTINF:");
/* insert the PARAMS separators on the slices line */
end = old_pos;
int last = nparams - 1;
for (int i = 0; i < nparams; i++) {
if (i != last) {
end = strchr(end, '/');
} else {
end = strstr(end, "#EXT"); /* the next line starts with either #EXTINF (usually) or #EXT-X-ENDLIST (at last chunk)*/
}
*new_pos = '/';
new_pos++;
memcpy(new_pos, params_start[i], params_size[i]);
new_pos += params_size[i];
*new_pos = '/';
new_pos++;
len = end - old_pos;
end++;
memcpy (new_pos, old_pos, len);
new_pos += len;
old_pos += len;
if (i != last) {
old_pos++; /* last entry is not followed by "/" separator */
}
}
}
/* copy tail */
len = media_playlist + strlen(media_playlist) - old_pos;
memcpy(new_pos, old_pos, len);
new_pos += len;
old_pos += len;
new_playlist[new_size] = '\0';
free (prefix);
free (base_uri);
free (params);
if (params_size) {
free (params_size);
}
if (params_start) {
free (params_start);
}
return new_playlist;
}
/* this adjusts the uri prefixes in the Master Playlist, for sending to the Media Player running on the Server Host */
char *adjust_master_playlist (char *fcup_response_data, int fcup_response_datalen, char *uri_prefix, char *uri_local_prefix) {
size_t uri_prefix_len = strlen(uri_prefix);
size_t uri_local_prefix_len = strlen(uri_local_prefix);
int counter = 0;
char *ptr = strstr(fcup_response_data, uri_prefix);
while (ptr != NULL) {
counter++;
ptr++;
ptr = strstr(ptr, uri_prefix);
}
size_t len = uri_local_prefix_len - uri_prefix_len;
len *= counter;
len += fcup_response_datalen;
char *new_master = (char *) malloc(len + 1);
*(new_master + len) = '\0';
char *first = fcup_response_data;
char *new = new_master;
char *last = strstr(first, uri_prefix);
counter = 0;
while (last != NULL) {
counter++;
len = last - first;
memcpy(new, first, len);
first = last + uri_prefix_len;
new += len;
memcpy(new, uri_local_prefix, uri_local_prefix_len);
new += uri_local_prefix_len;
last = strstr(last + uri_prefix_len, uri_prefix);
if (last == NULL) {
len = fcup_response_data + fcup_response_datalen - first;
memcpy(new, first, len);
break;
}
}
return new_master;
}
/* this parses the Master Playlist to make a table of the Media Playlist uri's that it lists */
int create_media_uri_table(const char *url_prefix, const char *master_playlist_data, int datalen,
char ***media_uri_table, int *num_uri) {
char *ptr = strstr(master_playlist_data, url_prefix);
char ** table = NULL;
if (ptr == NULL) {
return -1;
}
int count = 0;
while (ptr != NULL) {
char *end = strstr(ptr, "m3u8");
if (end == NULL) {
return 1;
}
end += sizeof("m3u8");
count++;
ptr = strstr(end, url_prefix);
}
table = (char **) calloc(count, sizeof(char *));
if (!table) {
return -1;
}
for (int i = 0; i < count; i++) {
table[i] = NULL;
}
ptr = strstr(master_playlist_data, url_prefix);
count = 0;
while (ptr != NULL) {
char *end = strstr(ptr, "m3u8");
char *uri;
if (end == NULL) {
return 0;
}
end += sizeof("m3u8");
size_t len = end - ptr - 1;
uri = (char *) calloc(len + 1, sizeof(char));
memcpy(uri , ptr, len);
table[count] = uri;
uri = NULL;
count ++;
ptr = strstr(end, url_prefix);
}
*num_uri = count;
*media_uri_table = table;
return 0;
}
/* the POST /action request from Client to Server on the AirPlay http channel follows a POST /event "FCUP Request"
from Server to Client on the reverse http channel, for a HLS playlist (first the Master Playlist, then the Media Playlists
listed in the Master Playlist. The POST /action request contains the playlist requested by the Server in
the preceding "FCUP Request". The FCUP Request sequence continues until all Media Playlists have been obtained by the Server */
static void
http_handler_action(raop_conn_t *conn, http_request_t *request, http_response_t *response,
char **response_data, int *response_datalen) {
bool data_is_plist = false;
plist_t req_root_node = NULL;
uint64_t uint_val;
int request_id = 0;
int fcup_response_statuscode = 0;
bool logger_debug = (logger_get_level(conn->raop->logger) >= LOGGER_DEBUG);
const char* session_id = http_request_get_header(request, "X-Apple-Session-ID");
if (!session_id) {
logger_log(conn->raop->logger, LOGGER_ERR, "Play request had no X-Apple-Session-ID");
goto post_action_error;
}
const char *apple_session_id = get_apple_session_id(conn->raop->airplay_video);
if (strcmp(session_id, apple_session_id)){
logger_log(conn->raop->logger, LOGGER_ERR, "X-Apple-Session-ID has changed:\n was:\"%s\"\n now:\"%s\"",
apple_session_id, session_id);
goto post_action_error;
}
/* verify that this request contains a binary plist*/
char *header_str = NULL;
http_request_get_header_string(request, &header_str);
logger_log(conn->raop->logger, LOGGER_DEBUG, "request header: %s", header_str);
data_is_plist = (strstr(header_str,"apple-binary-plist") != NULL);
free(header_str);
if (!data_is_plist) {
logger_log(conn->raop->logger, LOGGER_INFO, "POST /action: did not receive expected plist from client");
goto post_action_error;
}
/* extract the root_node plist */
int request_datalen = 0;
const char *request_data = http_request_get_data(request, &request_datalen);
if (request_datalen == 0) {
logger_log(conn->raop->logger, LOGGER_INFO, "POST /action: did not receive expected plist from client");
goto post_action_error;
}
plist_from_bin(request_data, request_datalen, &req_root_node);
/* determine type of data */
plist_t req_type_node = plist_dict_get_item(req_root_node, "type");
if (!PLIST_IS_STRING(req_type_node)) {
goto post_action_error;
}
/* three possible types are known */
char *type = NULL;
int action_type = 0;
plist_get_string_val(req_type_node, &type);
logger_log(conn->raop->logger, LOGGER_DEBUG, "action type is %s", type);
if (strstr(type, "unhandledURLResponse")) {
action_type = 1;
} else if (strstr(type, "playlistInsert")) {
action_type = 2;
} else if (strstr(type, "playlistRemove")) {
action_type = 3;
}
free (type);
plist_t req_params_node = NULL;
switch (action_type) {
case 1:
goto unhandledURLResponse;
case 2:
logger_log(conn->raop->logger, LOGGER_INFO, "unhandled action type playlistInsert (add new playback)");
goto finish;
case 3:
logger_log(conn->raop->logger, LOGGER_INFO, "unhandled action type playlistRemove (stop playback)");
goto finish;
default:
logger_log(conn->raop->logger, LOGGER_INFO, "unknown action type (unhandled)");
goto finish;
}
unhandledURLResponse:;
req_params_node = plist_dict_get_item(req_root_node, "params");
if (!PLIST_IS_DICT (req_params_node)) {
goto post_action_error;
}
/* handling type "unhandledURLResponse" (case 1)*/
uint_val = 0;
int fcup_response_datalen = 0;
if (logger_debug) {
plist_t plist_fcup_response_statuscode_node = plist_dict_get_item(req_params_node,
"FCUP_Response_StatusCode");
if (plist_fcup_response_statuscode_node) {
plist_get_uint_val(plist_fcup_response_statuscode_node, &uint_val);
fcup_response_statuscode = (int) uint_val;
uint_val = 0;
logger_log(conn->raop->logger, LOGGER_DEBUG, "FCUP_Response_StatusCode = %d",
fcup_response_statuscode);
}
plist_t plist_fcup_response_requestid_node = plist_dict_get_item(req_params_node,
"FCUP_Response_RequestID");
if (plist_fcup_response_requestid_node) {
plist_get_uint_val(plist_fcup_response_requestid_node, &uint_val);
request_id = (int) uint_val;
uint_val = 0;
logger_log(conn->raop->logger, LOGGER_DEBUG, "FCUP_Response_RequestID = %d", request_id);
}
}
plist_t plist_fcup_response_url_node = plist_dict_get_item(req_params_node, "FCUP_Response_URL");
if (!PLIST_IS_STRING(plist_fcup_response_url_node)) {
goto post_action_error;
}
char *fcup_response_url = NULL;
plist_get_string_val(plist_fcup_response_url_node, &fcup_response_url);
if (!fcup_response_url) {
goto post_action_error;
}
logger_log(conn->raop->logger, LOGGER_DEBUG, "FCUP_Response_URL = %s", fcup_response_url);
plist_t plist_fcup_response_data_node = plist_dict_get_item(req_params_node, "FCUP_Response_Data");
if (!PLIST_IS_DATA(plist_fcup_response_data_node)){
goto post_action_error;
}
uint_val = 0;
char *fcup_response_data = NULL;
plist_get_data_val(plist_fcup_response_data_node, &fcup_response_data, &uint_val);
fcup_response_datalen = (int) uint_val;
if (!fcup_response_data) {
free (fcup_response_url);
goto post_action_error;
}
if (logger_debug) {
logger_log(conn->raop->logger, LOGGER_DEBUG, "FCUP_Response datalen = %d", fcup_response_datalen);
char *data = malloc(fcup_response_datalen + 1);
memcpy(data, fcup_response_data, fcup_response_datalen);
data[fcup_response_datalen] = '\0';
logger_log(conn->raop->logger, LOGGER_DEBUG, "begin FCUP Response data:\n%s\nend FCUP Response data",data);
free (data);
}
char *ptr = strstr(fcup_response_url, "/master.m3u8");
if (ptr) {
/* this is a master playlist */
char *uri_prefix = get_uri_prefix(conn->raop->airplay_video);
char ** media_data_store = NULL;
int num_uri = 0;
char *uri_local_prefix = get_uri_local_prefix(conn->raop->airplay_video);
char *new_master = adjust_master_playlist (fcup_response_data, fcup_response_datalen, uri_prefix, uri_local_prefix);
store_master_playlist(conn->raop->airplay_video, new_master);
create_media_uri_table(uri_prefix, fcup_response_data, fcup_response_datalen, &media_data_store, &num_uri);
create_media_data_store(conn->raop->airplay_video, media_data_store, num_uri);
num_uri = get_num_media_uri(conn->raop->airplay_video);
set_next_media_uri_id(conn->raop->airplay_video, 0);
} else {
/* this is a media playlist */
assert(fcup_response_data);
char *playlist = (char *) calloc(fcup_response_datalen + 1, sizeof(char));
memcpy(playlist, fcup_response_data, fcup_response_datalen);
int uri_num = get_next_media_uri_id(conn->raop->airplay_video);
--uri_num; // (next num is current num + 1)
store_media_data_playlist_by_num(conn->raop->airplay_video, playlist, uri_num);
float duration = 0.0f;
int count = analyze_media_playlist(playlist, &duration);
if (count) {
logger_log(conn->raop->logger, LOGGER_DEBUG,
"\n%s:\nreceived media playlist has %5d chunks, total duration %9.3f secs\n",
fcup_response_url, count, duration);
}
}
if (fcup_response_data) {
free (fcup_response_data);
}
if (fcup_response_url) {
free (fcup_response_url);
}
int num_uri = get_num_media_uri(conn->raop->airplay_video);
int uri_num = get_next_media_uri_id(conn->raop->airplay_video);
if (uri_num < num_uri) {
fcup_request((void *) conn, get_media_uri_by_num(conn->raop->airplay_video, uri_num),
apple_session_id,
get_next_FCUP_RequestID(conn->raop->airplay_video));
set_next_media_uri_id(conn->raop->airplay_video, ++uri_num);
} else {
char * uri_local_prefix = get_uri_local_prefix(conn->raop->airplay_video);
conn->raop->callbacks.on_video_play(conn->raop->callbacks.cls,
strcat(uri_local_prefix, "/master.m3u8"),
get_start_position_seconds(conn->raop->airplay_video));
}
finish:
plist_free(req_root_node);
return;
post_action_error:;
http_response_init(response, "HTTP/1.1", 400, "Bad Request");
if (req_root_node) {
plist_free(req_root_node);
}
}
/* The POST /play request from the Client to Server on the AirPlay http channel contains (among other information)
the "Content Location" that specifies the HLS Playlists for the video to be streamed, as well as the video
"start position in seconds". Once this request is received by the Sever, the Server sends a POST /event
"FCUP Request" request to the Client on the reverse http channel, to request the HLS Master Playlist */
static void
http_handler_play(raop_conn_t *conn, http_request_t *request, http_response_t *response,
char **response_data, int *response_datalen) {
char* playback_location = NULL;
plist_t req_root_node = NULL;
float start_position_seconds = 0.0f;
bool data_is_binary_plist = false;
bool data_is_text = false;
bool data_is_octet = false;
logger_log(conn->raop->logger, LOGGER_DEBUG, "http_handler_play");
const char* session_id = http_request_get_header(request, "X-Apple-Session-ID");
if (!session_id) {
logger_log(conn->raop->logger, LOGGER_ERR, "Play request had no X-Apple-Session-ID");
goto play_error;
}
const char *apple_session_id = get_apple_session_id(conn->raop->airplay_video);
if (strcmp(session_id, apple_session_id)){
logger_log(conn->raop->logger, LOGGER_ERR, "X-Apple-Session-ID has changed:\n was:\"%s\"\n now:\"%s\"",
apple_session_id, session_id);
goto play_error;
}
int request_datalen = -1;
const char *request_data = http_request_get_data(request, &request_datalen);
if (request_datalen > 0) {
char *header_str = NULL;
http_request_get_header_string(request, &header_str);
logger_log(conn->raop->logger, LOGGER_DEBUG, "request header:\n%s", header_str);
data_is_binary_plist = (strstr(header_str, "x-apple-binary-plist") != NULL);
data_is_text = (strstr(header_str, "text/parameters") != NULL);
data_is_octet = (strstr(header_str, "octet-stream") != NULL);
free (header_str);
}
if (!data_is_text && !data_is_octet && !data_is_binary_plist) {
goto play_error;
}
if (data_is_text) {
logger_log(conn->raop->logger, LOGGER_ERR, "Play request Content is text (unsupported)");
goto play_error;
}
if (data_is_octet) {
logger_log(conn->raop->logger, LOGGER_ERR, "Play request Content is octet-stream (unsupported)");
goto play_error;
}
if (data_is_binary_plist) {
plist_from_bin(request_data, request_datalen, &req_root_node);
plist_t req_uuid_node = plist_dict_get_item(req_root_node, "uuid");
if (!req_uuid_node) {
goto play_error;
} else {
char* playback_uuid = NULL;
plist_get_string_val(req_uuid_node, &playback_uuid);
set_playback_uuid(conn->raop->airplay_video, playback_uuid);
free (playback_uuid);
}
plist_t req_content_location_node = plist_dict_get_item(req_root_node, "Content-Location");
if (!req_content_location_node) {
goto play_error;
} else {
plist_get_string_val(req_content_location_node, &playback_location);
}
plist_t req_start_position_seconds_node = plist_dict_get_item(req_root_node, "Start-Position-Seconds");
if (!req_start_position_seconds_node) {
logger_log(conn->raop->logger, LOGGER_INFO, "No Start-Position-Seconds in Play request");
} else {
double start_position = 0.0;
plist_get_real_val(req_start_position_seconds_node, &start_position);
start_position_seconds = (float) start_position;
}
set_start_position_seconds(conn->raop->airplay_video, (float) start_position_seconds);
}
char *ptr = strstr(playback_location, "/master.m3u8");
int prefix_len = (int) (ptr - playback_location);
set_uri_prefix(conn->raop->airplay_video, playback_location, prefix_len);
set_next_media_uri_id(conn->raop->airplay_video, 0);
fcup_request((void *) conn, playback_location, apple_session_id, get_next_FCUP_RequestID(conn->raop->airplay_video));
if (playback_location) {
free (playback_location);
}
if (req_root_node) {
plist_free(req_root_node);
}
return;
play_error:;
if (req_root_node) {
plist_free(req_root_node);
}
logger_log(conn->raop->logger, LOGGER_ERR, "Could not find valid Plist Data for /play, Unhandled");
http_response_init(response, "HTTP/1.1", 400, "Bad Request");
}
/* the HLS handler handles http requests GET /[uri] on the HLS channel from the media player to the Server, asking for
(adjusted) copies of Playlists: first the Master Playlist (adjusted to change the uri prefix to
"http://localhost:[port]/.......m3u8"), then the Media Playlists that the media player wishes to use.
If the client supplied Media playlists with the "YT-EXT-CONDENSED-URI" header, these must be adjusted into
the standard uncondensed form before sending with the response. The uri in the request is the uri for the
Media Playlist, taken from the Master Playlist, with the uri prefix removed.
*/
static void
http_handler_hls(raop_conn_t *conn, http_request_t *request, http_response_t *response,
char **response_data, int *response_datalen) {
const char *method = http_request_get_method(request);
assert (!strcmp(method, "GET"));
const char *url = http_request_get_url(request);
const char* upgrade = http_request_get_header(request, "Upgrade");
if (upgrade) {
//don't accept Upgrade: h2c request ?
return;
}
if (!strcmp(url, "/master.m3u8")){
char * master_playlist = get_master_playlist(conn->raop->airplay_video);
size_t len = strlen(master_playlist);
char * data = (char *) malloc(len + 1);
memcpy(data, master_playlist, len);
data[len] = '\0';
*response_data = data;
*response_datalen = (int ) len;
} else {
int num = get_media_playlist_by_uri(conn->raop->airplay_video, url);
if (num < 0) {
logger_log(conn->raop->logger, LOGGER_ERR,"Requested playlist %s not found", url);
assert(0);
} else {
char *media_playlist = get_media_playlist_by_num(conn->raop->airplay_video, num);
assert(media_playlist);
char *data = adjust_yt_condensed_playlist(media_playlist);
*response_data = data;
*response_datalen = strlen(data);
float duration = 0.0f;
int chunks = analyze_media_playlist(data, &duration);
logger_log(conn->raop->logger, LOGGER_INFO,
"Requested media_playlist %s has %5d chunks, total duration %9.3f secs", url, chunks, duration);
}
}
http_response_add_header(response, "Access-Control-Allow-Headers", "Content-type");
http_response_add_header(response, "Access-Control-Allow-Origin", "*");
const char *date;
date = gmt_time_string();
http_response_add_header(response, "Date", date);
if (*response_datalen > 0) {
http_response_add_header(response, "Content-Type", "application/x-mpegURL; charset=utf-8");
} else if (*response_datalen == 0) {
http_response_init(response, "HTTP/1.1", 404, "Not Found");
}
}

View File

@@ -27,6 +27,7 @@ struct http_request_s {
llhttp_t parser;
llhttp_settings_t parser_settings;
bool is_reverse; // if true, this is a reverse-response from client
const char *method;
char *url;
char protocol[9];
@@ -160,7 +161,7 @@ http_request_init(void)
llhttp_init(&request->parser, HTTP_REQUEST, &request->parser_settings);
request->parser.data = request;
request->is_reverse = false;
return request;
}
@@ -206,6 +207,9 @@ int
http_request_has_error(http_request_t *request)
{
assert(request);
if (request->is_reverse) {
return 0;
}
return (llhttp_get_errno(&request->parser) != HPE_OK);
}
@@ -213,6 +217,9 @@ const char *
http_request_get_error_name(http_request_t *request)
{
assert(request);
if (request->is_reverse) {
return NULL;
}
return llhttp_errno_name(llhttp_get_errno(&request->parser));
}
@@ -220,6 +227,9 @@ const char *
http_request_get_error_description(http_request_t *request)
{
assert(request);
if (request->is_reverse) {
return NULL;
}
return llhttp_get_error_reason(&request->parser);
}
@@ -227,6 +237,9 @@ const char *
http_request_get_method(http_request_t *request)
{
assert(request);
if (request->is_reverse) {
return NULL;
}
return request->method;
}
@@ -234,6 +247,9 @@ const char *
http_request_get_url(http_request_t *request)
{
assert(request);
if (request->is_reverse) {
return NULL;
}
return request->url;
}
@@ -241,6 +257,9 @@ const char *
http_request_get_protocol(http_request_t *request)
{
assert(request);
if (request->is_reverse) {
return NULL;
}
return request->protocol;
}
@@ -250,6 +269,9 @@ http_request_get_header(http_request_t *request, const char *name)
int i;
assert(request);
if (request->is_reverse) {
return NULL;
}
for (i=0; i<request->headers_size; i+=2) {
if (!strcmp(request->headers[i], name)) {
@@ -263,7 +285,6 @@ const char *
http_request_get_data(http_request_t *request, int *datalen)
{
assert(request);
if (datalen) {
*datalen = request->datalen;
}
@@ -277,6 +298,10 @@ http_request_get_header_string(http_request_t *request, char **header_str)
*header_str = NULL;
return 0;
}
if (request->is_reverse) {
*header_str = NULL;
return 0;
}
int len = 0;
for (int i = 0; i < request->headers_size; i++) {
len += strlen(request->headers[i]);
@@ -309,3 +334,11 @@ http_request_get_header_string(http_request_t *request, char **header_str)
assert(p == &(str[len]));
return len;
}
bool http_request_is_reverse(http_request_t *request) {
return request->is_reverse;
}
void http_request_set_reverse(http_request_t *request) {
request->is_reverse = true;
}

View File

@@ -15,8 +15,9 @@
#ifndef HTTP_REQUEST_H
#define HTTP_REQUEST_H
typedef struct http_request_s http_request_t;
#include <stdbool.h>
typedef struct http_request_s http_request_t;
http_request_t *http_request_init(void);
@@ -32,6 +33,8 @@ const char *http_request_get_protocol(http_request_t *request);
const char *http_request_get_header(http_request_t *request, const char *name);
const char *http_request_get_data(http_request_t *request, int *datalen);
int http_request_get_header_string(http_request_t *request, char **header_str);
bool http_request_is_reverse(http_request_t *request);
void http_request_set_reverse(http_request_t *request);
void http_request_destroy(http_request_t *request);

View File

@@ -91,6 +91,21 @@ http_response_init(http_response_t *response, const char *protocol, int code, co
http_response_add_data(response, "\r\n", 2);
}
void
http_response_reverse_request_init(http_response_t *request, const char *method, const char *url, const char *protocol)
{
assert(request);
request->data_length = 0; /* reinitialize a previously-initialized response as a reverse-HTTP (PTTH/1.0) request */
/* Add first line of response to the data array */
http_response_add_data(request, method, strlen(method));
http_response_add_data(request, " ", 1);
http_response_add_data(request, url, strlen(url));
http_response_add_data(request, " ", 1);
http_response_add_data(request, protocol, strlen(protocol));
http_response_add_data(request, "\r\n", 2);
}
void
http_response_destroy(http_response_t *response)
{

View File

@@ -22,6 +22,8 @@ typedef struct http_response_s http_response_t;
http_response_t *http_response_create();
void http_response_init(http_response_t *response, const char *protocol, int code, const char *message);
void http_response_reverse_request_init(http_response_t *request, const char *method, const char *url,
const char *protocol);
void http_response_add_header(http_response_t *response, const char *name, const char *value);
void http_response_finish(http_response_t *response, const char *data, int datalen);

View File

@@ -20,12 +20,22 @@
#include <stdio.h>
#include <assert.h>
#include <stdbool.h>
#include <errno.h>
#include "httpd.h"
#include "netutils.h"
#include "http_request.h"
#include "compat.h"
#include "logger.h"
#include "utils.h"
static const char *typename[] = {
[CONNECTION_TYPE_UNKNOWN] = "Unknown",
[CONNECTION_TYPE_RAOP] = "RAOP",
[CONNECTION_TYPE_AIRPLAY] = "AirPlay",
[CONNECTION_TYPE_PTTH] = "AirPlay (reversed)",
[CONNECTION_TYPE_HLS] = "HLS"
};
struct http_connection_s {
int connected;
@@ -57,6 +67,25 @@ struct httpd_s {
int server_fd6;
};
const char *
httpd_get_connection_typename (connection_type_t type) {
return typename[type];
}
int
httpd_get_connection_socket (httpd_t *httpd, void *user_data) {
for (int i = 0; i < httpd->max_connections; i++) {
http_connection_t *connection = &httpd->connections[i];
if (!connection->connected) {
continue;
}
if (connection->user_data == user_data) {
return connection->socket_fd;
}
}
return -1;
}
int
httpd_set_connection_type (httpd_t *httpd, void *user_data, connection_type_t type) {
for (int i = 0; i < httpd->max_connections; i++) {
@@ -87,6 +116,42 @@ httpd_count_connection_type (httpd_t *httpd, connection_type_t type) {
return count;
}
int
httpd_get_connection_socket_by_type (httpd_t *httpd, connection_type_t type, int instance){
int count = 0;
for (int i = 0; i < httpd->max_connections; i++) {
http_connection_t *connection = &httpd->connections[i];
if (!connection->connected) {
continue;
}
if (connection->type == type) {
count++;
if (count == instance) {
return connection->socket_fd;
}
}
}
return 0;
}
void *
httpd_get_connection_by_type (httpd_t *httpd, connection_type_t type, int instance){
int count = 0;
for (int i = 0; i < httpd->max_connections; i++) {
http_connection_t *connection = &httpd->connections[i];
if (!connection->connected) {
continue;
}
if (connection->type == type) {
count++;
if (count == instance) {
return connection->user_data;
}
}
}
return NULL;
}
#define MAX_CONNECTIONS 12 /* value used in AppleTV 3*/
httpd_t *
httpd_init(logger_t *logger, httpd_callbacks_t *callbacks, int nohold)
@@ -101,7 +166,6 @@ httpd_init(logger_t *logger, httpd_callbacks_t *callbacks, int nohold)
return NULL;
}
httpd->nohold = (nohold ? 1 : 0);
httpd->max_connections = MAX_CONNECTIONS;
httpd->connections = calloc(httpd->max_connections, sizeof(http_connection_t));
@@ -213,7 +277,7 @@ httpd_accept_connection(httpd_t *httpd, int server_fd, int is_ipv6)
local = netutils_get_address(&local_saddr, &local_len, &local_zone_id);
remote = netutils_get_address(&remote_saddr, &remote_len, &remote_zone_id);
assert (local_zone_id == remote_zone_id);
ret = httpd_add_connection(httpd, fd, local, local_len, remote, remote_len, local_zone_id);
if (ret == -1) {
shutdown(fd, SHUT_RDWR);
@@ -235,7 +299,7 @@ httpd_remove_known_connections(httpd_t *httpd) {
if (!connection->connected || connection->type == CONNECTION_TYPE_UNKNOWN) {
continue;
}
httpd_remove_connection(httpd, connection);
httpd_remove_connection(httpd, connection);
}
}
@@ -243,10 +307,11 @@ static THREAD_RETVAL
httpd_thread(void *arg)
{
httpd_t *httpd = arg;
char http[] = "HTTP/1.1";
char buffer[1024];
int i;
bool logger_debug = (logger_get_level(httpd->logger) >= LOGGER_DEBUG);
assert(httpd);
while (1) {
@@ -254,6 +319,7 @@ httpd_thread(void *arg)
struct timeval tv;
int nfds=0;
int ret;
int new_request;
MUTEX_LOCK(httpd->run_mutex);
if (!httpd->running) {
@@ -299,7 +365,7 @@ httpd_thread(void *arg)
/* Timeout happened */
continue;
} else if (ret == -1) {
logger_log(httpd->logger, LOGGER_ERR, "httpd error in select");
logger_log(httpd->logger, LOGGER_ERR, "httpd error in select: %d %s", errno, strerror(errno));
break;
}
@@ -337,20 +403,93 @@ httpd_thread(void *arg)
if (!connection->request) {
connection->request = http_request_init();
assert(connection->request);
}
new_request = 1;
if (connection->type == CONNECTION_TYPE_PTTH) {
http_request_is_reverse(connection->request);
}
logger_log(httpd->logger, LOGGER_DEBUG, "new request, connection %d, socket %d type %s",
i, connection->socket_fd, typename [connection->type]);
} else {
new_request = 0;
}
logger_log(httpd->logger, LOGGER_DEBUG, "httpd receiving on socket %d, connection %d", connection->socket_fd, i);
ret = recv(connection->socket_fd, buffer, sizeof(buffer), 0);
if (ret == 0) {
logger_log(httpd->logger, LOGGER_INFO, "Connection closed for socket %d", connection->socket_fd);
httpd_remove_connection(httpd, connection);
logger_log(httpd->logger, LOGGER_DEBUG, "httpd receiving on socket %d, connection %d",
connection->socket_fd, i);
if (logger_debug) {
logger_log(httpd->logger, LOGGER_DEBUG,"\nhttpd: current connections:");
for (int i = 0; i < httpd->max_connections; i++) {
http_connection_t *connection = &httpd->connections[i];
if(!connection->connected) {
continue;
}
if (!FD_ISSET(connection->socket_fd, &rfds)) {
logger_log(httpd->logger, LOGGER_DEBUG, "connection %d type %d socket %d conn %p %s", i,
connection->type, connection->socket_fd,
connection->user_data, typename [connection->type]);
} else {
logger_log(httpd->logger, LOGGER_DEBUG, "connection %d type %d socket %d conn %p %s ACTIVE CONNECTION",
i, connection->type, connection->socket_fd, connection->user_data, typename [connection->type]);
}
}
logger_log(httpd->logger, LOGGER_DEBUG, " ");
}
/* reverse-http responses from the client must not be sent to the llhttp parser:
* such messages start with "HTTP/1.1" */
if (new_request) {
int readstart = 0;
new_request = 0;
while (readstart < 8) {
ret = recv(connection->socket_fd, buffer + readstart, sizeof(buffer) - 1 - readstart, 0);
if (ret == 0) {
logger_log(httpd->logger, LOGGER_INFO, "Connection closed for socket %d",
connection->socket_fd);
break;
} else if (ret == -1) {
if (errno == EAGAIN) {
continue;
} else {
int sock_err = SOCKET_GET_ERROR();
logger_log(httpd->logger, LOGGER_ERR, "httpd: recv socket error %d:%s",
sock_err, strerror(sock_err));
break;
}
} else {
readstart += ret;
ret = readstart;
}
}
if (!memcmp(buffer, http, 8)) {
http_request_set_reverse(connection->request);
}
} else {
ret = recv(connection->socket_fd, buffer, sizeof(buffer) - 1, 0);
if (ret == 0) {
logger_log(httpd->logger, LOGGER_INFO, "Connection closed for socket %d",
connection->socket_fd);
httpd_remove_connection(httpd, connection);
continue;
}
}
if (http_request_is_reverse(connection->request)) {
/* this is a response from the client to a
* GET /event reverse HTTP request from the server */
if (ret && logger_debug) {
buffer[ret] = '\0';
logger_log(httpd->logger, LOGGER_INFO, "<<<< received response from client"
" (reversed HTTP = \"PTTH/1.0\") connection"
" on socket %d:\n%s\n", connection->socket_fd, buffer);
}
if (ret == 0) {
httpd_remove_connection(httpd, connection);
}
continue;
}
/* Parse HTTP request from data read from connection */
http_request_add_data(connection->request, buffer, ret);
if (http_request_has_error(connection->request)) {
logger_log(httpd->logger, LOGGER_ERR, "httpd error in parsing: %s", http_request_get_error_name(connection->request));
logger_log(httpd->logger, LOGGER_ERR, "httpd error in parsing: %s",
http_request_get_error_name(connection->request));
httpd_remove_connection(httpd, connection);
continue;
}
@@ -359,12 +498,13 @@ httpd_thread(void *arg)
if (http_request_is_complete(connection->request)) {
http_response_t *response = NULL;
// Callback the received data to raop
if (logger_debug) {
if (logger_debug) {
const char *method = http_request_get_method(connection->request);
const char *url = http_request_get_url(connection->request);
const char *protocol = http_request_get_protocol(connection->request);
logger_log(httpd->logger, LOGGER_INFO, "httpd request received on socket %d, connection %d, "
"method = %s, url = %s, protocol = %s", connection->socket_fd, i, method, url, protocol);
logger_log(httpd->logger, LOGGER_INFO, "httpd request received on socket %d, "
"connection %d, method = %s, url = %s, protocol = %s",
connection->socket_fd, i, method, url, protocol);
}
httpd->callbacks.conn_request(connection->user_data, connection->request, &response);
http_request_destroy(connection->request);

View File

@@ -23,7 +23,10 @@ typedef struct httpd_s httpd_t;
typedef enum connectype_type_e {
CONNECTION_TYPE_UNKNOWN,
CONNECTION_TYPE_RAOP
CONNECTION_TYPE_RAOP,
CONNECTION_TYPE_AIRPLAY,
CONNECTION_TYPE_PTTH,
CONNECTION_TYPE_HLS
} connection_type_t;
struct httpd_callbacks_s {
@@ -39,7 +42,10 @@ void httpd_remove_known_connections(httpd_t *httpd);
int httpd_set_connection_type (httpd_t *http, void *user_data, connection_type_t type);
int httpd_count_connection_type (httpd_t *http, connection_type_t type);
int httpd_get_connection_socket (httpd_t *httpd, void *user_data);
int httpd_get_connection_socket_by_type (httpd_t *httpd, connection_type_t type, int instance);
const char *httpd_get_connection_typename (connection_type_t type);
void *httpd_get_connection_by_type (httpd_t *httpd, connection_type_t type, int instance);
httpd_t *httpd_init(logger_t *logger, httpd_callbacks_t *callbacks, int nohold);
int httpd_is_running(httpd_t *httpd);

View File

@@ -72,6 +72,12 @@ struct raop_s {
/* public key as string */
char pk_str[2*ED25519_KEY_SIZE + 1];
/* place to store media_data_store */
airplay_video_t *airplay_video;
/* activate support for HLS live streaming */
bool hls_support;
};
struct raop_conn_s {
@@ -81,7 +87,8 @@ struct raop_conn_s {
raop_rtp_mirror_t *raop_rtp_mirror;
fairplay_t *fairplay;
pairing_session_t *session;
airplay_video_t *airplay_video;
unsigned char *local;
int locallen;
@@ -92,11 +99,14 @@ struct raop_conn_s {
connection_type_t connection_type;
char *client_session_id;
bool have_active_remote;
};
typedef struct raop_conn_s raop_conn_t;
#include "raop_handlers.h"
#include "http_handlers.h"
static void *
conn_init(void *opaque, unsigned char *local, int locallen, unsigned char *remote, int remotelen, unsigned int zone_id) {
@@ -147,6 +157,9 @@ conn_init(void *opaque, unsigned char *local, int locallen, unsigned char *remot
conn->remotelen = remotelen;
conn->connection_type = CONNECTION_TYPE_UNKNOWN;
conn->client_session_id = NULL;
conn->airplay_video = NULL;
conn->have_active_remote = false;
@@ -162,35 +175,110 @@ conn_request(void *ptr, http_request_t *request, http_response_t **response) {
char *response_data = NULL;
int response_datalen = 0;
raop_conn_t *conn = ptr;
bool hls_request = false;
logger_log(conn->raop->logger, LOGGER_DEBUG, "conn_request");
bool logger_debug = (logger_get_level(conn->raop->logger) >= LOGGER_DEBUG);
/*
All requests arriving here have been parsed by llhttp to obtain
method | url | protocol (RTSP/1.0 or HTTP/1.1)
There are three types of connections supplying these requests:
Connections from the AirPlay client:
(1) type RAOP connections with CSeq seqence header, and no X-Apple-Session-ID header
(2) type AIRPLAY connection with an X-Apple-Sequence-ID header and no Cseq header
Connections from localhost:
(3) type HLS internal connections from the local HLS server (gstreamer) at localhost with neither
of these headers, but a Host: localhost:[port] header. method = GET.
*/
const char *method = http_request_get_method(request);
const char *url = http_request_get_url(request);
const char *protocol = http_request_get_protocol(request);
if (!method) {
return;
}
/* this rejects messages from _airplay._tcp for video streaming protocol unless bool raop->hls_support is true*/
const char *cseq = http_request_get_header(request, "CSeq");
const char *protocol = http_request_get_protocol(request);
if (!cseq && !conn->raop->hls_support) {
logger_log(conn->raop->logger, LOGGER_INFO, "ignoring AirPlay video streaming request (use option -hls to activate HLS support)");
return;
}
const char *url = http_request_get_url(request);
const char *client_session_id = http_request_get_header(request, "X-Apple-Session-ID");
const char *host = http_request_get_header(request, "Host");
hls_request = (host && !cseq && !client_session_id);
if (conn->connection_type == CONNECTION_TYPE_UNKNOWN) {
if (httpd_count_connection_type(conn->raop->httpd, CONNECTION_TYPE_RAOP)) {
char ipaddr[40];
utils_ipaddress_to_string(conn->remotelen, conn->remote, conn->zone_id, ipaddr, (int) (sizeof(ipaddr)));
if (httpd_nohold(conn->raop->httpd)) {
logger_log(conn->raop->logger, LOGGER_INFO, "\"nohold\" feature: switch to new connection request from %s", ipaddr);
if (conn->raop->callbacks.video_reset) {
printf("**************************video_reset*************************\n");
conn->raop->callbacks.video_reset(conn->raop->callbacks.cls);
}
httpd_remove_known_connections(conn->raop->httpd);
if (cseq) {
if (httpd_count_connection_type(conn->raop->httpd, CONNECTION_TYPE_RAOP)) {
char ipaddr[40];
utils_ipaddress_to_string(conn->remotelen, conn->remote, conn->zone_id, ipaddr, (int) (sizeof(ipaddr)));
if (httpd_nohold(conn->raop->httpd)) {
logger_log(conn->raop->logger, LOGGER_INFO, "\"nohold\" feature: switch to new connection request from %s", ipaddr);
if (conn->raop->callbacks.video_reset) {
conn->raop->callbacks.video_reset(conn->raop->callbacks.cls);
}
httpd_remove_known_connections(conn->raop->httpd);
} else {
logger_log(conn->raop->logger, LOGGER_WARNING, "rejecting new connection request from %s", ipaddr);
*response = http_response_create();
http_response_init(*response, protocol, 409, "Conflict: Server is connected to another client");
goto finish;
}
}
logger_log(conn->raop->logger, LOGGER_DEBUG, "New connection %p identified as Connection type RAOP", ptr);
httpd_set_connection_type(conn->raop->httpd, ptr, CONNECTION_TYPE_RAOP);
conn->connection_type = CONNECTION_TYPE_RAOP;
} else if (client_session_id) {
logger_log(conn->raop->logger, LOGGER_DEBUG, "New connection %p identified as Connection type AirPlay", ptr);
httpd_set_connection_type(conn->raop->httpd, ptr, CONNECTION_TYPE_AIRPLAY);
conn->connection_type = CONNECTION_TYPE_AIRPLAY;
size_t len = strlen(client_session_id) + 1;
conn->client_session_id = (char *) malloc(len);
strncpy(conn->client_session_id, client_session_id, len);
/* airplay video has been requested: shut down any running RAOP udp services */
raop_conn_t *raop_conn = (raop_conn_t *) httpd_get_connection_by_type(conn->raop->httpd, CONNECTION_TYPE_RAOP, 1);
if (raop_conn) {
raop_rtp_mirror_t *raop_rtp_mirror = raop_conn->raop_rtp_mirror;
if (raop_rtp_mirror) {
logger_log(conn->raop->logger, LOGGER_DEBUG, "New AirPlay connection: stopping RAOP mirror"
" service on RAOP connection %p", raop_conn);
raop_rtp_mirror_stop(raop_rtp_mirror);
}
} else {
logger_log(conn->raop->logger, LOGGER_WARNING, "rejecting new connection request from %s", ipaddr);
*response = http_response_create();
http_response_init(*response, protocol, 409, "Conflict: Server is connected to another client");
goto finish;
}
}
httpd_set_connection_type(conn->raop->httpd, ptr, CONNECTION_TYPE_RAOP);
conn->connection_type = CONNECTION_TYPE_RAOP;
raop_rtp_t *raop_rtp = raop_conn->raop_rtp;
if (raop_rtp) {
logger_log(conn->raop->logger, LOGGER_DEBUG, "New AirPlay connection: stopping RAOP audio"
" service on RAOP connection %p", raop_conn);
raop_rtp_stop(raop_rtp);
}
raop_ntp_t *raop_ntp = raop_conn->raop_ntp;
if (raop_rtp) {
logger_log(conn->raop->logger, LOGGER_DEBUG, "New AirPlay connection: stopping NTP time"
" service on RAOP connection %p", raop_conn);
raop_ntp_stop(raop_ntp);
}
}
} else if (host) {
logger_log(conn->raop->logger, LOGGER_DEBUG, "New connection %p identified as Connection type HLS", ptr);
httpd_set_connection_type(conn->raop->httpd, ptr, CONNECTION_TYPE_HLS);
conn->connection_type = CONNECTION_TYPE_HLS;
} else {
logger_log(conn->raop->logger, LOGGER_WARNING, "connection from unknown connection type");
}
}
/* this response code and message will be modified by the handler if necessary */
*response = http_response_create();
http_response_init(*response, protocol, 200, "OK");
/* is this really necessary? or is it obsolete? (added for all RTSP requests EXCEPT "RECORD") */
if (cseq && strcmp(method, "RECORD")) {
http_response_add_header(*response, "Audio-Jack-Status", "connected; type=digital");
}
if (!conn->have_active_remote) {
@@ -204,15 +292,6 @@ conn_request(void *ptr, http_request_t *request, http_response_t **response) {
}
}
if (!method) {
return;
}
/* this rejects unsupported messages from _airplay._tcp for video streaming protocol*/
if (!cseq) {
return;
}
logger_log(conn->raop->logger, LOGGER_DEBUG, "\n%s %s %s", method, url, protocol);
char *header_str= NULL;
http_request_get_header_string(request, &header_str);
@@ -225,8 +304,9 @@ conn_request(void *ptr, http_request_t *request, http_response_t **response) {
const char *request_data = http_request_get_data(request, &request_datalen);
if (request_data && logger_debug) {
if (request_datalen > 0) {
/* logger has a buffer limit of 4096 */
if (data_is_plist) {
plist_t req_root_node = NULL;
plist_t req_root_node = NULL;
plist_from_bin(request_data, request_datalen, &req_root_node);
char * plist_xml;
uint32_t plist_len;
@@ -247,52 +327,99 @@ conn_request(void *ptr, http_request_t *request, http_response_t **response) {
}
}
*response = http_response_create();
http_response_init(*response, protocol, 200, "OK");
//http_response_add_header(*response, "Apple-Jack-Status", "connected; type=analog");
if (client_session_id) {
assert(!strcmp(client_session_id, conn->client_session_id));
}
logger_log(conn->raop->logger, LOGGER_DEBUG, "Handling request %s with URL %s", method, url);
raop_handler_t handler = NULL;
if (!strcmp(method, "GET") && !strcmp(url, "/info")) {
handler = &raop_handler_info;
} else if (!strcmp(method, "POST") && !strcmp(url, "/pair-pin-start")) {
handler = &raop_handler_pairpinstart;
} else if (!strcmp(method, "POST") && !strcmp(url, "/pair-setup-pin")) {
handler = &raop_handler_pairsetup_pin;
} else if (!strcmp(method, "POST") && !strcmp(url, "/pair-setup")) {
handler = &raop_handler_pairsetup;
} else if (!strcmp(method, "POST") && !strcmp(url, "/pair-verify")) {
handler = &raop_handler_pairverify;
} else if (!strcmp(method, "POST") && !strcmp(url, "/fp-setup")) {
handler = &raop_handler_fpsetup;
} else if (!strcmp(method, "OPTIONS")) {
handler = &raop_handler_options;
} else if (!strcmp(method, "SETUP")) {
handler = &raop_handler_setup;
} else if (!strcmp(method, "GET_PARAMETER")) {
handler = &raop_handler_get_parameter;
} else if (!strcmp(method, "SET_PARAMETER")) {
handler = &raop_handler_set_parameter;
} else if (!strcmp(method, "POST") && !strcmp(url, "/feedback")) {
handler = &raop_handler_feedback;
} else if (!strcmp(method, "RECORD")) {
handler = &raop_handler_record;
} else if (!strcmp(method, "FLUSH")) {
handler = &raop_handler_flush;
} else if (!strcmp(method, "TEARDOWN")) {
handler = &raop_handler_teardown;
} else {
logger_log(conn->raop->logger, LOGGER_INFO, "Unhandled Client Request: %s %s", method, url);
if (!hls_request && !strcmp(protocol, "RTSP/1.0")) {
if (!strcmp(method, "POST")) {
if (!strcmp(url, "/feedback")) {
handler = &raop_handler_feedback;
} else if (!strcmp(url, "/pair-pin-start")) {
handler = &raop_handler_pairpinstart;
} else if (!strcmp(url, "/pair-setup-pin")) {
handler = &raop_handler_pairsetup_pin;
} else if (!strcmp(url, "/pair-setup")) {
handler = &raop_handler_pairsetup;
} else if (!strcmp(url, "/pair-verify")) {
handler = &raop_handler_pairverify;
} else if (!strcmp(url, "/fp-setup")) {
handler = &raop_handler_fpsetup;
} else if (!strcmp(url, "/getProperty")) {
handler = &http_handler_get_property;
} else if (!strcmp(url, "/audioMode")) {
//handler = &http_handler_audioMode;
}
} else if (!strcmp(method, "GET")) {
if (!strcmp(url, "/info")) {
handler = &raop_handler_info;
}
} else if (!strcmp(method, "OPTIONS")) {
handler = &raop_handler_options;
} else if (!strcmp(method, "SETUP")) {
handler = &raop_handler_setup;
} else if (!strcmp(method, "GET_PARAMETER")) {
handler = &raop_handler_get_parameter;
} else if (!strcmp(method, "SET_PARAMETER")) {
handler = &raop_handler_set_parameter;
} else if (!strcmp(method, "RECORD")) {
handler = &raop_handler_record;
} else if (!strcmp(method, "FLUSH")) {
handler = &raop_handler_flush;
} else if (!strcmp(method, "TEARDOWN")) {
handler = &raop_handler_teardown;
}
} else if (!hls_request && !strcmp(protocol, "HTTP/1.1")) {
if (!strcmp(method, "POST")) {
if (!strcmp(url, "/reverse")) {
handler = &http_handler_reverse;
} else if (!strcmp(url, "/play")) {
handler = &http_handler_play;
} else if (!strncmp (url, "/getProperty?", strlen("/getProperty?"))) {
handler = &http_handler_get_property;
} else if (!strncmp(url, "/scrub?", strlen("/scrub?"))) {
handler = &http_handler_scrub;
} else if (!strncmp(url, "/rate?", strlen("/rate?"))) {
handler = &http_handler_rate;
} else if (!strcmp(url, "/stop")) {
handler = &http_handler_stop;
} else if (!strcmp(url, "/action")) {
handler = &http_handler_action;
} else if (!strcmp(url, "/fp-setup2")) {
handler = &http_handler_fpsetup2;
}
} else if (!strcmp(method, "GET")) {
if (!strcmp(url, "/server-info")) {
handler = &http_handler_server_info;
} else if (!strcmp(url, "/playback-info")) {
handler = &http_handler_playback_info;
}
} else if (!strcmp(method, "PUT")) {
if (!strncmp (url, "/setProperty?", strlen("/setProperty?"))) {
handler = &http_handler_set_property;
} else {
}
}
} else if (hls_request) {
handler = &http_handler_hls;
}
if (handler != NULL) {
handler(conn, request, *response, &response_data, &response_datalen);
} else {
logger_log(conn->raop->logger, LOGGER_INFO,
"Unhandled Client Request: %s %s %s", method, url, protocol);
}
finish:;
http_response_add_header(*response, "Server", "AirTunes/"GLOBAL_VERSION);
http_response_add_header(*response, "CSeq", cseq);
if (!hls_request) {
http_response_add_header(*response, "Server", "AirTunes/"GLOBAL_VERSION);
if (cseq) {
http_response_add_header(*response, "CSeq", cseq);
}
}
http_response_finish(*response, response_data, response_datalen);
int len;
@@ -304,11 +431,14 @@ conn_request(void *ptr, http_request_t *request, http_response_t **response) {
}
header_str = utils_data_to_text(data, len);
logger_log(conn->raop->logger, LOGGER_DEBUG, "\n%s", header_str);
bool data_is_plist = (strstr(header_str,"apple-binary-plist") != NULL);
bool data_is_text = (strstr(header_str,"text/parameters") != NULL);
bool data_is_text = (strstr(header_str,"text/") != NULL ||
strstr(header_str, "x-mpegURL") != NULL);
free(header_str);
if (response_data) {
if (response_datalen > 0 && logger_debug) {
/* logger has a buffer limit of 4096 */
if (data_is_plist) {
plist_t res_root_node = NULL;
plist_from_bin(response_data, response_datalen, &res_root_node);
@@ -328,9 +458,9 @@ conn_request(void *ptr, http_request_t *request, http_response_t **response) {
free(data_str);
}
}
free(response_data);
response_data = NULL;
response_datalen = 0;
if (response_data) {
free(response_data);
}
}
}
@@ -364,6 +494,13 @@ conn_destroy(void *ptr) {
free(conn->remote);
pairing_session_destroy(conn->session);
fairplay_destroy(conn->fairplay);
if (conn->client_session_id) {
free(conn->client_session_id);
}
if (conn->airplay_video) {
airplay_video_service_destroy(conn->airplay_video);
}
free(conn);
}
@@ -420,6 +557,8 @@ raop_init(raop_callbacks_t *callbacks) {
raop->max_ntp_timeouts = 0;
raop->audio_delay_micros = 250000;
raop->hls_support = false;
return raop;
}
@@ -474,6 +613,7 @@ raop_init2(raop_t *raop, int nohold, const char *device_id, const char *keyfile)
void
raop_destroy(raop_t *raop) {
if (raop) {
raop_destroy_airplay_video(raop);
raop_stop(raop);
pairing_destroy(raop->pairing);
httpd_destroy(raop->httpd);
@@ -533,6 +673,8 @@ int raop_set_plist(raop_t *raop, const char *plist_item, const int value) {
} else if (strcmp(plist_item, "pin") == 0) {
raop->pin = value;
raop->use_pin = true;
} else if (strcmp(plist_item, "hls") == 0) {
raop->hls_support = (value > 0 ? true : false);
} else {
retval = -1;
}
@@ -604,3 +746,27 @@ void raop_remove_known_connections(raop_t * raop) {
httpd_remove_known_connections(raop->httpd);
}
airplay_video_t *deregister_airplay_video(raop_t *raop) {
airplay_video_t *airplay_video = raop->airplay_video;
raop->airplay_video = NULL;
return airplay_video;
}
bool register_airplay_video(raop_t *raop, airplay_video_t *airplay_video) {
if (raop->airplay_video) {
return false;
}
raop->airplay_video = airplay_video;
return true;
}
airplay_video_t * get_airplay_video(raop_t *raop) {
return raop->airplay_video;
}
void raop_destroy_airplay_video(raop_t *raop) {
if (raop->airplay_video) {
airplay_video_service_destroy(raop->airplay_video);
raop->airplay_video = NULL;
}
}

View File

@@ -21,6 +21,7 @@
#include "dnssd.h"
#include "stream.h"
#include "raop_ntp.h"
#include "airplay_video.h"
#if defined (WIN32) && defined(DLL_EXPORT)
# define RAOP_API __declspec(dllexport)
@@ -36,12 +37,29 @@ typedef struct raop_s raop_t;
typedef void (*raop_log_callback_t)(void *cls, int level, const char *msg);
typedef struct playback_info_s {
//char * uuid;
uint32_t stallcount;
double duration;
double position;
float rate;
bool ready_to_play;
bool playback_buffer_empty;
bool playback_buffer_full;
bool playback_likely_to_keep_up;
int num_loaded_time_ranges;
int num_seekable_time_ranges;
void *loadedTimeRanges;
void *seekableTimeRanges;
} playback_info_t;
typedef enum video_codec_e {
VIDEO_CODEC_UNKNOWN,
VIDEO_CODEC_H264,
VIDEO_CODEC_H265
} video_codec_t;
struct raop_callbacks_s {
void* cls;
@@ -49,8 +67,7 @@ struct raop_callbacks_s {
void (*video_process)(void *cls, raop_ntp_t *ntp, video_decode_struct *data);
void (*video_pause)(void *cls);
void (*video_resume)(void *cls);
void (*video_codec) (void *cls, video_codec_t video_codec);
/* Optional but recommended callback functions */
void (*conn_init)(void *cls);
void (*conn_destroy)(void *cls);
@@ -72,11 +89,25 @@ struct raop_callbacks_s {
void (*export_dacp) (void *cls, const char *active_remote, const char *dacp_id);
void (*video_reset) (void *cls);
void (*video_set_codec)(void *cls, video_codec_t codec);
/* for HLS video player controls */
void (*on_video_play) (void *cls, const char *location, const float start_position);
void (*on_video_scrub) (void *cls, const float position);
void (*on_video_rate) (void *cls, const float rate);
void (*on_video_stop) (void *cls);
void (*on_video_acquire_playback_info) (void *cls, playback_info_t *playback_video);
};
typedef struct raop_callbacks_s raop_callbacks_t;
raop_ntp_t *raop_ntp_init(logger_t *logger, raop_callbacks_t *callbacks, const char *remote,
int remote_addr_len, unsigned short timing_rport, timing_protocol_t *time_protocol);
int remote_addr_len, unsigned short timing_rport,
timing_protocol_t *time_protocol);
int airplay_video_service_init(raop_t *raop, unsigned short port, const char *session_id);
bool register_airplay_video(raop_t *raop, airplay_video_t *airplay_video);
airplay_video_t *get_airplay_video(raop_t *raop);
airplay_video_t *deregister_airplay_video(raop_t *raop);
RAOP_API raop_t *raop_init(raop_callbacks_t *callbacks);
RAOP_API int raop_init2(raop_t *raop, int nohold, const char *device_id, const char *keyfile);
RAOP_API void raop_set_log_level(raop_t *raop, int level);
@@ -93,6 +124,7 @@ RAOP_API void raop_stop(raop_t *raop);
RAOP_API void raop_set_dnssd(raop_t *raop, dnssd_t *dnssd);
RAOP_API void raop_destroy(raop_t *raop);
RAOP_API void raop_remove_known_connections(raop_t * raop);
RAOP_API void raop_destroy_airplay_video(raop_t *raop);
#ifdef __cplusplus
}

View File

@@ -275,7 +275,7 @@ void raop_buffer_handle_resends(raop_buffer_t *raop_buffer, raop_resend_cb_t res
assert(resend_cb);
if (seqnum_cmp(raop_buffer->first_seqnum, raop_buffer->last_seqnum) < 0) {
int seqnum, count;
unsigned short seqnum, count = 0;
logger_log(raop_buffer->logger, LOGGER_DEBUG, "raop_buffer_handle_resends first_seqnum=%u last seqnum=%u",
raop_buffer->first_seqnum, raop_buffer->last_seqnum);
for (seqnum = raop_buffer->first_seqnum; seqnum_cmp(seqnum, raop_buffer->last_seqnum) < 0; seqnum++) {
@@ -283,12 +283,11 @@ void raop_buffer_handle_resends(raop_buffer_t *raop_buffer, raop_resend_cb_t res
if (entry->filled) {
break;
}
count++;
}
if (seqnum_cmp(seqnum, raop_buffer->first_seqnum) == 0) {
return;
if (count){
resend_cb(opaque, raop_buffer->first_seqnum, count);
}
count = seqnum_cmp(seqnum, raop_buffer->first_seqnum);
resend_cb(opaque, raop_buffer->first_seqnum, count);
}
}

View File

@@ -197,7 +197,6 @@ raop_handler_pairpinstart(raop_conn_t *conn,
logger_log(conn->raop->logger, LOGGER_INFO, "*** CLIENT MUST NOW ENTER PIN = \"%s\" AS AIRPLAY PASSWORD", pin);
*response_data = NULL;
response_datalen = 0;
return;
}
static void
@@ -749,13 +748,14 @@ raop_handler_setup(raop_conn_t *conn,
conn->raop_rtp_mirror = raop_rtp_mirror_init(conn->raop->logger, &conn->raop->callbacks,
conn->raop_ntp, remote, conn->remotelen, aeskey);
// plist_t res_event_port_node = plist_new_uint(conn->raop->port);
plist_t res_event_port_node = plist_new_uint(0);
/* the event port is not used in mirror mode or audio mode */
unsigned short event_port = 0;
plist_t res_event_port_node = plist_new_uint(event_port);
plist_t res_timing_port_node = plist_new_uint(timing_lport);
plist_dict_set_item(res_root_node, "timingPort", res_timing_port_node);
plist_dict_set_item(res_root_node, "eventPort", res_event_port_node);
logger_log(conn->raop->logger, LOGGER_DEBUG, "eport = %d, tport = %d", 0, timing_lport);
logger_log(conn->raop->logger, LOGGER_DEBUG, "eport = %d, tport = %d", event_port, timing_lport);
}
// Process stream setup requests

View File

@@ -214,10 +214,14 @@ raop_ntp_init_socket(raop_ntp_t *raop_ntp, int use_ipv6)
}
// We're calling recvfrom without knowing whether there is any data, so we need a timeout
uint32_t recv_timeout_msec = 300;
#ifdef _WIN32
DWORD tv = recv_timeout_msec;
#else
struct timeval tv;
tv.tv_sec = 0;
tv.tv_usec = 300000;
tv.tv_sec = recv_timeout_msec / (uint32_t) 1000;
tv.tv_usec = ((uint32_t) 1000) * (recv_timeout_msec % (uint32_t) 1000);
#endif
if (setsockopt(tsock, SOL_SOCKET, SO_RCVTIMEO, CAST &tv, sizeof(tv)) < 0) {
goto sockets_cleanup;
}
@@ -299,7 +303,7 @@ raop_ntp_thread(void *arg)
if (send_len < 0) {
int sock_err = SOCKET_GET_ERROR();
logger_log(raop_ntp->logger, LOGGER_ERR, "raop_ntp error sending request. Error %d:%s",
sock_err, strerror(sock_err));
sock_err, SOCKET_ERROR_STRING(sock_err));
} else {
// Read response
response_len = recvfrom(raop_ntp->tsock, (char *)response, sizeof(response), 0, NULL, NULL);

View File

@@ -81,7 +81,7 @@ struct raop_rtp_mirror_s {
raop_callbacks_t callbacks;
raop_ntp_t *ntp;
/* Buffer to handle all resends */
/* mirror buffer for decryption */
mirror_buffer_t *buffer;
/* Remote address as sockaddr */
@@ -245,8 +245,9 @@ raop_rtp_mirror_thread(void *arg)
saddrlen = sizeof(saddr);
stream_fd = accept(raop_rtp_mirror->mirror_data_sock, (struct sockaddr *)&saddr, &saddrlen);
if (stream_fd == -1) {
int sock_err = SOCKET_GET_ERROR();
logger_log(raop_rtp_mirror->logger, LOGGER_ERR,
"raop_rtp_mirror error in accept %d %s", errno, strerror(errno));
"raop_rtp_mirror error in accept %d %s", sock_err, SOCKET_ERROR_STRING(sock_err));
break;
}
@@ -255,31 +256,36 @@ raop_rtp_mirror_thread(void *arg)
tv.tv_sec = 0;
tv.tv_usec = 5000;
if (setsockopt(stream_fd, SOL_SOCKET, SO_RCVTIMEO, CAST &tv, sizeof(tv)) < 0) {
int sock_err = SOCKET_GET_ERROR();
logger_log(raop_rtp_mirror->logger, LOGGER_ERR,
"raop_rtp_mirror could not set stream socket timeout %d %s", errno, strerror(errno));
"raop_rtp_mirror could not set stream socket timeout %d %s", sock_err, SOCKET_ERROR_STRING(sock_err));
break;
}
int option;
option = 1;
if (setsockopt(stream_fd, SOL_SOCKET, SO_KEEPALIVE, CAST &option, sizeof(option)) < 0) {
int sock_err = SOCKET_GET_ERROR();
logger_log(raop_rtp_mirror->logger, LOGGER_WARNING,
"raop_rtp_mirror could not set stream socket keepalive %d %s", errno, strerror(errno));
"raop_rtp_mirror could not set stream socket keepalive %d %s", sock_err, SOCKET_ERROR_STRING(sock_err));
}
option = 60;
if (setsockopt(stream_fd, SOL_TCP, TCP_KEEPIDLE, CAST &option, sizeof(option)) < 0) {
int sock_err = SOCKET_GET_ERROR();
logger_log(raop_rtp_mirror->logger, LOGGER_WARNING,
"raop_rtp_mirror could not set stream socket keepalive time %d %s", errno, strerror(errno));
"raop_rtp_mirror could not set stream socket keepalive time %d %s", sock_err, SOCKET_ERROR_STRING(sock_err));
}
option = 10;
if (setsockopt(stream_fd, SOL_TCP, TCP_KEEPINTVL, CAST &option, sizeof(option)) < 0) {
int sock_err = SOCKET_GET_ERROR();
logger_log(raop_rtp_mirror->logger, LOGGER_WARNING,
"raop_rtp_mirror could not set stream socket keepalive interval %d %s", errno, strerror(errno));
"raop_rtp_mirror could not set stream socket keepalive interval %d %s", sock_err, SOCKET_ERROR_STRING(sock_err));
}
option = 6;
if (setsockopt(stream_fd, SOL_TCP, TCP_KEEPCNT, CAST &option, sizeof(option)) < 0) {
int sock_err = SOCKET_GET_ERROR();
logger_log(raop_rtp_mirror->logger, LOGGER_WARNING,
"raop_rtp_mirror could not set stream socket keepalive probes %d %s", errno, strerror(errno));
"raop_rtp_mirror could not set stream socket keepalive probes %d %s", sock_err, SOCKET_ERROR_STRING(sock_err));
}
readstart = 0;
}
@@ -301,10 +307,11 @@ raop_rtp_mirror_thread(void *arg)
stream_fd = -1;
continue;
} else if (payload == NULL && ret == -1) {
if (errno == EAGAIN || errno == EWOULDBLOCK) continue; // Timeouts can happen even if the connection is fine
int sock_err = SOCKET_GET_ERROR();
if (sock_err == SOCKET_ERRORNAME(EAGAIN) || sock_err == SOCKET_ERRORNAME(EWOULDBLOCK)) continue; // Timeouts can happen even if the connection is fine
logger_log(raop_rtp_mirror->logger, LOGGER_ERR,
"raop_rtp_mirror error in header recv: %d %s", errno, strerror(errno));
if (errno == ECONNRESET) conn_reset = true;;
"raop_rtp_mirror error in header recv: %d %s", sock_err, SOCKET_ERROR_STRING(sock_err));
if (sock_err == SOCKET_ERRORNAME(ECONNRESET)) conn_reset = true;;
break;
}
@@ -364,9 +371,10 @@ raop_rtp_mirror_thread(void *arg)
logger_log(raop_rtp_mirror->logger, LOGGER_ERR, "raop_rtp_mirror tcp socket was closed by client (recv returned 0)");
break;
} else if (ret == -1) {
if (errno == EAGAIN || errno == EWOULDBLOCK) continue; // Timeouts can happen even if the connection is fine
logger_log(raop_rtp_mirror->logger, LOGGER_ERR, "raop_rtp_mirror error in recv: %d %s", errno, strerror(errno));
if (errno == ECONNRESET) conn_reset = true;
int sock_err = SOCKET_GET_ERROR();
if (sock_err == SOCKET_ERRORNAME(EAGAIN) || sock_err == SOCKET_ERRORNAME(EWOULDBLOCK)) continue; // Timeouts can happen even if the connection is fine
logger_log(raop_rtp_mirror->logger, LOGGER_ERR, "raop_rtp_mirror error in recv: %d %s", sock_err, SOCKET_ERROR_STRING(sock_err));
if (errno == SOCKET_ERRORNAME(ECONNRESET)) conn_reset = true;
break;
}
@@ -533,9 +541,6 @@ raop_rtp_mirror_thread(void *arg)
raop_rtp_mirror->callbacks.video_process(raop_rtp_mirror->callbacks.cls, raop_rtp_mirror->ntp, &video_data);
free(payload_out);
break;
//char *str3 = utils_data_to_string(payload_out, video_data.data_len, 16);
//printf("%s\n", str3);
//free (str3);
case 0x01:
/* 128-byte observed packet header structure
bytes 0-15: length + timestamp
@@ -601,13 +606,12 @@ raop_rtp_mirror_thread(void *arg)
free(sps_pps);
sps_pps = NULL;
}
/* test for a H265 VPS/SPs/PPS */
/* test for a H265 VPS/SPS/PPS */
unsigned char hvc1[] = { 0x68, 0x76, 0x63, 0x31 };
if (!memcmp(payload + 4, hvc1, 4)) {
/* hvc1 HECV detected */
codec = VIDEO_CODEC_H265;
printf("h265 detected\n");
h265_video = true;
raop_rtp_mirror->callbacks.video_set_codec(raop_rtp_mirror->callbacks.cls, codec);
unsigned char vps_start_code[] = { 0xa0, 0x00, 0x01, 0x00 };
@@ -679,10 +683,6 @@ raop_rtp_mirror_thread(void *arg)
memcpy(ptr, nal_start_code, 4);
ptr += 4;
memcpy(ptr, pps, pps_size);
// printf (" HEVC (hvc1) vps + sps + pps NALU\n");
//char *str = utils_data_to_string(sps_pps, sps_pps_len, 16);
//printf("%s\n", str);
//free (str);
} else {
codec = VIDEO_CODEC_H264;
h265_video = false;

View File

@@ -16,6 +16,9 @@
#define SOCKETS_H
#if defined(WIN32)
char *wsa_strerror(int errnum);
typedef int socklen_t;
#ifndef SHUT_RD
@@ -31,6 +34,7 @@ typedef int socklen_t;
#define SOCKET_GET_ERROR() WSAGetLastError()
#define SOCKET_SET_ERROR(value) WSASetLastError(value)
#define SOCKET_ERRORNAME(name) WSA##name
#define SOCKET_ERROR_STRING(errnum) wsa_strerror(errnum)
#define WSAEAGAIN WSAEWOULDBLOCK
#define WSAENOMEM WSA_NOT_ENOUGH_MEMORY
@@ -43,7 +47,7 @@ typedef int socklen_t;
#define SOCKET_GET_ERROR() (errno)
#define SOCKET_SET_ERROR(value) (errno = (value))
#define SOCKET_ERRORNAME(name) name
#define SOCKET_ERROR_STRING(errnum) strerror(errnum)
#endif
#endif

View File

@@ -282,3 +282,14 @@ int utils_ipaddress_to_string(int addresslen, const unsigned char *address, unsi
}
return ret;
}
const char *gmt_time_string() {
static char date_buf[64];
memset(date_buf, 0, 64);
time_t now = time(0);
if (strftime(date_buf, 63, "%c GMT", gmtime(&now)))
return date_buf;
else
return "";
}

View File

@@ -30,6 +30,7 @@ char *utils_data_to_string(const unsigned char *data, int datalen, int chars_per
char *utils_data_to_text(const char *data, int datalen);
void ntp_timestamp_to_time(uint64_t ntp_timestamp, char *timestamp, size_t maxsize);
void ntp_timestamp_to_seconds(uint64_t ntp_timestamp, char *timestamp, size_t maxsize);
const char *gmt_time_string();
int utils_ipaddress_to_string(int addresslen, const unsigned char *address,
unsigned int zone_id, char *string, int len);
#endif

View File

@@ -4,6 +4,7 @@ if (APPLE )
set( ENV{PKG_CONFIG_PATH} "/Library/FrameWorks/GStreamer.framework/Libraries/pkgconfig" ) # GStreamer.framework, preferred
set( ENV{PKG_CONFIG_PATH} "$ENV{PKG_CONFIG_PATH}:/usr/local/lib/pkgconfig" ) # Brew or self-installed gstreamer
set( ENV{PKG_CONFIG_PATH} "$ENV{PKG_CONFIG_PATH}:/opt/homebrew/lib/pkgconfig" ) # Brew, M1/M2 macs
set( ENV{PKG_CONFIG_PATH} "$ENV{PKG_CONFIG_PATH}:$ENV{HOMEBREW_PREFIX}/lib/pkgconfig" ) # Brew, using prefix
set( ENV{PKG_CONFIG_PATH} "$ENV{PKG_CONFIG_PATH}:/opt/local/lib/pkgconfig/" ) # MacPorts
message( "PKG_CONFIG_PATH (Apple, renderers) = " $ENV{PKG_CONFIG_PATH} )
find_program( PKG_CONFIG_EXECUTABLE pkg-config PATHS /Library/FrameWorks/GStreamer.framework/Commands )

View File

@@ -20,7 +20,6 @@
* Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
*/
#include <gst/gst.h>
#include <gst/app/gstappsrc.h>
#include "video_renderer.h"
@@ -31,7 +30,7 @@
#include "x_display_fix.h"
static bool fullscreen = false;
static bool alt_keypress = false;
static unsigned char X11_search_attempts;
static unsigned char X11_search_attempts;
#endif
static GstClockTime gst_video_pipeline_base_time = GST_CLOCK_TIME_NONE;
@@ -40,9 +39,12 @@ static unsigned short width, height, width_source, height_source; /* not curren
static bool first_packet = false;
static bool sync = false;
static bool auto_videosink = true;
static bool hls_video = false;
#ifdef X_DISPLAY_FIX
static bool use_x11 = false;
#endif
static bool logger_debug = false;
static bool video_terminate = false;
static user_data_t user_data;
#define NCODECS 2 /* renderers for h264 and h265 */
@@ -52,6 +54,9 @@ struct video_renderer_s {
const char *codec;
bool autovideo, state_pending;
int id;
gboolean terminate;
gint64 duration;
gint buffering_level;
#ifdef X_DISPLAY_FIX
bool use_x11;
const char * server_name;
@@ -64,6 +69,7 @@ static video_renderer_t *renderer_type[NCODECS] = {0};
static int n_renderers = NCODECS;
static char h264[] = "h264";
static char h265[] = "h265";
static char hls[] = "hls";
static void append_videoflip (GString *launch, const videoflip_t *flip, const videoflip_t *rot) {
/* videoflip image transform */
@@ -86,7 +92,7 @@ static void append_videoflip (GString *launch, const videoflip_t *flip, const vi
case LEFT:
g_string_append(launch, "videoflip video-direction=GST_VIDEO_ORIENTATION_UL_LR ! ");
break;
case RIGHT:
case RIGHT:
g_string_append(launch, "videoflip video-direction=GST_VIDEO_ORIENTATION_UR_LL ! ");
break;
default:
@@ -122,14 +128,14 @@ static void append_videoflip (GString *launch, const videoflip_t *flip, const vi
}
}
/* apple uses colorimetry=1:3:5:1 *
/* apple uses colorimetry that is detected as 1:3:7:1 * //previously 1:3:5:1 was seen
* (not recognized by v4l2 plugin in Gstreamer < 1.20.4) *
* See .../gst-libs/gst/video/video-color.h in gst-plugins-base *
* range = 1 -> GST_VIDEO_COLOR_RANGE_0_255 ("full RGB") *
* matrix = 3 -> GST_VIDEO_COLOR_MATRIX_BT709 *
* transfer = 5 -> GST_VIDEO_TRANSFER_BT709 *
* transfer = 7 -> GST_VIDEO_TRANSFER_SRGB * // previously GST_VIDEO_TRANSFER_BT709
* primaries = 1 -> GST_VIDEO_COLOR_PRIMARIES_BT709 *
* closest used by GStreamer < 1.20.4 is BT709, 2:3:5:1 with * *
* closest used by GStreamer < 1.20.4 is BT709, 2:3:5:1 with * // now use sRGB = 1:1:7:1
* range = 2 -> GST_VIDEO_COLOR_RANGE_16_235 ("limited RGB") */
static const char h264_caps[]="video/x-h264,stream-format=(string)byte-stream,alignment=(string)au";
@@ -143,26 +149,78 @@ void video_renderer_size(float *f_width_source, float *f_height_source, float *f
logger_log(logger, LOGGER_DEBUG, "begin video stream wxh = %dx%d; source %dx%d", width, height, width_source, height_source);
}
void video_renderer_init(logger_t *render_logger, const char *server_name, videoflip_t videoflip[2], const char *parser,
GstElement *make_video_sink(const char *videosink, const char *videosink_options) {
/* used to build a videosink for playbin, using the user-specified string "videosink" */
GstElement *video_sink = gst_element_factory_make(videosink, "videosink");
if (!video_sink) {
return NULL;
}
/* process the video_sink_optons */
size_t len = strlen(videosink_options);
if (!len) {
return video_sink;
}
char *options = (char *) malloc(len + 1);
strncpy(options, videosink_options, len + 1);
/* remove any extension begining with "!" */
char *end = strchr(options, '!');
if (end) {
*end = '\0';
}
/* add any fullscreen options "property=pval" included in string videosink_options*/
/* OK to use strtok_r in Windows with MSYS2 (POSIX); use strtok_s for MSVC */
char *token;
char *text = options;
while((token = strtok_r(text, " ", &text))) {
char *pval = strchr(token, '=');
if (pval) {
*pval = '\0';
pval++;
const gchar *property_name = (const gchar *) token;
const gchar *value = (const gchar *) pval;
g_print("playbin_videosink property: \"%s\" \"%s\"\n", property_name, value);
gst_util_set_object_arg(G_OBJECT (video_sink), property_name, value);
}
}
free(options);
return video_sink;
}
void video_renderer_init(logger_t *render_logger, const char *server_name, videoflip_t videoflip[2], const char *parser,
const char *decoder, const char *converter, const char *videosink, const char *videosink_options,
bool initial_fullscreen, bool video_sync, bool h265_support) {
bool initial_fullscreen, bool video_sync, bool h265_support, const char *uri) {
GError *error = NULL;
GstCaps *caps = NULL;
hls_video = (uri != NULL);
/* videosink choices that are auto */
auto_videosink = (strstr(videosink, "autovideosink") || strstr(videosink, "fpsdisplaysink"));
logger = render_logger;
logger_debug = (logger_get_level(logger) >= LOGGER_DEBUG);
video_terminate = false;
/* this call to g_set_application_name makes server_name appear in the X11 display window title bar, */
/* (instead of the program name uxplay taken from (argv[0]). It is only set one time. */
const gchar *appname = g_get_application_name();
if (!appname || strcmp(appname,server_name)) g_set_application_name(server_name);
appname = NULL;
n_renderers = h265_support ? 2 : 1;
/* the renderer for hls video will only be built if a HLS uri is provided in
* the call to video_renderer_init, in which case the h264 and 265 mirror-mode
* renderers will not be built. This is because it appears that we cannot
* put playbin into GST_STATE_READY before knowing the uri (?), so cannot use a
* unified renderer structure with h264, h265 and hls */
if (hls_video) {
n_renderers = 1;
} else {
n_renderers = h265_support ? 2 : 1;
}
g_assert (n_renderers <= NCODECS);
for (int i = 0; i < n_renderers; i++) {
g_assert (i < 2);
@@ -171,69 +229,97 @@ void video_renderer_init(logger_t *render_logger, const char *server_name, video
renderer_type[i]->autovideo = auto_videosink;
renderer_type[i]->id = i;
renderer_type[i]->bus = NULL;
switch (i) {
case 0:
renderer_type[i]->codec = h264;
caps = gst_caps_from_string(h264_caps);
break;
case 1:
renderer_type[i]->codec = h265;
caps = gst_caps_from_string(h265_caps);
break;
default:
g_assert(0);
}
GString *launch = g_string_new("appsrc name=video_source ! ");
g_string_append(launch, "queue ! ");
g_string_append(launch, parser);
g_string_append(launch, " ! ");
g_string_append(launch, decoder);
g_string_append(launch, " ! ");
append_videoflip(launch, &videoflip[0], &videoflip[1]);
g_string_append(launch, converter);
g_string_append(launch, " ! ");
g_string_append(launch, "videoscale ! ");
g_string_append(launch, videosink);
g_string_append(launch, " name=");
g_string_append(launch, videosink);
g_string_append(launch, "_");
g_string_append(launch, renderer_type[i]->codec);
g_string_append(launch, videosink_options);
if (video_sync) {
g_string_append(launch, " sync=true");
sync = true;
if (hls_video) {
/* use playbin3 to play HLS video: replace "playbin3" by "playbin" to use playbin2 */
renderer_type[i]->pipeline = gst_element_factory_make("playbin3", "hls-playbin3");
g_assert(renderer_type[i]->pipeline);
renderer_type[i]->appsrc = NULL;
renderer_type[i]->codec = hls;
/* if we are not using autovideosink, build a videossink based on the stricng "videosink" */
if(strcmp(videosink, "autovideosink")) {
GstElement *playbin_videosink = make_video_sink(videosink, videosink_options);
if (!playbin_videosink) {
logger_log(logger, LOGGER_ERR, "video_renderer_init: failed to create playbin_videosink");
} else {
logger_log(logger, LOGGER_DEBUG, "video_renderer_init: create playbin_videosink at %p", playbin_videosink);
g_object_set(G_OBJECT (renderer_type[i]->pipeline), "video-sink", playbin_videosink, NULL);
}
}
g_object_set (G_OBJECT (renderer_type[i]->pipeline), "uri", uri, NULL);
} else {
g_string_append(launch, " sync=false");
sync = false;
}
switch (i) {
case 0:
renderer_type[i]->codec = h264;
caps = gst_caps_from_string(h264_caps);
break;
case 1:
renderer_type[i]->codec = h265;
caps = gst_caps_from_string(h265_caps);
break;
default:
g_assert(0);
}
GString *launch = g_string_new("appsrc name=video_source ! ");
g_string_append(launch, "queue ! ");
g_string_append(launch, parser);
g_string_append(launch, " ! ");
g_string_append(launch, decoder);
g_string_append(launch, " ! ");
append_videoflip(launch, &videoflip[0], &videoflip[1]);
g_string_append(launch, converter);
g_string_append(launch, " ! ");
g_string_append(launch, "videoscale ! ");
g_string_append(launch, videosink);
g_string_append(launch, " name=");
g_string_append(launch, videosink);
g_string_append(launch, "_");
g_string_append(launch, renderer_type[i]->codec);
g_string_append(launch, videosink_options);
if (video_sync) {
g_string_append(launch, " sync=true");
sync = true;
} else {
g_string_append(launch, " sync=false");
sync = false;
}
if (!strcmp(renderer_type[i]->codec, h265)) {
g_string_replace (launch, (const gchar *) h264, (const gchar *) h265, 0);
} else {
g_string_replace (launch, (const gchar *) h265, (const gchar *) h264, 0);
}
if (!strcmp(renderer_type[i]->codec, h264)) {
char *pos = launch->str;
while ((pos = strstr(pos,h265))){
pos +=3;
*pos = '4';
}
} else if (!strcmp(renderer_type[i]->codec, h265)) {
char *pos = launch->str;
while ((pos = strstr(pos,h264))){
pos +=3;
*pos = '5';
}
}
logger_log(logger, LOGGER_DEBUG, "GStreamer video pipeline %d:\n\"%s\"", i + 1, launch->str);
renderer_type[i]->pipeline = gst_parse_launch(launch->str, &error);
if (error) {
g_error ("get_parse_launch error (video) :\n %s\n",error->message);
g_clear_error (&error);
}
g_assert (renderer_type[i]->pipeline);
logger_log(logger, LOGGER_DEBUG, "GStreamer video pipeline %d:\n\"%s\"", i + 1, launch->str);
renderer_type[i]->pipeline = gst_parse_launch(launch->str, &error);
if (error) {
g_error ("get_parse_launch error (video) :\n %s\n",error->message);
g_clear_error (&error);
}
g_assert (renderer_type[i]->pipeline);
GstClock *clock = gst_system_clock_obtain();
g_object_set(clock, "clock-type", GST_CLOCK_TYPE_REALTIME, NULL);
gst_pipeline_use_clock(GST_PIPELINE_CAST(renderer_type[i]->pipeline), clock);
renderer_type[i]->appsrc = gst_bin_get_by_name (GST_BIN (renderer_type[i]->pipeline), "video_source");
g_assert(renderer_type[i]->appsrc);
GstClock *clock = gst_system_clock_obtain();
g_object_set(clock, "clock-type", GST_CLOCK_TYPE_REALTIME, NULL);
gst_pipeline_use_clock(GST_PIPELINE_CAST(renderer_type[i]->pipeline), clock);
renderer_type[i]->appsrc = gst_bin_get_by_name (GST_BIN (renderer_type[i]->pipeline), "video_source");
g_assert(renderer_type[i]->appsrc);
g_object_set(renderer_type[i]->appsrc, "caps", caps, "stream-type", 0, "is-live", TRUE, "format", GST_FORMAT_TIME, NULL);
g_string_free(launch, TRUE);
gst_caps_unref(caps);
gst_object_unref(clock);
g_object_set(renderer_type[i]->appsrc, "caps", caps, "stream-type", 0, "is-live", TRUE, "format", GST_FORMAT_TIME, NULL);
g_string_free(launch, TRUE);
gst_caps_unref(caps);
gst_object_unref(clock);
}
#ifdef X_DISPLAY_FIX
bool use_x11 = (strstr(videosink, "xvimagesink") || strstr(videosink, "ximagesink") || auto_videosink);
use_x11 = (strstr(videosink, "xvimagesink") || strstr(videosink, "ximagesink") || auto_videosink);
fullscreen = initial_fullscreen;
renderer_type[i]->server_name = server_name;
renderer_type[i]->gst_window = NULL;
@@ -249,7 +335,8 @@ void video_renderer_init(logger_t *render_logger, const char *server_name, video
} else {
free(renderer_type[0]->gst_window);
renderer_type[0]->gst_window = NULL;
} } else if (renderer_type[0]->use_x11) {
}
} else if (renderer_type[0]->use_x11) {
renderer_type[i]->gst_window = (X11_Window_t *) calloc(1, sizeof(X11_Window_t));
g_assert(renderer_type[i]->gst_window);
memcpy(renderer_type[i]->gst_window, renderer_type[0]->gst_window, sizeof(X11_Window_t));
@@ -259,38 +346,56 @@ void video_renderer_init(logger_t *render_logger, const char *server_name, video
#endif
gst_element_set_state (renderer_type[i]->pipeline, GST_STATE_READY);
GstState state;
if (gst_element_get_state (renderer_type[i]->pipeline, &state, NULL, 0)) {
if (gst_element_get_state (renderer_type[i]->pipeline, &state, NULL, 100 * GST_MSECOND)) {
if (state == GST_STATE_READY) {
logger_log(logger, LOGGER_DEBUG, "Initialized GStreamer video renderer %d", i + 1);
logger_log(logger, LOGGER_DEBUG, "Initialized GStreamer video renderer %d", i + 1);
if (hls_video && i == 0) {
renderer = renderer_type[i];
}
} else {
logger_log(logger, LOGGER_ERR, "Failed to initialize GStreamer video renderer %d", i + 1);
logger_log(logger, LOGGER_ERR, "Failed to initialize GStreamer video renderer %d", i + 1);
}
} else {
logger_log(logger, LOGGER_ERR, "Failed to initialize GStreamer video renderer %d", i + 1);
}
logger_log(logger, LOGGER_ERR, "Failed to initialize GStreamer video renderer %d", i + 1);
}
}
}
void video_renderer_pause() {
if (!renderer) {
return;
}
logger_log(logger, LOGGER_DEBUG, "video renderer paused");
gst_element_set_state(renderer->pipeline, GST_STATE_PAUSED);
}
void video_renderer_resume() {
if (!renderer) {
return;
}
gst_element_set_state (renderer->pipeline, GST_STATE_PLAYING);
GstState state;
/* wait with timeout 100 msec for pipeline to change state from PAUSED to PLAYING */
gst_element_get_state(renderer->pipeline, &state, NULL, 100 * GST_MSECOND);
const gchar *state_name = gst_element_state_get_name(state);
logger_log(logger, LOGGER_DEBUG, "video renderer resumed: state %s", state_name);
gst_video_pipeline_base_time = gst_element_get_base_time(renderer->appsrc);
if (renderer->appsrc) {
gst_video_pipeline_base_time = gst_element_get_base_time(renderer->appsrc);
}
}
void video_renderer_start() {
/* start both h264 and h265 pipelines; will shut down the "wrong" one when we know the codec */
if (hls_video) {
renderer->bus = gst_element_get_bus(renderer->pipeline);
gst_element_set_state (renderer->pipeline, GST_STATE_PLAYING);
return;
}
/* when not hls, start both h264 and h265 pipelines; will shut down the "wrong" one when we know the codec */
for (int i = 0; i < n_renderers; i++) {
gst_element_set_state (renderer_type[i]->pipeline, GST_STATE_PLAYING);
gst_video_pipeline_base_time = gst_element_get_base_time(renderer_type[i]->appsrc);
if (renderer_type[i]->appsrc) {
gst_video_pipeline_base_time = gst_element_get_base_time(renderer_type[i]->appsrc);
}
renderer_type[i]->bus = gst_element_get_bus(renderer_type[i]->pipeline);
}
renderer = NULL;
@@ -300,6 +405,23 @@ void video_renderer_start() {
#endif
}
/* used to find any X11 Window used by the playbin (HLS) pipeline after it starts playing.
* if use_x11 is true, called every 100 ms after playbin state is READY until the x11 window is found*/
bool waiting_for_x11_window() {
if (!hls_video) {
return false;
}
#ifdef X_DISPLAY_FIX
if (use_x11 && renderer->gst_window) {
get_x_window(renderer->gst_window, renderer->server_name);
if (!renderer->gst_window->window) {
return true; /* window still not found */
}
}
#endif
return false;
}
void video_renderer_render_buffer(unsigned char* data, int *data_len, int *nal_count, uint64_t *ntp_time) {
GstBuffer *buffer;
GstClockTime pts = (GstClockTime) *ntp_time; /*now in nsecs */
@@ -354,21 +476,28 @@ void video_renderer_flush() {
void video_renderer_stop() {
if (renderer) {
gst_app_src_end_of_stream (GST_APP_SRC(renderer->appsrc));
if (renderer->appsrc) {
gst_app_src_end_of_stream (GST_APP_SRC(renderer->appsrc));
}
gst_element_set_state (renderer->pipeline, GST_STATE_NULL);
}
//gst_element_set_state (renderer->playbin, GST_STATE_NULL);
}
}
static void video_renderer_destroy_h26x(video_renderer_t *renderer) {
if (renderer) {
GstState state;
gst_element_get_state(renderer->pipeline, &state, NULL, 0);
gst_element_get_state(renderer->pipeline, &state, NULL, 100 * GST_MSECOND);
if (state != GST_STATE_NULL) {
gst_app_src_end_of_stream (GST_APP_SRC(renderer->appsrc));
if (!hls_video) {
gst_app_src_end_of_stream (GST_APP_SRC(renderer->appsrc));
}
gst_element_set_state (renderer->pipeline, GST_STATE_NULL);
}
gst_object_unref(renderer->bus);
gst_object_unref (renderer->appsrc);
if (renderer->appsrc) {
gst_object_unref (renderer->appsrc);
}
gst_object_unref (renderer->pipeline);
#ifdef X_DISPLAY_FIX
if (renderer->gst_window) {
@@ -381,7 +510,6 @@ static void video_renderer_destroy_h26x(video_renderer_t *renderer) {
}
}
void video_renderer_destroy() {
for (int i = 0; i < n_renderers; i++) {
if (renderer_type[i]) {
@@ -390,11 +518,7 @@ void video_renderer_destroy() {
}
}
/* not implemented for gstreamer */
void video_renderer_update_background(int type) {
}
gboolean gstreamer_pipeline_bus_callback(GstBus *bus, GstMessage *message, void * loop) {
gboolean gstreamer_pipeline_bus_callback(GstBus *bus, GstMessage *message, void *loop) {
/* identify which pipeline sent the message */
int type = -1;
@@ -405,18 +529,49 @@ gboolean gstreamer_pipeline_bus_callback(GstBus *bus, GstMessage *message, void
}
}
g_assert(type != -1);
if (logger_debug) {
g_print("GStreamer %s bus message: %s %s\n", renderer_type[type]->codec, GST_MESSAGE_SRC_NAME(message), GST_MESSAGE_TYPE_NAME(message));
}
if (logger_debug && hls_video) {
gint64 pos;
gst_element_query_position (renderer_type[type]->pipeline, GST_FORMAT_TIME, &pos);
if (GST_CLOCK_TIME_IS_VALID(pos)) {
g_print("GStreamer bus message %s %s; position: %" GST_TIME_FORMAT "\n", GST_MESSAGE_SRC_NAME(message),
GST_MESSAGE_TYPE_NAME(message), GST_TIME_ARGS(pos));
} else {
g_print("GStreamer bus message %s %s; position: none\n", GST_MESSAGE_SRC_NAME(message),
GST_MESSAGE_TYPE_NAME(message));
}
}
switch (GST_MESSAGE_TYPE (message)) {
case GST_MESSAGE_DURATION:
renderer_type[type]->duration = GST_CLOCK_TIME_NONE;
break;
case GST_MESSAGE_BUFFERING:
if (hls_video) {
gint percent = -1;
gst_message_parse_buffering(message, &percent);
if (percent >= 0) {
renderer_type[type]->buffering_level = percent;
logger_log(logger, LOGGER_DEBUG, "Buffering :%u percent done", percent);
if (percent < 100) {
gst_element_set_state (renderer_type[type]->pipeline, GST_STATE_PAUSED);
} else {
gst_element_set_state (renderer_type[type]->pipeline, GST_STATE_PLAYING);
}
}
}
break;
case GST_MESSAGE_ERROR: {
GError *err;
gchar *debug;
gboolean flushing;
gst_message_parse_error (message, &err, &debug);
logger_log(logger, LOGGER_INFO, "GStreamer error: %s", err->message);
if (strstr(err->message,"Internal data stream error")) {
logger_log(logger, LOGGER_INFO, "GStreamer error: %s %s", GST_MESSAGE_SRC_NAME(message),err->message);
if (!hls_video && strstr(err->message,"Internal data stream error")) {
logger_log(logger, LOGGER_INFO,
"*** This is a generic GStreamer error that usually means that GStreamer\n"
"*** was unable to construct a working video pipeline.\n\n"
@@ -424,26 +579,33 @@ gboolean gstreamer_pipeline_bus_callback(GstBus *bus, GstMessage *message, void
"*** GStreamer may be trying to use non-functional hardware h264 video decoding.\n"
"*** Try using option -avdec to force software decoding or use -vs <videosink>\n"
"*** to select a videosink of your choice (see \"man uxplay\").\n\n"
"*** Raspberry Pi OS with (unpatched) GStreamer-1.18.4 needs \"-bt709\" uxplay option");
"*** Raspberry Pi models 4B and earlier using Video4Linux2 may need \"-bt709\" uxplay option");
}
g_error_free (err);
g_free (debug);
gst_app_src_end_of_stream (GST_APP_SRC(renderer_type[type]->appsrc));
flushing = TRUE;
gst_bus_set_flushing(bus, flushing);
gst_element_set_state (renderer_type[type]->pipeline, GST_STATE_NULL);
g_main_loop_quit( (GMainLoop *) loop);
if (renderer_type[type]->appsrc) {
gst_app_src_end_of_stream (GST_APP_SRC(renderer_type[type]->appsrc));
}
gst_bus_set_flushing(bus, TRUE);
gst_element_set_state (renderer_type[type]->pipeline, GST_STATE_READY);
renderer_type[type]->terminate = TRUE;
g_main_loop_quit( (GMainLoop *) loop);
break;
}
case GST_MESSAGE_EOS:
/* end-of-stream */
logger_log(logger, LOGGER_INFO, "GStreamer: End-Of-Stream");
// g_main_loop_quit( (GMainLoop *) loop);
logger_log(logger, LOGGER_INFO, "GStreamer: End-Of-Stream");
if (hls_video) {
gst_bus_set_flushing(bus, TRUE);
gst_element_set_state (renderer_type[type]->pipeline, GST_STATE_READY);
renderer_type[type]->terminate = TRUE;
g_main_loop_quit( (GMainLoop *) loop);
}
break;
case GST_MESSAGE_STATE_CHANGED:
if (renderer_type[type]->state_pending && strstr(GST_MESSAGE_SRC_NAME(message), "pipeline")) {
GstState state;
gst_element_get_state(renderer_type[type]->pipeline, &state, NULL,0);
gst_element_get_state(renderer_type[type]->pipeline, &state, NULL, 100 * GST_MSECOND);
if (state == GST_STATE_NULL) {
gst_element_set_state(renderer_type[type]->pipeline, GST_STATE_PLAYING);
} else if (state == GST_STATE_PLAYING) {
@@ -511,6 +673,7 @@ gboolean gstreamer_pipeline_bus_callback(GstBus *bus, GstMessage *message, void
}
void video_renderer_choose_codec (bool video_is_h265) {
g_assert(!hls_video);
/* set renderer to h264 or h265, depending on pps/sps received by raop_rtp_mirror */
video_renderer_t *renderer_new = video_is_h265 ? renderer_type[1] : renderer_type[0];
if (renderer == renderer_new) {
@@ -535,7 +698,9 @@ void video_renderer_choose_codec (bool video_is_h265) {
unsigned int video_reset_callback(void * loop) {
if (video_terminate) {
video_terminate = false;
gst_app_src_end_of_stream (GST_APP_SRC(renderer->appsrc));
if (renderer->appsrc) {
gst_app_src_end_of_stream (GST_APP_SRC(renderer->appsrc));
}
gboolean flushing = TRUE;
gst_bus_set_flushing(renderer->bus, flushing);
gst_element_set_state (renderer->pipeline, GST_STATE_NULL);
@@ -544,6 +709,63 @@ unsigned int video_reset_callback(void * loop) {
return (unsigned int) TRUE;
}
bool video_get_playback_info(double *duration, double *position, float *rate) {
gint64 pos = 0;
GstState state;
*duration = 0.0;
*position = -1.0;
*rate = 0.0f;
if (!renderer) {
return true;
}
gst_element_get_state(renderer->pipeline, &state, NULL, 0);
*rate = 0.0f;
switch (state) {
case GST_STATE_PLAYING:
*rate = 1.0f;
default:
break;
}
if (!GST_CLOCK_TIME_IS_VALID(renderer->duration)) {
if (!gst_element_query_duration (renderer->pipeline, GST_FORMAT_TIME, &renderer->duration)) {
return true;
}
}
*duration = ((double) renderer->duration) / GST_SECOND;
if (*duration) {
if (gst_element_query_position (renderer->pipeline, GST_FORMAT_TIME, &pos) &&
GST_CLOCK_TIME_IS_VALID(pos)) {
*position = ((double) pos) / GST_SECOND;
}
}
logger_log(logger, LOGGER_DEBUG, "********* video_get_playback_info: position %" GST_TIME_FORMAT " duration %" GST_TIME_FORMAT " %s *********",
GST_TIME_ARGS (pos), GST_TIME_ARGS (renderer->duration), gst_element_state_get_name(state));
return true;
}
void video_renderer_seek(float position) {
double pos = (double) position;
pos *= GST_SECOND;
gint64 seek_position = (gint64) pos;
seek_position = seek_position < 1000 ? 1000 : seek_position;
seek_position = seek_position > renderer->duration - 1000 ? renderer->duration - 1000: seek_position;
g_print("SCRUB: seek to %f secs = %" GST_TIME_FORMAT ", duration = %" GST_TIME_FORMAT "\n", position,
GST_TIME_ARGS(seek_position), GST_TIME_ARGS(renderer->duration));
gboolean result = gst_element_seek_simple(renderer->pipeline, GST_FORMAT_TIME,
(GstSeekFlags)(GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT),
seek_position);
if (result) {
g_print("seek succeeded\n");
gst_element_set_state (renderer->pipeline, GST_STATE_PLAYING);
} else {
g_print("seek failed\n");
}
}
unsigned int video_renderer_listen(void *loop, int id) {
g_assert(id >= 0 && id < n_renderers);
return (unsigned int) gst_bus_add_watch(renderer_type[id]->bus,(GstBusFunc)

View File

@@ -47,27 +47,25 @@ typedef enum videoflip_e {
typedef struct video_renderer_s video_renderer_t;
typedef struct user_data_s {
int type;
GMainLoop *loop;
} user_data_t;
void video_renderer_init(logger_t *render_logger, const char *server_name, videoflip_t videoflip[2], const char *parser,
const char *decoder, const char *converter, const char *videosink, const char *videosin_options,
bool initial_fullscreen, bool video_sync, bool h265_support);
void video_renderer_init (logger_t *logger, const char *server_name, videoflip_t videoflip[2], const char *parser,
const char *decoder, const char *converter, const char *videosink, const char *videosink_options,
bool initial_fullscreen, bool video_sync, bool h265_support, const char *uri);
void video_renderer_start ();
void video_renderer_stop ();
void video_renderer_pause ();
void video_renderer_seek(float position);
void video_renderer_resume ();
bool video_renderer_is_paused();
void video_renderer_render_buffer (unsigned char* data, int *data_len, int *nal_count, uint64_t *ntp_time);
void video_renderer_flush ();
unsigned int video_renderer_listen(void *loop, int id);
void video_renderer_destroy ();
void video_renderer_size(float *width_source, float *height_source, float *width, float *height);
bool waiting_for_x11_window();
bool video_get_playback_info(double *duration, double *position, float *rate);
void video_renderer_choose_codec(bool is_h265);
unsigned int video_renderer_listen(void *loop, int id);
unsigned int video_reset_callback(void *loop);
#ifdef __cplusplus
}
#endif

View File

@@ -23,6 +23,9 @@
/* based on code from David Ventura https://github.com/DavidVentura/UxPlay */
/* This file should be only included from video_renderer.c as it defines static
* functions and depends on video_renderer internals */
#ifndef X_DISPLAY_FIX_H
#define X_DISPLAY_FIX_H
@@ -40,12 +43,12 @@ struct X11_Window_s {
Window window;
} typedef X11_Window_t;
void get_X11_Display(X11_Window_t * X11) {
static void get_X11_Display(X11_Window_t * X11) {
X11->display = XOpenDisplay(NULL);
X11->window = (Window) NULL;
}
Window enum_windows(const char * str, Display * display, Window window, int depth) {
static Window enum_windows(const char * str, Display * display, Window window, int depth) {
int i;
XTextProperty text;
XGetWMName(display, window, &text);
@@ -73,7 +76,7 @@ int X11_error_catcher( Display *disp, XErrorEvent *xe ) {
return 0;
}
void get_x_window(X11_Window_t * X11, const char * name) {
static void get_x_window(X11_Window_t * X11, const char * name) {
Window root = XDefaultRootWindow(X11->display);
XSetErrorHandler(X11_error_catcher);
X11->window = enum_windows(name, X11->display, root, 0);
@@ -89,7 +92,7 @@ void get_x_window(X11_Window_t * X11, const char * name) {
#endif
}
void set_fullscreen(X11_Window_t * X11, bool * fullscreen) {
static void set_fullscreen(X11_Window_t * X11, bool * fullscreen) {
XClientMessageEvent msg = {
.type = ClientMessage,
.display = X11->display,

View File

@@ -1,11 +1,11 @@
.TH UXPLAY "1" "September 2024" "1.70" "User Commands"
.TH UXPLAY "1" "December 2024" "1.71" "User Commands"
.SH NAME
uxplay \- start AirPlay server
.SH SYNOPSIS
.B uxplay
[\fI\,-n name\/\fR] [\fI\,-s wxh\/\fR] [\fI\,-p \/\fR[\fI\,n\/\fR]] [more \fI OPTIONS \/\fR ...]
.SH DESCRIPTION
UxPlay 1.70: An open\-source AirPlay mirroring (+ audio streaming) server:
UxPlay 1.71: An open\-source AirPlay mirroring (+ audio streaming) server:
.SH OPTIONS
.TP
.B
@@ -15,6 +15,8 @@ UxPlay 1.70: An open\-source AirPlay mirroring (+ audio streaming) server:
.TP
\fB\-h265\fR Support h265 (4K) video (with h265 versions of h264 plugins)
.TP
\fB\-hls\fR Support HTTP Live Streaming (currently YouTube video only)
.TP
\fB\-pin\fI[xxxx]\fRUse a 4-digit pin code to control client access (default: no)
.IP
without option, pin is random: optionally use fixed pin xxxx.
@@ -83,7 +85,13 @@ UxPlay 1.70: An open\-source AirPlay mirroring (+ audio streaming) server:
.TP
\fB\-v4l2\fR Use Video4Linux2 for GPU hardware h264 video decoding.
.TP
\fB\-bt709\fR Sometimes needed for Raspberry Pi with GStreamer < 1.22
\fB\-bt709\fR Sometimes needed for Raspberry Pi models using Video4Linux2.
.TP
\fB\-srgb\fR Display "Full range" [0-255] color, not "Limited Range"[16-235]
.IP
This is a workaround for a GStreamer problem, until it is fixed.
.PP
\fB\-srgb\fR no Disable srgb option (use when enabled by default: Linux, *BSD)
.TP
\fB\-as\fI sink\fR Choose the GStreamer audiosink; default "autoaudiosink"
.IP

View File

@@ -62,7 +62,7 @@
#include "renderers/video_renderer.h"
#include "renderers/audio_renderer.h"
#define VERSION "1.70"
#define VERSION "1.71"
#define SECOND_IN_USECS 1000000
#define SECOND_IN_NSECS 1000000000UL
@@ -72,6 +72,12 @@
#define HIGHEST_PORT 65535
#define NTP_TIMEOUT_LIMIT 5
#define BT709_FIX "capssetter caps=\"video/x-h264, colorimetry=bt709\""
#define SRGB_FIX " ! video/x-raw,colorimetry=sRGB,format=RGB ! "
#ifdef FULL_RANGE_RGB_FIX
#define DEFAULT_SRGB_FIX true
#else
#define DEFAULT_SRGB_FIX false
#endif
static std::string server_name = DEFAULT_NAME;
static dnssd_t *dnssd = NULL;
@@ -122,6 +128,7 @@ static unsigned short display[5] = {0}, tcp[3] = {0}, udp[3] = {0};
static bool debug_log = DEFAULT_DEBUG_LOG;
static int log_level = LOGGER_INFO;
static bool bt709_fix = false;
static bool srgb_fix = DEFAULT_SRGB_FIX;
static int nohold = 0;
static bool nofreeze = false;
static unsigned short raop_port;
@@ -144,6 +151,11 @@ static double db_high = 0.0;
static bool taper_volume = false;
static bool h265_support = false;
static int n_renderers = 0;
static bool hls_support = false;
static std::string url = "";
static guint gst_x11_window_id = 0;
static guint gst_hls_position_id = 0;
static bool preserve_connections = false;
/* logging */
@@ -360,6 +372,16 @@ static gboolean reset_callback(gpointer loop) {
return TRUE;
}
static gboolean x11_window_callback(gpointer loop) {
/* called while trying to find an x11 window used by playbin (HLS mode) */
if (waiting_for_x11_window()) {
return TRUE;
}
g_source_remove(gst_x11_window_id);
gst_x11_window_id = 0;
return FALSE;
}
static gboolean sigint_callback(gpointer loop) {
relaunch_video = false;
g_main_loop_quit((GMainLoop *) loop);
@@ -400,6 +422,15 @@ static void main_loop() {
relaunch_video = false;
if (use_video) {
relaunch_video = true;
if (url.empty()) {
n_renderers = h265_support ? 2 : 1;
gst_x11_window_id = 0;
} else {
/* hls video will be rendered */
n_renderers = 1;
url.erase();
gst_x11_window_id = g_timeout_add(100, (GSourceFunc) x11_window_callback, (gpointer) loop);
}
for (int i = 0; i < n_renderers; i++) {
gst_bus_watch_id[i] = (guint) video_renderer_listen((void *)loop, i);
}
@@ -408,12 +439,12 @@ static void main_loop() {
guint video_reset_watch_id = g_timeout_add(100, (GSourceFunc) video_reset_callback, (gpointer) loop);
guint sigterm_watch_id = g_unix_signal_add(SIGTERM, (GSourceFunc) sigterm_callback, (gpointer) loop);
guint sigint_watch_id = g_unix_signal_add(SIGINT, (GSourceFunc) sigint_callback, (gpointer) loop);
//printf("********** main_loop_run *******************\n");
g_main_loop_run(loop);
//printf("********** main_loop_exit *******************\n");
for (int i = 0; i < n_renderers; i++) {
if (gst_bus_watch_id[i] > 0) g_source_remove(gst_bus_watch_id[i]);
}
if (gst_x11_window_id > 0) g_source_remove(gst_x11_window_id);
if (sigint_watch_id > 0) g_source_remove(sigint_watch_id);
if (sigterm_watch_id > 0) g_source_remove(sigterm_watch_id);
if (reset_watch_id > 0) g_source_remove(reset_watch_id);
@@ -582,6 +613,7 @@ static void print_info (char *name) {
printf("-n name Specify the network name of the AirPlay server\n");
printf("-nh Do not add \"@hostname\" at the end of AirPlay server name\n");
printf("-h265 Support h265 (4K) video (with h265 versions of h264 plugins)\n");
printf("-hls Support HTTP Live Streaming (currently Youtube video only) \n");
printf("-pin[xxxx]Use a 4-digit pin code to control client access (default: no)\n");
printf(" default pin is random: optionally use fixed pin xxxx\n");
printf("-reg [fn] Keep a register in $HOME/.uxplay.register to verify returning\n");
@@ -593,7 +625,7 @@ static void print_info (char *name) {
printf("-async no Switch off audio/(client)video timestamp synchronization\n");
printf("-db l[:h] Set minimum volume attenuation to l dB (decibels, negative);\n");
printf(" optional: set maximum to h dB (+ or -) default: -30.0:0.0 dB\n");
printf("-taper Use a \"tapered\" AirPlay volume-control profile\n");
printf("-taper Use a \"tapered\" AirPlay volume-control profile\n");
printf("-s wxh[@r]Request to client for video display resolution [refresh_rate]\n");
printf(" default 1920x1080[@60] (or 3840x2160[@60] with -h265 option)\n");
printf("-o Set display \"overscanned\" mode on (not usually needed)\n");
@@ -607,6 +639,7 @@ static void print_info (char *name) {
printf("-vd ... Choose the GStreamer h264 decoder; default \"decodebin\"\n");
printf(" choices: (software) avdec_h264; (hardware) v4l2h264dec,\n");
printf(" nvdec, nvh264dec, vaapih64dec, vtdec,etc.\n");
printf(" choices: avdec_h264,vaapih264dec,nvdec,nvh264dec,v4l2h264dec\n");
printf("-vc ... Choose the GStreamer videoconverter; default \"videoconvert\"\n");
printf(" another choice when using v4l2h264dec: v4l2convert\n");
printf("-vs ... Choose the GStreamer videosink; default \"autovideosink\"\n");
@@ -614,7 +647,10 @@ static void print_info (char *name) {
printf(" gtksink,waylandsink,osxvideosink,kmssink,d3d11videosink etc.\n");
printf("-vs 0 Streamed audio only, with no video display window\n");
printf("-v4l2 Use Video4Linux2 for GPU hardware h264 decoding\n");
printf("-bt709 Sometimes needed for Raspberry Pi with GStreamer < 1.22 \n");
printf("-bt709 Sometimes needed for Raspberry Pi models using Video4Linux2 \n");
printf("-srgb Display \"Full range\" [0-255] color, not \"Limited Range\"[16-235]\n");
printf(" This is a workaround for a GStreamer problem, until it is fixed\n");
printf("-srgb no Disable srgb option (use when enabled by default: Linux, *BSD)\n");
printf("-as ... Choose the GStreamer audiosink; default \"autoaudiosink\"\n");
printf(" some choices:pulsesink,alsasink,pipewiresink,jackaudiosink,\n");
printf(" osssink,oss4sink,osxaudiosink,wasapisink,directsoundsink.\n");
@@ -977,7 +1013,7 @@ static void parse_arguments (int argc, char *argv[]) {
fprintf(stderr," -rpifb was equivalent to \"-v4l2 -vs kmssink\"\n");
fprintf(stderr," -rpigl was equivalent to \"-v4l2 -vs glimagesink\"\n");
fprintf(stderr," -rpiwl was equivalent to \"-v4l2 -vs waylandsink\"\n");
fprintf(stderr," for GStreamer < 1.22, \"-bt709\" may also be needed\n");
fprintf(stderr," Option \"-bt709\" may also be needed for R Pi model 4B and earlier\n");
exit(1);
} else if (arg == "-fs" ) {
fullscreen = true;
@@ -1052,6 +1088,15 @@ static void parse_arguments (int argc, char *argv[]) {
}
} else if (arg == "-bt709") {
bt709_fix = true;
} else if (arg == "-srgb") {
srgb_fix = true;
if (i < argc - 1) {
if (strlen(argv[i+1]) == 2 && strncmp(argv[i+1], "no", 2) == 0) {
srgb_fix = false;
i++;
continue;
}
}
} else if (arg == "-nohold") {
nohold = 1;
} else if (arg == "-al") {
@@ -1145,6 +1190,8 @@ static void parse_arguments (int argc, char *argv[]) {
db_low = db1;
db_high = db2;
printf("db range %f:%f\n", db_low, db_high);
} else if (arg == "-hls") {
hls_support = true;
} else if (arg == "-h265") {
h265_support = true;
} else if (arg == "-nofreeze") {
@@ -1356,7 +1403,7 @@ static int start_dnssd(std::vector<char> hw_addr, std::string name) {
}
/* after dnssd starts, reset the default feature set here
* (overwrites features set in dnssdint.h).
* (overwrites features set in dnssdint.h)
* default: FEATURES_1 = 0x5A7FFEE6, FEATURES_2 = 0 */
dnssd_set_airplay_features(dnssd, 0, 0); // AirPlay video supported
@@ -1399,7 +1446,8 @@ static int start_dnssd(std::vector<char> hw_addr, std::string name) {
dnssd_set_airplay_features(dnssd, 30, 1); // RAOP support: with this bit set, the AirTunes service is not required.
dnssd_set_airplay_features(dnssd, 31, 0); //
/* bits 32-63 see https://emanualcozzi.net/docs/airplay2/features
/* bits 32-63: see https://emanualcozzi.net/docs/airplay2/features
dnssd_set_airplay_features(dnssd, 32, 0); // isCarPlay when ON,; Supports InitialVolume when OFF
dnssd_set_airplay_features(dnssd, 33, 0); // Supports Air Play Video Play Queue
dnssd_set_airplay_features(dnssd, 34, 0); // Supports Air Play from cloud (requires that bit 6 is ON)
@@ -1412,8 +1460,7 @@ static int start_dnssd(std::vector<char> hw_addr, std::string name) {
dnssd_set_airplay_features(dnssd, 40, 0); // Supports Buffered Audio
dnssd_set_airplay_features(dnssd, 41, 0); // Supports PTP
dnssd_set_airplay_features(dnssd, 42, 0); // Supports Screen Multi Codec (allows h265 video)
dnssd_set_airplay_features(dnssd, 42, 0); // Supports Screen Multi Codec (allows h265 video)
dnssd_set_airplay_features(dnssd, 43, 0); // Supports System Pairing
dnssd_set_airplay_features(dnssd, 44, 0); // is AP Valeria Screen Sender
@@ -1440,9 +1487,15 @@ static int start_dnssd(std::vector<char> hw_addr, std::string name) {
dnssd_set_airplay_features(dnssd, 61, 0); // Supports RFC2198 redundancy
*/
/* needed for HLS video support */
dnssd_set_airplay_features(dnssd, 0, (int) hls_support);
dnssd_set_airplay_features(dnssd, 4, (int) hls_support);
// not sure about this one (bit 8, screen rotation supported):
//dnssd_set_airplay_features(dnssd, 8, (int) hls_support);
/* needed for h265 video support */
dnssd_set_airplay_features(dnssd, 42, (int) h265_support);
/* bit 27 of Features determines whether the AirPlay2 client-pairing protocol will be used (1) or not (0) */
dnssd_set_airplay_features(dnssd, 27, (int) setup_legacy_pairing);
return 0;
@@ -1475,6 +1528,8 @@ static bool check_blocked_client(char *deviceid) {
// Server callbacks
extern "C" void video_reset(void *cls) {
LOGD("video_reset");
url.erase();
reset_loop = true;
remote_clock_offset = 0;
relaunch_video = true;
@@ -1801,6 +1856,51 @@ extern "C" bool check_register(void *cls, const char *client_pk) {
return false;
}
}
/* control callbacks for video player (unimplemented) */
extern "C" void on_video_play(void *cls, const char* location, const float start_position) {
/* start_position needs to be implemented */
url.erase();
url.append(location);
reset_loop = true;
relaunch_video = true;
preserve_connections = true;
LOGD("********************on_video_play: location = %s***********************", url.c_str());
}
extern "C" void on_video_scrub(void *cls, const float position) {
LOGI("on_video_scrub: position = %7.5f\n", position);
video_renderer_seek(position);
}
extern "C" void on_video_rate(void *cls, const float rate) {
LOGI("on_video_rate = %7.5f\n", rate);
if (rate == 1.0f) {
video_renderer_resume();
} else if (rate == 0.0f) {
video_renderer_pause();
} else {
LOGI("on_video_rate: ignoring unexpected value rate = %f\n", rate);
}
}
extern "C" void on_video_stop(void *cls) {
LOGI("on_video_stop\n");
}
extern "C" void on_video_acquire_playback_info (void *cls, playback_info_t *playback_info) {
int buffering_level;
LOGD("on_video_acquire_playback info\n");
bool still_playing = video_get_playback_info(&playback_info->duration, &playback_info->position,
&playback_info->rate);
LOGD("on_video_acquire_playback info done\n");
if (!still_playing) {
LOGI(" video has finished, %f", playback_info->position);
playback_info->position = -1.0;
playback_info->duration = -1.0;
video_renderer_stop();
}
}
extern "C" void log_callback (void *cls, int level, const char *msg) {
switch (level) {
@@ -1851,6 +1951,11 @@ static int start_raop_server (unsigned short display[5], unsigned short tcp[3],
raop_cbs.export_dacp = export_dacp;
raop_cbs.video_reset = video_reset;
raop_cbs.video_set_codec = video_set_codec;
raop_cbs.on_video_play = on_video_play;
raop_cbs.on_video_scrub = on_video_scrub;
raop_cbs.on_video_rate = on_video_rate;
raop_cbs.on_video_stop = on_video_stop;
raop_cbs.on_video_acquire_playback_info = on_video_acquire_playback_info;
raop = raop_init(&raop_cbs);
if (raop == NULL) {
@@ -1879,6 +1984,7 @@ static int start_raop_server (unsigned short display[5], unsigned short tcp[3],
raop_set_plist(raop, "max_ntp_timeouts", max_ntp_timeouts);
if (audiodelay >= 0) raop_set_plist(raop, "audio_delay_micros", audiodelay);
if (require_password) raop_set_plist(raop, "pin", (int) pin);
if (hls_support) raop_set_plist(raop, "hls", 1);
/* network port selection (ports listed as "0" will be dynamically assigned) */
raop_set_tcp_ports(raop, tcp);
@@ -1993,8 +2099,8 @@ static void read_config_file(const char * filename, const char * uxplay_name) {
void real_main (int argc, char *argv[]);
int main (int argc, char *argv[]) {
printf("*=== Using gst_macos_main wrapper for GStreamer >= 1.22 on macOS ===*\n");
return gst_macos_main ((GstMainFunc) real_main, argc, argv , NULL);
LOGI("*=== Using gst_macos_main wrapper for GStreamer >= 1.22 on macOS ===*");
return gst_macos_main ((GstMainFunc) real_main, argc, argv , NULL);
}
void real_main (int argc, char *argv[]) {
@@ -2034,22 +2140,22 @@ int main (int argc, char *argv[]) {
}
if (dump_video) {
if (video_dump_limit > 0) {
printf("dump video using \"-vdmp %d %s\"\n", video_dump_limit, video_dumpfile_name.c_str());
LOGI("dump video using \"-vdmp %d %s\"", video_dump_limit, video_dumpfile_name.c_str());
} else {
printf("dump video using \"-vdmp %s\"\n", video_dumpfile_name.c_str());
LOGI("dump video using \"-vdmp %s\"", video_dumpfile_name.c_str());
}
}
if (dump_audio) {
if (audio_dump_limit > 0) {
printf("dump audio using \"-admp %d %s\"\n", audio_dump_limit, audio_dumpfile_name.c_str());
LOGI("dump audio using \"-admp %d %s\"", audio_dump_limit, audio_dumpfile_name.c_str());
} else {
printf("dump audio using \"-admp %s\"\n", audio_dumpfile_name.c_str());
LOGI("dump audio using \"-admp %s\"", audio_dumpfile_name.c_str());
}
}
#if __APPLE__
/* force use of -nc option on macOS */
LOGI("macOS detected: use -nc option as workaround for GStreamer problem");
LOGI("macOS detected: using -nc option as workaround for GStreamer problem");
new_window_closing_behavior = false;
#endif
@@ -2070,9 +2176,9 @@ int main (int argc, char *argv[]) {
if (videosink == "d3d11videosink" && videosink_options.empty() && use_video) {
if (fullscreen) {
videosink_options.append(" fullscreen-toggle-mode=GST_D3D11_WINDOW_FULLSCREEN_TOGGLE_MODE_PROPERTY fullscreen=true ");
videosink_options.append(" fullscreen-toggle-mode=GST_D3D11_WINDOW_FULLSCREEN_TOGGLE_MODE_PROPERTY fullscreen=true ");
} else {
videosink_options.append(" fullscreen-toggle-mode=GST_D3D11_WINDOW_FULLSCREEN_TOGGLE_MODE_ALT_ENTER ");
videosink_options.append(" fullscreen-toggle-mode=GST_D3D11_WINDOW_FULLSCREEN_TOGGLE_MODE_ALT_ENTER ");
}
LOGI("d3d11videosink is being used with option fullscreen-toggle-mode=alt-enter\n"
"Use Alt-Enter key combination to toggle into/out of full-screen mode");
@@ -2083,6 +2189,12 @@ int main (int argc, char *argv[]) {
video_parser.append(BT709_FIX);
}
if (srgb_fix && use_video) {
std::string option = video_converter;
video_converter.append(SRGB_FIX);
video_converter.append(option);
}
if (require_password && registration_list) {
if (pairing_register == "") {
const char * homedir = get_homedir();
@@ -2148,12 +2260,10 @@ int main (int argc, char *argv[]) {
} else {
LOGI("audio_disabled");
}
if (use_video) {
n_renderers = h265_support ? 2 : 1;
video_renderer_init(render_logger, server_name.c_str(), videoflip, video_parser.c_str(),
video_decoder.c_str(), video_converter.c_str(), videosink.c_str(),
videosink_options.c_str(),fullscreen, video_sync, h265_support);
videosink_options.c_str(), fullscreen, video_sync, h265_support, NULL);
video_renderer_start();
}
@@ -2196,7 +2306,6 @@ int main (int argc, char *argv[]) {
if (start_dnssd(server_hw_addr, server_name)) {
goto cleanup;
}
if (start_raop_server(display, tcp, udp, debug_log)) {
stop_dnssd();
goto cleanup;
@@ -2209,7 +2318,7 @@ int main (int argc, char *argv[]) {
reconnect:
compression_type = 0;
close_window = new_window_closing_behavior;
main_loop();
if (relaunch_video || reset_loop) {
if(reset_loop) {
@@ -2218,12 +2327,18 @@ int main (int argc, char *argv[]) {
raop_stop(raop);
}
if (use_audio) audio_renderer_stop();
if (use_video && close_window) {
if (use_video && (close_window || preserve_connections)) {
video_renderer_destroy();
raop_remove_known_connections(raop);
if (!preserve_connections) {
raop_destroy_airplay_video(raop);
url.erase();
raop_remove_known_connections(raop);
}
preserve_connections = false;
const char *uri = (url.empty() ? NULL : url.c_str());
video_renderer_init(render_logger, server_name.c_str(), videoflip, video_parser.c_str(),
video_decoder.c_str(), video_converter.c_str(), videosink.c_str(),
videosink_options.c_str(), fullscreen, video_sync, h265_support);
video_decoder.c_str(), video_converter.c_str(), videosink.c_str(),
videosink_options.c_str(), fullscreen, video_sync, h265_support, uri);
video_renderer_start();
}
if (relaunch_video) {

View File

@@ -1,5 +1,5 @@
Name: uxplay
Version: 1.70
Version: 1.71.1
Release: 1%{?dist}
%global gittag v%{version}
@@ -135,7 +135,7 @@ cd build
%{_docdir}/%{name}/llhttp/LICENSE-MIT
%changelog
* Tue Sep 17 2024 UxPlay maintainer <https://github.com/FDH2/UxPlay>
* Fri Nov 15 2024 UxPlay maintainer <https://github.com/FDH2/UxPlay>
Initial uxplay.spec: tested on Fedora 38, Rocky Linux 9.2, OpenSUSE
Leap 15.5, Mageia 9, OpenMandriva ROME, PCLinuxOS
-