Merge pull request #188 from FDH2/master

UxPlay-1.72
This commit is contained in:
antimof
2025-07-29 16:21:23 +03:00
committed by GitHub
45 changed files with 4277 additions and 2836 deletions

View File

@@ -58,7 +58,8 @@ target_link_libraries( uxplay
install( TARGETS uxplay RUNTIME DESTINATION bin )
install( FILES uxplay.1 DESTINATION ${CMAKE_INSTALL_MANDIR}/man1 )
install( FILES README.md README.txt README.html LICENSE DESTINATION ${CMAKE_INSTALL_DOCDIR} )
install( FILES lib/llhttp/LICENSE-MIT DESTINATION ${CMAKE_INSTALL_DOCDIR}/llhttp )
install( FILES lib/llhttp/LICENSE-MIT DESTINATION ${CMAKE_INSTALL_DOCDIR}/llhttp )
install( FILES uxplay.service DESTINATION ${CMAKE_INSTALL_DOCDIR}/systemd )
# uninstall target
if(NOT TARGET uninstall)

View File

@@ -1,19 +1,46 @@
<h1
id="uxplay-1.71-airplay-mirror-and-airplay-audio-server-for-linux-macos-and-unix-now-also-runs-on-windows.">UxPlay
1.71: AirPlay-Mirror and AirPlay-Audio server for Linux, macOS, and Unix
(now also runs on Windows).</h1>
id="uxplay-1.72-airplay-mirror-and-airplay-audio-server-for-linux-macos-and-unix-also-runs-on-windows.">UxPlay
1.72: AirPlay-Mirror and AirPlay-Audio server for Linux, macOS, and Unix
(also runs on Windows).</h1>
<h3
id="now-developed-at-the-github-site-httpsgithub.comfdh2uxplay-where-all-user-issues-should-be-posted-and-latest-versions-can-be-found."><strong>Now
developed at the GitHub site <a href="https://github.com/FDH2/UxPlay"
class="uri">https://github.com/FDH2/UxPlay</a> (where ALL user issues
should be posted, and latest versions can be found).</strong></h3>
<ul>
<li><em><strong>NEW in v1.71</strong>: Support for (YouTube) HLS (HTTP
Live Streaming) video with the new “-hls” option.</em> Click on the
airplay icon in the YouTube app to stream video. (You may need to wait
until advertisements have finished or been skipped before clicking the
YouTube airplay icon.) <strong>Please report any issues with this new
feature of UxPlay</strong>.</li>
<li><p><strong>NEW on github</strong>: option -ca (with no filename
given) will now render Apple Music cover art (in audio-only mode) inside
UxPlay. (-ca <code>&lt;filename&gt;</code> will continue to export cover
art for display by an external viewer).</p></li>
<li><p><strong>NEW in v1.72</strong>: Improved Support for (YouTube) HLS
(HTTP Live Streaming) video with the new “-hls” option (introduced in
1.71).* <strong>Only streaming from the YouTube iOS app (in "m3u8"
protocol) is currently supported</strong>: (streaming using the AirPlay
icon in a browser window is <strong>not</strong> yet supported).Click on
the airplay icon in the YouTube app to stream video. <strong>Please
report any issues with this new feature of UxPlay</strong>.</p>
<p><em>The default video player for HLS is GStreamer playbin v3: use
“-hls 2” to revert to playbin v2 if some videos fail to play</em>.</p>
<ul>
<li>user-requested features: added support for setting a password (as an
alternative to on-screen pin codes) to control client access (-pw
option, see “man pw” or this README for details); added support for
setting initial client audio-streaming volume (-vol option), and output
of audio-mode metadata to file (for display by some external process,
-md option).</li>
</ul>
<p><strong>ISSUES</strong> <strong><em>(Please help to solve if you have
expertise)</em></strong></p>
<ul>
<li>in HLS video streaming from the YouTube app (-hls option), rendered
using GStreamers media player “playbin3” (or playbin2, with option -hls
2), we dont understand how to correctly deal with “interstitials” (= 15
sec commercials) when “skip” is pressed on the client. (HLS is handled
by handlers in lib/http_handlers.h). (Should response to HTTP requests
POST /action (playlistRemove) and POST /Stop be modified? <em>Wireshark
data from HLS on an AppleTV model 3 with UN-upgraded original OS
(unencrypted communications) could be useful!</em></li>
</ul></li>
</ul>
<h2 id="highlights">Highlights:</h2>
<ul>
@@ -51,9 +78,10 @@ alt="Current Packaging status" /></a>.</p>
<ul>
<li><p>Install uxplay on Debian-based Linux systems with
<code>sudo apt install uxplay</code>”; on FreeBSD with
<code>sudo pkg install uxplay</code>. Also available on Arch-based
systems through AUR. Since v. 1.66, uxplay is now also packaged in RPM
format by Fedora 38 (“<code>sudo dnf install uxplay</code>”).</p></li>
<code>sudo pkg install uxplay</code>; on OpenBSD with
<code>doas pkg_add uxplay</code>”. Also available on Arch-based systems
through AUR. Since v. 1.66, uxplay is now also packaged in RPM format by
Fedora 38 (“<code>sudo dnf install uxplay</code>”).</p></li>
<li><p>For other RPM-based distributions which have not yet packaged
UxPlay, a RPM “specfile” <strong>uxplay.spec</strong> is now provided
with recent <a
@@ -62,8 +90,12 @@ href="https://github.com/FDH2/UxPlay/releases">releases</a> (see their
the section on using this specfile for <a
href="#building-an-installable-rpm-package">building an installable RPM
package</a>.</p></li>
<li><p>If your distribution does not supply UxPlay, or you want the
latest version, it is very easy to build it yourself: see the very <a
href="#building-uxplay-from-source">detailed instructions for building
UxPlay from source</a>. later in this document.</p></li>
</ul>
<p>After installation:</p>
<h2 id="after-installation">After installation:</h2>
<ul>
<li><p>(On Linux and *BSD): if a firewall is active on the server
hosting UxPlay, make sure the default network port (UDP 5353) for
@@ -79,15 +111,40 @@ distributions <strong>GStreamer plugin packages</strong> you should
also install.</p></li>
<li><p>For Audio-only mode (Apple Music, etc.) best quality is obtained
with the option “uxplay -async”, but there is then a 2 second latency
imposed by iOS.</p></li>
imposed by iOS. Use option “uxplay -ca” to display any “Cover Art” that
accompanies the audio.</p></li>
<li><p>If you are using UxPlay just to mirror the clients screen
(without showing videos that need audio synchronized with video), it is
best to use the option “uxplay -vsync no”.</p></li>
<li><p>Add any UxPlay options you want to use as defaults to a startup
file <code>~/.uxplayrc</code> (see “<code>man uxplay</code>” or
<code>uxplay -h</code>” for format and other possible locations). In
<code>uxplay -h</code>” for format and other possible locations; the
location can also be set with “uxplay -rc <em>location</em>”). In
particular, if your system uses PipeWire audio or Wayland video systems,
you may wish to add “as pipewiresink” or “vs waylandsink” as defaults to
the file. <em>(Output from terminal commands “ps waux | grep pulse” or
“pactl info” will contain “pipewire” if your Linux/BSD system uses
it).</em></p></li>
<li><p>For Linux systems using systemd, there is a
<strong>systemd</strong> service file <strong>uxplay.service</strong>
found in the UxPlay top directory of the distribution, and also
installed in <code>&lt;DOCDIR&gt;/uxplay/systemd/</code> (where DOCDIR
is usually <code>/usr/local/share/doc</code>), that allows users to
start their own instance of UxPlay as a rootless daemon: it should
either be added to the directory /etc/systemd/user, or the user can just
create their own systemd directory <code>~/.config/systemd/user/</code>
and then copy uxplay.service into it. To save uxplay terminal output to
a file ~/uxplay.log, uncomment the StandardOutput entry in
uxplay.service. Then</p>
<p><code>systemctl --user [start/stop/enable/disable/status] uxplay</code></p>
<p>can be used to control the daemon. If it is enabled, the daemon will
start at the users first login and stop when they no longer have any
open sessions. See
https://www.baeldung.com/linux/systemd-create-user-services for more
about systemd user services. If more than one user might simultaneously
run uxplay this way, they should specify distinct -p and -m options
(ports and deviceID) in their startup files. <strong>Note: it is NOT
recommended to run UxPlay as a root service.</strong></p></li>
<li><p>On Raspberry Pi: models using hardware h264 video decoding by the
Broadcom GPU (models 4B and earlier) may require the uxplay option
-bt709. If you use Ubuntu 22.10 or earlier, GStreamer must be <a
@@ -96,9 +153,12 @@ to use hardware video decoding by the Broadcom GPU (also recommended but
optional for Raspberry Pi OS (Bullseye): the patched GStreamer does not
need option ” -bt709`“. The need for -bt709 when hardware video decoding
is used seems to have reappeared starting with GStreamer-1.22.</p></li>
<li><p>If UxPlay is used in a public space, there are security options
for requiring an AppleTV-style one-time pin (displayed on the terminal)
to be entered, or a password, and for barring/permitting client access
by their device ID. See options -pin, -reg, -pw, -restrict, -allow,
-block.</p></li>
</ul>
<p>To (easily) compile the latest UxPlay from source, see the section <a
href="#getting-uxplay">Getting UxPlay</a>.</p>
<h1 id="detailed-description-of-uxplay">Detailed description of
UxPlay</h1>
<p>This project is a GPLv3 open source unix AirPlay2 Mirror server for
@@ -237,7 +297,7 @@ can be regarded as a “System Library”, which it is in *BSD). Many Linux
distributions treat OpenSSL as a “System Library”, but some
(e.g. Debian) do not: in this case, the issue is solved by linking with
OpenSSL-3.0.0 or later.</p>
<h1 id="getting-uxplay">Getting UxPlay</h1>
<h1 id="building-uxplay-from-source">Building UxPlay from source</h1>
<p>Either download and unzip <a
href="https://github.com/FDH2/UxPlay/archive/refs/heads/master.zip">UxPlay-master.zip</a>,
or (if git is installed): “git clone https://github.com/FDH2/UxPlay”.
@@ -368,6 +428,10 @@ gst-plugins-base.</p></li>
Either avahi-libdns or mDNSResponder must also be installed to provide
the dns_sd library. OpenSSL is already installed as a System
Library.</p></li>
<li><p><strong>OpenBSD:</strong> (doas pkg_add) libplist
gstreamer1-plugins-base. avahi-libs must also be installed to provide
the dns_sd library; (avahi-main must also be installed). OpenSSL is
already installed as a System Library.</p></li>
</ul>
<h4 id="building-an-installable-rpm-package">Building an installable RPM
package</h4>
@@ -456,6 +520,8 @@ graphics).</p></li>
gstreamer1-plugins, gstreamer1-plugins-* (* = core, good, bad, x, gtk,
gl, vulkan, pulse, v4l2, …), (+ gstreamer1-vaapi for Intel/AMD
graphics).</p></li>
<li><p><strong>OpenBSD:</strong> Install gstreamer1-libav,
gstreamer-plugins-* (* = core, bad, base, good).</p></li>
</ul>
<h3 id="starting-and-running-uxplay">Starting and running UxPlay</h3>
<p>Since UxPlay-1.64, UxPlay can be started with options read from a
@@ -494,8 +560,12 @@ option: see “<code>-pin</code>” and “<code>-reg</code>” in <a
href="#usage">Usage</a> for details, if you wish to use it. <em>Some
clients with MDM (Mobile Device Management, often present on
employer-owned devices) are required to use pin-authentication: UxPlay
will provide this even when running without the pin
option.</em></p></li>
will provide this even when running without the pin option.</em>
Password authentication (-pw <em>pwd</em>) is also offered as an
alternative solution to pin codes: users need to know the password
<em>pwd</em> and enter it on their iOS/macOS device to access UxPlay,
when prompted (if <em>pwd</em> is not set, a displayed random pin code
must be entered at <strong>each</strong> new connection.)</p></li>
<li><p>By default, UxPlay is locked to its current client until that
client drops the connection; since UxPlay-1.58, the option
<code>-nohold</code> modifies this behavior so that when a new client
@@ -556,12 +626,14 @@ some video to be played at 60 frames per second. (You can see what
framerate is actually streaming by using -vs fpsdisplaysink, and/or
-FPSdata.) When using this, you should use the default timestamp-based
synchronization option <code>-vsync</code>.</p></li>
<li><p>Since UxPlay-1.54, you can display the accompanying “Cover Art”
from sources like Apple Music in Audio-Only (ALAC) mode: run
<li><p>You can now display (inside UxPlay) the accompanying “Cover Art”
from sources like Apple Music in Audio-Only (ALAC) mode with the option
<code>uxplay -ca</code>. <em>The older method of exporting cover art to
an external viewer remains available: run
<code>uxplay -ca &lt;name&gt; &amp;</code>” in the background, then run
a image viewer with an autoreload feature: an example is “feh”: run
<code>feh -R 1 &lt;name&gt;</code>” in the foreground; terminate feh
and then Uxplay with “<code>ctrl-C fg ctrl-C</code>”.</p></li>
and then Uxplay with “<code>ctrl-C fg ctrl-C</code></em>.</p></li>
</ul>
<p>By default, GStreamer uses an algorithm to search for the best
“videosink” (GStreamers term for a graphics driver to display images)
@@ -679,10 +751,9 @@ framebuffer video, use <code>&lt;videosink&gt;</code> =
<li>Tip: to start UxPlay on a remote host (such as a Raspberry Pi) using
ssh:</li>
</ul>
<!-- -->
<pre><code> ssh user@remote_host
export DISPLAY=:0
nohup uxplay [options] &gt; FILE &amp;</code></pre>
<pre><code>ssh user@remote_host
export DISPLAY=:0
nohup uxplay [options] &gt; FILE &amp;</code></pre>
<p>Sound and video will play on the remote host; “nohup” will keep
uxplay running if the ssh session is closed. Terminal output is saved to
FILE (which can be /dev/null to discard it)</p>
@@ -690,10 +761,10 @@ FILE (which can be /dev/null to discard it)</p>
id="building-uxplay-on-macos-intel-x86_64-and-apple-silicon-m1m2-macs">Building
UxPlay on macOS: <strong>(Intel X86_64 and “Apple Silicon” M1/M2
Macs)</strong></h2>
<p><em>Note: A native AirPlay Server feature is included in macOS 12
Monterey, but is restricted to recent hardware. UxPlay can run on older
macOS systems that will not be able to run Monterey, or can run Monterey
but not AirPlay.</em></p>
<p><em>Note: A native AirPlay Server feature is included in macOS since
macOS 12 Monterey, but is restricted to recent hardware. As well as
running on latest macOS, UxPlay can run on older macOS systems that will
cannot run Monterey, or can run Monterey but not AirPlay.</em></p>
<p>These instructions for macOS assume that the Xcode command-line
developer tools are installed (if Xcode is installed, open the Terminal,
type “sudo xcode-select install” and accept the conditions).</p>
@@ -753,23 +824,16 @@ not supply a complete GStreamer, but seems to have everything needed for
UxPlay). <strong>New: the UxPlay build script will now also detect
Homebrew installations in non-standard locations indicated by the
environment variable <code>$HOMEBREW_PREFIX</code>.</strong></p>
<p><strong>Using GStreamer installed from MacPorts</strong>: this is
<strong>not</strong> recommended, as currently the MacPorts GStreamer is
old (v1.16.2), unmaintained, and built to use X11:</p>
<ul>
<li>Instead <a
href="https://github.com/FDH2/UxPlay/wiki/Building-GStreamer-from-Source-on-macOS-with-MacPorts">build
gstreamer yourself</a> if you use MacPorts and do not want to use the
“Official” Gstreamer binaries.</li>
</ul>
<p><em>(If you really wish to use the MacPorts GStreamer-1.16.2, install
pkgconf (“sudo port install pkgconf”), then “sudo port install
<p><strong>Using GStreamer installed from MacPorts</strong>: MacPorts is
now providing recent GStreamer releases: install pkgconf (“sudo port
install pkgconf”), then “sudo port install gstreamer1
gstreamer1-gst-plugins-base gstreamer1-gst-plugins-good
gstreamer1-gst-plugins-bad gstreamer1-gst-libav”. For X11 support on
macOS, compile UxPlay using a special cmake option
<code>-DUSE_X11=ON</code>, and run it from an XQuartz terminal with -vs
ximagesink; older non-retina macs require a lower resolution when using
X11: <code>uxplay -s 800x600</code>.)</em></p>
gstreamer1-gst-plugins-bad gstreamer1-gst-libav”. (The following may no
longer be relevant: <em>For X11 support on macOS, compile UxPlay using a
special cmake option <code>-DUSE_X11=ON</code>, and run it from an
XQuartz terminal with -vs ximagesink; older non-retina macs require a
lower resolution when using X11:
<code>uxplay -s 800x600</code>.)</em></p>
<p>After installing GStreamer, build and install uxplay: open a terminal
and change into the UxPlay source directory (“UxPlay-master” for zipfile
downloads, “UxPlay” for “git clone” downloads) and build/install with
@@ -780,27 +844,22 @@ with “export GST_DEBUG=2” before runnng UxPlay) reveals that with the
default (since UxPlay 1.64) use of timestamps for video synchonization,
many video frames are being dropped (only on macOS), perhaps due to
another error (about videometa) that shows up in the GStreamer warnings.
<strong>Recommendation: use the new UxPlay “no timestamp” option
<strong>Recommendation: use the UxPlay “no timestamp” option
<code>-vsync no</code></strong> (you can add a line “vsync no” in the
uxplayrc configuration file).</p></li>
<li><p>On macOS with this installation of GStreamer, the only videosinks
available seem to be glimagesink (default choice made by autovideosink)
and osxvideosink. The window title does not show the Airplay server
name, but the window is visible to screen-sharing apps (e.g., Zoom). The
only available audiosink seems to be osxaudiosink.</p></li>
<li><p>The option -nc is always used, whether or not it is selected.
This is a workaround for a problem with GStreamer videosinks on macOS:
if the GStreamer pipeline is destroyed while the mirror window is still
open, a segfault occurs.</p></li>
<li><p>In the case of glimagesink, the resolution settings “-s wxh” do
available are glimagesink (default choice made by autovideosink) and
osxvideosink. The window title does not show the Airplay server name,
but the window can be shared on Zoom. Because of issues with
glimagesink, you may find osxvideosink works better. The only available
audiosink is osxaudiosink.</p></li>
<li><p>The option -nc is currently used by default on macOS, This is a
workaround for window-closing problems with GStreamer videosinks on
macOS. This option can be canceled with “-nc no”, if not
needed.</p></li>
<li><p>In the case of glimagesink, the resolution settings “-s wxh” may
not affect the (small) initial OpenGL mirror window size, but the window
can be expanded using the mouse or trackpad. In contrast, a window
created with “-vs osxvideosink” is initially big, but has the wrong
aspect ratio (stretched image); in this case the aspect ratio changes
when the window width is changed by dragging its side; the option
<code>-vs "osxvideosink force-aspect-ratio=true"</code> can be used to
make the window have the correct aspect ratio when it first
opens.</p></li>
can be expanded using the mouse or trackpad.</p></li>
</ul>
<h2
id="building-uxplay-on-microsoft-windows-using-msys2-with-the-mingw-64-compiler.">Building
@@ -825,22 +884,38 @@ href="https://www.msys2.org">https://www.msys2.org/</a>. Accept the
default installation location <code>C:\mysys64</code>.</p></li>
<li><p><a href="https://packages.msys2.org/package/">MSYS2 packages</a>
are installed with a variant of the “pacman” package manager used by
Arch Linux. Open a “MSYS2 MINGW64” terminal from the MSYS2 tab in the
Windows Start menu, and update the new MSYS2 installation with “pacman
-Syu”. Then install the <strong>MinGW-64</strong> compiler and
<strong>cmake</strong></p>
<pre><code>pacman -S mingw-w64-x86_64-cmake mingw-w64-x86_64-gcc</code></pre>
<p>The compiler with all required dependencies will be installed in the
msys64 directory, with default path <code>C:/msys64/mingw64</code>. Here
we will simply build UxPlay from the command line in the MSYS2
environment (this uses “<code>ninja</code>” in place of
<code>make</code>” for the build system).</p></li>
Arch Linux. Open a “MSYS2” terminal from the MSYS2 tab in the Windows
Start menu, and update the new MSYS2 installation with “pacman
-Syu”.</p>
<ul>
<li>_NEW: MSYS2 now recommends using the newer UCRT64 terminal
environment (which uses the newer Microsoft UCRT “Universal C RunTime
Library, included as part of the Windows OS since Windows 10) rather
than the MINGW64 terminal environment (which uses the older Microsoft
MSVCRT C library, which has “legacy” status, but is available on all
Windows systems). If you wish to use the legacy MSVCRT library, to
support older Windows versions, modify the instructions below as
follows:</li>
</ul>
<ol type="1">
<li>change the MSYS2 terminal type from UCRT64 to MINGW64; (2) modify
mingw-w64-ucrt-x86_64-* package names to mingw-w64-x86_64-*, (just omit
“-ucrt”);</li>
<li>replace <code>ucrt64</code> by <code>mingw64</code> in directory
names._</li>
</ol>
<p>Open a new MSYS2 UCRT64 terminal, and install the gcc compiler and
cmake:</p>
<p><code>pacman -S mingw-w64-ucrt-x86_64-cmake mingw-w64-ucrt-x86_64-gcc</code></p>
<p>We will simply build UxPlay from the command line in the MSYS2
environment (using “<code>ninja</code>” in place of “<code>make</code>
for the build system).</p></li>
<li><p>Download the latest UxPlay from github <strong>(to use
<code>git</code>, install it with <code>pacman -S git</code>, then
<code>git clone https://github.com/FDH2/UxPlay</code>”)</strong>, then
install UxPlay dependencies (openssl is already installed with
MSYS2):</p>
<p><code>pacman -S mingw-w64-x86_64-libplist mingw-w64-x86_64-gstreamer mingw-w64-x86_64-gst-plugins-base</code></p>
<pre><code>`pacman -S mingw-w64-ucrt-x86_64-libplist mingw-w64-ucrt-x86_64-gstreamer mingw-w64-ucrt-x86_64-gst-plugins-base`</code></pre>
<p>If you are trying a different Windows build system, MSVC versions of
GStreamer for Windows are available from the <a
href="https://gstreamer.freedesktop.org/download/">official GStreamer
@@ -857,19 +932,23 @@ build UxPlay with</p>
<li><p>Assuming no error in either of these, you will have built the
uxplay executable <strong>uxplay.exe</strong> in the current (“build”)
directory. The “sudo make install” and “sudo make uninstall” features
offered in the other builds are not available on Windows; instead, the
MSYS2 environment has <code>/mingw64/...</code> available, and you can
install the uxplay.exe executable in <code>C:/msys64/mingw64/bin</code>
(plus manpage and documentation in
<code>C:/msys64/mingw64/share/...</code>) with</p>
<p><code>cmake --install . --prefix /mingw64</code></p>
offered in the other builds are not available on Windows; instead, you
can install the uxplay.exe executable in
<code>C:/msys64/ucrt64/bin</code> (plus manpage and documentation in
<code>C:/msys64/ucrt64/share/...</code>) with</p>
<p><code>cmake --install . --prefix $HOME/../../ucrt64</code></p>
<p>You can later uninstall uxplay by returning to the build directory
and running</p>
<p><code>ninja uninstall</code></p>
<p>(This assumes that certain files in the build directory were not
deleted since building UxPlay).</p>
<p>To be able to view the manpage, you need to install the manpage
viewer with “<code>pacman -S man</code>”.</p></li>
</ol>
<p>To run <strong>uxplay.exe</strong> you need to install some gstreamer
plugin packages with
<code>pacman -S mingw-w64-x86_64-gst-&lt;plugin&gt;</code>, where the
required ones have <code>&lt;plugin&gt;</code> given by</p>
<code>pacman -S mingw-w64-ucrt-x86_64-gst-&lt;plugin&gt;</code>, where
the required ones have <code>&lt;plugin&gt;</code> given by</p>
<ol type="1">
<li><strong>libav</strong></li>
<li><strong>plugins-good</strong></li>
@@ -886,9 +965,9 @@ Settings-&gt;Update and Security-&gt;Windows Security-&gt;Firewall &amp;
network protection -&gt; allow an app through firewall</strong>. If your
virus protection flags uxplay.exe as “suspicious” (but without a true
malware signature) you may need to give it an exception.</p>
<p>Now test by running “<code>uxplay</code>” (in a MSYS2 terminal
window). If you need to specify the audiosink, there are two main
choices on Windows: the older DirectSound plugin
<p>Now test by running “<code>uxplay</code>” (in a MSYS2 UCRT64 terminal
window. If you need to specify the audiosink, there are two main choices
on Windows: the older DirectSound plugin
<code>-as directsoundsink</code>”, and the more modern Windows Audio
Session API (wasapi) plugin “<code>-as wasapisink</code>”, which
supports <a
@@ -904,25 +983,24 @@ If “<code>device</code>” is not specified, the default audio device is
used.</p>
<p>If you wish to specify the videosink using the
<code>-vs &lt;videosink&gt;</code> option, some choices for
<code>&lt;videosink&gt;</code> are <code>d3d11videosink</code>,
<code>d3dvideosink</code>, <code>glimagesink</code>,
<code>gtksink</code>.</p>
<code>&lt;videosink&gt;</code> are <code>d3d12videosink</code>,
<code>d3d11videosink</code>, <code>d3dvideosink</code>,
<code>glimagesink</code>, <code>gtksink</code>,
<code>autovideosink</code>. If you do not specify the videosink, the
d3d11videosink will be used (users have reported segfaults of the newer
d3d12 videodecoder on certain older Nvidia cards when the image
resolution changes: d3d11 will used by default until this is fixed).</p>
<ul>
<li>With Direct3D 11.0 or greater, you can either always be in
fullscreen mode using option
<code>-vs "d3d11videosink fullscreen-toggle-mode=property fullscreen=true"</code>,
or get the ability to toggle into and out of fullscreen mode using the
Alt-Enter key combination with option
<code>-vs "d3d11videosink fullscreen-toggle-mode=alt-enter"</code>. For
convenience, these options will be added if just
<code>-vs d3d11videosink</code> with or without the fullscreen option
“-fs” is used. <em>(Windows users may wish to add
<code>vs d3d11videosink</code>” (no initial “<code>-</code>”) to the
UxPlay startup options file; see “man uxplay” or “uxplay -h”.)</em></li>
<li>With Direct3D 11.0 or greater, various options can be set using
e.g. <code>-vs "d3d11videosink &lt;options&gt;"</code> (see the
gstreamer videosink documentation for these videosinks). For
convenience, if no <code>&lt;options&gt;</code> are set, the option to
toggle in and out of fullscreen mode with the Alt-Enter key combination
is added.</li>
</ul>
<p>The executable uxplay.exe can also be run without the MSYS2
environment, in the Windows Terminal, with
<code>C:\msys64\mingw64\bin\uxplay</code>.</p>
<code>C:\msys64\ucrt64\bin\uxplay</code>.</p>
<h1 id="usage">Usage</h1>
<p>Options:</p>
<ul>
@@ -933,6 +1011,9 @@ or <code>~/.config/uxplayrc</code>); lines begining with
<code>#</code>” are treated as comments, and ignored. Command line
options supersede options in the startup file.</li>
</ul>
<p><strong>-rc <em>file</em></strong> can also be used to specify the
startup file location: this overrides <code>$UXPLAYRC</code>,
<code>~/.uxplayrc</code>, etc.</p>
<p><strong>-n server_name</strong> (Default: UxPlay);
server_name@_hostname_ will be the name that appears offering AirPlay
services to your iPad, iPhone etc, where <em>hostname</em> is the name
@@ -953,10 +1034,14 @@ and some iPhones) can send h265 video if a resolution “-s wxh” with h
&gt; 1080 is requested. The “-h265” option changes the default
resolution (“-s” option) from 1920x1080 to 3840x2160, and leaves default
maximum framerate (“-fps” option) at 30fps.</p>
<p><strong>-hls</strong> Activate HTTP Live Streaming support. With this
option YouTube videos can be streamed directly from YouTube servers to
UxPlay (without passing through the client) by clicking on the AirPlay
icon in the YouTube app.</p>
<p><strong>-hls [v]</strong> Activate HTTP Live Streaming support. With
this option YouTube videos can be streamed directly from YouTube servers
to UxPlay (without passing through the client) by clicking on the
AirPlay icon in the YouTube app. Optional [v] (allowed values 2 or 3,
default: 3) allows selection of the version of GStreamers "playbin"
video player to use for playing HLS video. <em>(Playbin v3 is the
recommended player, but if some videos fail to play, you can try with
version 2.)</em></p>
<p><strong>-pin [nnnn]</strong>: (since v1.67) use Apple-style
(one-time) “pin” authentication when a new client connects for the first
time: a four-digit pin code is displayed on the terminal, and the client
@@ -983,6 +1068,17 @@ key (base-64 format), Device ID, and Device name; commenting out (with
options -restrict, -block, -allow for more ways to control client
access). <em>(Add a line “reg” in the startup file if you wish to use
this feature.)</em></p>
<p><strong>-pw</strong> [<em>pwd</em>]. (since 1.72). As an alternative
to -pin, client access can be controlled with a password set when uxplay
starts (set it in the .uxplay startup file, where it is stored as
cleartext.) All users must then know this password. This uses HTTP md5
Digest authentication, which is now regarded as providing weak security,
but it is only used to validate the uxplay password, and no user
credentials are exposed. If <em>pwd</em> is <strong>not</strong>
specified, a random 4-digit pin code is displayed, and must be entered
on the client at <strong>each</strong> new connection. <em>Note: -pin
and -pw are alternatives: if both are specified at startup, the earlier
of these two options is discarded.</em></p>
<p><strong>-vsync [x]</strong> (In Mirror mode:) this option
(<strong>now the default</strong>) uses timestamps to synchronize audio
with video on the server, with an optional audio delay in (decimal)
@@ -1030,6 +1126,9 @@ each time the length of the volume slider (or the number of steps above
mute, where 16 steps = full volume) is reduced by 50%, the perceived
volume is halved (a 10dB attenuation). (This is modified at low volumes,
to use the “untapered” volume if it is louder.)</p>
<p><strong>-vol <em>v</em></strong> Sets initial audio-streaming volume
(on client): range is [0:1], with 0.0 = mute, 1.0 = full volume
(<em>v</em> is a decimal number).</p>
<p><strong>-s wxh</strong> e.g. -s 1920x1080 (= “1080p”), the default
width and height resolutions in pixels for h264 video. (The default
becomes 3840x2160 (= “4K”) when the -h265 option is used.) This is just
@@ -1049,8 +1148,8 @@ an empty boundary frame of unused pixels (which would be lost in a
full-screen display that overscans, and is not displayed by gstreamer).
Recommendation: <strong>dont use this option</strong> unless there is
some special reason to use it.</p>
<p><strong>-fs</strong> uses fullscreen mode, but only works with X11,
Wayland, VAAPI, and D3D11 (Windows).</p>
<p><strong>-fs</strong> uses fullscreen mode, but currently only works
with X11, Wayland, VAAPI, kms and D3D11 (Windows).</p>
<p><strong>-p</strong> allows you to select the network ports used by
UxPlay (these need to be opened if the server is behind a firewall). By
itself, -p sets “legacy” ports TCP 7100, 7000, 7001, UDP 6000, 6001,
@@ -1144,6 +1243,9 @@ client. Values in the range [0.0, 10.0] seconds are allowed, and will be
converted to a whole number of microseconds. Default is 0.25 sec (250000
usec). <em>(However, the client appears to ignore this reported latency,
so this option seems non-functional.)</em></p>
<p><strong>-ca</strong> (without specifying a filename) now displays
“cover art” that accompanies Apple Music when played in “Audio-only”
(ALAC) mode.</p>
<p><strong>-ca <em>filename</em></strong> provides a file (where
<em>filename</em> can include a full path) used for output of “cover
art” (from Apple Music, <em>etc.</em>,) in audio-only ALAC mode. This
@@ -1158,17 +1260,22 @@ then run the the image viewer in the foreground. Example, using
in which uxplay was put into the background). To quit, use
<code>ctrl-C fg ctrl-C</code> to terminate the image viewer, bring
<code>uxplay</code> into the foreground, and terminate it too.</p>
<p><strong>-md <em>filename</em></strong> Like the -ca option, but
exports audio metadata text (Artist, Title, Genre, etc.) to file for
possible display by a process that watches the file for changes.
Previous text is overwritten as new metadata is received, and the file
is deleted when uxplay terminates.</p>
<p><strong>-reset n</strong> sets a limit of <em>n</em> consecutive
timeout failures of the client to respond to ntp requests from the
server (these are sent every 3 seconds to check if the client is still
present, and synchronize with it). After <em>n</em> failures, the client
will be presumed to be offline, and the connection will be reset to
allow a new connection. The default value of <em>n</em> is 5; the value
<em>n</em> = 0 means “no limit” on timeouts.</p>
failures of the client to send feedback requests (these “heartbeat
signals” are sent by the client once per second to ask for a response
showing that the server is still online). After <em>n</em> missing
signals, the client will be presumed to be offline, and the connection
will be reset to allow a new connection. The default value of <em>n</em>
is 15 seconds; the value <em>n</em> = 0 means “no limit”.</p>
<p><strong>-nofreeze</strong> closes the video window after a reset due
to ntp timeout (default is to leave window open to allow a smoother
reconection to the same client). This option may be useful in fullscreen
mode.</p>
to client going offline (default is to leave window open to allow a
smoother reconection to the same client). This option may be useful in
fullscreen mode.</p>
<p><strong>-nc</strong> maintains previous UxPlay &lt; 1.45 behavior
that does <strong>not close</strong> the video window when the the
client sends the “Stop Mirroring” signal. <em>This option is currently
@@ -1267,9 +1374,10 @@ to a file to <em>n</em> or less. To change the name <em>audiodump</em>,
use -admp [n] <em>filename</em>. <em>Note that (unlike dumped video) the
dumped audio is currently only useful for debugging, as it is not
containerized to make it playable with standard audio players.</em></p>
<p><strong>-d</strong> Enable debug output. Note: this does not show
GStreamer error or debug messages. To see GStreamer error and warning
messages, set the environment variable GST_DEBUG with “export
<p><strong>-d [n]</strong> Enable debug output; optional argument n=1
suppresses audio/video packet data in debug output. Note: this does not
show GStreamer error or debug messages. To see GStreamer error and
warning messages, set the environment variable GST_DEBUG with “export
GST_DEBUG=2” before running uxplay. To see GStreamer information
messages, set GST_DEBUG=4; for DEBUG messages, GST_DEBUG=5; increase
this to see even more of the GStreamer inner workings.</p>
@@ -1602,6 +1710,22 @@ an AppleTV6,2 with sourceVersion 380.20.1 (an AppleTV 4K 1st gen,
introduced 2017, running tvOS 12.2.1), so it does not seem to matter
what version UxPlay claims to be.</p>
<h1 id="changelog">Changelog</h1>
<p>xxxx 2025-07-07 Render Audio cover-art inside UxPlay with -ca option
(no file specified).</p>
<p>1.72.2 2025-07-07 Fix bug (typo) in DNS_SD advertisement introduced
with -pw option. Update llhttp to v 9.3.0</p>
<p>1.72.1 2025-06-06 minor update: fix regression in -reg option; add
option -rc <rcfile> to specify initialization file; add “-nc no” to
unset “-nc” option (for macOS users, where -nc is default); add
user-installable systemd script for running UxPlay as an
always-available “rootless daemon”</p>
<p>1.72 2025-05-07. Improved HLS Live Streaming (YouTube) support,
including “scrub”. Add requested options -md &lt;filename&gt; to output
audio metadata text to a file for possible display (complements -ca
option), and -vol <v> option to set initial audio-streaming volume. Add
support password user access control with HTTP digest Authentication
(-pw [pwd]). If no pwd is set, a random pin is displayed for entry at
each new connection.</p>
<p>1.71 2024-12-13 Add support for HTTP Live Streaming (HLS), initially
only for YouTube movies. Fix issue with NTP timeout on Windows.</p>
<p>1.70 2024-10-04 Add support for 4K (h265) video (resolution 3840 x

305
README.md
View File

@@ -1,14 +1,35 @@
# UxPlay 1.71: AirPlay-Mirror and AirPlay-Audio server for Linux, macOS, and Unix (now also runs on Windows).
# UxPlay 1.72: AirPlay-Mirror and AirPlay-Audio server for Linux, macOS, and Unix (also runs on Windows).
### **Now developed at the GitHub site <https://github.com/FDH2/UxPlay> (where ALL user issues should be posted, and latest versions can be found).**
- ***NEW in v1.71**: Support for (YouTube) HLS (HTTP Live Streaming)
video with the new "-hls" option.* Click on the airplay icon in the
YouTube app to stream video. (You may need to wait until
advertisements have finished or been skipped before clicking the
YouTube airplay icon.) **Please report any issues with this new
feature of UxPlay**.
- **NEW on github**: option -ca (with no filename given) will now render
Apple Music cover art (in audio-only mode) inside
UxPlay. (-ca `<filename>` will continue to export cover art for
display by an external viewer).
- **NEW in v1.72**: Improved Support for (YouTube) HLS (HTTP Live Streaming)
video with the new "-hls" option (introduced in 1.71).* **Only streaming from the YouTube iOS app
(in \"m3u8\" protocol) is currently supported**: (streaming using the AirPlay icon in a browser window
is **not** yet supported).Click on the airplay icon in the
YouTube app to stream video.
**Please report any issues with this new feature of UxPlay**.
_The default video player for HLS is
GStreamer playbin v3: use "-hls 2" to revert to playbin v2 if
some videos fail to play_.
* user-requested features: added support for setting a password (as an alternative to on-screen
pin codes) to control client access (-pw option, see "man pw" or this README for details); added support for
setting initial client audio-streaming volume (-vol option), and output of audio-mode
metadata to file (for display by some external process, -md option).
**ISSUES** ***(Please help to solve if you have expertise)***
* in HLS video streaming from the YouTube app (-hls option), rendered using GStreamer's media player "playbin3" (or playbin2, with option -hls 2),
we don't understand how to correctly deal with "interstitials" (= 15 sec commercials) when "skip" is pressed on the client.
(HLS is handled by handlers in lib/http_handlers.h). (Should response to HTTP requests POST /action (playlistRemove) and POST
/Stop be modified? _Wireshark data from HLS on an AppleTV model 3 with UN-upgraded original OS (unencrypted communications) could be useful!_
## Highlights:
- GPLv3, open source.
@@ -44,7 +65,8 @@ status](https://repology.org/badge/vertical-allrepos/uxplay.svg)](https://repolo
- Install uxplay on Debian-based Linux systems with
"`sudo apt install uxplay`"; on FreeBSD with
"`sudo pkg install uxplay`". Also available on Arch-based systems
"`sudo pkg install uxplay`"; on OpenBSD with
"`doas pkg_add uxplay`". Also available on Arch-based systems
through AUR. Since v. 1.66, uxplay is now also packaged in RPM
format by Fedora 38 ("`sudo dnf install uxplay`").
@@ -55,7 +77,12 @@ status](https://repology.org/badge/vertical-allrepos/uxplay.svg)](https://repolo
See the section on using this specfile for [building an installable
RPM package](#building-an-installable-rpm-package).
After installation:
- If your distribution does not supply UxPlay, or you want the latest version,
it is very easy to build it yourself: see the very
[detailed instructions for building UxPlay from source](#building-uxplay-from-source).
later in this document.
## After installation:
- (On Linux and \*BSD): if a firewall is active on the server hosting
UxPlay, make sure the default network port (UDP 5353) for
@@ -71,16 +98,40 @@ After installation:
- For Audio-only mode (Apple Music, etc.) best quality is obtained
with the option "uxplay -async", but there is then a 2 second
latency imposed by iOS.
latency imposed by iOS. Use option "uxplay -ca" to display any "Cover Art" that
accompanies the audio.
- If you are using UxPlay just to mirror the client's screen (without
showing videos that need audio synchronized with video), it is best to
use the option "uxplay -vsync no".
- Add any UxPlay options you want to use as defaults to a startup file
`~/.uxplayrc` (see "`man uxplay`" or "`uxplay -h`" for format and
other possible locations). In particular, if your system uses
other possible locations; the location can also be set with "uxplay -rc _location_").
In particular, if your system uses
PipeWire audio or Wayland video systems, you may wish to add "as
pipewiresink" or "vs waylandsink" as defaults to the file. *(Output
from terminal commands "ps waux \| grep pulse" or "pactl info" will
contain "pipewire" if your Linux/BSD system uses it).*
- For Linux systems using systemd, there is a **systemd** service file **uxplay.service**
found in the UxPlay top directory of the distribution, and also installed
in `<DOCDIR>/uxplay/systemd/` (where DOCDIR is usually ``/usr/local/share/doc``), that allows users to start
their own instance of UxPlay as a rootless daemon: it should either be added to the
directory /etc/systemd/user, or the user can just create their own
systemd directory `~/.config/systemd/user/` and then copy uxplay.service into it. To save
uxplay terminal output to a file ~/uxplay.log, uncomment the StandardOutput entry in
uxplay.service. Then
`systemctl --user [start/stop/enable/disable/status] uxplay`
can be used to control the daemon. If it is enabled, the daemon will start
at the user's first login and stop when they no longer have any open sessions. See
https://www.baeldung.com/linux/systemd-create-user-services for more about
systemd user services. If more than one user might simultaneously run uxplay this way, they should
specify distinct -p and -m options (ports and deviceID) in their startup files.
**Note: it is NOT recommended to run UxPlay as a root service.**
- On Raspberry Pi: models using hardware h264 video decoding by the
Broadcom GPU (models 4B and earlier) may require the uxplay option -bt709.
If you use Ubuntu 22.10 or earlier, GStreamer must
@@ -91,9 +142,10 @@ After installation:
decoding is used seems to have reappeared
starting with GStreamer-1.22.
To (easily) compile the latest UxPlay from source, see the section
[Getting UxPlay](#getting-uxplay).
- If UxPlay is used in a public space, there are security options for requiring an AppleTV-style
one-time pin (displayed on the terminal) to be entered, or a password, and for barring/permitting
client access by their device ID. See options -pin, -reg, -pw, -restrict, -allow, -block.
# Detailed description of UxPlay
This project is a GPLv3 open source unix AirPlay2 Mirror server for
@@ -235,7 +287,7 @@ clause incompatible with the GPL unless OpenSSL can be regarded as a
OpenSSL as a "System Library", but some (e.g. Debian) do not: in this
case, the issue is solved by linking with OpenSSL-3.0.0 or later.
# Getting UxPlay
# Building UxPlay from source
Either download and unzip
[UxPlay-master.zip](https://github.com/FDH2/UxPlay/archive/refs/heads/master.zip),
@@ -369,6 +421,11 @@ package](#building-an-installable-rpm-package).
avahi-libdns or mDNSResponder must also be installed to provide the
dns_sd library. OpenSSL is already installed as a System Library.
- **OpenBSD:** (doas pkg_add) libplist gstreamer1-plugins-base.
avahi-libs must also be installed to provide the dns_sd library;
(avahi-main must also be installed).
OpenSSL is already installed as a System Library.
#### Building an installable RPM package
First-time RPM builders should first install the rpm-build and
@@ -450,6 +507,9 @@ repositories for those distributions.
gstreamer1-plugins-\* (\* = core, good, bad, x, gtk, gl, vulkan,
pulse, v4l2, ...), (+ gstreamer1-vaapi for Intel/AMD graphics).
- **OpenBSD:** Install gstreamer1-libav, gstreamer-plugins-\*
(\* = core, bad, base, good).
### Starting and running UxPlay
Since UxPlay-1.64, UxPlay can be started with options read from a
@@ -486,7 +546,11 @@ below for help with this or other problems.
[Usage](#usage) for details, if you wish to use it. *Some clients
with MDM (Mobile Device Management, often present on employer-owned
devices) are required to use pin-authentication: UxPlay will provide
this even when running without the pin option.*
this even when running without the pin option.* Password authentication
(-pw _pwd_) is also offered as an alternative solution to pin codes:
users need to know the password _pwd_ and enter it on their iOS/macOS device
to access UxPlay, when prompted (if _pwd_ is not set, a displayed random
pin code must be entered at **each** new connection.)
- By default, UxPlay is locked to its current client until that client
drops the connection; since UxPlay-1.58, the option `-nohold`
@@ -548,12 +612,14 @@ value advances it.)
-FPSdata.) When using this, you should use the default
timestamp-based synchronization option `-vsync`.
- Since UxPlay-1.54, you can display the accompanying "Cover Art" from
sources like Apple Music in Audio-Only (ALAC) mode: run
- You can now display (inside UxPlay) the accompanying "Cover Art" from
sources like Apple Music in Audio-Only (ALAC) mode with the option
`uxplay -ca`. _The older method of exporting cover art to an external
viewer remains available: run
"`uxplay -ca <name> &`" in the background, then run a image viewer
with an autoreload feature: an example is "feh": run
"`feh -R 1 <name>`" in the foreground; terminate feh and then Uxplay
with "`ctrl-C fg ctrl-C`".
with "`ctrl-C fg ctrl-C`"_.
By default, GStreamer uses an algorithm to search for the best
"videosink" (GStreamer's term for a graphics driver to display images)
@@ -672,15 +738,14 @@ choice `<videosink>` = `glimagesink` is sometimes useful. With the
Wayland video compositor, use `<videosink>` = `waylandsink`. With
framebuffer video, use `<videosink>` = `kmssink`.
- Tip: to start UxPlay on a remote host (such as a Raspberry Pi) using
* Tip: to start UxPlay on a remote host (such as a Raspberry Pi) using
ssh:
```{=html}
<!-- -->
```
ssh user@remote_host
ssh user@remote_host
export DISPLAY=:0
nohup uxplay [options] > FILE &
```
Sound and video will play on the remote host; "nohup" will keep uxplay
running if the ssh session is closed. Terminal output is saved to FILE
@@ -688,9 +753,10 @@ running if the ssh session is closed. Terminal output is saved to FILE
## Building UxPlay on macOS: **(Intel X86_64 and "Apple Silicon" M1/M2 Macs)**
*Note: A native AirPlay Server feature is included in macOS 12 Monterey,
but is restricted to recent hardware. UxPlay can run on older macOS
systems that will not be able to run Monterey, or can run Monterey but
*Note: A native AirPlay Server feature is included in macOS since macOS 12 Monterey,
but is restricted to recent hardware. As well as running on latest macOS,
UxPlay can run on older macOS
systems that will cannot run Monterey, or can run Monterey but
not AirPlay.*
These instructions for macOS assume that the Xcode command-line
@@ -752,19 +818,12 @@ complete GStreamer, but seems to have everything needed for UxPlay).
installations in non-standard locations indicated by the environment
variable `$HOMEBREW_PREFIX`.**
**Using GStreamer installed from MacPorts**: this is **not**
recommended, as currently the MacPorts GStreamer is old (v1.16.2),
unmaintained, and built to use X11:
- Instead [build gstreamer
yourself](https://github.com/FDH2/UxPlay/wiki/Building-GStreamer-from-Source-on-macOS-with-MacPorts)
if you use MacPorts and do not want to use the "Official" Gstreamer
binaries.
*(If you really wish to use the MacPorts GStreamer-1.16.2, install
pkgconf ("sudo port install pkgconf"), then "sudo port install
**Using GStreamer installed from MacPorts**: MacPorts is now providing
recent GStreamer releases: install
pkgconf ("sudo port install pkgconf"), then "sudo port install gstreamer1
gstreamer1-gst-plugins-base gstreamer1-gst-plugins-good
gstreamer1-gst-plugins-bad gstreamer1-gst-libav". For X11 support on
gstreamer1-gst-plugins-bad gstreamer1-gst-libav".
(The following may no longer be relevant: *For X11 support on
macOS, compile UxPlay using a special cmake option `-DUSE_X11=ON`, and
run it from an XQuartz terminal with -vs ximagesink; older non-retina
macs require a lower resolution when using X11: `uxplay -s 800x600`.)*
@@ -779,30 +838,26 @@ downloads, "UxPlay" for "git clone" downloads) and build/install with
default (since UxPlay 1.64) use of timestamps for video
synchonization, many video frames are being dropped (only on macOS),
perhaps due to another error (about videometa) that shows up in the
GStreamer warnings. **Recommendation: use the new UxPlay "no
GStreamer warnings. **Recommendation: use the UxPlay "no
timestamp" option "`-vsync no`"** (you can add a line "vsync no" in
the uxplayrc configuration file).
- On macOS with this installation of GStreamer, the only videosinks
available seem to be glimagesink (default choice made by
autovideosink) and osxvideosink. The window title does not show the
Airplay server name, but the window is visible to screen-sharing
apps (e.g., Zoom). The only available audiosink seems to be
available are glimagesink (default choice made by
autovideosink) and osxvideosink.
The window title does not show the
Airplay server name, but the window can be shared on Zoom.
Because of issues with glimagesink, you may find
osxvideosink works better. The only available audiosink is
osxaudiosink.
- The option -nc is always used, whether or not it is selected. This
is a workaround for a problem with GStreamer videosinks on macOS: if
the GStreamer pipeline is destroyed while the mirror window is still
open, a segfault occurs.
- The option -nc is currently used by default on macOS, This
is a workaround for window-closing problems with GStreamer videosinks on macOS.
This option can be canceled with "-nc no", if not needed.
- In the case of glimagesink, the resolution settings "-s wxh" do not
- In the case of glimagesink, the resolution settings "-s wxh" may not
affect the (small) initial OpenGL mirror window size, but the window
can be expanded using the mouse or trackpad. In contrast, a window
created with "-vs osxvideosink" is initially big, but has the wrong
aspect ratio (stretched image); in this case the aspect ratio
changes when the window width is changed by dragging its side; the
option `-vs "osxvideosink force-aspect-ratio=true"` can be used to
make the window have the correct aspect ratio when it first opens.
can be expanded using the mouse or trackpad.
## Building UxPlay on Microsoft Windows, using MSYS2 with the MinGW-64 compiler.
@@ -825,16 +880,25 @@ downloads, "UxPlay" for "git clone" downloads) and build/install with
3. [MSYS2 packages](https://packages.msys2.org/package/) are installed
with a variant of the "pacman" package manager used by Arch Linux.
Open a "MSYS2 MINGW64" terminal from the MSYS2 tab in the Windows
Open a "MSYS2" terminal from the MSYS2 tab in the Windows
Start menu, and update the new MSYS2 installation with "pacman
-Syu". Then install the **MinGW-64** compiler and **cmake**
-Syu".
pacman -S mingw-w64-x86_64-cmake mingw-w64-x86_64-gcc
* _NEW: MSYS2 now recommends using the newer UCRT64 terminal environment (which uses the newer Microsoft
UCRT "Universal C RunTime Library", included as part of the Windows OS since Windows 10)
rather than the MINGW64 terminal environment
(which uses the older Microsoft MSVCRT C library, which has "legacy" status, but is available on all Windows systems).
If you wish to use the legacy MSVCRT library, to support older Windows versions, modify the instructions below as follows:
(1) change the MSYS2 terminal type from UCRT64 to MINGW64; (2) modify mingw-w64-ucrt-x86_64-* package names to mingw-w64-x86_64-*, (just omit "-ucrt");
(3) replace `ucrt64` by ``mingw64`` in directory names._
The compiler with all required dependencies will be installed in the
msys64 directory, with default path `C:/msys64/mingw64`. Here we
will simply build UxPlay from the command line in the MSYS2
environment (this uses "`ninja`" in place of "`make`" for the build
Open a new MSYS2 UCRT64 terminal, and install the gcc compiler and cmake:
`pacman -S mingw-w64-ucrt-x86_64-cmake mingw-w64-ucrt-x86_64-gcc`
We will simply build UxPlay from the command line in the MSYS2
environment (using "`ninja`" in place of "`make`" for the build
system).
4. Download the latest UxPlay from github **(to use `git`, install it
@@ -842,7 +906,7 @@ downloads, "UxPlay" for "git clone" downloads) and build/install with
"`git clone https://github.com/FDH2/UxPlay`")**, then install UxPlay
dependencies (openssl is already installed with MSYS2):
`pacman -S mingw-w64-x86_64-libplist mingw-w64-x86_64-gstreamer mingw-w64-x86_64-gst-plugins-base`
`pacman -S mingw-w64-ucrt-x86_64-libplist mingw-w64-ucrt-x86_64-gstreamer mingw-w64-ucrt-x86_64-gst-plugins-base`
If you are trying a different Windows build system, MSVC versions of
GStreamer for Windows are available from the [official GStreamer
@@ -862,18 +926,23 @@ downloads, "UxPlay" for "git clone" downloads) and build/install with
6. Assuming no error in either of these, you will have built the uxplay
executable **uxplay.exe** in the current ("build") directory. The
"sudo make install" and "sudo make uninstall" features offered in
the other builds are not available on Windows; instead, the MSYS2
environment has `/mingw64/...` available, and you can install the
uxplay.exe executable in `C:/msys64/mingw64/bin` (plus manpage and
documentation in `C:/msys64/mingw64/share/...`) with
the other builds are not available on Windows; instead, you can install the
uxplay.exe executable in `C:/msys64/ucrt64/bin` (plus manpage and
documentation in `C:/msys64/ucrt64/share/...`) with
`cmake --install . --prefix /mingw64`
`cmake --install . --prefix $HOME/../../ucrt64`
You can later uninstall uxplay by returning to the build directory and running
`ninja uninstall`
(This assumes that certain files in the build directory were not deleted since building UxPlay).
To be able to view the manpage, you need to install the manpage
viewer with "`pacman -S man`".
To run **uxplay.exe** you need to install some gstreamer plugin packages
with `pacman -S mingw-w64-x86_64-gst-<plugin>`, where the required ones
with `pacman -S mingw-w64-ucrt-x86_64-gst-<plugin>`, where the required ones
have `<plugin>` given by
1. **libav**
@@ -892,7 +961,7 @@ app through firewall**. If your virus protection flags uxplay.exe as
"suspicious" (but without a true malware signature) you may need to give
it an exception.
Now test by running "`uxplay`" (in a MSYS2 terminal window). If you need
Now test by running "`uxplay`" (in a MSYS2 UCRT64 terminal window. If you need
to specify the audiosink, there are two main choices on Windows: the
older DirectSound plugin "`-as directsoundsink`", and the more modern
Windows Audio Session API (wasapi) plugin "`-as wasapisink`", which
@@ -908,23 +977,20 @@ like `\{0.0.0.00000000\}.\{98e35b2b-8eba-412e-b840-fd2c2492cf44\}`. If
"`device`" is not specified, the default audio device is used.
If you wish to specify the videosink using the `-vs <videosink>` option,
some choices for `<videosink>` are `d3d11videosink`, `d3dvideosink`,
`glimagesink`, `gtksink`.
some choices for `<videosink>` are `d3d12videosink`, ``d3d11videosink``, ```d3dvideosink```,
`glimagesink`, ``gtksink``, ```autovideosink```. If you do not specify the videosink,
the d3d11videosink will be used (users have reported segfaults of the newer d3d12 videodecoder
on certain older Nvidia cards when the image resolution changes:
d3d11 will used by default until this is fixed).
- With Direct3D 11.0 or greater, you can either always be in
fullscreen mode using option
`-vs "d3d11videosink fullscreen-toggle-mode=property fullscreen=true"`,
or get the ability to toggle into and out of fullscreen mode using
the Alt-Enter key combination with option
`-vs "d3d11videosink fullscreen-toggle-mode=alt-enter"`. For
convenience, these options will be added if just
`-vs d3d11videosink` with or without the fullscreen option "-fs" is
used. *(Windows users may wish to add "`vs d3d11videosink`" (no
initial "`-`") to the UxPlay startup options file; see "man uxplay"
or "uxplay -h".)*
- With Direct3D 11.0 or greater, various options can be set
using e.g. `-vs "d3d11videosink <options>"` (see the gstreamer videosink
documentation for these videosinks).
For convenience, if no `<options>` are set, the option to
toggle in and out of fullscreen mode with the Alt-Enter key combination is added.
The executable uxplay.exe can also be run without the MSYS2 environment,
in the Windows Terminal, with `C:\msys64\mingw64\bin\uxplay`.
in the Windows Terminal, with `C:\msys64\ucrt64\bin\uxplay`.
# Usage
@@ -937,6 +1003,9 @@ Options:
comments, and ignored. Command line options supersede options in the
startup file.
**-rc _file_** can also be used to specify the startup file location: this
overrides `$UXPLAYRC`, ``~/.uxplayrc``, etc.
**-n server_name** (Default: UxPlay); server_name@\_hostname\_ will be
the name that appears offering AirPlay services to your iPad, iPhone
etc, where *hostname* is the name of the server running uxplay. This
@@ -958,10 +1027,14 @@ The "-h265" option changes the default resolution ("-s" option) from
1920x1080 to 3840x2160, and leaves default maximum framerate ("-fps"
option) at 30fps.
**-hls** Activate HTTP Live Streaming support. With this option YouTube
**-hls \[v\]** Activate HTTP Live Streaming support. With this option YouTube
videos can be streamed directly from YouTube servers to UxPlay (without
passing through the client) by clicking on the AirPlay icon in the
YouTube app.
YouTube app. Optional \[v\] (allowed values 2 or 3, default: 3)
allows selection of the version of GStreamer's
\"playbin\" video player to use for playing HLS video. _(Playbin v3
is the recommended player, but if some videos fail to play, you can try
with version 2.)_
**-pin \[nnnn\]**: (since v1.67) use Apple-style (one-time) "pin"
authentication when a new client connects for the first time: a
@@ -990,6 +1063,17 @@ deregisters the corresponding client (see options -restrict, -block,
-allow for more ways to control client access). *(Add a line "reg" in
the startup file if you wish to use this feature.)*
**-pw** [*pwd*]. (since 1.72). As an alternative to -pin, client access
can be controlled with a password set when uxplay starts (set it in
the .uxplay startup file, where it is stored as cleartext.) All users must
then know this password. This uses HTTP md5 Digest authentication,
which is now regarded as providing weak security, but it is only used to
validate the uxplay password, and no user credentials are exposed.
If *pwd* is **not** specified, a random 4-digit pin code is displayed, and must
be entered on the client at **each** new connection.
_Note: -pin and -pw are alternatives: if both are specified at startup, the
earlier of these two options is discarded._
**-vsync \[x\]** (In Mirror mode:) this option (**now the default**)
uses timestamps to synchronize audio with video on the server, with an
optional audio delay in (decimal) milliseconds (*x* = "20.5" means
@@ -1038,6 +1122,9 @@ where 16 steps = full volume) is reduced by 50%, the perceived volume is
halved (a 10dB attenuation). (This is modified at low volumes, to use
the "untapered" volume if it is louder.)
**-vol *v*** Sets initial audio-streaming volume (on client): range is [0:1],
with 0.0 = mute, 1.0 = full volume (*v* is a decimal number).
**-s wxh** e.g. -s 1920x1080 (= "1080p"), the default width and height
resolutions in pixels for h264 video. (The default becomes 3840x2160 (=
"4K") when the -h265 option is used.) This is just a request made to the
@@ -1060,8 +1147,8 @@ display that overscans, and is not displayed by gstreamer).
Recommendation: **don't use this option** unless there is some special
reason to use it.
**-fs** uses fullscreen mode, but only works with X11, Wayland, VAAPI,
and D3D11 (Windows).
**-fs** uses fullscreen mode, but currently only works with X11, Wayland, VAAPI,
kms and D3D11 (Windows).
**-p** allows you to select the network ports used by UxPlay (these need
to be opened if the server is behind a firewall). By itself, -p sets
@@ -1166,6 +1253,9 @@ number of microseconds. Default is 0.25 sec (250000 usec). *(However,
the client appears to ignore this reported latency, so this option seems
non-functional.)*
**-ca** (without specifying a filename) now displays "cover art"
that accompanies Apple Music when played in "Audio-only" (ALAC) mode.
**-ca *filename*** provides a file (where *filename* can include a full
path) used for output of "cover art" (from Apple Music, *etc.*,) in
audio-only ALAC mode. This file is overwritten with the latest cover art
@@ -1179,14 +1269,19 @@ uxplay was put into the background). To quit, use `ctrl-C fg ctrl-C` to
terminate the image viewer, bring `uxplay` into the foreground, and
terminate it too.
**-reset n** sets a limit of *n* consecutive timeout failures of the
client to respond to ntp requests from the server (these are sent every
3 seconds to check if the client is still present, and synchronize with
it). After *n* failures, the client will be presumed to be offline, and
the connection will be reset to allow a new connection. The default
value of *n* is 5; the value *n* = 0 means "no limit" on timeouts.
**-md *filename*** Like the -ca option, but exports audio metadata text
(Artist, Title, Genre, etc.) to file for possible display by a process that watches
the file for changes. Previous text is overwritten as new metadata is received,
and the file is deleted when uxplay terminates.
**-nofreeze** closes the video window after a reset due to ntp timeout
**-reset n** sets a limit of *n* consecutive failures of the
client to send feedback requests (these "heartbeat signals" are sent by the client
once per second to ask for a response showing that the server is still online).
After *n* missing signals, the client will be presumed to be offline, and
the connection will be reset to allow a new connection. The default
value of *n* is 15 seconds; the value *n* = 0 means "no limit".
**-nofreeze** closes the video window after a reset due to client going offline
(default is to leave window open to allow a smoother reconection to the
same client). This option may be useful in fullscreen mode.
@@ -1297,7 +1392,9 @@ that (unlike dumped video) the dumped audio is currently only useful for
debugging, as it is not containerized to make it playable with standard
audio players.*
**-d** Enable debug output. Note: this does not show GStreamer error or
**-d \[n\]** Enable debug output; optional argument n=1 suppresses audio/video
packet data in debug output.
Note: this does not show GStreamer error or
debug messages. To see GStreamer error and warning messages, set the
environment variable GST_DEBUG with "export GST_DEBUG=2" before running
uxplay. To see GStreamer information messages, set GST_DEBUG=4; for
@@ -1645,6 +1742,24 @@ introduced 2017, running tvOS 12.2.1), so it does not seem to matter
what version UxPlay claims to be.
# Changelog
xxxx 2025-07-07 Render Audio cover-art inside UxPlay with -ca option (no file
specified).
1.72.2 2025-07-07 Fix bug (typo) in DNS_SD advertisement introduced with -pw
option. Update llhttp to v 9.3.0
1.72.1 2025-06-06 minor update: fix regression in -reg option; add option
-rc <rcfile> to specify initialization file; add "-nc no" to unset "-nc"
option (for macOS users, where -nc is default); add user-installable
systemd script for running UxPlay as an always-available "rootless daemon"
1.72 2025-05-07. Improved HLS Live Streaming (YouTube) support, including
"scrub".
Add requested options -md \<filename\> to output audio
metadata text to a file for possible display (complements -ca option),
and -vol <v> option to set initial audio-streaming volume. Add support
password user access control with HTTP digest Authentication (-pw [pwd]).
If no pwd is set, a random pin is displayed for entry at each new connection.
1.71 2024-12-13 Add support for HTTP Live Streaming (HLS), initially
only for YouTube movies. Fix issue with NTP timeout on Windows.

View File

@@ -1,13 +1,41 @@
# UxPlay 1.71: AirPlay-Mirror and AirPlay-Audio server for Linux, macOS, and Unix (now also runs on Windows).
# UxPlay 1.72: AirPlay-Mirror and AirPlay-Audio server for Linux, macOS, and Unix (also runs on Windows).
### **Now developed at the GitHub site <https://github.com/FDH2/UxPlay> (where ALL user issues should be posted, and latest versions can be found).**
- ***NEW in v1.71**: Support for (YouTube) HLS (HTTP Live Streaming)
video with the new "-hls" option.* Click on the airplay icon in the
YouTube app to stream video. (You may need to wait until
advertisements have finished or been skipped before clicking the
YouTube airplay icon.) **Please report any issues with this new
feature of UxPlay**.
- **NEW on github**: option -ca (with no filename given) will now
render Apple Music cover art (in audio-only mode) inside UxPlay.
(-ca `<filename>` will continue to export cover art for display by
an external viewer).
- **NEW in v1.72**: Improved Support for (YouTube) HLS (HTTP Live
Streaming) video with the new "-hls" option (introduced in 1.71).\*
**Only streaming from the YouTube iOS app (in \"m3u8\" protocol) is
currently supported**: (streaming using the AirPlay icon in a
browser window is **not** yet supported).Click on the airplay icon
in the YouTube app to stream video. **Please report any issues with
this new feature of UxPlay**.
*The default video player for HLS is GStreamer playbin v3: use "-hls
2" to revert to playbin v2 if some videos fail to play*.
- user-requested features: added support for setting a password
(as an alternative to on-screen pin codes) to control client
access (-pw option, see "man pw" or this README for details);
added support for setting initial client audio-streaming volume
(-vol option), and output of audio-mode metadata to file (for
display by some external process, -md option).
**ISSUES** ***(Please help to solve if you have expertise)***
- in HLS video streaming from the YouTube app (-hls option),
rendered using GStreamer's media player "playbin3" (or playbin2,
with option -hls 2), we don't understand how to correctly deal
with "interstitials" (= 15 sec commercials) when "skip" is
pressed on the client. (HLS is handled by handlers in
lib/http_handlers.h). (Should response to HTTP requests POST
/action (playlistRemove) and POST /Stop be modified? *Wireshark
data from HLS on an AppleTV model 3 with UN-upgraded original OS
(unencrypted communications) could be useful!*
## Highlights:
@@ -44,7 +72,8 @@ status](https://repology.org/badge/vertical-allrepos/uxplay.svg)](https://repolo
- Install uxplay on Debian-based Linux systems with
"`sudo apt install uxplay`"; on FreeBSD with
"`sudo pkg install uxplay`". Also available on Arch-based systems
"`sudo pkg install uxplay`"; on OpenBSD with
"`doas pkg_add uxplay`". Also available on Arch-based systems
through AUR. Since v. 1.66, uxplay is now also packaged in RPM
format by Fedora 38 ("`sudo dnf install uxplay`").
@@ -55,7 +84,12 @@ status](https://repology.org/badge/vertical-allrepos/uxplay.svg)](https://repolo
See the section on using this specfile for [building an installable
RPM package](#building-an-installable-rpm-package).
After installation:
- If your distribution does not supply UxPlay, or you want the latest
version, it is very easy to build it yourself: see the very
[detailed instructions for building UxPlay from
source](#building-uxplay-from-source). later in this document.
## After installation:
- (On Linux and \*BSD): if a firewall is active on the server hosting
UxPlay, make sure the default network port (UDP 5353) for
@@ -71,15 +105,43 @@ After installation:
- For Audio-only mode (Apple Music, etc.) best quality is obtained
with the option "uxplay -async", but there is then a 2 second
latency imposed by iOS.
latency imposed by iOS. Use option "uxplay -ca" to display any
"Cover Art" that accompanies the audio.
- If you are using UxPlay just to mirror the client's screen (without
showing videos that need audio synchronized with video), it is best
to use the option "uxplay -vsync no".
- Add any UxPlay options you want to use as defaults to a startup file
`~/.uxplayrc` (see "`man uxplay`" or "`uxplay -h`" for format and
other possible locations). In particular, if your system uses
PipeWire audio or Wayland video systems, you may wish to add "as
pipewiresink" or "vs waylandsink" as defaults to the file. *(Output
from terminal commands "ps waux \| grep pulse" or "pactl info" will
contain "pipewire" if your Linux/BSD system uses it).*
other possible locations; the location can also be set with "uxplay
-rc *location*"). In particular, if your system uses PipeWire audio
or Wayland video systems, you may wish to add "as pipewiresink" or
"vs waylandsink" as defaults to the file. *(Output from terminal
commands "ps waux \| grep pulse" or "pactl info" will contain
"pipewire" if your Linux/BSD system uses it).*
- For Linux systems using systemd, there is a **systemd** service file
**uxplay.service** found in the UxPlay top directory of the
distribution, and also installed in `<DOCDIR>/uxplay/systemd/`
(where DOCDIR is usually `/usr/local/share/doc`), that allows users
to start their own instance of UxPlay as a rootless daemon: it
should either be added to the directory /etc/systemd/user, or the
user can just create their own systemd directory
`~/.config/systemd/user/` and then copy uxplay.service into it. To
save uxplay terminal output to a file \~/uxplay.log, uncomment the
StandardOutput entry in uxplay.service. Then
`systemctl --user [start/stop/enable/disable/status] uxplay`
can be used to control the daemon. If it is enabled, the daemon will
start at the user's first login and stop when they no longer have
any open sessions. See
https://www.baeldung.com/linux/systemd-create-user-services for more
about systemd user services. If more than one user might
simultaneously run uxplay this way, they should specify distinct -p
and -m options (ports and deviceID) in their startup files. **Note:
it is NOT recommended to run UxPlay as a root service.**
- On Raspberry Pi: models using hardware h264 video decoding by the
Broadcom GPU (models 4B and earlier) may require the uxplay option
@@ -91,8 +153,11 @@ After installation:
video decoding is used seems to have reappeared starting with
GStreamer-1.22.
To (easily) compile the latest UxPlay from source, see the section
[Getting UxPlay](#getting-uxplay).
- If UxPlay is used in a public space, there are security options for
requiring an AppleTV-style one-time pin (displayed on the terminal)
to be entered, or a password, and for barring/permitting client
access by their device ID. See options -pin, -reg, -pw, -restrict,
-allow, -block.
# Detailed description of UxPlay
@@ -235,7 +300,7 @@ clause incompatible with the GPL unless OpenSSL can be regarded as a
OpenSSL as a "System Library", but some (e.g. Debian) do not: in this
case, the issue is solved by linking with OpenSSL-3.0.0 or later.
# Getting UxPlay
# Building UxPlay from source
Either download and unzip
[UxPlay-master.zip](https://github.com/FDH2/UxPlay/archive/refs/heads/master.zip),
@@ -369,6 +434,11 @@ package](#building-an-installable-rpm-package).
avahi-libdns or mDNSResponder must also be installed to provide the
dns_sd library. OpenSSL is already installed as a System Library.
- **OpenBSD:** (doas pkg_add) libplist gstreamer1-plugins-base.
avahi-libs must also be installed to provide the dns_sd library;
(avahi-main must also be installed). OpenSSL is already installed as
a System Library.
#### Building an installable RPM package
First-time RPM builders should first install the rpm-build and
@@ -450,6 +520,9 @@ repositories for those distributions.
gstreamer1-plugins-\* (\* = core, good, bad, x, gtk, gl, vulkan,
pulse, v4l2, ...), (+ gstreamer1-vaapi for Intel/AMD graphics).
- **OpenBSD:** Install gstreamer1-libav, gstreamer-plugins-\* (\* =
core, bad, base, good).
### Starting and running UxPlay
Since UxPlay-1.64, UxPlay can be started with options read from a
@@ -486,7 +559,12 @@ below for help with this or other problems.
[Usage](#usage) for details, if you wish to use it. *Some clients
with MDM (Mobile Device Management, often present on employer-owned
devices) are required to use pin-authentication: UxPlay will provide
this even when running without the pin option.*
this even when running without the pin option.* Password
authentication (-pw *pwd*) is also offered as an alternative
solution to pin codes: users need to know the password *pwd* and
enter it on their iOS/macOS device to access UxPlay, when prompted
(if *pwd* is not set, a displayed random pin code must be entered at
**each** new connection.)
- By default, UxPlay is locked to its current client until that client
drops the connection; since UxPlay-1.58, the option `-nohold`
@@ -548,12 +626,13 @@ value advances it.)
-FPSdata.) When using this, you should use the default
timestamp-based synchronization option `-vsync`.
- Since UxPlay-1.54, you can display the accompanying "Cover Art" from
sources like Apple Music in Audio-Only (ALAC) mode: run
"`uxplay -ca <name> &`" in the background, then run a image viewer
with an autoreload feature: an example is "feh": run
"`feh -R 1 <name>`" in the foreground; terminate feh and then Uxplay
with "`ctrl-C fg ctrl-C`".
- You can now display (inside UxPlay) the accompanying "Cover Art"
from sources like Apple Music in Audio-Only (ALAC) mode with the
option `uxplay -ca`. *The older method of exporting cover art to an
external viewer remains available: run "`uxplay -ca <name> &`" in
the background, then run a image viewer with an autoreload feature:
an example is "feh": run "`feh -R 1 <name>`" in the foreground;
terminate feh and then Uxplay with "`ctrl-C fg ctrl-C`"*.
By default, GStreamer uses an algorithm to search for the best
"videosink" (GStreamer's term for a graphics driver to display images)
@@ -679,9 +758,9 @@ framebuffer video, use `<videosink>` = `kmssink`.
```{=html}
<!-- -->
```
ssh user@remote_host
export DISPLAY=:0
nohup uxplay [options] > FILE &
ssh user@remote_host
export DISPLAY=:0
nohup uxplay [options] > FILE &
Sound and video will play on the remote host; "nohup" will keep uxplay
running if the ssh session is closed. Terminal output is saved to FILE
@@ -689,10 +768,10 @@ running if the ssh session is closed. Terminal output is saved to FILE
## Building UxPlay on macOS: **(Intel X86_64 and "Apple Silicon" M1/M2 Macs)**
*Note: A native AirPlay Server feature is included in macOS 12 Monterey,
but is restricted to recent hardware. UxPlay can run on older macOS
systems that will not be able to run Monterey, or can run Monterey but
not AirPlay.*
*Note: A native AirPlay Server feature is included in macOS since macOS
12 Monterey, but is restricted to recent hardware. As well as running on
latest macOS, UxPlay can run on older macOS systems that will cannot run
Monterey, or can run Monterey but not AirPlay.*
These instructions for macOS assume that the Xcode command-line
developer tools are installed (if Xcode is installed, open the Terminal,
@@ -753,22 +832,15 @@ complete GStreamer, but seems to have everything needed for UxPlay).
installations in non-standard locations indicated by the environment
variable `$HOMEBREW_PREFIX`.**
**Using GStreamer installed from MacPorts**: this is **not**
recommended, as currently the MacPorts GStreamer is old (v1.16.2),
unmaintained, and built to use X11:
- Instead [build gstreamer
yourself](https://github.com/FDH2/UxPlay/wiki/Building-GStreamer-from-Source-on-macOS-with-MacPorts)
if you use MacPorts and do not want to use the "Official" Gstreamer
binaries.
*(If you really wish to use the MacPorts GStreamer-1.16.2, install
pkgconf ("sudo port install pkgconf"), then "sudo port install
**Using GStreamer installed from MacPorts**: MacPorts is now providing
recent GStreamer releases: install pkgconf ("sudo port install
pkgconf"), then "sudo port install gstreamer1
gstreamer1-gst-plugins-base gstreamer1-gst-plugins-good
gstreamer1-gst-plugins-bad gstreamer1-gst-libav". For X11 support on
macOS, compile UxPlay using a special cmake option `-DUSE_X11=ON`, and
run it from an XQuartz terminal with -vs ximagesink; older non-retina
macs require a lower resolution when using X11: `uxplay -s 800x600`.)*
gstreamer1-gst-plugins-bad gstreamer1-gst-libav". (The following may no
longer be relevant: *For X11 support on macOS, compile UxPlay using a
special cmake option `-DUSE_X11=ON`, and run it from an XQuartz terminal
with -vs ximagesink; older non-retina macs require a lower resolution
when using X11: `uxplay -s 800x600`.)*
After installing GStreamer, build and install uxplay: open a terminal
and change into the UxPlay source directory ("UxPlay-master" for zipfile
@@ -780,30 +852,24 @@ downloads, "UxPlay" for "git clone" downloads) and build/install with
default (since UxPlay 1.64) use of timestamps for video
synchonization, many video frames are being dropped (only on macOS),
perhaps due to another error (about videometa) that shows up in the
GStreamer warnings. **Recommendation: use the new UxPlay "no
timestamp" option "`-vsync no`"** (you can add a line "vsync no" in
the uxplayrc configuration file).
GStreamer warnings. **Recommendation: use the UxPlay "no timestamp"
option "`-vsync no`"** (you can add a line "vsync no" in the
uxplayrc configuration file).
- On macOS with this installation of GStreamer, the only videosinks
available seem to be glimagesink (default choice made by
autovideosink) and osxvideosink. The window title does not show the
Airplay server name, but the window is visible to screen-sharing
apps (e.g., Zoom). The only available audiosink seems to be
osxaudiosink.
available are glimagesink (default choice made by autovideosink) and
osxvideosink. The window title does not show the Airplay server
name, but the window can be shared on Zoom. Because of issues with
glimagesink, you may find osxvideosink works better. The only
available audiosink is osxaudiosink.
- The option -nc is always used, whether or not it is selected. This
is a workaround for a problem with GStreamer videosinks on macOS: if
the GStreamer pipeline is destroyed while the mirror window is still
open, a segfault occurs.
- The option -nc is currently used by default on macOS, This is a
workaround for window-closing problems with GStreamer videosinks on
macOS. This option can be canceled with "-nc no", if not needed.
- In the case of glimagesink, the resolution settings "-s wxh" do not
- In the case of glimagesink, the resolution settings "-s wxh" may not
affect the (small) initial OpenGL mirror window size, but the window
can be expanded using the mouse or trackpad. In contrast, a window
created with "-vs osxvideosink" is initially big, but has the wrong
aspect ratio (stretched image); in this case the aspect ratio
changes when the window width is changed by dragging its side; the
option `-vs "osxvideosink force-aspect-ratio=true"` can be used to
make the window have the correct aspect ratio when it first opens.
can be expanded using the mouse or trackpad.
## Building UxPlay on Microsoft Windows, using MSYS2 with the MinGW-64 compiler.
@@ -826,16 +892,30 @@ downloads, "UxPlay" for "git clone" downloads) and build/install with
3. [MSYS2 packages](https://packages.msys2.org/package/) are installed
with a variant of the "pacman" package manager used by Arch Linux.
Open a "MSYS2 MINGW64" terminal from the MSYS2 tab in the Windows
Start menu, and update the new MSYS2 installation with "pacman
-Syu". Then install the **MinGW-64** compiler and **cmake**
Open a "MSYS2" terminal from the MSYS2 tab in the Windows Start
menu, and update the new MSYS2 installation with "pacman -Syu".
pacman -S mingw-w64-x86_64-cmake mingw-w64-x86_64-gcc
- \_NEW: MSYS2 now recommends using the newer UCRT64 terminal
environment (which uses the newer Microsoft UCRT "Universal C
RunTime Library", included as part of the Windows OS since
Windows 10) rather than the MINGW64 terminal environment (which
uses the older Microsoft MSVCRT C library, which has "legacy"
status, but is available on all Windows systems). If you wish to
use the legacy MSVCRT library, to support older Windows
versions, modify the instructions below as follows:
The compiler with all required dependencies will be installed in the
msys64 directory, with default path `C:/msys64/mingw64`. Here we
will simply build UxPlay from the command line in the MSYS2
environment (this uses "`ninja`" in place of "`make`" for the build
(1) change the MSYS2 terminal type from UCRT64 to MINGW64; (2)
modify mingw-w64-ucrt-x86_64-\* package names to
mingw-w64-x86_64-\*, (just omit "-ucrt");
(2) replace `ucrt64` by `mingw64` in directory names.\_
Open a new MSYS2 UCRT64 terminal, and install the gcc compiler and
cmake:
`pacman -S mingw-w64-ucrt-x86_64-cmake mingw-w64-ucrt-x86_64-gcc`
We will simply build UxPlay from the command line in the MSYS2
environment (using "`ninja`" in place of "`make`" for the build
system).
4. Download the latest UxPlay from github **(to use `git`, install it
@@ -843,7 +923,7 @@ downloads, "UxPlay" for "git clone" downloads) and build/install with
"`git clone https://github.com/FDH2/UxPlay`")**, then install UxPlay
dependencies (openssl is already installed with MSYS2):
`pacman -S mingw-w64-x86_64-libplist mingw-w64-x86_64-gstreamer mingw-w64-x86_64-gst-plugins-base`
`pacman -S mingw-w64-ucrt-x86_64-libplist mingw-w64-ucrt-x86_64-gstreamer mingw-w64-ucrt-x86_64-gst-plugins-base`
If you are trying a different Windows build system, MSVC versions of
GStreamer for Windows are available from the [official GStreamer
@@ -863,19 +943,26 @@ downloads, "UxPlay" for "git clone" downloads) and build/install with
6. Assuming no error in either of these, you will have built the uxplay
executable **uxplay.exe** in the current ("build") directory. The
"sudo make install" and "sudo make uninstall" features offered in
the other builds are not available on Windows; instead, the MSYS2
environment has `/mingw64/...` available, and you can install the
uxplay.exe executable in `C:/msys64/mingw64/bin` (plus manpage and
documentation in `C:/msys64/mingw64/share/...`) with
the other builds are not available on Windows; instead, you can
install the uxplay.exe executable in `C:/msys64/ucrt64/bin` (plus
manpage and documentation in `C:/msys64/ucrt64/share/...`) with
`cmake --install . --prefix /mingw64`
`cmake --install . --prefix $HOME/../../ucrt64`
You can later uninstall uxplay by returning to the build directory
and running
`ninja uninstall`
(This assumes that certain files in the build directory were not
deleted since building UxPlay).
To be able to view the manpage, you need to install the manpage
viewer with "`pacman -S man`".
To run **uxplay.exe** you need to install some gstreamer plugin packages
with `pacman -S mingw-w64-x86_64-gst-<plugin>`, where the required ones
have `<plugin>` given by
with `pacman -S mingw-w64-ucrt-x86_64-gst-<plugin>`, where the required
ones have `<plugin>` given by
1. **libav**
2. **plugins-good**
@@ -893,11 +980,11 @@ app through firewall**. If your virus protection flags uxplay.exe as
"suspicious" (but without a true malware signature) you may need to give
it an exception.
Now test by running "`uxplay`" (in a MSYS2 terminal window). If you need
to specify the audiosink, there are two main choices on Windows: the
older DirectSound plugin "`-as directsoundsink`", and the more modern
Windows Audio Session API (wasapi) plugin "`-as wasapisink`", which
supports [additional
Now test by running "`uxplay`" (in a MSYS2 UCRT64 terminal window. If
you need to specify the audiosink, there are two main choices on
Windows: the older DirectSound plugin "`-as directsoundsink`", and the
more modern Windows Audio Session API (wasapi) plugin
"`-as wasapisink`", which supports [additional
options](https://gstreamer.freedesktop.org/documentation/wasapi/wasapisink.html)
such as
@@ -909,23 +996,21 @@ like `\{0.0.0.00000000\}.\{98e35b2b-8eba-412e-b840-fd2c2492cf44\}`. If
"`device`" is not specified, the default audio device is used.
If you wish to specify the videosink using the `-vs <videosink>` option,
some choices for `<videosink>` are `d3d11videosink`, `d3dvideosink`,
`glimagesink`, `gtksink`.
some choices for `<videosink>` are `d3d12videosink`, `d3d11videosink`,
`d3dvideosink`, `glimagesink`, `gtksink`, `autovideosink`. If you do not
specify the videosink, the d3d11videosink will be used (users have
reported segfaults of the newer d3d12 videodecoder on certain older
Nvidia cards when the image resolution changes: d3d11 will used by
default until this is fixed).
- With Direct3D 11.0 or greater, you can either always be in
fullscreen mode using option
`-vs "d3d11videosink fullscreen-toggle-mode=property fullscreen=true"`,
or get the ability to toggle into and out of fullscreen mode using
the Alt-Enter key combination with option
`-vs "d3d11videosink fullscreen-toggle-mode=alt-enter"`. For
convenience, these options will be added if just
`-vs d3d11videosink` with or without the fullscreen option "-fs" is
used. *(Windows users may wish to add "`vs d3d11videosink`" (no
initial "`-`") to the UxPlay startup options file; see "man uxplay"
or "uxplay -h".)*
- With Direct3D 11.0 or greater, various options can be set using
e.g. `-vs "d3d11videosink <options>"` (see the gstreamer videosink
documentation for these videosinks). For convenience, if no
`<options>` are set, the option to toggle in and out of fullscreen
mode with the Alt-Enter key combination is added.
The executable uxplay.exe can also be run without the MSYS2 environment,
in the Windows Terminal, with `C:\msys64\mingw64\bin\uxplay`.
in the Windows Terminal, with `C:\msys64\ucrt64\bin\uxplay`.
# Usage
@@ -938,6 +1023,9 @@ Options:
comments, and ignored. Command line options supersede options in the
startup file.
**-rc *file*** can also be used to specify the startup file location:
this overrides `$UXPLAYRC`, `~/.uxplayrc`, etc.
**-n server_name** (Default: UxPlay); server_name@\_hostname\_ will be
the name that appears offering AirPlay services to your iPad, iPhone
etc, where *hostname* is the name of the server running uxplay. This
@@ -959,10 +1047,13 @@ The "-h265" option changes the default resolution ("-s" option) from
1920x1080 to 3840x2160, and leaves default maximum framerate ("-fps"
option) at 30fps.
**-hls** Activate HTTP Live Streaming support. With this option YouTube
videos can be streamed directly from YouTube servers to UxPlay (without
passing through the client) by clicking on the AirPlay icon in the
YouTube app.
**-hls \[v\]** Activate HTTP Live Streaming support. With this option
YouTube videos can be streamed directly from YouTube servers to UxPlay
(without passing through the client) by clicking on the AirPlay icon in
the YouTube app. Optional \[v\] (allowed values 2 or 3, default: 3)
allows selection of the version of GStreamer's \"playbin\" video player
to use for playing HLS video. *(Playbin v3 is the recommended player,
but if some videos fail to play, you can try with version 2.)*
**-pin \[nnnn\]**: (since v1.67) use Apple-style (one-time) "pin"
authentication when a new client connects for the first time: a
@@ -991,6 +1082,17 @@ deregisters the corresponding client (see options -restrict, -block,
-allow for more ways to control client access). *(Add a line "reg" in
the startup file if you wish to use this feature.)*
**-pw** \[*pwd*\]. (since 1.72). As an alternative to -pin, client
access can be controlled with a password set when uxplay starts (set it
in the .uxplay startup file, where it is stored as cleartext.) All users
must then know this password. This uses HTTP md5 Digest authentication,
which is now regarded as providing weak security, but it is only used to
validate the uxplay password, and no user credentials are exposed. If
*pwd* is **not** specified, a random 4-digit pin code is displayed, and
must be entered on the client at **each** new connection. *Note: -pin
and -pw are alternatives: if both are specified at startup, the earlier
of these two options is discarded.*
**-vsync \[x\]** (In Mirror mode:) this option (**now the default**)
uses timestamps to synchronize audio with video on the server, with an
optional audio delay in (decimal) milliseconds (*x* = "20.5" means
@@ -1039,6 +1141,9 @@ where 16 steps = full volume) is reduced by 50%, the perceived volume is
halved (a 10dB attenuation). (This is modified at low volumes, to use
the "untapered" volume if it is louder.)
**-vol *v*** Sets initial audio-streaming volume (on client): range is
\[0:1\], with 0.0 = mute, 1.0 = full volume (*v* is a decimal number).
**-s wxh** e.g. -s 1920x1080 (= "1080p"), the default width and height
resolutions in pixels for h264 video. (The default becomes 3840x2160 (=
"4K") when the -h265 option is used.) This is just a request made to the
@@ -1061,8 +1166,8 @@ display that overscans, and is not displayed by gstreamer).
Recommendation: **don't use this option** unless there is some special
reason to use it.
**-fs** uses fullscreen mode, but only works with X11, Wayland, VAAPI,
and D3D11 (Windows).
**-fs** uses fullscreen mode, but currently only works with X11,
Wayland, VAAPI, kms and D3D11 (Windows).
**-p** allows you to select the network ports used by UxPlay (these need
to be opened if the server is behind a firewall). By itself, -p sets
@@ -1167,6 +1272,9 @@ number of microseconds. Default is 0.25 sec (250000 usec). *(However,
the client appears to ignore this reported latency, so this option seems
non-functional.)*
**-ca** (without specifying a filename) now displays "cover art" that
accompanies Apple Music when played in "Audio-only" (ALAC) mode.
**-ca *filename*** provides a file (where *filename* can include a full
path) used for output of "cover art" (from Apple Music, *etc.*,) in
audio-only ALAC mode. This file is overwritten with the latest cover art
@@ -1180,16 +1288,21 @@ uxplay was put into the background). To quit, use `ctrl-C fg ctrl-C` to
terminate the image viewer, bring `uxplay` into the foreground, and
terminate it too.
**-reset n** sets a limit of *n* consecutive timeout failures of the
client to respond to ntp requests from the server (these are sent every
3 seconds to check if the client is still present, and synchronize with
it). After *n* failures, the client will be presumed to be offline, and
the connection will be reset to allow a new connection. The default
value of *n* is 5; the value *n* = 0 means "no limit" on timeouts.
**-md *filename*** Like the -ca option, but exports audio metadata text
(Artist, Title, Genre, etc.) to file for possible display by a process
that watches the file for changes. Previous text is overwritten as new
metadata is received, and the file is deleted when uxplay terminates.
**-nofreeze** closes the video window after a reset due to ntp timeout
(default is to leave window open to allow a smoother reconection to the
same client). This option may be useful in fullscreen mode.
**-reset n** sets a limit of *n* consecutive failures of the client to
send feedback requests (these "heartbeat signals" are sent by the client
once per second to ask for a response showing that the server is still
online). After *n* missing signals, the client will be presumed to be
offline, and the connection will be reset to allow a new connection. The
default value of *n* is 15 seconds; the value *n* = 0 means "no limit".
**-nofreeze** closes the video window after a reset due to client going
offline (default is to leave window open to allow a smoother reconection
to the same client). This option may be useful in fullscreen mode.
**-nc** maintains previous UxPlay \< 1.45 behavior that does **not
close** the video window when the the client sends the "Stop Mirroring"
@@ -1298,12 +1411,13 @@ that (unlike dumped video) the dumped audio is currently only useful for
debugging, as it is not containerized to make it playable with standard
audio players.*
**-d** Enable debug output. Note: this does not show GStreamer error or
debug messages. To see GStreamer error and warning messages, set the
environment variable GST_DEBUG with "export GST_DEBUG=2" before running
uxplay. To see GStreamer information messages, set GST_DEBUG=4; for
DEBUG messages, GST_DEBUG=5; increase this to see even more of the
GStreamer inner workings.
**-d \[n\]** Enable debug output; optional argument n=1 suppresses
audio/video packet data in debug output. Note: this does not show
GStreamer error or debug messages. To see GStreamer error and warning
messages, set the environment variable GST_DEBUG with "export
GST_DEBUG=2" before running uxplay. To see GStreamer information
messages, set GST_DEBUG=4; for DEBUG messages, GST_DEBUG=5; increase
this to see even more of the GStreamer inner workings.
# Troubleshooting
@@ -1647,6 +1761,26 @@ what version UxPlay claims to be.
# Changelog
xxxx 2025-07-07 Render Audio cover-art inside UxPlay with -ca option (no
file specified).
1.72.2 2025-07-07 Fix bug (typo) in DNS_SD advertisement introduced with
-pw option. Update llhttp to v 9.3.0
1.72.1 2025-06-06 minor update: fix regression in -reg option; add
option -rc `<rcfile>`{=html} to specify initialization file; add "-nc
no" to unset "-nc" option (for macOS users, where -nc is default); add
user-installable systemd script for running UxPlay as an
always-available "rootless daemon"
1.72 2025-05-07. Improved HLS Live Streaming (YouTube) support,
including "scrub". Add requested options -md \<filename\> to output
audio metadata text to a file for possible display (complements -ca
option), and -vol `<v>`{=html} option to set initial audio-streaming
volume. Add support password user access control with HTTP digest
Authentication (-pw \[pwd\]). If no pwd is set, a random pin is
displayed for entry at each new connection.
1.71 2024-12-13 Add support for HTTP Live Streaming (HLS), initially
only for YouTube movies. Fix issue with NTP timeout on Windows.

View File

@@ -1,4 +1,3 @@
cmake_minimum_required(VERSION 3.5)
include_directories( playfair llhttp )
message( STATUS "*** CFLAGS \"" ${CMAKE_C_FLAGS} "\" from build environment will be postpended to CMAKE_CFLAGS" )
@@ -102,7 +101,11 @@ else ()
target_link_libraries ( airplay PUBLIC ${LIBPLIST} )
endif()
if ( PLIST_FOUND )
message( STATUS "found libplist-${PLIST_VERSION}" )
message( STATUS "found libplist-${PLIST_VERSION}")
pkg_check_modules ( PLIST_23 libplist-2.0>=2.3.0 )
if ( PLIST_23_FOUND )
add_definitions( -DPLIST_230 )
endif()
endif()
target_include_directories( airplay PRIVATE ${PLIST_INCLUDE_DIRS} )

View File

@@ -29,7 +29,7 @@
struct media_item_s {
char *uri;
char *playlist;
int access;
int num;
};
struct airplay_video_s {
@@ -61,6 +61,8 @@ int airplay_video_service_init(raop_t *raop, unsigned short http_port,
airplay_video_service_destroy(airplay_video);
}
/* calloc guarantees that the 36-character strings apple_session_id and
playback_uuid are null-terminated */
airplay_video = (airplay_video_t *) calloc(1, sizeof(airplay_video_t));
if (!airplay_video) {
return -1;
@@ -82,16 +84,12 @@ int airplay_video_service_init(raop_t *raop, unsigned short http_port,
//printf(" %p %p\n", airplay_video, get_airplay_video(raop));
airplay_video->raop = raop;
airplay_video->FCUP_RequestID = 0;
size_t len = strlen(session_id);
assert(len == 36);
strncpy(airplay_video->apple_session_id, session_id, len);
(airplay_video->apple_session_id)[len] = '\0';
airplay_video->start_position_seconds = 0.0f;
airplay_video->master_uri = NULL;
@@ -143,6 +141,10 @@ void set_playback_uuid(airplay_video_t *airplay_video, const char *playback_uuid
(airplay_video->playback_uuid)[len] = '\0';
}
const char *get_playback_uuid(airplay_video_t *airplay_video) {
return (const char *) airplay_video->playback_uuid;
}
void set_uri_prefix(airplay_video_t *airplay_video, char *uri_prefix, int uri_prefix_len) {
if (airplay_video->uri_prefix) {
free (airplay_video->uri_prefix);
@@ -159,12 +161,10 @@ char *get_uri_local_prefix(airplay_video_t *airplay_video) {
return airplay_video->local_uri_prefix;
}
char *get_master_uri(airplay_video_t *airplay_video) {
return airplay_video->master_uri;
}
int get_next_FCUP_RequestID(airplay_video_t *airplay_video) {
return ++(airplay_video->FCUP_RequestID);
}
@@ -177,9 +177,6 @@ int get_next_media_uri_id(airplay_video_t *airplay_video) {
return airplay_video->next_uri;
}
/* master playlist */
void store_master_playlist(airplay_video_t *airplay_video, char *master_playlist) {
if (airplay_video->master_playlist) {
free (airplay_video->master_playlist);
@@ -219,90 +216,50 @@ void create_media_data_store(airplay_video_t * airplay_video, char ** uri_list,
for (int i = 0; i < num_uri; i++) {
media_data_store[i].uri = uri_list[i];
media_data_store[i].playlist = NULL;
media_data_store[i].access = 0;
media_data_store[i].num = i;
}
airplay_video->media_data_store = media_data_store;
airplay_video->num_uri = num_uri;
}
int store_media_data_playlist_by_num(airplay_video_t *airplay_video, char * media_playlist, int num) {
int store_media_playlist(airplay_video_t *airplay_video, char * media_playlist, int num) {
media_item_t *media_data_store = airplay_video->media_data_store;
if ( num < 0 || num >= airplay_video->num_uri) {
return -1;
return -1;
} else if (media_data_store[num].playlist) {
return -2;
return -2;
}
for (int i = 0; i < num ; i++) {
if (strcmp(media_data_store[i].uri, media_data_store[num].uri) == 0) {
assert(strcmp(media_data_store[i].playlist, media_playlist) == 0);
media_data_store[num].num = i;
free (media_playlist);
return 1;
}
}
media_data_store[num].playlist = media_playlist;
return 0;
}
char * get_media_playlist_by_num(airplay_video_t *airplay_video, int num) {
char * get_media_playlist(airplay_video_t *airplay_video, const char *uri) {
media_item_t *media_data_store = airplay_video->media_data_store;
if (media_data_store == NULL) {
return NULL;
}
if (num >= 0 && num <airplay_video->num_uri) {
return media_data_store[num].playlist;
}
return NULL;
}
int get_media_playlist_by_uri(airplay_video_t *airplay_video, const char *uri) {
/* Problem: there can be more than one StreamInf playlist with the same uri:
* they differ by choice of partner Media (audio, subtitles) playlists
* If the same uri is requested again, one of the other ones will be returned
* (the least-previously-requested one will be served up)
*/
// modified to return the position of the media playlist in the master playlist
media_item_t *media_data_store = airplay_video->media_data_store;
if (media_data_store == NULL) {
return -2;
}
int found = 0;;
int num = -1;
int access = -1;
for (int i = 0; i < airplay_video->num_uri; i++) {
if (strstr(media_data_store[i].uri, uri)) {
if (!found) {
found = 1;
num = i;
access = media_data_store[i].access;
} else {
/* change > below to >= to reverse the order of choice */
if (access > media_data_store[i].access) {
access = media_data_store[i].access;
num = i;
}
}
return media_data_store[media_data_store[i].num].playlist;
}
}
if (found) {
//printf("found %s\n", media_data_store[num].uri);
++media_data_store[num].access;
return num;
}
return -1;
return NULL;
}
char * get_media_uri_by_num(airplay_video_t *airplay_video, int num) {
media_item_t * media_data_store = airplay_video->media_data_store;
if (media_data_store == NULL) {
return NULL;
}
if (num >= 0 && num < airplay_video->num_uri) {
return media_data_store[num].uri;
}
return NULL;
}
int get_media_uri_num(airplay_video_t *airplay_video, char * uri) {
media_item_t *media_data_store = airplay_video->media_data_store;
for (int i = 0; i < airplay_video->num_uri ; i++) {
if (strstr(media_data_store[i].uri, uri)) {
return i;
}
}
return -1;
return NULL;
}
int analyze_media_playlist(char *playlist, float *duration) {
@@ -320,3 +277,258 @@ int analyze_media_playlist(char *playlist, float *duration) {
}
return count;
}
/* parse Master Playlist, make table of Media Playlist uri's that it lists */
int create_media_uri_table(const char *url_prefix, const char *master_playlist_data,
int datalen, char ***media_uri_table, int *num_uri) {
char *ptr = strstr(master_playlist_data, url_prefix);
char ** table = NULL;
if (ptr == NULL) {
return -1;
}
int count = 0;
while (ptr != NULL) {
char *end = strstr(ptr, "m3u8");
if (end == NULL) {
return 1;
}
end += sizeof("m3u8");
count++;
ptr = strstr(end, url_prefix);
}
table = (char **) calloc(count, sizeof(char *));
if (!table) {
return -1;
}
for (int i = 0; i < count; i++) {
table[i] = NULL;
}
ptr = strstr(master_playlist_data, url_prefix);
count = 0;
while (ptr != NULL) {
char *end = strstr(ptr, "m3u8");
char *uri;
if (end == NULL) {
return 0;
}
end += sizeof("m3u8");
size_t len = end - ptr - 1;
uri = (char *) calloc(len + 1, sizeof(char));
memcpy(uri , ptr, len);
table[count] = uri;
uri = NULL;
count ++;
ptr = strstr(end, url_prefix);
}
*num_uri = count;
*media_uri_table = table;
return 0;
}
/* Adjust uri prefixes in the Master Playlist, for sending to the Media Player */
char *adjust_master_playlist (char *fcup_response_data, int fcup_response_datalen,
char *uri_prefix, char *uri_local_prefix) {
size_t uri_prefix_len = strlen(uri_prefix);
size_t uri_local_prefix_len = strlen(uri_local_prefix);
int counter = 0;
char *ptr = strstr(fcup_response_data, uri_prefix);
while (ptr != NULL) {
counter++;
ptr++;
ptr = strstr(ptr, uri_prefix);
}
size_t len = uri_local_prefix_len - uri_prefix_len;
len *= counter;
len += fcup_response_datalen;
char *new_master = (char *) malloc(len + 1);
*(new_master + len) = '\0';
char *first = fcup_response_data;
char *new = new_master;
char *last = strstr(first, uri_prefix);
counter = 0;
while (last != NULL) {
counter++;
len = last - first;
memcpy(new, first, len);
first = last + uri_prefix_len;
new += len;
memcpy(new, uri_local_prefix, uri_local_prefix_len);
new += uri_local_prefix_len;
last = strstr(last + uri_prefix_len, uri_prefix);
if (last == NULL) {
len = fcup_response_data + fcup_response_datalen - first;
memcpy(new, first, len);
break;
}
}
return new_master;
}
char *adjust_yt_condensed_playlist(const char *media_playlist) {
/* this copies a Media Playlist into a null-terminated string.
If it has the "#YT-EXT-CONDENSED-URI" header, it is also expanded into
the full Media Playlist format.
It returns a pointer to the expanded playlist, WHICH MUST BE FREED AFTER USE */
const char *base_uri_begin;
const char *params_begin;
const char *prefix_begin;
size_t base_uri_len;
size_t params_len;
size_t prefix_len;
const char* ptr = strstr(media_playlist, "#EXTM3U\n");
ptr += strlen("#EXTM3U\n");
assert(ptr);
if (strncmp(ptr, "#YT-EXT-CONDENSED-URL", strlen("#YT-EXT-CONDENSED-URL"))) {
size_t len = strlen(media_playlist);
char * playlist_copy = (char *) malloc(len + 1);
memcpy(playlist_copy, media_playlist, len);
playlist_copy[len] = '\0';
return playlist_copy;
}
ptr = strstr(ptr, "BASE-URI=");
base_uri_begin = strchr(ptr, '"');
base_uri_begin++;
ptr = strchr(base_uri_begin, '"');
base_uri_len = ptr - base_uri_begin;
char *base_uri = (char *) calloc(base_uri_len + 1, sizeof(char));
assert(base_uri);
memcpy(base_uri, base_uri_begin, base_uri_len); //must free
ptr = strstr(ptr, "PARAMS=");
params_begin = strchr(ptr, '"');
params_begin++;
ptr = strchr(params_begin,'"');
params_len = ptr - params_begin;
char *params = (char *) calloc(params_len + 1, sizeof(char));
assert(params);
memcpy(params, params_begin, params_len); //must free
ptr = strstr(ptr, "PREFIX=");
prefix_begin = strchr(ptr, '"');
prefix_begin++;
ptr = strchr(prefix_begin,'"');
prefix_len = ptr - prefix_begin;
char *prefix = (char *) calloc(prefix_len + 1, sizeof(char));
assert(prefix);
memcpy(prefix, prefix_begin, prefix_len); //must free
/* expand params */
int nparams = 0;
int *params_size = NULL;
const char **params_start = NULL;
if (strlen(params)) {
nparams = 1;
char * comma = strchr(params, ',');
while (comma) {
nparams++;
comma++;
comma = strchr(comma, ',');
}
params_start = (const char **) calloc(nparams, sizeof(char *)); //must free
params_size = (int *) calloc(nparams, sizeof(int)); //must free
ptr = params;
for (int i = 0; i < nparams; i++) {
comma = strchr(ptr, ',');
params_start[i] = ptr;
if (comma) {
params_size[i] = (int) (comma - ptr);
ptr = comma;
ptr++;
} else {
params_size[i] = (int) (params + params_len - ptr);
break;
}
}
}
int count = 0;
ptr = strstr(media_playlist, "#EXTINF");
while (ptr) {
count++;
ptr = strstr(++ptr, "#EXTINF");
}
size_t old_size = strlen(media_playlist);
size_t new_size = old_size;
new_size += count * (base_uri_len + params_len);
char * new_playlist = (char *) calloc( new_size + 100, sizeof(char));
const char *old_pos = media_playlist;
char *new_pos = new_playlist;
ptr = old_pos;
ptr = strstr(old_pos, "#EXTINF:");
size_t len = ptr - old_pos;
/* copy header section before chunks */
memcpy(new_pos, old_pos, len);
old_pos += len;
new_pos += len;
while (ptr) {
/* for each chunk */
const char *end = NULL;
char *start = strstr(ptr, prefix);
len = start - ptr;
/* copy first line of chunk entry */
memcpy(new_pos, old_pos, len);
old_pos += len;
new_pos += len;
/* copy base uri to replace prefix*/
memcpy(new_pos, base_uri, base_uri_len);
new_pos += base_uri_len;
old_pos += prefix_len;
ptr = strstr(old_pos, "#EXTINF:");
/* insert the PARAMS separators on the slices line */
end = old_pos;
int last = nparams - 1;
for (int i = 0; i < nparams; i++) {
if (i != last) {
end = strchr(end, '/');
} else {
/* the next line starts with either #EXTINF (usually)
or #EXT-X-ENDLIST (at last chunk)*/
end = strstr(end, "#EXT");
}
*new_pos = '/';
new_pos++;
memcpy(new_pos, params_start[i], params_size[i]);
new_pos += params_size[i];
*new_pos = '/';
new_pos++;
len = end - old_pos;
end++;
memcpy (new_pos, old_pos, len);
new_pos += len;
old_pos += len;
if (i != last) {
old_pos++; /* last entry is not followed by "/" separator */
}
}
}
/* copy tail */
len = media_playlist + strlen(media_playlist) - old_pos;
memcpy(new_pos, old_pos, len);
new_pos += len;
old_pos += len;
new_playlist[new_size] = '\0';
free (prefix);
free (base_uri);
free (params);
if (params_size) {
free (params_size);
}
if (params_start) {
free (params_start);
}
return new_playlist;
}

View File

@@ -30,23 +30,27 @@ const char *get_apple_session_id(airplay_video_t *airplay_video);
void set_start_position_seconds(airplay_video_t *airplay_video, float start_position_seconds);
float get_start_position_seconds(airplay_video_t *airplay_video);
void set_playback_uuid(airplay_video_t *airplay_video, const char *playback_uuid);
const char *get_playback_uuid(airplay_video_t *airplay_video);
void set_uri_prefix(airplay_video_t *airplay_video, char *uri_prefix, int uri_prefix_len);
char *get_uri_prefix(airplay_video_t *airplay_video);
char *get_uri_local_prefix(airplay_video_t *airplay_video);
int get_next_FCUP_RequestID(airplay_video_t *airplay_video);
void set_next_media_uri_id(airplay_video_t *airplay_video, int id);
int get_next_media_uri_id(airplay_video_t *airplay_video);
int get_media_playlist_by_uri(airplay_video_t *airplay_video, const char *uri);
void store_master_playlist(airplay_video_t *airplay_video, char *master_playlist);
char *get_master_playlist(airplay_video_t *airplay_video);
int get_num_media_uri(airplay_video_t *airplay_video);
char *get_media_uri_by_num(airplay_video_t *airplay_video, int num);
int analyze_media_playlist(char *playlist, float *duration);
int create_media_uri_table(const char *url_prefix, const char *master_playlist_data,
int datalen, char ***media_uri_table, int *num_uri);
void store_master_playlist(airplay_video_t *airplay_video, char *master_playlist);
int store_media_playlist(airplay_video_t *airplay_video, char *media_playlist, int num);
char *get_master_playlist(airplay_video_t *airplay_video);
char *get_media_playlist(airplay_video_t *airplay_video, const char *uri);
void destroy_media_data_store(airplay_video_t *airplay_video);
void create_media_data_store(airplay_video_t * airplay_video, char ** media_data_store, int num_uri);
int store_media_data_playlist_by_num(airplay_video_t *airplay_video, char * media_playlist, int num);
char *get_media_playlist_by_num(airplay_video_t *airplay_video, int num);
char *get_media_uri_by_num(airplay_video_t *airplay_video, int num);
int get_media_uri_num(airplay_video_t *airplay_video, char * uri);
int analyze_media_playlist(char *playlist, float *duration);
void airplay_video_service_destroy(airplay_video_t *airplay_video);
@@ -59,6 +63,9 @@ void media_data_store_destroy(void *media_data_store);
// called by the POST /action handler:
char *process_media_data(void *media_data_store, const char *url, const char *data, int datalen);
char *adjust_master_playlist (char *fcup_response_data, int fcup_response_datalen,
char *uri_prefix, char *uri_local_prefix);
char *adjust_yt_condensed_playlist(const char *media_playlist);
//called by the POST /play handler
bool request_media_data(void *media_data_store, const char *primary_url, const char * session_id);
@@ -70,5 +77,5 @@ char *query_media_data(void *media_data_store, const char *url, int *len);
void media_data_store_reset(void *media_data_store);
const char *adjust_primary_uri(void *media_data_store, const char *url);
#endif //AIRPLAY_VIDEO_H

View File

@@ -31,6 +31,9 @@
# ifndef ntonll
# define ntohll(x) ((1==ntohl(1)) ? (x) : (((uint64_t)ntohl((x) & 0xFFFFFFFFUL)) << 32) | ntohl((uint32_t)((x) >> 32)))
# endif
#ifndef htonll
# define htonll(x) ((1==htonl(1)) ? (x) : (((uint64_t)htonl((x) & 0xFFFFFFFFUL)) << 32) | htonl((uint32_t)((x) >> 32)))
#endif
#else
# ifndef htonll
# ifdef SYS_ENDIAN_H
@@ -86,6 +89,12 @@ uint32_t byteutils_get_int_be(unsigned char* b, int offset) {
uint64_t byteutils_get_long_be(unsigned char* b, int offset) {
return ntohll(byteutils_get_long(b, offset));
}
/**
* Writes a big endian unsigned 64 bit integer to the buffer at position offset
*/
void byteutils_put_long_be(unsigned char* b, int offset, uint64_t value) {
*((uint64_t*)(b + offset)) = htonll(value);
}
/**
* Reads a float from the buffer at position offset

View File

@@ -28,5 +28,6 @@ float byteutils_get_float(unsigned char* b, int offset);
uint64_t byteutils_get_ntp_timestamp(unsigned char *b, int offset);
void byteutils_put_ntp_timestamp(unsigned char *b, int offset, uint64_t us_since_1970);
void byteutils_put_long_be(unsigned char* b, int offset, uint64_t value);
#endif //AIRPLAYSERVER_BYTEUTILS_H

View File

@@ -507,6 +507,8 @@ void ed25519_key_destroy(ed25519_key_t *key) {
}
}
// SHA 512
struct sha_ctx_s {
@@ -540,7 +542,6 @@ void sha_final(sha_ctx_t *ctx, uint8_t *out, unsigned int *len) {
void sha_reset(sha_ctx_t *ctx) {
if (!EVP_MD_CTX_reset(ctx->digest_ctx) ||
!EVP_DigestInit_ex(ctx->digest_ctx, EVP_sha512(), NULL)) {
handle_error(__func__);
}
}
@@ -552,6 +553,63 @@ void sha_destroy(sha_ctx_t *ctx) {
}
}
//MD5
struct md5_ctx_s {
EVP_MD_CTX *digest_ctx;
};
md5_ctx_t *md5_init() {
md5_ctx_t *ctx = malloc(sizeof(md5_ctx_t));
assert(ctx != NULL);
ctx->digest_ctx = EVP_MD_CTX_new();
assert(ctx->digest_ctx != NULL);
if (!EVP_DigestInit_ex(ctx->digest_ctx, EVP_md5(), NULL)) {
handle_error(__func__);
}
return ctx;
}
void md5_update(md5_ctx_t *ctx, const uint8_t *in, int len) {
if (!EVP_DigestUpdate(ctx->digest_ctx, in, len)) {
handle_error(__func__);
}
}
void md5_final(md5_ctx_t *ctx, uint8_t *out, unsigned int *len) {
if (!EVP_DigestFinal_ex(ctx->digest_ctx, out, len)) {
handle_error(__func__);
}
}
void md5_reset(md5_ctx_t *ctx) {
if (!EVP_MD_CTX_reset(ctx->digest_ctx) ||
!EVP_DigestInit_ex(ctx->digest_ctx, EVP_md5(), NULL)) {
handle_error(__func__);
}
}
void md5_destroy(md5_ctx_t *ctx) {
if (ctx) {
EVP_MD_CTX_free(ctx->digest_ctx);
free(ctx);
}
}
#define MD5_DIGEST_LENGTH 16
char *get_md5(char *string) {
unsigned char hash[MD5_DIGEST_LENGTH];
md5_ctx_t *ctx = NULL;
ctx = md5_init();
md5_update(ctx, (const unsigned char *) string, strlen(string));
md5_final(ctx, hash, NULL);
md5_destroy(ctx);
ctx = NULL;
char *result_str = utils_hex_to_string(hash, MD5_DIGEST_LENGTH);
return result_str; //must free result_str after use
}
int get_random_bytes(unsigned char *buf, int num) {
return RAND_bytes(buf, num);
}

View File

@@ -109,6 +109,15 @@ void sha_final(sha_ctx_t *ctx, uint8_t *out, unsigned int *len);
void sha_reset(sha_ctx_t *ctx);
void sha_destroy(sha_ctx_t *ctx);
//MD5
typedef struct md5_ctx_s md5_ctx_t;
md5_ctx_t *md5_init();
void md5_update(md5_ctx_t *ctx, const uint8_t *in, int len);
void md5_final(md5_ctx_t *ctx, uint8_t *out, unsigned int *len);
void md5_reset(md5_ctx_t *ctx);
void md5_destroy(md5_ctx_t *ctx);
char *get_md5(char *string);
#ifdef __cplusplus
}
#endif

View File

@@ -151,17 +151,21 @@ struct dnssd_s {
uint32_t features1;
uint32_t features2;
unsigned char require_pw;
unsigned char pin_pw;
};
dnssd_t *
dnssd_init(const char* name, int name_len, const char* hw_addr, int hw_addr_len, int *error, int require_pw)
dnssd_init(const char* name, int name_len, const char* hw_addr, int hw_addr_len, int *error, unsigned char pin_pw)
{
dnssd_t *dnssd;
char *end;
unsigned long features;
/* pin_pw = 0: no pin or password
1: use onscreen pin for client access control
2 or 3: require password for client access control
*/
if (error) *error = DNSSD_ERROR_NOERROR;
@@ -171,7 +175,7 @@ dnssd_init(const char* name, int name_len, const char* hw_addr, int hw_addr_len,
return NULL;
}
dnssd->require_pw = (unsigned char) require_pw;
dnssd->pin_pw = pin_pw;
features = strtoul(FEATURES_1, &end, 16);
if (!end || (features & 0xFFFFFFFF) != features) {
@@ -302,10 +306,20 @@ dnssd_register_raop(dnssd_t *dnssd, unsigned short port)
dnssd->TXTRecordSetValue(&dnssd->raop_record, "am", strlen(GLOBAL_MODEL), GLOBAL_MODEL);
dnssd->TXTRecordSetValue(&dnssd->raop_record, "md", strlen(RAOP_MD), RAOP_MD);
dnssd->TXTRecordSetValue(&dnssd->raop_record, "rhd", strlen(RAOP_RHD), RAOP_RHD);
if (dnssd->require_pw) {
switch (dnssd->pin_pw) {
case 2:
case 3:
dnssd->TXTRecordSetValue(&dnssd->raop_record, "pw", strlen("true"), "true");
} else {
dnssd->TXTRecordSetValue(&dnssd->raop_record, "sf", 4, "0x84");
break;
case 1:
dnssd->TXTRecordSetValue(&dnssd->raop_record, "pw", strlen("true"), "true");
dnssd->TXTRecordSetValue(&dnssd->raop_record, "sf", 3, "0x8c");
break;
default:
dnssd->TXTRecordSetValue(&dnssd->raop_record, "pw", strlen("false"), "false");
dnssd->TXTRecordSetValue(&dnssd->raop_record, "sf", strlen(RAOP_SF), RAOP_SF);
break;
}
dnssd->TXTRecordSetValue(&dnssd->raop_record, "sr", strlen(RAOP_SR), RAOP_SR);
dnssd->TXTRecordSetValue(&dnssd->raop_record, "ss", strlen(RAOP_SS), RAOP_SS);
@@ -361,18 +375,27 @@ dnssd_register_airplay(dnssd_t *dnssd, unsigned short port)
return -1;
}
// flags is a string representing a 20-bit flag (up to 3 hex digits)
dnssd->TXTRecordCreate(&dnssd->airplay_record, 0, NULL);
dnssd->TXTRecordSetValue(&dnssd->airplay_record, "deviceid", strlen(device_id), device_id);
dnssd->TXTRecordSetValue(&dnssd->airplay_record, "features", strlen(features), features);
dnssd->TXTRecordSetValue(&dnssd->airplay_record, "flags", strlen(AIRPLAY_FLAGS), AIRPLAY_FLAGS);
switch (dnssd->pin_pw) {
case 1: // display onscreen pin
dnssd->TXTRecordSetValue(&dnssd->airplay_record, "pw", strlen("true"), "true");
dnssd->TXTRecordSetValue(&dnssd->airplay_record, "flags", 3, "0x4");
break;
case 2: // require password
case 3:
dnssd->TXTRecordSetValue(&dnssd->airplay_record, "pw", strlen("true"), "true");
dnssd->TXTRecordSetValue(&dnssd->airplay_record, "flags", 3, "0x4");
break;
default:
dnssd->TXTRecordSetValue(&dnssd->airplay_record, "pw", strlen("false"), "false");
dnssd->TXTRecordSetValue(&dnssd->airplay_record, "flags", 3, "0x4");
break;
}
dnssd->TXTRecordSetValue(&dnssd->airplay_record, "model", strlen(GLOBAL_MODEL), GLOBAL_MODEL);
dnssd->TXTRecordSetValue(&dnssd->airplay_record, "pk", strlen(dnssd->pk), dnssd->pk);
if (dnssd->require_pw) {
dnssd->TXTRecordSetValue(&dnssd->airplay_record, "pw", strlen("true"), "true");
} else {
dnssd->TXTRecordSetValue(&dnssd->airplay_record, "pw", strlen("false"), "false");
}
dnssd->TXTRecordSetValue(&dnssd->airplay_record, "pi", strlen(AIRPLAY_PI), AIRPLAY_PI);
dnssd->TXTRecordSetValue(&dnssd->airplay_record, "srcvers", strlen(AIRPLAY_SRCVERS), AIRPLAY_SRCVERS);
dnssd->TXTRecordSetValue(&dnssd->airplay_record, "vv", strlen(AIRPLAY_VV), AIRPLAY_VV);

View File

@@ -35,7 +35,7 @@ extern "C" {
typedef struct dnssd_s dnssd_t;
DNSSD_API dnssd_t *dnssd_init(const char *name, int name_len, const char *hw_addr, int hw_addr_len, int *error, int require_pw);
DNSSD_API dnssd_t *dnssd_init(const char *name, int name_len, const char *hw_addr, int hw_addr_len, int *error, unsigned char pin_pw);
DNSSD_API int dnssd_register_raop(dnssd_t *dnssd, unsigned short port);
DNSSD_API int dnssd_register_airplay(dnssd_t *dnssd, unsigned short port);

View File

@@ -44,7 +44,7 @@
#define RAOP_VN "65537"
#define AIRPLAY_SRCVERS GLOBAL_VERSION /*defined in global.h */
#define AIRPLAY_FLAGS "0x4"
#define AIRPLAY_FLAGS "0x84"
#define AIRPLAY_VV "2"
#define AIRPLAY_PI "2e388006-13ba-4041-9a67-25dd4a43d536"

View File

@@ -94,7 +94,7 @@ int fcup_request(void *conn_opaque, const char *media_url, const char *client_se
if (send_len < 0) {
int sock_err = SOCKET_GET_ERROR();
logger_log(conn->raop->logger, LOGGER_ERR, "fcup_request: send error %d:%s\n",
sock_err, strerror(sock_err));
sock_err, SOCKET_ERROR_STRING(sock_err));
http_response_destroy(request);
/* shut down connection? */
return -1;

View File

@@ -279,10 +279,10 @@ http_handler_playback_info(raop_conn_t *conn, http_request_t *request, http_resp
playback_info_t playback_info;
playback_info.stallcount = 0;
playback_info.ready_to_play = true; // ???;
playback_info.playback_buffer_empty = false; // maybe need to get this from playbin
playback_info.playback_buffer_full = true;
playback_info.playback_likely_to_keep_up = true;
//playback_info.playback_buffer_empty = false; // maybe need to get this from playbin
//playback_info.playback_buffer_full = true;
//ayback_info.ready_to_play = true; // ???;
//ayback_info.playback_likely_to_keep_up = true;
conn->raop->callbacks.on_video_acquire_playback_info(conn->raop->callbacks.cls, &playback_info);
if (playback_info.duration == -1.0) {
@@ -353,262 +353,6 @@ http_handler_reverse(raop_conn_t *conn, http_request_t *request, http_response_t
}
}
/* this copies a Media Playlist into a null-terminated string. If it has the "#YT-EXT-CONDENSED-URI"
header, it is also expanded into the full Media Playlist format */
char *adjust_yt_condensed_playlist(const char *media_playlist) {
/* expands a YT-EXT_CONDENSED-URL media playlist into a full media playlist
* returns a pointer to the expanded playlist, WHICH MUST BE FREED AFTER USE */
const char *base_uri_begin;
const char *params_begin;
const char *prefix_begin;
size_t base_uri_len;
size_t params_len;
size_t prefix_len;
const char* ptr = strstr(media_playlist, "#EXTM3U\n");
ptr += strlen("#EXTM3U\n");
assert(ptr);
if (strncmp(ptr, "#YT-EXT-CONDENSED-URL", strlen("#YT-EXT-CONDENSED-URL"))) {
size_t len = strlen(media_playlist);
char * playlist_copy = (char *) malloc(len + 1);
memcpy(playlist_copy, media_playlist, len);
playlist_copy[len] = '\0';
return playlist_copy;
}
ptr = strstr(ptr, "BASE-URI=");
base_uri_begin = strchr(ptr, '"');
base_uri_begin++;
ptr = strchr(base_uri_begin, '"');
base_uri_len = ptr - base_uri_begin;
char *base_uri = (char *) calloc(base_uri_len + 1, sizeof(char));
assert(base_uri);
memcpy(base_uri, base_uri_begin, base_uri_len); //must free
ptr = strstr(ptr, "PARAMS=");
params_begin = strchr(ptr, '"');
params_begin++;
ptr = strchr(params_begin,'"');
params_len = ptr - params_begin;
char *params = (char *) calloc(params_len + 1, sizeof(char));
assert(params);
memcpy(params, params_begin, params_len); //must free
ptr = strstr(ptr, "PREFIX=");
prefix_begin = strchr(ptr, '"');
prefix_begin++;
ptr = strchr(prefix_begin,'"');
prefix_len = ptr - prefix_begin;
char *prefix = (char *) calloc(prefix_len + 1, sizeof(char));
assert(prefix);
memcpy(prefix, prefix_begin, prefix_len); //must free
/* expand params */
int nparams = 0;
int *params_size = NULL;
const char **params_start = NULL;
if (strlen(params)) {
nparams = 1;
char * comma = strchr(params, ',');
while (comma) {
nparams++;
comma++;
comma = strchr(comma, ',');
}
params_start = (const char **) calloc(nparams, sizeof(char *)); //must free
params_size = (int *) calloc(nparams, sizeof(int)); //must free
ptr = params;
for (int i = 0; i < nparams; i++) {
comma = strchr(ptr, ',');
params_start[i] = ptr;
if (comma) {
params_size[i] = (int) (comma - ptr);
ptr = comma;
ptr++;
} else {
params_size[i] = (int) (params + params_len - ptr);
break;
}
}
}
int count = 0;
ptr = strstr(media_playlist, "#EXTINF");
while (ptr) {
count++;
ptr = strstr(++ptr, "#EXTINF");
}
size_t old_size = strlen(media_playlist);
size_t new_size = old_size;
new_size += count * (base_uri_len + params_len);
char * new_playlist = (char *) calloc( new_size + 100, sizeof(char));
const char *old_pos = media_playlist;
char *new_pos = new_playlist;
ptr = old_pos;
ptr = strstr(old_pos, "#EXTINF:");
size_t len = ptr - old_pos;
/* copy header section before chunks */
memcpy(new_pos, old_pos, len);
old_pos += len;
new_pos += len;
while (ptr) {
/* for each chunk */
const char *end = NULL;
char *start = strstr(ptr, prefix);
len = start - ptr;
/* copy first line of chunk entry */
memcpy(new_pos, old_pos, len);
old_pos += len;
new_pos += len;
/* copy base uri to replace prefix*/
memcpy(new_pos, base_uri, base_uri_len);
new_pos += base_uri_len;
old_pos += prefix_len;
ptr = strstr(old_pos, "#EXTINF:");
/* insert the PARAMS separators on the slices line */
end = old_pos;
int last = nparams - 1;
for (int i = 0; i < nparams; i++) {
if (i != last) {
end = strchr(end, '/');
} else {
end = strstr(end, "#EXT"); /* the next line starts with either #EXTINF (usually) or #EXT-X-ENDLIST (at last chunk)*/
}
*new_pos = '/';
new_pos++;
memcpy(new_pos, params_start[i], params_size[i]);
new_pos += params_size[i];
*new_pos = '/';
new_pos++;
len = end - old_pos;
end++;
memcpy (new_pos, old_pos, len);
new_pos += len;
old_pos += len;
if (i != last) {
old_pos++; /* last entry is not followed by "/" separator */
}
}
}
/* copy tail */
len = media_playlist + strlen(media_playlist) - old_pos;
memcpy(new_pos, old_pos, len);
new_pos += len;
old_pos += len;
new_playlist[new_size] = '\0';
free (prefix);
free (base_uri);
free (params);
if (params_size) {
free (params_size);
}
if (params_start) {
free (params_start);
}
return new_playlist;
}
/* this adjusts the uri prefixes in the Master Playlist, for sending to the Media Player running on the Server Host */
char *adjust_master_playlist (char *fcup_response_data, int fcup_response_datalen, char *uri_prefix, char *uri_local_prefix) {
size_t uri_prefix_len = strlen(uri_prefix);
size_t uri_local_prefix_len = strlen(uri_local_prefix);
int counter = 0;
char *ptr = strstr(fcup_response_data, uri_prefix);
while (ptr != NULL) {
counter++;
ptr++;
ptr = strstr(ptr, uri_prefix);
}
size_t len = uri_local_prefix_len - uri_prefix_len;
len *= counter;
len += fcup_response_datalen;
char *new_master = (char *) malloc(len + 1);
*(new_master + len) = '\0';
char *first = fcup_response_data;
char *new = new_master;
char *last = strstr(first, uri_prefix);
counter = 0;
while (last != NULL) {
counter++;
len = last - first;
memcpy(new, first, len);
first = last + uri_prefix_len;
new += len;
memcpy(new, uri_local_prefix, uri_local_prefix_len);
new += uri_local_prefix_len;
last = strstr(last + uri_prefix_len, uri_prefix);
if (last == NULL) {
len = fcup_response_data + fcup_response_datalen - first;
memcpy(new, first, len);
break;
}
}
return new_master;
}
/* this parses the Master Playlist to make a table of the Media Playlist uri's that it lists */
int create_media_uri_table(const char *url_prefix, const char *master_playlist_data, int datalen,
char ***media_uri_table, int *num_uri) {
char *ptr = strstr(master_playlist_data, url_prefix);
char ** table = NULL;
if (ptr == NULL) {
return -1;
}
int count = 0;
while (ptr != NULL) {
char *end = strstr(ptr, "m3u8");
if (end == NULL) {
return 1;
}
end += sizeof("m3u8");
count++;
ptr = strstr(end, url_prefix);
}
table = (char **) calloc(count, sizeof(char *));
if (!table) {
return -1;
}
for (int i = 0; i < count; i++) {
table[i] = NULL;
}
ptr = strstr(master_playlist_data, url_prefix);
count = 0;
while (ptr != NULL) {
char *end = strstr(ptr, "m3u8");
char *uri;
if (end == NULL) {
return 0;
}
end += sizeof("m3u8");
size_t len = end - ptr - 1;
uri = (char *) calloc(len + 1, sizeof(char));
memcpy(uri , ptr, len);
table[count] = uri;
uri = NULL;
count ++;
ptr = strstr(end, url_prefix);
}
*num_uri = count;
*media_uri_table = table;
return 0;
}
/* the POST /action request from Client to Server on the AirPlay http channel follows a POST /event "FCUP Request"
from Server to Client on the reverse http channel, for a HLS playlist (first the Master Playlist, then the Media Playlists
listed in the Master Playlist. The POST /action request contains the playlist requested by the Server in
@@ -664,31 +408,65 @@ http_handler_action(raop_conn_t *conn, http_request_t *request, http_response_t
goto post_action_error;
}
plist_t req_params_node = NULL;
/* three possible types are known */
char *type = NULL;
int action_type = 0;
plist_get_string_val(req_type_node, &type);
logger_log(conn->raop->logger, LOGGER_DEBUG, "action type is %s", type);
if (strstr(type, "unhandledURLResponse")) {
action_type = 1;
} else if (strstr(type, "playlistInsert")) {
action_type = 2;
goto unhandledURLResponse;
} else if (strstr(type, "playlistRemove")) {
action_type = 3;
}
free (type);
plist_t req_params_node = NULL;
switch (action_type) {
case 1:
goto unhandledURLResponse;
case 2:
logger_log(conn->raop->logger, LOGGER_INFO, "unhandled action type playlistInsert (add new playback)");
goto finish;
case 3:
logger_log(conn->raop->logger, LOGGER_INFO, "unhandled action type playlistRemove (stop playback)");
req_params_node = plist_dict_get_item(req_root_node, "params");
if (!req_params_node || !PLIST_IS_DICT (req_params_node)) {
goto post_action_error;
}
plist_t req_params_item_node = plist_dict_get_item(req_params_node, "item");
if (!req_params_item_node) {
goto post_action_error;
} else {
if (!PLIST_IS_DICT (req_params_item_node)) {
goto post_action_error;
}
plist_t req_params_item_uuid_node = plist_dict_get_item(req_params_item_node, "uuid");
char* remove_uuid = NULL;
plist_get_string_val(req_params_item_uuid_node, &remove_uuid);
const char *playback_uuid = get_playback_uuid(conn->raop->airplay_video);
if (strcmp(remove_uuid, playback_uuid)) {
logger_log(conn->raop->logger, LOGGER_ERR, "uuid of playlist removal action request did not match current playlist:\n"
" current: %s\n remove: %s", playback_uuid, remove_uuid);
} else {
logger_log(conn->raop->logger, LOGGER_DEBUG, "removal_uuid matches playback_uuid\n");
}
free (remove_uuid);
}
logger_log(conn->raop->logger, LOGGER_ERR, "FIXME: playlist removal not yet implemented");
goto finish;
default:
} else if (strstr(type, "playlistInsert")) {
logger_log(conn->raop->logger, LOGGER_INFO, "unhandled action type playlistInsert (add new playback)");
printf("\n***************FIXME************************\nPlaylist insertion needs more information for it to be implemented:\n"
"please report following output as an \"Issue\" at http://github.com/FDH2/UxPlay:\n");
char *header_str = NULL;
http_request_get_header_string(request, &header_str);
printf("\n\n%s\n", header_str);
bool is_plist = (bool) strstr(header_str,"apple-binary-plist");
free(header_str);
if (is_plist) {
int request_datalen;
const char *request_data = http_request_get_data(request, &request_datalen);
plist_t req_root_node = NULL;
plist_from_bin(request_data, request_datalen, &req_root_node);
char * plist_xml;
uint32_t plist_len;
plist_to_xml(req_root_node, &plist_xml, &plist_len);
plist_xml = utils_strip_data_from_plist_xml(plist_xml);
printf("%s", plist_xml);
free(plist_xml);
plist_free(req_root_node);
}
assert(0);
} else {
logger_log(conn->raop->logger, LOGGER_INFO, "unknown action type (unhandled)");
goto finish;
}
@@ -705,45 +483,45 @@ http_handler_action(raop_conn_t *conn, http_request_t *request, http_response_t
int fcup_response_datalen = 0;
if (logger_debug) {
plist_t plist_fcup_response_statuscode_node = plist_dict_get_item(req_params_node,
plist_t req_params_fcup_response_statuscode_node = plist_dict_get_item(req_params_node,
"FCUP_Response_StatusCode");
if (plist_fcup_response_statuscode_node) {
plist_get_uint_val(plist_fcup_response_statuscode_node, &uint_val);
if (req_params_fcup_response_statuscode_node) {
plist_get_uint_val(req_params_fcup_response_statuscode_node, &uint_val);
fcup_response_statuscode = (int) uint_val;
uint_val = 0;
logger_log(conn->raop->logger, LOGGER_DEBUG, "FCUP_Response_StatusCode = %d",
fcup_response_statuscode);
}
plist_t plist_fcup_response_requestid_node = plist_dict_get_item(req_params_node,
plist_t req_params_fcup_response_requestid_node = plist_dict_get_item(req_params_node,
"FCUP_Response_RequestID");
if (plist_fcup_response_requestid_node) {
plist_get_uint_val(plist_fcup_response_requestid_node, &uint_val);
if (req_params_fcup_response_requestid_node) {
plist_get_uint_val(req_params_fcup_response_requestid_node, &uint_val);
request_id = (int) uint_val;
uint_val = 0;
logger_log(conn->raop->logger, LOGGER_DEBUG, "FCUP_Response_RequestID = %d", request_id);
}
}
plist_t plist_fcup_response_url_node = plist_dict_get_item(req_params_node, "FCUP_Response_URL");
if (!PLIST_IS_STRING(plist_fcup_response_url_node)) {
plist_t req_params_fcup_response_url_node = plist_dict_get_item(req_params_node, "FCUP_Response_URL");
if (!PLIST_IS_STRING(req_params_fcup_response_url_node)) {
goto post_action_error;
}
char *fcup_response_url = NULL;
plist_get_string_val(plist_fcup_response_url_node, &fcup_response_url);
plist_get_string_val(req_params_fcup_response_url_node, &fcup_response_url);
if (!fcup_response_url) {
goto post_action_error;
}
logger_log(conn->raop->logger, LOGGER_DEBUG, "FCUP_Response_URL = %s", fcup_response_url);
plist_t plist_fcup_response_data_node = plist_dict_get_item(req_params_node, "FCUP_Response_Data");
if (!PLIST_IS_DATA(plist_fcup_response_data_node)){
plist_t req_params_fcup_response_data_node = plist_dict_get_item(req_params_node, "FCUP_Response_Data");
if (!PLIST_IS_DATA(req_params_fcup_response_data_node)){
goto post_action_error;
}
uint_val = 0;
char *fcup_response_data = NULL;
plist_get_data_val(plist_fcup_response_data_node, &fcup_response_data, &uint_val);
plist_get_data_val(req_params_fcup_response_data_node, &fcup_response_data, &uint_val);
fcup_response_datalen = (int) uint_val;
if (!fcup_response_data) {
@@ -782,7 +560,7 @@ http_handler_action(raop_conn_t *conn, http_request_t *request, http_response_t
memcpy(playlist, fcup_response_data, fcup_response_datalen);
int uri_num = get_next_media_uri_id(conn->raop->airplay_video);
--uri_num; // (next num is current num + 1)
store_media_data_playlist_by_num(conn->raop->airplay_video, playlist, uri_num);
store_media_playlist(conn->raop->airplay_video, playlist, uri_num);
float duration = 0.0f;
int count = analyze_media_playlist(playlist, &duration);
if (count) {
@@ -836,12 +614,14 @@ http_handler_play(raop_conn_t *conn, http_request_t *request, http_response_t *r
char **response_data, int *response_datalen) {
char* playback_location = NULL;
char* client_proc_name = NULL;
plist_t req_root_node = NULL;
float start_position_seconds = 0.0f;
bool data_is_binary_plist = false;
bool data_is_text = false;
bool data_is_octet = false;
char supported_hls_proc_names[] = "YouTube;";
logger_log(conn->raop->logger, LOGGER_DEBUG, "http_handler_play");
const char* session_id = http_request_get_header(request, "X-Apple-Session-ID");
@@ -902,6 +682,17 @@ http_handler_play(raop_conn_t *conn, http_request_t *request, http_response_t *r
plist_get_string_val(req_content_location_node, &playback_location);
}
plist_t req_client_proc_name_node = plist_dict_get_item(req_root_node, "clientProcName");
if (!req_client_proc_name_node) {
goto play_error;
} else {
plist_get_string_val(req_client_proc_name_node, &client_proc_name);
if (!strstr(supported_hls_proc_names, client_proc_name)){
logger_log(conn->raop->logger, LOGGER_WARNING, "Unsupported HLS streaming format: clientProcName %s not found in supported list: %s",
client_proc_name, supported_hls_proc_names);
}
}
plist_t req_start_position_seconds_node = plist_dict_get_item(req_root_node, "Start-Position-Seconds");
if (!req_start_position_seconds_node) {
logger_log(conn->raop->logger, LOGGER_INFO, "No Start-Position-Seconds in Play request");
@@ -914,6 +705,10 @@ http_handler_play(raop_conn_t *conn, http_request_t *request, http_response_t *r
}
char *ptr = strstr(playback_location, "/master.m3u8");
if (!ptr) {
logger_log(conn->raop->logger, LOGGER_ERR, "Content-Location has unsupported form:\n%s\n", playback_location);
goto play_error;
}
int prefix_len = (int) (ptr - playback_location);
set_uri_prefix(conn->raop->airplay_video, playback_location, prefix_len);
set_next_media_uri_id(conn->raop->airplay_video, 0);
@@ -932,8 +727,10 @@ http_handler_play(raop_conn_t *conn, http_request_t *request, http_response_t *r
if (req_root_node) {
plist_free(req_root_node);
}
logger_log(conn->raop->logger, LOGGER_ERR, "Could not find valid Plist Data for /play, Unhandled");
logger_log(conn->raop->logger, LOGGER_ERR, "Could not find valid Plist Data for POST/play request, Unhandled");
http_response_init(response, "HTTP/1.1", 400, "Bad Request");
http_response_set_disconnect(response, 1);
conn->raop->callbacks.conn_reset(conn->raop->callbacks.cls, 2);
}
/* the HLS handler handles http requests GET /[uri] on the HLS channel from the media player to the Server, asking for
@@ -952,26 +749,32 @@ http_handler_hls(raop_conn_t *conn, http_request_t *request, http_response_t *r
const char *url = http_request_get_url(request);
const char* upgrade = http_request_get_header(request, "Upgrade");
if (upgrade) {
//don't accept Upgrade: h2c request ?
return;
//don't accept Upgrade: h2c request ?
char *header_str = NULL;
http_request_get_header_string(request, &header_str);
logger_log(conn->raop->logger, LOGGER_INFO,
"%s\nhls upgrade request declined", header_str);
free (header_str);
return;
}
if (!strcmp(url, "/master.m3u8")){
char * master_playlist = get_master_playlist(conn->raop->airplay_video);
size_t len = strlen(master_playlist);
char * data = (char *) malloc(len + 1);
memcpy(data, master_playlist, len);
data[len] = '\0';
*response_data = data;
*response_datalen = (int ) len;
if (master_playlist) {
size_t len = strlen(master_playlist);
char * data = (char *) malloc(len + 1);
memcpy(data, master_playlist, len);
data[len] = '\0';
*response_data = data;
*response_datalen = (int ) len;
} else {
logger_log(conn->raop->logger, LOGGER_ERR,"requested master playlist %s not found", url);
*response_datalen = 0;
}
} else {
int num = get_media_playlist_by_uri(conn->raop->airplay_video, url);
if (num < 0) {
logger_log(conn->raop->logger, LOGGER_ERR,"Requested playlist %s not found", url);
assert(0);
} else {
char *media_playlist = get_media_playlist_by_num(conn->raop->airplay_video, num);
assert(media_playlist);
char *media_playlist = get_media_playlist(conn->raop->airplay_video, url);
if (media_playlist) {
char *data = adjust_yt_condensed_playlist(media_playlist);
*response_data = data;
*response_datalen = strlen(data);
@@ -979,8 +782,11 @@ http_handler_hls(raop_conn_t *conn, http_request_t *request, http_response_t *r
int chunks = analyze_media_playlist(data, &duration);
logger_log(conn->raop->logger, LOGGER_INFO,
"Requested media_playlist %s has %5d chunks, total duration %9.3f secs", url, chunks, duration);
} else {
logger_log(conn->raop->logger, LOGGER_ERR,"requested media playlist %s not found", url);
*response_datalen = 0;
}
}
}
http_response_add_header(response, "Access-Control-Allow-Headers", "Content-type");
http_response_add_header(response, "Access-Control-Allow-Origin", "*");

View File

@@ -205,13 +205,25 @@ httpd_remove_connection(httpd_t *httpd, http_connection_t *connection)
http_request_destroy(connection->request);
connection->request = NULL;
}
httpd->callbacks.conn_destroy(connection->user_data);
shutdown(connection->socket_fd, SHUT_WR);
closesocket(connection->socket_fd);
connection->connected = 0;
connection->user_data = NULL;
logger_log(httpd->logger, LOGGER_DEBUG, "removing connection type %s socket %d conn %p", typename[connection->type],
connection->socket_fd, connection->user_data);
if (connection->user_data) {
httpd->callbacks.conn_destroy(connection->user_data);
connection->user_data = NULL;
}
if (connection->socket_fd) {
shutdown(connection->socket_fd, SHUT_WR);
int ret = closesocket(connection->socket_fd);
if (ret == -1) {
logger_log(httpd->logger, LOGGER_ERR, "httpd error in closesocket (close): %d %s", errno, strerror(errno));
}
connection->socket_fd = 0;
}
if (connection->connected) {
connection->connected = 0;
httpd->open_connections--;
}
connection->type = CONNECTION_TYPE_UNKNOWN;
httpd->open_connections--;
}
static int
@@ -303,6 +315,17 @@ httpd_remove_known_connections(httpd_t *httpd) {
}
}
void
httpd_remove_connections_by_type(httpd_t *httpd, connection_type_t type) {
for (int i = 0; i < httpd->max_connections; i++) {
http_connection_t *connection = &httpd->connections[i];
if (!connection->connected || connection->type != type) {
continue;
}
httpd_remove_connection(httpd, connection);
}
}
static THREAD_RETVAL
httpd_thread(void *arg)
{
@@ -450,7 +473,7 @@ httpd_thread(void *arg)
} else {
int sock_err = SOCKET_GET_ERROR();
logger_log(httpd->logger, LOGGER_ERR, "httpd: recv socket error %d:%s",
sock_err, strerror(sock_err));
sock_err, SOCKET_ERROR_STRING(sock_err));
break;
}
} else {
@@ -571,7 +594,7 @@ httpd_thread(void *arg)
httpd->running = 0;
MUTEX_UNLOCK(httpd->run_mutex);
logger_log(httpd->logger, LOGGER_DEBUG, "Exiting HTTP thread");
logger_log(httpd->logger, LOGGER_DEBUG, "Exiting httpd thread");
return 0;
}

View File

@@ -39,6 +39,7 @@ struct httpd_callbacks_s {
typedef struct httpd_callbacks_s httpd_callbacks_t;
bool httpd_nohold(httpd_t *httpd);
void httpd_remove_known_connections(httpd_t *httpd);
void httpd_remove_connections_by_type(httpd_t *httpd, connection_type_t type);
int httpd_set_connection_type (httpd_t *http, void *user_data, connection_type_t type);
int httpd_count_connection_type (httpd_t *http, connection_type_t type);

View File

@@ -1,4 +1,3 @@
cmake_minimum_required(VERSION 3.5)
aux_source_directory(. llhttp_src)
set(DIR_SRCS ${llhttp_src})
include_directories(.)

View File

@@ -57,29 +57,14 @@ static int wasm_on_headers_complete_wrap(llhttp_t* p) {
}
const llhttp_settings_t wasm_settings = {
wasm_on_message_begin,
wasm_on_url,
wasm_on_status,
NULL,
NULL,
wasm_on_header_field,
wasm_on_header_value,
NULL,
NULL,
wasm_on_headers_complete_wrap,
wasm_on_body,
wasm_on_message_complete,
NULL,
NULL,
NULL,
NULL,
NULL,
NULL,
NULL,
NULL,
NULL,
NULL,
NULL,
.on_message_begin = wasm_on_message_begin,
.on_url = wasm_on_url,
.on_status = wasm_on_status,
.on_header_field = wasm_on_header_field,
.on_header_value = wasm_on_header_value,
.on_headers_complete = wasm_on_headers_complete_wrap,
.on_body = wasm_on_body,
.on_message_complete = wasm_on_message_complete,
};
@@ -341,6 +326,20 @@ int llhttp__on_message_begin(llhttp_t* s, const char* p, const char* endp) {
}
int llhttp__on_protocol(llhttp_t* s, const char* p, const char* endp) {
int err;
SPAN_CALLBACK_MAYBE(s, on_protocol, p, endp - p);
return err;
}
int llhttp__on_protocol_complete(llhttp_t* s, const char* p, const char* endp) {
int err;
CALLBACK_MAYBE(s, on_protocol_complete);
return err;
}
int llhttp__on_url(llhttp_t* s, const char* p, const char* endp) {
int err;
SPAN_CALLBACK_MAYBE(s, on_url, p, endp - p);

File diff suppressed because it is too large Load Diff

View File

@@ -3,8 +3,8 @@
#define INCLUDE_LLHTTP_H_
#define LLHTTP_VERSION_MAJOR 9
#define LLHTTP_VERSION_MINOR 2
#define LLHTTP_VERSION_PATCH 1
#define LLHTTP_VERSION_MINOR 3
#define LLHTTP_VERSION_PATCH 0
#ifndef INCLUDE_LLHTTP_ITSELF_H_
#define INCLUDE_LLHTTP_ITSELF_H_
@@ -90,7 +90,8 @@ enum llhttp_errno {
HPE_CB_HEADER_VALUE_COMPLETE = 29,
HPE_CB_CHUNK_EXTENSION_NAME_COMPLETE = 34,
HPE_CB_CHUNK_EXTENSION_VALUE_COMPLETE = 35,
HPE_CB_RESET = 31
HPE_CB_RESET = 31,
HPE_CB_PROTOCOL_COMPLETE = 38
};
typedef enum llhttp_errno llhttp_errno_t;
@@ -326,6 +327,7 @@ typedef enum llhttp_status llhttp_status_t;
XX(34, CB_CHUNK_EXTENSION_NAME_COMPLETE, CB_CHUNK_EXTENSION_NAME_COMPLETE) \
XX(35, CB_CHUNK_EXTENSION_VALUE_COMPLETE, CB_CHUNK_EXTENSION_VALUE_COMPLETE) \
XX(31, CB_RESET, CB_RESET) \
XX(38, CB_PROTOCOL_COMPLETE, CB_PROTOCOL_COMPLETE) \
#define HTTP_METHOD_MAP(XX) \
@@ -567,6 +569,7 @@ struct llhttp_settings_s {
llhttp_cb on_message_begin;
/* Possible return values 0, -1, HPE_USER */
llhttp_data_cb on_protocol;
llhttp_data_cb on_url;
llhttp_data_cb on_status;
llhttp_data_cb on_method;
@@ -592,6 +595,7 @@ struct llhttp_settings_s {
/* Possible return values 0, -1, `HPE_PAUSED` */
llhttp_cb on_message_complete;
llhttp_cb on_protocol_complete;
llhttp_cb on_url_complete;
llhttp_cb on_status_complete;
llhttp_cb on_method_complete;

View File

@@ -31,6 +31,7 @@ extern "C" {
#define LOGGER_NOTICE 5 /* normal but significant condition */
#define LOGGER_INFO 6 /* informational */
#define LOGGER_DEBUG 7 /* debug-level messages */
#define LOGGER_DEBUG_DATA 8 /* debug-level messages including audio/video packet data */
typedef void (*logger_callback_t)(void *cls, int level, const char *msg);

View File

@@ -16,6 +16,7 @@
*/
#include <stdlib.h>
#include <stdio.h>
#include <string.h>
#include <assert.h>
#include <stdbool.h>
@@ -195,6 +196,184 @@ pairing_session_get_public_key(pairing_session_t *session, unsigned char ecdh_ke
return 0;
}
int
pairing_session_make_nonce(pairing_session_t *session, uint64_t *local_time, const char *client_data, unsigned char *nonce, int len) {
unsigned char hash[SHA512_DIGEST_LENGTH];
if (len > sizeof(hash)) {
return -1;
}
if (!client_data || !local_time || !session || !nonce || len <= 0) {
return -2;
}
sha_ctx_t *ctx = sha_init();
sha_update(ctx, (const unsigned char *) local_time, sizeof(uint64_t));
sha_update(ctx, (const unsigned char *) client_data, strlen(client_data));
sha_update(ctx, (const unsigned char *) session->ed_ours, ED25519_KEY_SIZE);
sha_final(ctx, hash, NULL);
sha_destroy(ctx);
memcpy(nonce, hash, len);
return 0;
}
static
char *get_token(char **cursor, char *token_name, char start_char, char end_char) {
char *ptr = *cursor;
ptr = strstr(ptr, token_name);
if (!ptr) {
return NULL;
}
ptr += strlen(token_name);
ptr = strchr(ptr, start_char);
if (!ptr) {
return NULL;
}
char *token = ++ptr;
ptr = strchr(ptr, end_char);
if (!ptr) {
return NULL;
}
*(ptr++) = '\0';
*cursor = ptr;
return token;
}
//#define test_digest
bool
pairing_digest_verify(const char *method, const char * authorization, const char *password) {
/* RFC 2617 HTTP md5 Digest password authentication */
size_t authlen = strlen(authorization);
char *sentence = (char *) malloc(authlen + 1);
memcpy(sentence, authorization, authlen);
*(sentence + authlen) = '\0';
char *username = NULL;
char *realm = NULL;
char *nonce = NULL;
char *uri = NULL;
char *qop = NULL;
char *nc = NULL;
char *cnonce = NULL;
char *response = NULL;
char *cursor = sentence;
const char *pwd = password;
const char *mthd = method;
char *raw;
int len;
bool authenticated;
#ifdef test_digest
char testauth[] = "Digest username=\"Mufasa\","
"realm=\"testrealm@host.com\","
"nonce=\"dcd98b7102dd2f0e8b11d0f600bfb0c093\","
"uri=\"/dir/index.html\","
"qop=auth,"
"nc=00000001,"
"cnonce=\"0a4f113b\","
"response=\"6629fae49393a05397450978507c4ef1\","
"opaque=\"5ccc069c403ebaf9f0171e9517f40e41\""
;
pwd = "Circle Of Life";
mthd = "GET";
cursor = testauth;
char HA1[] = "939e7578ed9e3c518a452acee763bce9";
char HA2[] = "39aff3a2bab6126f332b942af96d3366";
#endif
username = get_token(&cursor, "username", '\"', '\"');
realm = get_token(&cursor, "realm", '\"', '\"');
nonce = get_token(&cursor,"nonce", '\"', '\"');
uri = get_token(&cursor,"uri", '\"', '\"');
qop = get_token(&cursor, "qop", '=', ',');
if (qop) {
nc = get_token(&cursor, "nc", '=', ',');
cnonce = get_token(&cursor, "cnonce", '\"', '\"');
}
response = get_token(&cursor, "response", '\"', '\"');
#ifdef test_digest
printf("username: [%s] realm: [%s]\n", username, realm);
printf("nonce: [%s]\n", nonce);
printf("method: [%s]\n", mthd);
printf("uri: [%s]\n", uri);
if (qop) {
printf("qop: [%s], nc=[%s], cnonce: [%s]\n", qop, nc, cnonce);
}
printf("response: [%s]\n", response);
#endif
/* H1 = H(username : realm : password ) */
len = strlen(username) + strlen(realm) + strlen(pwd) + 3;
raw = (char *) calloc(len, sizeof(char));
strncat(raw, username, len - strlen(raw) - 1);
strncat(raw, ":", len - strlen(raw) - 1);
strncat(raw, realm, len - strlen(raw) - 1);
strncat(raw, ":", len - strlen(raw) - 1);
strncat(raw, pwd, len - strlen(raw) - 1);
char *hash1 = get_md5(raw);
free (raw);
#ifdef test_digest
printf("hash1: should be %s, was: %s\n", HA1, hash1);
#endif
/* H2 = H(method : uri) */
len = strlen(mthd) + strlen(uri) + 2;
raw = (char *) calloc(len, sizeof(char));
strncat(raw, mthd, len - strlen(raw) - 1);
strncat(raw, ":", len - strlen(raw) - 1);
strncat(raw, uri, len - strlen(raw) - 1);
char *hash2 = get_md5(raw);
free (raw);
#ifdef test_digest
printf("hash2: should be %s, was: %s\n", HA2, hash2);
#endif
/* result = H(H1 : nonce (or nonce:nc:cnonce:qop) : H2) */
len = strlen(hash1) + strlen(nonce) + strlen(hash2) + 3;
if (qop) {
len += strlen(nc) + strlen(cnonce) + strlen(qop) + 3;
}
raw = (char *) calloc(len, sizeof(char));
strncat(raw, hash1, len - strlen(raw) - 1);
strncat(raw, ":", len - strlen(raw) - 1);
strncat(raw, nonce, len - strlen(raw) - 1);
strncat(raw, ":", len - strlen(raw) - 1);
if (qop) {
strncat(raw, nc, len - strlen(raw) - 1);
strncat(raw, ":", len - strlen(raw) - 1);
strncat(raw, cnonce, len - strlen(raw) - 1);
strncat(raw, ":", len - strlen(raw) - 1);
strncat(raw, qop, len - strlen(raw) - 1);
strncat(raw, ":", len - strlen(raw) - 1);
}
strncat(raw, hash2, len - strlen(raw) - 1);
free (hash1);
free (hash2);
char *result = get_md5(raw);
free (raw);
authenticated = (strcmp(result,response) ? false : true);
#ifdef test_digest
printf("result: should be %s, was: %s, authenticated is %s\n", response, result, (authenticated ? "true" : "false"));
#endif
free (result);
free(sentence);
#ifdef test_digest
exit(0);
#endif
return authenticated;
}
int
pairing_session_get_signature(pairing_session_t *session, unsigned char signature[PAIRING_SIG_SIZE])
{
@@ -447,11 +626,10 @@ srp_confirm_pair_setup(pairing_session_t *session, pairing_t *pairing,
return epk_len;
}
void access_client_session_data(pairing_session_t *session, char **username, char **client_pk64, bool *setup) {
void get_pairing_session_client_data(pairing_session_t *session, char **username, char **client_pk64) {
int len64 = 4 * (1 + (ED25519_KEY_SIZE / 3)) + 1;
setup = &(session->pair_setup);
*username = session->username;
if (setup) {
if (session->pair_setup) {
*client_pk64 = (char *) malloc(len64);
pk_to_base64(session->client_pk, ED25519_KEY_SIZE, *client_pk64, len64);
} else {

View File

@@ -60,6 +60,9 @@ int srp_validate_proof(pairing_session_t *session, pairing_t *pairing, const uns
int len_A, unsigned char *proof, int client_proof_len, int proof_len);
int srp_confirm_pair_setup(pairing_session_t *session, pairing_t *pairing, unsigned char *epk,
unsigned char *auth_tag);
void access_client_session_data(pairing_session_t *session, char **username, char **client_pk, bool *setup);
void get_pairing_session_client_data(pairing_session_t *session, char **username, char **client_pk);
void ed25519_pk_to_base64(const unsigned char *pk, char **pk64);
int pairing_session_make_nonce(pairing_session_t *session, uint64_t *local_time, const char *client_data, unsigned char *nonce, int len);
bool pairing_digest_verify(const char *method, const char * authorization, const char *password);
#endif

View File

@@ -1,4 +1,3 @@
cmake_minimum_required(VERSION 3.5)
aux_source_directory(. playfair_src)
set(DIR_SRCS ${playfair_src})
include_directories(.)

View File

@@ -64,20 +64,24 @@ struct raop_s {
uint8_t clientFPSdata;
int audio_delay_micros;
int max_ntp_timeouts;
/* for temporary storage of pin during pair-pin start */
unsigned short pin;
bool use_pin;
unsigned short pin;
bool use_pin;
/* public key as string */
char pk_str[2*ED25519_KEY_SIZE + 1];
char pk_str[2*ED25519_KEY_SIZE + 1];
/* place to store media_data_store */
airplay_video_t *airplay_video;
airplay_video_t *airplay_video;
/* activate support for HLS live streaming */
bool hls_support;
bool hls_support;
/* used in digest authentication */
char *nonce;
char *random_pw;
unsigned char auth_fail_count;
};
struct raop_conn_s {
@@ -100,7 +104,7 @@ struct raop_conn_s {
connection_type_t connection_type;
char *client_session_id;
bool authenticated;
bool have_active_remote;
};
typedef struct raop_conn_s raop_conn_t;
@@ -160,6 +164,7 @@ conn_init(void *opaque, unsigned char *local, int locallen, unsigned char *remot
conn->client_session_id = NULL;
conn->airplay_video = NULL;
conn->authenticated = false;
conn->have_active_remote = false;
@@ -193,8 +198,9 @@ conn_request(void *ptr, http_request_t *request, http_response_t **response) {
*/
const char *method = http_request_get_method(request);
const char *url = http_request_get_url(request);
if (!method) {
if (!method || !url) {
return;
}
@@ -206,7 +212,6 @@ conn_request(void *ptr, http_request_t *request, http_response_t **response) {
return;
}
const char *url = http_request_get_url(request);
const char *client_session_id = http_request_get_header(request, "X-Apple-Session-ID");
const char *host = http_request_get_header(request, "Host");
hls_request = (host && !cseq && !client_session_id);
@@ -308,11 +313,22 @@ conn_request(void *ptr, http_request_t *request, http_response_t **response) {
if (data_is_plist) {
plist_t req_root_node = NULL;
plist_from_bin(request_data, request_datalen, &req_root_node);
char * plist_xml;
char * plist_xml = NULL;
char * stripped_xml = NULL;
uint32_t plist_len;
plist_to_xml(req_root_node, &plist_xml, &plist_len);
logger_log(conn->raop->logger, LOGGER_DEBUG, "%s", plist_xml);
free(plist_xml);
stripped_xml = utils_strip_data_from_plist_xml(plist_xml);
logger_log(conn->raop->logger, LOGGER_DEBUG, "%s", (stripped_xml ? stripped_xml : plist_xml));
if (stripped_xml) {
free(stripped_xml);
}
if (plist_xml) {
#ifdef PLIST_230
plist_mem_free(plist_xml);
#else
plist_to_xml_free(plist_xml);
#endif
}
plist_free(req_root_node);
} else if (data_is_text) {
char *data_str = utils_data_to_text((char *) request_data, request_datalen);
@@ -370,7 +386,9 @@ conn_request(void *ptr, http_request_t *request, http_response_t **response) {
handler = &raop_handler_flush;
} else if (!strcmp(method, "TEARDOWN")) {
handler = &raop_handler_teardown;
}
} else {
http_response_init(*response, protocol, 501, "Not Implemented");
}
} else if (!hls_request && !strcmp(protocol, "HTTP/1.1")) {
if (!strcmp(method, "POST")) {
if (!strcmp(url, "/reverse")) {
@@ -399,7 +417,6 @@ conn_request(void *ptr, http_request_t *request, http_response_t **response) {
} else if (!strcmp(method, "PUT")) {
if (!strncmp (url, "/setProperty?", strlen("/setProperty?"))) {
handler = &http_handler_set_property;
} else {
}
}
} else if (hls_request) {
@@ -442,12 +459,23 @@ conn_request(void *ptr, http_request_t *request, http_response_t **response) {
if (data_is_plist) {
plist_t res_root_node = NULL;
plist_from_bin(response_data, response_datalen, &res_root_node);
char * plist_xml;
char * plist_xml = NULL;
char * stripped_xml = NULL;
uint32_t plist_len;
plist_to_xml(res_root_node, &plist_xml, &plist_len);
stripped_xml = utils_strip_data_from_plist_xml(plist_xml);
logger_log(conn->raop->logger, LOGGER_DEBUG, "%s", (stripped_xml ? stripped_xml : plist_xml));
if (stripped_xml) {
free(stripped_xml);
}
if (plist_xml) {
#ifdef PLIST_230
plist_mem_free(plist_xml);
#else
plist_to_xml_free(plist_xml);
#endif
}
plist_free(res_root_node);
logger_log(conn->raop->logger, LOGGER_DEBUG, "%s", plist_xml);
free(plist_xml);
} else if (data_is_text) {
char *data_str = utils_data_to_text((char*) response_data, response_datalen);
logger_log(conn->raop->logger, LOGGER_DEBUG, "%s", data_str);
@@ -554,11 +582,11 @@ raop_init(raop_callbacks_t *callbacks) {
/* initialize switch for display of client's streaming data records */
raop->clientFPSdata = 0;
raop->max_ntp_timeouts = 0;
raop->audio_delay_micros = 250000;
raop->hls_support = false;
raop->nonce = NULL;
return raop;
}
@@ -582,7 +610,7 @@ raop_init2(raop_t *raop, int nohold, const char *device_id, const char *keyfile)
#else
unsigned char public_key[ED25519_KEY_SIZE];
pairing_get_public_key(pairing, public_key);
char *pk_str = utils_pk_to_string(public_key, ED25519_KEY_SIZE);
char *pk_str = utils_hex_to_string(public_key, ED25519_KEY_SIZE);
strncpy(raop->pk_str, (const char *) pk_str, 2*ED25519_KEY_SIZE);
free(pk_str);
#endif
@@ -614,10 +642,17 @@ void
raop_destroy(raop_t *raop) {
if (raop) {
raop_destroy_airplay_video(raop);
raop_stop(raop);
raop_stop_httpd(raop);
pairing_destroy(raop->pairing);
httpd_destroy(raop->httpd);
logger_destroy(raop->logger);
if (raop->nonce) {
free(raop->nonce);
}
if (raop->random_pw) {
free(raop->random_pw);
}
free(raop);
/* Cleanup the network */
@@ -662,9 +697,6 @@ int raop_set_plist(raop_t *raop, const char *plist_item, const int value) {
} else if (strcmp(plist_item, "clientFPSdata") == 0) {
raop->clientFPSdata = (value ? 1 : 0);
if ((int) raop->clientFPSdata != value) retval = 1;
} else if (strcmp(plist_item, "max_ntp_timeouts") == 0) {
raop->max_ntp_timeouts = (value > 0 ? value : 0);
if (raop->max_ntp_timeouts != value) retval = 1;
} else if (strcmp(plist_item, "audio_delay_micros") == 0) {
if (value >= 0 && value <= 10 * SECOND_IN_USECS) {
raop->audio_delay_micros = value;
@@ -730,14 +762,14 @@ raop_set_dnssd(raop_t *raop, dnssd_t *dnssd) {
int
raop_start(raop_t *raop, unsigned short *port) {
raop_start_httpd(raop_t *raop, unsigned short *port) {
assert(raop);
assert(port);
return httpd_start(raop->httpd, port);
}
void
raop_stop(raop_t *raop) {
raop_stop_httpd(raop_t *raop) {
assert(raop);
httpd_stop(raop->httpd);
}
@@ -770,3 +802,7 @@ void raop_destroy_airplay_video(raop_t *raop) {
raop->airplay_video = NULL;
}
}
uint64_t get_local_time() {
return raop_ntp_get_local_time();
}

View File

@@ -67,17 +67,22 @@ struct raop_callbacks_s {
void (*video_process)(void *cls, raop_ntp_t *ntp, video_decode_struct *data);
void (*video_pause)(void *cls);
void (*video_resume)(void *cls);
/* Optional but recommended callback functions */
void (*conn_feedback) (void *cls);
void (*conn_reset) (void *cls, int reason);
void (*video_reset) (void *cls);
/* Optional but recommended callback functions (probably not optional, check this)*/
void (*conn_init)(void *cls);
void (*conn_destroy)(void *cls);
void (*conn_reset) (void *cls, int timeouts, bool reset_video);
void (*conn_teardown)(void *cls, bool *teardown_96, bool *teardown_110 );
void (*audio_flush)(void *cls);
void (*video_flush)(void *cls);
double (*audio_set_client_volume)(void *cls);
void (*audio_set_volume)(void *cls, float volume);
void (*audio_set_metadata)(void *cls, const void *buffer, int buflen);
void (*audio_set_coverart)(void *cls, const void *buffer, int buflen);
void (*audio_stop_coverart_rendering) (void* cls);
void (*audio_remote_control_id)(void *cls, const char *dacp_id, const char *active_remote_header);
void (*audio_set_progress)(void *cls, unsigned int start, unsigned int curr, unsigned int end);
void (*audio_get_format)(void *cls, unsigned char *ct, unsigned short *spf, bool *usingScreen, bool *isMedia, uint64_t *audioFormat);
@@ -86,17 +91,17 @@ struct raop_callbacks_s {
void (*display_pin) (void *cls, char * pin);
void (*register_client) (void *cls, const char *device_id, const char *pk_str, const char *name);
bool (*check_register) (void *cls, const char *pk_str);
const char* (*passwd) (void *cls, int *len);
void (*export_dacp) (void *cls, const char *active_remote, const char *dacp_id);
void (*video_reset) (void *cls);
void (*video_set_codec)(void *cls, video_codec_t codec);
int (*video_set_codec)(void *cls, video_codec_t codec);
/* for HLS video player controls */
void (*on_video_play) (void *cls, const char *location, const float start_position);
void (*on_video_scrub) (void *cls, const float position);
void (*on_video_rate) (void *cls, const float rate);
void (*on_video_stop) (void *cls);
void (*on_video_acquire_playback_info) (void *cls, playback_info_t *playback_video);
};
typedef struct raop_callbacks_s raop_callbacks_t;
raop_ntp_t *raop_ntp_init(logger_t *logger, raop_callbacks_t *callbacks, const char *remote,
int remote_addr_len, unsigned short timing_rport,
@@ -107,7 +112,8 @@ int airplay_video_service_init(raop_t *raop, unsigned short port, const char *se
bool register_airplay_video(raop_t *raop, airplay_video_t *airplay_video);
airplay_video_t *get_airplay_video(raop_t *raop);
airplay_video_t *deregister_airplay_video(raop_t *raop);
uint64_t get_local_time();
RAOP_API raop_t *raop_init(raop_callbacks_t *callbacks);
RAOP_API int raop_init2(raop_t *raop, int nohold, const char *device_id, const char *keyfile);
RAOP_API void raop_set_log_level(raop_t *raop, int level);
@@ -118,9 +124,9 @@ RAOP_API void raop_set_udp_ports(raop_t *raop, unsigned short port[3]);
RAOP_API void raop_set_tcp_ports(raop_t *raop, unsigned short port[2]);
RAOP_API unsigned short raop_get_port(raop_t *raop);
RAOP_API void *raop_get_callback_cls(raop_t *raop);
RAOP_API int raop_start(raop_t *raop, unsigned short *port);
RAOP_API int raop_start_httpd(raop_t *raop, unsigned short *port);
RAOP_API int raop_is_running(raop_t *raop);
RAOP_API void raop_stop(raop_t *raop);
RAOP_API void raop_stop_httpd(raop_t *raop);
RAOP_API void raop_set_dnssd(raop_t *raop, dnssd_t *dnssd);
RAOP_API void raop_destroy(raop_t *raop);
RAOP_API void raop_remove_known_connections(raop_t * raop);

View File

@@ -39,10 +39,11 @@ typedef struct {
/* Data available */
int filled;
uint64_t packet_arrival_time;
/* RTP header */
unsigned short seqnum;
uint64_t rtp_timestamp;
uint64_t ntp_timestamp;
uint32_t rtp_timestamp;
/* Payload data */
unsigned int payload_size;
@@ -165,7 +166,7 @@ raop_buffer_decrypt(raop_buffer_t *raop_buffer, unsigned char *data, unsigned ch
}
int
raop_buffer_enqueue(raop_buffer_t *raop_buffer, unsigned char *data, unsigned short datalen, uint64_t *ntp_timestamp, uint64_t *rtp_timestamp, int use_seqnum) {
raop_buffer_enqueue(raop_buffer_t *raop_buffer, unsigned char *data, unsigned short datalen, int use_seqnum) {
unsigned char empty_packet_marker[] = { 0x00, 0x68, 0x34, 0x00 };
assert(raop_buffer);
@@ -206,8 +207,7 @@ raop_buffer_enqueue(raop_buffer_t *raop_buffer, unsigned char *data, unsigned sh
/* Update the raop_buffer entry header */
entry->seqnum = seqnum;
entry->rtp_timestamp = *rtp_timestamp;
entry->ntp_timestamp = *ntp_timestamp;
entry->rtp_timestamp = byteutils_get_int_be(data, 4);
entry->filled = 1;
entry->payload_data = malloc(payload_size);
@@ -228,7 +228,7 @@ raop_buffer_enqueue(raop_buffer_t *raop_buffer, unsigned char *data, unsigned sh
}
void *
raop_buffer_dequeue(raop_buffer_t *raop_buffer, unsigned int *length, uint64_t *ntp_timestamp, uint64_t *rtp_timestamp, unsigned short *seqnum, int no_resend) {
raop_buffer_dequeue(raop_buffer_t *raop_buffer, unsigned int *length, uint32_t *rtp_timestamp, unsigned short *seqnum, int no_resend) {
assert(raop_buffer);
/* Calculate number of entries in the current buffer */
@@ -261,7 +261,6 @@ raop_buffer_dequeue(raop_buffer_t *raop_buffer, unsigned int *length, uint64_t *
/* Return entry payload buffer */
*rtp_timestamp = entry->rtp_timestamp;
*ntp_timestamp = entry->ntp_timestamp;
*seqnum = entry->seqnum;
*length = entry->payload_size;
entry->payload_size = 0;

View File

@@ -28,8 +28,8 @@ typedef int (*raop_resend_cb_t)(void *opaque, unsigned short seqno, unsigned sho
raop_buffer_t *raop_buffer_init(logger_t *logger,
const unsigned char *aeskey,
const unsigned char *aesiv);
int raop_buffer_enqueue(raop_buffer_t *raop_buffer, unsigned char *data, unsigned short datalen, uint64_t *ntp_timestamp, uint64_t *rtp_timestamp, int use_seqnum);
void *raop_buffer_dequeue(raop_buffer_t *raop_buffer, unsigned int *length, uint64_t *ntp_timestamp, uint64_t *rtp_timestamp, unsigned short *seqnum, int no_resend);
int raop_buffer_enqueue(raop_buffer_t *raop_buffer, unsigned char *data, unsigned short datalen, int use_seqnum);
void *raop_buffer_dequeue(raop_buffer_t *raop_buffer, unsigned int *length, uint32_t *rtp_timestamp, unsigned short *seqnum, int no_resend);
void raop_buffer_handle_resends(raop_buffer_t *raop_buffer, raop_resend_cb_t resend_cb, void *opaque);
void raop_buffer_flush(raop_buffer_t *raop_buffer, int next_seq);

View File

@@ -25,6 +25,8 @@
#include <plist/plist.h>
#define AUDIO_SAMPLE_RATE 44100 /* all supported AirPlay audio format use this sample rate */
#define SECOND_IN_USECS 1000000
#define SECOND_IN_NSECS 1000000000
#define MAX_PW_ATTEMPTS 5
typedef void (*raop_handler_t)(raop_conn_t *, http_request_t *,
http_response_t *, char **, int *);
@@ -36,6 +38,33 @@ raop_handler_info(raop_conn_t *conn,
{
assert(conn->raop->dnssd);
#if 0
/* initial GET/info request sends plist with string "txtAirPlay" */
bool txtAirPlay = false;
const char* content_type = NULL;
content_type = http_request_get_header(request, "Content-Type");
if (content_type && strstr(content_type, "application/x-apple-binary-plist")) {
char *qualifier_string = NULL;
const char *data = NULL;
int data_len = 0;
data = http_request_get_data(request, &data_len);
//parsing bplist
plist_t req_root_node = NULL;
plist_from_bin(data, data_len, &req_root_node);
plist_t req_qualifier_node = plist_dict_get_item(req_root_node, "qualifier");
if (PLIST_IS_ARRAY(req_qualifier_node)) {
plist_t req_string_node = plist_array_get_item(req_qualifier_node, 0);
plist_get_string_val(req_string_node, &qualifier_string);
}
if (qualifier_string && !strcmp(qualifier_string, "txtAirPlay")) {
printf("qualifier: %s\n", qualifier_string);
txtAirPlay = true;
}
if (qualifier_string) {
free(qualifier_string);
}
}
#endif
plist_t res_node = plist_new_dict();
/* deviceID is the physical hardware address, and will not change */
@@ -509,7 +538,7 @@ raop_handler_options(raop_conn_t *conn,
http_request_t *request, http_response_t *response,
char **response_data, int *response_datalen)
{
http_response_add_header(response, "Public", "SETUP, RECORD, PAUSE, FLUSH, TEARDOWN, OPTIONS, GET_PARAMETER, SET_PARAMETER");
http_response_add_header(response, "Public", "SETUP, RECORD, FLUSH, TEARDOWN, OPTIONS, GET_PARAMETER, SET_PARAMETER");
}
static void
@@ -517,12 +546,12 @@ raop_handler_setup(raop_conn_t *conn,
http_request_t *request, http_response_t *response,
char **response_data, int *response_datalen)
{
const char *dacp_id;
const char *active_remote_header;
const char *dacp_id = NULL;
const char *active_remote_header = NULL;
bool logger_debug = (logger_get_level(conn->raop->logger) >= LOGGER_DEBUG);
const char *data;
int data_len;
const char *data = NULL;
int data_len = 0;
data = http_request_get_data(request, &data_len);
dacp_id = http_request_get_header(request, "DACP-ID");
@@ -548,22 +577,117 @@ raop_handler_setup(raop_conn_t *conn,
if (PLIST_IS_DATA(req_eiv_node) && PLIST_IS_DATA(req_ekey_node)) {
// The first SETUP call that initializes keys and timing
unsigned char aesiv[16];
unsigned char aeskey[16];
unsigned char eaeskey[72];
unsigned char aesiv[16] = { 0 };
unsigned char aeskey[16] = { 0 };
unsigned char eaeskey[72] = { 0 };
logger_log(conn->raop->logger, LOGGER_DEBUG, "SETUP 1");
// First setup
char* eiv = NULL;
uint64_t eiv_len = 0;
char *deviceID = NULL;
char *model = NULL;
char *name = NULL;
bool admit_client = true;
plist_t req_deviceid_node = plist_dict_get_item(req_root_node, "deviceID");
plist_get_string_val(req_deviceid_node, &deviceID);
/* RFC2617 Digest authentication (md5 hash) of uxplay client-access password, if set */
if (!conn->authenticated && conn->raop->callbacks.passwd) {
size_t pin_len = 4;
if (conn->raop->random_pw && strncmp(conn->raop->random_pw + pin_len + 1, deviceID, 17)) {
conn->raop->auth_fail_count = MAX_PW_ATTEMPTS;
}
int len;
const char *password = conn->raop->callbacks.passwd(conn->raop->callbacks.cls, &len);
// len = -1 means use a random password for this connection; len = 0 means no password
if (len == -1 && conn->raop->random_pw && conn->raop->auth_fail_count >= MAX_PW_ATTEMPTS) {
// change random_pw after MAX_PW_ATTEMPTS failed authentication attempts
logger_log(conn->raop->logger, LOGGER_INFO, "Too many authentication failures or new client: generate new random password");
free(conn->raop->random_pw);
conn->raop->random_pw = NULL;
}
if (len == -1 && !conn->raop->random_pw) {
// get and store 4 random digits
int pin_4 = random_pin();
if (pin_4 < 0) {
logger_log(conn->raop->logger, LOGGER_ERR, "Failed to generate random pin");
pin_4 = 1234;
}
conn->raop->random_pw = (char *) malloc(pin_len + 1 + 18);
char *pin = conn->raop->random_pw;
snprintf(pin, pin_len + 1, "%04u", pin_4 % 10000);
pin[pin_len] = '\0';
snprintf(pin + pin_len + 1, 18, "%s", deviceID);
conn->raop->auth_fail_count = 0;
if (conn->raop->callbacks.display_pin) {
conn->raop->callbacks.display_pin(conn->raop->callbacks.cls, pin);
}
logger_log(conn->raop->logger, LOGGER_INFO, "*** CLIENT MUST NOW ENTER PIN = \"%s\" AS AIRPLAY PASSWORD", pin);
}
if (len && !conn->authenticated) {
if (len == -1) {
password = (const char *) conn->raop->random_pw;
}
char nonce_string[33] = { '\0' };
//bool stale = false; //not implemented
const char *authorization = NULL;
authorization = http_request_get_header(request, "Authorization");
if (authorization) {
char *ptr = strstr(authorization, "nonce=\"") + strlen("nonce=\"");
strncpy(nonce_string, ptr, 32);
const char *method = http_request_get_method(request);
conn->authenticated = pairing_digest_verify(method, authorization, password);
if (!conn->authenticated) {
conn->raop->auth_fail_count++;
logger_log(conn->raop->logger, LOGGER_INFO, "*** authentication failure: count = %u", conn->raop->auth_fail_count);
if (conn->raop->callbacks.display_pin && conn->raop->auth_fail_count > 1) {
conn->raop->callbacks.display_pin(conn->raop->callbacks.cls, conn->raop->random_pw);
}
logger_log(conn->raop->logger, LOGGER_INFO, "*** CLIENT MUST NOW ENTER PIN = \"%s\" AS AIRPLAY PASSWORD", conn->raop->random_pw);
}
if (conn->authenticated) {
//printf("initial authenticatication OK\n");
conn->authenticated = conn->authenticated && !strcmp(nonce_string, conn->raop->nonce);
if (!conn->authenticated) {
logger_log(conn->raop->logger, LOGGER_INFO, "authentication rejected (nonce mismatch) %s %s",
nonce_string, conn->raop->nonce);
}
}
if (conn->authenticated && conn->raop->random_pw) {
free (conn->raop->random_pw);
conn->raop->random_pw = NULL;
}
if (conn->raop->nonce) {
free(conn->raop->nonce);
conn->raop->nonce = NULL;
}
logger_log(conn->raop->logger, LOGGER_INFO, "Client authentication %s", (conn->authenticated ? "success" : "failure"));
}
if (!conn->authenticated) {
/* create a nonce */
const char *url = http_request_get_url(request);
unsigned char nonce[16] = { '\0' };
int len = 16;
uint64_t now = raop_ntp_get_local_time();
assert (!pairing_session_make_nonce(conn->session, &now, url, nonce, len));
if (conn->raop->nonce) {
free(conn->raop->nonce);
}
conn->raop->nonce = utils_hex_to_string(nonce, len);
char response_text[80] = "Digest realm=\"raop\", nonce=\"";
strncat(response_text, conn->raop->nonce, 80 - strlen(response_text) - 1);
strncat(response_text, "\"", 80 - strlen(response_text) - 1);
http_response_init(response, "RTSP/1.0", 401, "Unauthorized");
http_response_add_header(response, "WWW-Authenticate", response_text);
return;
}
}
}
char* eiv = NULL;
uint64_t eiv_len = 0;
char *model = NULL;
char *name = NULL;
bool admit_client = true;
plist_t req_model_node = plist_dict_get_item(req_root_node, "model");
plist_get_string_val(req_model_node, &model);
plist_t req_name_node = plist_dict_get_item(req_root_node, "name");
@@ -572,16 +696,11 @@ raop_handler_setup(raop_conn_t *conn,
conn->raop->callbacks.report_client_request(conn->raop->callbacks.cls, deviceID, model, name, &admit_client);
}
if (admit_client && deviceID && name && conn->raop->callbacks.register_client) {
bool pending_registration;
char *client_device_id;
char *client_pk; /* encoded as null-terminated base64 string*/
access_client_session_data(conn->session, &client_device_id, &client_pk, &pending_registration);
if (pending_registration) {
if (client_pk && !strcmp(deviceID, client_device_id)) {
conn->raop->callbacks.register_client(conn->raop->callbacks.cls, client_device_id, client_pk, name);
}
}
if (client_pk) {
char *client_device_id = NULL;
char *client_pk = NULL; /* encoded as null-terminated base64 string, must be freed*/
get_pairing_session_client_data(conn->session, &client_device_id, &client_pk);
if (client_pk && !strcmp(deviceID, client_device_id)) {
conn->raop->callbacks.register_client(conn->raop->callbacks.cls, client_device_id, client_pk, name);
free (client_pk);
}
}
@@ -691,7 +810,7 @@ raop_handler_setup(raop_conn_t *conn,
}
}
char *timing_protocol = NULL;
timing_protocol_t time_protocol;
timing_protocol_t time_protocol = TP_NONE;
plist_t req_timing_protocol_node = plist_dict_get_item(req_root_node, "timingProtocol");
plist_get_string_val(req_timing_protocol_node, &timing_protocol);
if (timing_protocol) {
@@ -730,7 +849,7 @@ raop_handler_setup(raop_conn_t *conn,
conn->raop_ntp = NULL;
conn->raop_rtp = NULL;
conn->raop_rtp_mirror = NULL;
char remote[40];
char remote[40] = { 0 };
int len = utils_ipaddress_to_string(conn->remotelen, conn->remote, conn->zone_id, remote, (int) sizeof(remote));
if (!len || len > sizeof(remote)) {
char *str = utils_data_to_string(conn->remote, conn->remotelen, 16);
@@ -742,7 +861,7 @@ raop_handler_setup(raop_conn_t *conn,
}
conn->raop_ntp = raop_ntp_init(conn->raop->logger, &conn->raop->callbacks, remote,
conn->remotelen, (unsigned short) timing_rport, &time_protocol);
raop_ntp_start(conn->raop_ntp, &timing_lport, conn->raop->max_ntp_timeouts);
raop_ntp_start(conn->raop_ntp, &timing_lport);
conn->raop_rtp = raop_rtp_init(conn->raop->logger, &conn->raop->callbacks, conn->raop_ntp,
remote, conn->remotelen, aeskey, aesiv);
conn->raop_rtp_mirror = raop_rtp_mirror_init(conn->raop->logger, &conn->raop->callbacks,
@@ -776,7 +895,7 @@ raop_handler_setup(raop_conn_t *conn,
// Mirroring
unsigned short dport = conn->raop->mirror_data_lport;
plist_t stream_id_node = plist_dict_get_item(req_stream_node, "streamConnectionID");
uint64_t stream_connection_id;
uint64_t stream_connection_id = 0;
plist_get_uint_val(stream_id_node, &stream_connection_id);
logger_log(conn->raop->logger, LOGGER_DEBUG, "streamConnectionID (needed for AES-CTR video decryption"
" key and iv): %llu", stream_connection_id);
@@ -802,7 +921,7 @@ raop_handler_setup(raop_conn_t *conn,
// Audio
unsigned short cport = conn->raop->control_lport, dport = conn->raop->data_lport;
unsigned short remote_cport = 0;
unsigned char ct;
unsigned char ct = 0;
unsigned int sr = AUDIO_SAMPLE_RATE; /* all AirPlay audio formats supported so far have sample rate 44.1kHz */
uint64_t uint_val = 0;
@@ -816,10 +935,10 @@ raop_handler_setup(raop_conn_t *conn,
if (conn->raop->callbacks.audio_get_format) {
/* get additional audio format parameters */
uint64_t audioFormat;
unsigned short spf;
bool isMedia;
bool usingScreen;
uint64_t audioFormat = 0;
unsigned short spf = 0;
bool isMedia = false;
bool usingScreen = false;
uint8_t bool_val = 0;
plist_t req_stream_spf_node = plist_dict_get_item(req_stream_node, "spf");
@@ -894,6 +1013,11 @@ raop_handler_get_parameter(raop_conn_t *conn,
int datalen;
content_type = http_request_get_header(request, "Content-Type");
if (!content_type) {
http_response_init(response, "RTSP/1.0", 451, "Parameter not understood");
return;
}
data = http_request_get_data(request, &datalen);
if (!strcmp(content_type, "text/parameters")) {
const char *current = data;
@@ -903,8 +1027,10 @@ raop_handler_get_parameter(raop_conn_t *conn,
/* This is a bit ugly, but seems to be how airport works too */
if ((datalen - (current - data) >= 8) && !strncmp(current, "volume\r\n", 8)) {
const char volume[] = "volume: 0.0\r\n";
char volume[25] = "volume: 0.0\r\n";
if (conn->raop->callbacks.audio_set_client_volume) {
snprintf(volume, 25, "volume: %9.6f\r\n", conn->raop->callbacks.audio_set_client_volume(conn->raop->callbacks.cls));
}
http_response_add_header(response, "Content-Type", "text/parameters");
*response_data = strdup(volume);
if (*response_data) {
@@ -940,6 +1066,10 @@ raop_handler_set_parameter(raop_conn_t *conn,
int datalen;
content_type = http_request_get_header(request, "Content-Type");
if (!content_type) {
http_response_init(response, "RTSP/1.0", 451, "Parameter not understood");
return;
}
data = http_request_get_data(request, &datalen);
if (!strcmp(content_type, "text/parameters")) {
char *datastr;
@@ -983,6 +1113,8 @@ raop_handler_feedback(raop_conn_t *conn,
char **response_data, int *response_datalen)
{
logger_log(conn->raop->logger, LOGGER_DEBUG, "raop_handler_feedback");
/* register receipt of client's "heartbeat" signal */
conn->raop->callbacks.conn_feedback(conn->raop->callbacks.cls);
}
static void
@@ -1043,7 +1175,7 @@ raop_handler_teardown(raop_conn_t *conn,
uint64_t val;
int count = plist_array_get_size(req_streams_node);
for (int i = 0; i < count; i++) {
plist_t req_stream_node = plist_array_get_item(req_streams_node,0);
plist_t req_stream_node = plist_array_get_item(req_streams_node,i);
plist_t req_stream_type_node = plist_dict_get_item(req_stream_node, "type");
plist_get_uint_val(req_stream_type_node, &val);
if (val == 96) {
@@ -1065,8 +1197,13 @@ raop_handler_teardown(raop_conn_t *conn,
if (conn->raop_rtp) {
/* Stop our audio RTP session */
raop_rtp_stop(conn->raop_rtp);
/* stop any coverart rendering */
if (conn->raop->callbacks.audio_stop_coverart_rendering) {
conn->raop->callbacks.audio_stop_coverart_rendering(conn->raop->callbacks.cls);
}
}
} else if (teardown_110) {
conn->raop->callbacks.video_reset(conn->raop->callbacks.cls);
if (conn->raop_rtp_mirror) {
/* Stop our video RTP session */
raop_rtp_mirror_stop(conn->raop_rtp_mirror);
@@ -1081,5 +1218,7 @@ raop_handler_teardown(raop_conn_t *conn,
raop_rtp_mirror_destroy(conn->raop_rtp_mirror);
conn->raop_rtp_mirror = NULL;
}
/* shut down any HLS connections */
httpd_remove_connections_by_type(conn->raop->httpd, CONNECTION_TYPE_HLS);
}
}

View File

@@ -58,8 +58,6 @@ struct raop_ntp_s {
logger_t *logger;
raop_callbacks_t callbacks;
int max_ntp_timeouts;
thread_handle_t thread;
mutex_handle_t run_mutex;
@@ -94,8 +92,19 @@ struct raop_ntp_s {
int tsock;
timing_protocol_t time_protocol;
bool client_time_received;
uint64_t video_arrival_offset;
};
/* for use in syncing audio before a first rtp_sync */
void raop_ntp_set_video_arrival_offset(raop_ntp_t* raop_ntp, const uint64_t *offset) {
raop_ntp->video_arrival_offset = *offset;
}
uint64_t raop_ntp_get_video_arrival_offset(raop_ntp_t* raop_ntp) {
return raop_ntp->video_arrival_offset;
}
/*
* Used for sorting the data array by delay
@@ -153,6 +162,9 @@ raop_ntp_t *raop_ntp_init(logger_t *logger, raop_callbacks_t *callbacks, const c
raop_ntp->logger = logger;
memcpy(&raop_ntp->callbacks, callbacks, sizeof(raop_callbacks_t));
raop_ntp->timing_rport = timing_rport;
raop_ntp->client_time_received = false;
raop_ntp->video_arrival_offset = 0;
if (raop_ntp_parse_remote(raop_ntp, remote, remote_addr_len) < 0) {
free(raop_ntp);
@@ -165,7 +177,7 @@ raop_ntp_t *raop_ntp_init(logger_t *logger, raop_callbacks_t *callbacks, const c
raop_ntp->running = 0;
raop_ntp->joined = 1;
uint64_t time = raop_ntp_get_local_time(raop_ntp);
uint64_t time = raop_ntp_get_local_time();
for (int i = 0; i < RAOP_NTP_DATA_COUNT; ++i) {
raop_ntp->data[i].offset = 0ll;
@@ -274,10 +286,9 @@ raop_ntp_thread(void *arg)
};
raop_ntp_data_t data_sorted[RAOP_NTP_DATA_COUNT];
const unsigned two_pow_n[RAOP_NTP_DATA_COUNT] = {2, 4, 8, 16, 32, 64, 128, 256};
int timeout_counter = 0;
bool conn_reset = false;
bool logger_debug = (logger_get_level(raop_ntp->logger) >= LOGGER_DEBUG);
uint64_t recv_time = 0, client_ref_time = 0;
while (1) {
MUTEX_LOCK(raop_ntp->run_mutex);
if (!raop_ntp->running) {
@@ -290,8 +301,12 @@ raop_ntp_thread(void *arg)
raop_ntp_flush_socket(raop_ntp->tsock);
// Send request
uint64_t send_time = raop_ntp_get_local_time(raop_ntp);
uint64_t send_time = raop_ntp_get_local_time();
byteutils_put_ntp_timestamp(request, 24, send_time);
if (recv_time) {
byteutils_put_long_be(request, 8, client_ref_time);
byteutils_put_ntp_timestamp(request, 16, recv_time);
}
int send_len = sendto(raop_ntp->tsock, (char *)request, sizeof(request), 0,
(struct sockaddr *) &raop_ntp->remote_saddr, raop_ntp->remote_saddr_len);
if (logger_debug) {
@@ -308,20 +323,17 @@ raop_ntp_thread(void *arg)
// Read response
response_len = recvfrom(raop_ntp->tsock, (char *)response, sizeof(response), 0, NULL, NULL);
if (response_len < 0) {
timeout_counter++;
char time[30];
int level = (timeout_counter == 1 ? LOGGER_DEBUG : LOGGER_ERR);
ntp_timestamp_to_time(send_time, time, sizeof(time));
logger_log(raop_ntp->logger, level, "raop_ntp receive timeout %d (limit %d) (request sent %s)",
timeout_counter, raop_ntp->max_ntp_timeouts, time);
if (timeout_counter == raop_ntp->max_ntp_timeouts) {
conn_reset = true; /* client is no longer responding */
break;
}
logger_log(raop_ntp->logger, LOGGER_DEBUG , "raop_ntp receive timeout (request sent %s)", time);
} else {
recv_time = raop_ntp_get_local_time();
client_ref_time = byteutils_get_long_be(response, 24);
if (!raop_ntp->client_time_received) {
raop_ntp->client_time_received = true;
}
//local time of the server when the NTP response packet returns
int64_t t3 = (int64_t) raop_ntp_get_local_time(raop_ntp);
timeout_counter = 0;
int64_t t3 = (int64_t) recv_time;
// Local time of the server when the NTP request packet leaves the server
int64_t t0 = (int64_t) byteutils_get_ntp_timestamp(response, 8);
@@ -391,15 +403,11 @@ raop_ntp_thread(void *arg)
MUTEX_UNLOCK(raop_ntp->run_mutex);
logger_log(raop_ntp->logger, LOGGER_DEBUG, "raop_ntp exiting thread");
if (conn_reset && raop_ntp->callbacks.conn_reset) {
const bool video_reset = false; /* leave "frozen video" in place */
raop_ntp->callbacks.conn_reset(raop_ntp->callbacks.cls, timeout_counter, video_reset);
}
return 0;
}
void
raop_ntp_start(raop_ntp_t *raop_ntp, unsigned short *timing_lport, int max_ntp_timeouts)
raop_ntp_start(raop_ntp_t *raop_ntp, unsigned short *timing_lport)
{
logger_log(raop_ntp->logger, LOGGER_DEBUG, "raop_ntp starting time");
int use_ipv6 = 0;
@@ -407,7 +415,6 @@ raop_ntp_start(raop_ntp_t *raop_ntp, unsigned short *timing_lport, int max_ntp_t
assert(raop_ntp);
assert(timing_lport);
raop_ntp->max_ntp_timeouts = max_ntp_timeouts;
raop_ntp->timing_lport = *timing_lport;
MUTEX_LOCK(raop_ntp->run_mutex);
@@ -457,13 +464,13 @@ raop_ntp_stop(raop_ntp_t *raop_ntp)
COND_SIGNAL(raop_ntp->wait_cond);
MUTEX_UNLOCK(raop_ntp->wait_mutex);
THREAD_JOIN(raop_ntp->thread);
if (raop_ntp->tsock != -1) {
closesocket(raop_ntp->tsock);
raop_ntp->tsock = -1;
}
THREAD_JOIN(raop_ntp->thread);
logger_log(raop_ntp->logger, LOGGER_DEBUG, "raop_ntp stopped time thread");
/* Mark thread as joined */
@@ -495,7 +502,7 @@ uint64_t raop_remote_timestamp_to_nano_seconds(raop_ntp_t *raop_ntp, uint64_t ti
* Returns the current time in nano seconds according to the local wall clock.
* The system Unix time is used as the local wall clock.
*/
uint64_t raop_ntp_get_local_time(raop_ntp_t *raop_ntp) {
uint64_t raop_ntp_get_local_time() {
struct timespec time;
clock_gettime(CLOCK_REALTIME, &time);
return ((uint64_t) time.tv_nsec) + (uint64_t) time.tv_sec * SECOND_IN_NSECS;
@@ -505,16 +512,22 @@ uint64_t raop_ntp_get_local_time(raop_ntp_t *raop_ntp) {
* Returns the current time in nano seconds according to the remote wall clock.
*/
uint64_t raop_ntp_get_remote_time(raop_ntp_t *raop_ntp) {
if (!raop_ntp->client_time_received) {
return 0;
}
MUTEX_LOCK(raop_ntp->sync_params_mutex);
int64_t offset = raop_ntp->sync_offset;
MUTEX_UNLOCK(raop_ntp->sync_params_mutex);
return (uint64_t) ((int64_t) raop_ntp_get_local_time(raop_ntp) + offset);
return (uint64_t) ((int64_t) raop_ntp_get_local_time() + offset);
}
/**
* Returns the local wall clock time in nano seconds for the given point in remote clock time
*/
uint64_t raop_ntp_convert_remote_time(raop_ntp_t *raop_ntp, uint64_t remote_time) {
if (!raop_ntp->client_time_received) {
return 0;
}
MUTEX_LOCK(raop_ntp->sync_params_mutex);
int64_t offset = raop_ntp->sync_offset;
MUTEX_UNLOCK(raop_ntp->sync_params_mutex);
@@ -525,6 +538,9 @@ uint64_t raop_ntp_convert_remote_time(raop_ntp_t *raop_ntp, uint64_t remote_time
* Returns the remote wall clock time in nano seconds for the given point in local clock time
*/
uint64_t raop_ntp_convert_local_time(raop_ntp_t *raop_ntp, uint64_t local_time) {
if (!raop_ntp->client_time_received) {
return 0;
}
MUTEX_LOCK(raop_ntp->sync_params_mutex);
int64_t offset = raop_ntp->sync_offset;
MUTEX_UNLOCK(raop_ntp->sync_params_mutex);

View File

@@ -27,7 +27,7 @@ typedef struct raop_ntp_s raop_ntp_t;
typedef enum timing_protocol_e { NTP, TP_NONE, TP_OTHER, TP_UNSPECIFIED } timing_protocol_t;
void raop_ntp_start(raop_ntp_t *raop_ntp, unsigned short *timing_lport, int max_ntp_timeouts);
void raop_ntp_start(raop_ntp_t *raop_ntp, unsigned short *timing_lport);
void raop_ntp_stop(raop_ntp_t *raop_ntp);
@@ -38,9 +38,12 @@ void raop_ntp_destroy(raop_ntp_t *raop_rtp);
uint64_t raop_ntp_timestamp_to_nano_seconds(uint64_t ntp_timestamp, bool account_for_epoch_diff);
uint64_t raop_remote_timestamp_to_nano_seconds(raop_ntp_t *raop_ntp, uint64_t timestamp);
uint64_t raop_ntp_get_local_time(raop_ntp_t *raop_ntp);
uint64_t raop_ntp_get_local_time();
uint64_t raop_ntp_get_remote_time(raop_ntp_t *raop_ntp);
uint64_t raop_ntp_convert_remote_time(raop_ntp_t *raop_ntp, uint64_t remote_time);
uint64_t raop_ntp_convert_local_time(raop_ntp_t *raop_ntp, uint64_t local_time);
void raop_ntp_set_video_arrival_offset(raop_ntp_t* raop_ntp, const uint64_t *offset);
uint64_t raop_ntp_get_video_arrival_offset(raop_ntp_t* raop_ntp);
#endif //RAOP_NTP_H

View File

@@ -36,10 +36,9 @@
#define NO_FLUSH (-42)
#define SECOND_IN_NSECS 1000000000
#define RAOP_RTP_SYNC_DATA_COUNT 8
#define SEC SECOND_IN_NSECS
#define DELAY_AAC 0.275 //empirical, matches audio latency of about -0.25 sec after first clock sync event
#define DELAY_AAC 0.20 //empirical, matches audio latency of about -0.25 sec after first clock sync event
/* note: it is unclear what will happen in the unlikely event that this code is running at the time of the unix-time
* epoch event on 2038-01-19 at 3:14:08 UTC ! (but Apple will surely have removed AirPlay "legacy pairing" by then!) */
@@ -56,14 +55,16 @@ struct raop_rtp_s {
// Time and sync
raop_ntp_t *ntp;
double rtp_clock_rate;
int64_t rtp_sync_offset;
raop_rtp_sync_data_t sync_data[RAOP_RTP_SYNC_DATA_COUNT];
int sync_data_index;
uint64_t ntp_start_time;
uint64_t rtp_start_time;
uint64_t rtp_time;
bool rtp_clock_started;
uint32_t rtp_sync;
uint64_t client_ntp_sync;
bool initial_sync;
// Transmission Stats, could be used if a playout buffer is needed
// float interarrival_jitter; // As defined by RTP RFC 3550, Section 6.4.1
// unsigned int last_packet_transit_time;
@@ -161,12 +162,10 @@ raop_rtp_init(logger_t *logger, raop_callbacks_t *callbacks, raop_ntp_t *ntp, co
raop_rtp->logger = logger;
raop_rtp->ntp = ntp;
raop_rtp->rtp_sync_offset = 0;
raop_rtp->sync_data_index = 0;
for (int i = 0; i < RAOP_RTP_SYNC_DATA_COUNT; ++i) {
raop_rtp->sync_data[i].ntp_time = 0;
raop_rtp->sync_data[i].rtp_time = 0;
}
raop_rtp->rtp_sync = 0;
raop_rtp->client_ntp_sync = 0;
raop_rtp->initial_sync = false;
raop_rtp->ntp_start_time = 0;
raop_rtp->rtp_start_time = 0;
raop_rtp->rtp_clock_started = false;
@@ -386,59 +385,24 @@ raop_rtp_process_events(raop_rtp_t *raop_rtp, void *cb_data)
return 0;
}
void raop_rtp_sync_clock(raop_rtp_t *raop_rtp, uint64_t *ntp_time, uint64_t *rtp_time) {
/* ntp_time = (uint64_t)(((int64_t)(raop_rtp->rtp_clock_rate * rtp_time)) + raop_rtp->rtp_sync_offset) */
int latest, valid_data_count = 0;
uint64_t ntp_sum = 0, rtp_sum = 0;
double offset = ((double) *ntp_time) - raop_rtp->rtp_clock_rate * *rtp_time;
int64_t correction = 0;
raop_rtp->sync_data_index = (raop_rtp->sync_data_index + 1) % RAOP_RTP_SYNC_DATA_COUNT;
latest = raop_rtp->sync_data_index;
raop_rtp->sync_data[latest].rtp_time = *rtp_time;
raop_rtp->sync_data[latest].ntp_time = *ntp_time;
for (int i = 0; i < RAOP_RTP_SYNC_DATA_COUNT; i++) {
if (raop_rtp->sync_data[i].ntp_time == 0) continue;
valid_data_count++;
if (i == latest) continue;
ntp_sum += *ntp_time - raop_rtp->sync_data[i].ntp_time;
rtp_sum += *rtp_time - raop_rtp->sync_data[i].rtp_time;
static uint64_t rtp_time_to_client_ntp(raop_rtp_t *raop_rtp, uint32_t rtp32) {
if (!raop_rtp->initial_sync) {
return 0;
}
if (valid_data_count > 1) {
correction -= raop_rtp->rtp_sync_offset;
offset += (((double) ntp_sum) - raop_rtp->rtp_clock_rate * rtp_sum) / valid_data_count;
}
raop_rtp->rtp_sync_offset = (int64_t) offset;
correction += raop_rtp->rtp_sync_offset;
logger_log(raop_rtp->logger, LOGGER_DEBUG, "dataset %d raop_rtp sync correction=%lld, rtp_sync_offset = %lld ",
valid_data_count, correction, raop_rtp->rtp_sync_offset);
}
uint64_t rtp64_time (raop_rtp_t *raop_rtp, const uint32_t *rtp32) {
/* convert from 32-bit to 64-bit rtp time:
* the rtp_time 32-bit epoch at 44.1kHz has length of about 27 hours
* using 64-bit rtp time avoids any epoch issues.
* initial call sets epoch to 1; subsequent calls maintain consistent epoch.
* (assumes successive calls are close in time) */
if (raop_rtp->rtp_clock_started) {
uint32_t diff1 = *rtp32 - ((uint32_t) raop_rtp->rtp_time);
uint32_t diff2 = ((uint32_t) raop_rtp->rtp_time) - *rtp32;
if (diff1 <= diff2) {
raop_rtp->rtp_time += (uint64_t) diff1;
} else {
raop_rtp->rtp_time -= (uint64_t) diff2;
}
int32_t rtp_change;
rtp32 -= raop_rtp->rtp_sync;
if (rtp32 <= INT32_MAX) {
rtp_change = (int32_t) rtp32;
} else {
raop_rtp->rtp_time = (0x01ULL << 32 ) + (uint64_t) *rtp32;
raop_rtp->rtp_start_time = raop_rtp->rtp_time;
raop_rtp->rtp_clock_started = true;
rtp_change = -(int32_t) (-rtp32);
}
double incr = raop_rtp->rtp_clock_rate * (double) rtp_change;
incr += (double) raop_rtp->client_ntp_sync;
if (incr < 0.0) {
return 0;
} else {
return (uint64_t) incr;
}
//assert(*rtp32 == (uint32_t) raop_rtp->rtp_time);
return raop_rtp->rtp_time;
}
static THREAD_RETVAL
@@ -450,22 +414,16 @@ raop_rtp_thread_udp(void *arg)
struct sockaddr_storage saddr;
socklen_t saddrlen;
bool got_remote_control_saddr = false;
uint64_t video_arrival_offset = 0;
/* for initial rtp to ntp conversions */
bool have_synced = false;
bool no_data_yet = true;
/* initial audio stream has no data */
unsigned char no_data_marker[] = {0x00, 0x68, 0x34, 0x00 };
int rtp_count = 0;
double sync_adjustment = 0;
unsigned short seqnum1 = 0, seqnum2 = 0;
assert(raop_rtp);
bool logger_debug = (logger_get_level(raop_rtp->logger) >= LOGGER_DEBUG);
raop_rtp->ntp_start_time = raop_ntp_get_local_time(raop_rtp->ntp);
bool logger_debug_data = (logger_get_level(raop_rtp->logger) >= LOGGER_DEBUG_DATA);
raop_rtp->ntp_start_time = raop_ntp_get_local_time();
raop_rtp->rtp_clock_started = false;
for (int i = 0; i < RAOP_RTP_SYNC_DATA_COUNT; i++) {
raop_rtp->sync_data[i].ntp_time = 0;
}
int no_resend = (raop_rtp->control_rport == 0); /* true when control_rport is not set */
@@ -500,7 +458,9 @@ raop_rtp_thread_udp(void *arg)
/* Timeout happened */
continue;
} else if (ret == -1) {
logger_log(raop_rtp->logger, LOGGER_ERR, "raop_rtp error in select");
int sock_err = SOCKET_GET_ERROR();
logger_log(raop_rtp->logger, LOGGER_ERR,
"raop_rtp error in select %d %s", sock_err, SOCKET_ERROR_STRING(sock_err));
break;
}
@@ -526,14 +486,8 @@ raop_rtp_thread_udp(void *arg)
unsigned int resent_packetlen = packetlen - 4;
unsigned short seqnum = byteutils_get_short_be(resent_packet, 2);
if (resent_packetlen >= 12) {
uint32_t timestamp = byteutils_get_int_be(resent_packet, 4);
uint64_t rtp_time = rtp64_time(raop_rtp, &timestamp);
uint64_t ntp_time = 0;
if (have_synced) {
ntp_time = (uint64_t) (raop_rtp->rtp_sync_offset + (int64_t) (raop_rtp->rtp_clock_rate * rtp_time));
}
logger_log(raop_rtp->logger, LOGGER_DEBUG, "raop_rtp resent audio packet: seqnum=%u", seqnum);
int result = raop_buffer_enqueue(raop_rtp->buffer, resent_packet, resent_packetlen, &ntp_time, &rtp_time, 1);
int result = raop_buffer_enqueue(raop_rtp->buffer, resent_packet, resent_packetlen, 1);
assert(result >= 0);
} else if (logger_debug) {
/* type_c = 0x56 packets with length 8 have been reported */
@@ -553,24 +507,30 @@ raop_rtp_thread_udp(void *arg)
* next_rtp = sync_rtp + 77175 = 441 * 175 (1.75 sec) for ALAC */
// The unit for the rtp clock is 1 / sample rate = 1 / 44100
uint32_t sync_rtp = byteutils_get_int_be(packet, 4);
uint64_t sync_rtp64 = rtp64_time(raop_rtp, &sync_rtp);
if (have_synced == false) {
uint64_t client_ntp_sync_prev = 0;
uint64_t rtp_sync_prev = 0;
if (!raop_rtp->initial_sync) {
logger_log(raop_rtp->logger, LOGGER_DEBUG, "first audio rtp sync");
have_synced = true;
}
raop_rtp->initial_sync = true;
} else {
client_ntp_sync_prev = raop_rtp->client_ntp_sync;
rtp_sync_prev = raop_rtp->rtp_sync;
}
raop_rtp->rtp_sync = byteutils_get_int_be(packet, 4);
uint64_t sync_ntp_raw = byteutils_get_long_be(packet, 8);
uint64_t sync_ntp_remote = raop_remote_timestamp_to_nano_seconds(raop_rtp->ntp, sync_ntp_raw);
raop_rtp->client_ntp_sync = raop_remote_timestamp_to_nano_seconds(raop_rtp->ntp, sync_ntp_raw);
if (logger_debug) {
uint64_t sync_ntp_local = raop_ntp_convert_remote_time(raop_rtp->ntp, sync_ntp_remote);
double offset_change = ((double) raop_rtp->client_ntp_sync) - raop_rtp->rtp_clock_rate * raop_rtp->rtp_sync;
offset_change -= ((double) client_ntp_sync_prev) - raop_rtp->rtp_clock_rate * rtp_sync_prev;
uint64_t sync_ntp_local = raop_ntp_convert_remote_time(raop_rtp->ntp, raop_rtp->rtp_sync);
char *str = utils_data_to_string(packet, packetlen, 20);
logger_log(raop_rtp->logger, LOGGER_DEBUG,
"raop_rtp sync: client ntp=%8.6f, ntp = %8.6f, ntp_start_time %8.6f\nts_client = %8.6f sync_rtp=%u\n%s",
(double) sync_ntp_remote / SEC, (double) sync_ntp_local / SEC,
(double) raop_rtp->ntp_start_time / SEC, (double) sync_ntp_remote / SEC, sync_rtp, str);
"raop_rtp sync: ntp = %8.6f, ntp_start_time %8.6f\nts_client = %8.6f sync_rtp=%u offset change = %8.6f\n%s",
(double) sync_ntp_local / SEC, (double) raop_rtp->ntp_start_time / SEC,
(double) raop_rtp->client_ntp_sync / SEC, raop_rtp->rtp_sync, offset_change / SEC, str);
free(str);
}
raop_rtp_sync_clock(raop_rtp, &sync_ntp_remote, &sync_rtp64);
} else if (logger_debug) {
char *str = utils_data_to_string(packet, packetlen, 16);
logger_log(raop_rtp->logger, LOGGER_DEBUG, "raop_rtp unknown udp control packet\n%s", str);
@@ -615,6 +575,9 @@ raop_rtp_thread_udp(void *arg)
if (FD_ISSET(raop_rtp->dsock, &rfds)) {
if (!raop_rtp->initial_sync && !video_arrival_offset) {
video_arrival_offset = raop_ntp_get_video_arrival_offset(raop_rtp->ntp);
}
//logger_log(raop_rtp->logger, LOGGER_INFO, "Would have data packet in queue");
// Receiving audio data here
saddrlen = sizeof(saddr);
@@ -633,83 +596,58 @@ raop_rtp_thread_udp(void *arg)
continue;
}
uint32_t rtp_timestamp = byteutils_get_int_be(packet, 4);
uint64_t rtp_time = rtp64_time(raop_rtp, &rtp_timestamp);
uint64_t ntp_time = 0;
if (!raop_rtp->initial_sync && raop_rtp->ct == 8 && video_arrival_offset) {
/* estimate a fake initial remote timestamp for video synchronization with AAC audio before the first rtp sync */
uint64_t ts = raop_ntp_get_local_time() - video_arrival_offset;
double delay = DELAY_AAC;
ts += (uint64_t) (delay * SEC);
raop_rtp->client_ntp_sync = ts;
raop_rtp->rtp_sync = byteutils_get_int_be(packet, 4);
raop_rtp->initial_sync = true;
}
if (packetlen == 16 && memcmp(packet + 12, no_data_marker, 4) == 0) {
/* this is a "no data" packet */
/* the first such packet could be used to provide the initial rtptime and seqnum formerly given in the RECORD request */
continue;
}
if (raop_rtp->ct == 2 && packetlen == 44) continue; /* ignore the ALAC packets with format information only. */
if (have_synced) {
ntp_time = (uint64_t) (raop_rtp->rtp_sync_offset + (int64_t) (raop_rtp->rtp_clock_rate * rtp_time));
} else if (packetlen == 16 && memcmp(packet + 12, no_data_marker, 4) == 0) {
/* use the special "no_data" packet to help determine an initial offset before the first rtp sync.
* until the first rtp sync occurs, we don't know the exact client ntp timestamp that matches the client rtp timestamp */
if (no_data_yet) {
int64_t sync_ntp = ((int64_t) raop_ntp_get_local_time(raop_rtp->ntp)) - ((int64_t) raop_rtp->ntp_start_time) ;
int64_t sync_rtp = ((int64_t) rtp_time) - ((int64_t) raop_rtp->rtp_start_time);
unsigned short seqnum = byteutils_get_short_be(packet, 2);
if (rtp_count == 0) {
sync_adjustment = ((double) sync_ntp);
rtp_count = 1;
seqnum1 = seqnum;
seqnum2 = seqnum;
}
if (seqnum2 != seqnum) { /* for AAC-ELD only use copy 1 of the 3 copies of each frame */
rtp_count++;
sync_adjustment += (((double) sync_ntp) - raop_rtp->rtp_clock_rate * sync_rtp - sync_adjustment) / rtp_count;
}
seqnum2 = seqnum1;
seqnum1 = seqnum;
}
continue;
} else {
no_data_yet = false;
}
int result = raop_buffer_enqueue(raop_rtp->buffer, packet, packetlen, &ntp_time, &rtp_time, 1);
int result = raop_buffer_enqueue(raop_rtp->buffer, packet, packetlen, 1);
assert(result >= 0);
if (raop_rtp->ct == 2 && !have_synced) {
/* in ALAC Audio-only mode wait until the first sync before dequeing */
if (!raop_rtp->initial_sync) {
/* wait until the first sync before dequeing ALAC */
continue;
} else {
// Render continuous buffer entries
void *payload = NULL;
unsigned int payload_size;
unsigned short seqnum;
uint64_t rtp64_timestamp;
uint64_t ntp_timestamp;
uint32_t rtp_timestamp;
while ((payload = raop_buffer_dequeue(raop_rtp->buffer, &payload_size, &ntp_timestamp, &rtp64_timestamp, &seqnum, no_resend))) {
while ((payload = raop_buffer_dequeue(raop_rtp->buffer, &payload_size, &rtp_timestamp, &seqnum, no_resend))) {
audio_decode_struct audio_data;
audio_data.rtp_time = rtp64_timestamp;
audio_data.rtp_time = rtp_timestamp;
audio_data.seqnum = seqnum;
audio_data.data_len = payload_size;
audio_data.data = payload;
audio_data.ct = raop_rtp->ct;
if (have_synced) {
if (ntp_timestamp == 0) {
ntp_timestamp = (uint64_t) (raop_rtp->rtp_sync_offset + (int64_t) (raop_rtp->rtp_clock_rate * rtp64_timestamp));
}
audio_data.ntp_time_remote = ntp_timestamp;
audio_data.ntp_time_local = raop_ntp_convert_remote_time(raop_rtp->ntp, audio_data.ntp_time_remote);
audio_data.sync_status = 1;
} else {
double elapsed_time = raop_rtp->rtp_clock_rate * (rtp64_timestamp - raop_rtp->rtp_start_time) + sync_adjustment
+ DELAY_AAC * SECOND_IN_NSECS;
audio_data.ntp_time_local = raop_rtp->ntp_start_time + (uint64_t) elapsed_time;
audio_data.ntp_time_remote = raop_ntp_convert_local_time(raop_rtp->ntp, audio_data.ntp_time_local);
audio_data.sync_status = 0;
audio_data.ntp_time_remote = rtp_time_to_client_ntp(raop_rtp, rtp_timestamp);
audio_data.ntp_time_local = raop_ntp_convert_remote_time(raop_rtp->ntp, audio_data.ntp_time_remote);
if (logger_debug_data) {
uint64_t ntp_now = raop_ntp_get_local_time();
int64_t latency = (audio_data.ntp_time_local ? ((int64_t) ntp_now) - ((int64_t) audio_data.ntp_time_local) : 0);
logger_log(raop_rtp->logger, LOGGER_DEBUG,
"raop_rtp audio: now = %8.6f, ntp = %8.6f, latency = %9.6f, ts = %8.6f, rtp_time=%u seqnum = %u",
(double) ntp_now / SEC, (double) audio_data.ntp_time_local / SEC, (double) latency / SEC,
(double) audio_data.ntp_time_remote /SEC, rtp_timestamp, seqnum);
}
raop_rtp->callbacks.audio_process(raop_rtp->callbacks.cls, raop_rtp->ntp, &audio_data);
free(payload);
if (logger_debug) {
uint64_t ntp_now = raop_ntp_get_local_time(raop_rtp->ntp);
int64_t latency = ((int64_t) ntp_now) - ((int64_t) audio_data.ntp_time_local);
logger_log(raop_rtp->logger, LOGGER_DEBUG,
"raop_rtp audio: now = %8.6f, ntp = %8.6f, latency = %8.6f, rtp_time=%u seqnum = %u",
(double) ntp_now / SEC, (double) audio_data.ntp_time_local / SEC, (double) latency / SEC,
(uint32_t) rtp64_timestamp, seqnum);
}
}
/* Handle possible resend requests */
@@ -897,8 +835,14 @@ raop_rtp_stop(raop_rtp_t *raop_rtp)
/* Join the thread */
THREAD_JOIN(raop_rtp->thread);
if (raop_rtp->csock != -1) closesocket(raop_rtp->csock);
if (raop_rtp->dsock != -1) closesocket(raop_rtp->dsock);
if (raop_rtp->csock != -1) {
closesocket(raop_rtp->csock);
raop_rtp->csock = -1;
}
if (raop_rtp->dsock != -1) {
closesocket(raop_rtp->dsock);
raop_rtp->dsock = -1;
}
/* Flush buffer into initial state */
raop_buffer_flush(raop_rtp->buffer, -1);

View File

@@ -62,6 +62,11 @@
#define TCP_KEEPIDLE TCP_KEEPALIVE
#endif
/* for OpenBSD, where TCP_KEEPIDLE and TCP_KEEPALIVE are not defined */
#if !defined(TCP_KEEPIDLE) && !defined(TCP_KEEPALIVE)
#define TCP_KEEPIDLE SO_KEEPALIVE
#endif
//struct h264codec_s {
// unsigned char compatibility;
// short pps_size;
@@ -195,12 +200,14 @@ raop_rtp_mirror_thread(void *arg)
uint64_t ntp_timestamp_local = 0;
unsigned char nal_start_code[4] = { 0x00, 0x00, 0x00, 0x01 };
bool logger_debug = (logger_get_level(raop_rtp_mirror->logger) >= LOGGER_DEBUG);
bool logger_debug_data = (logger_get_level(raop_rtp_mirror->logger) >= LOGGER_DEBUG_DATA);
bool h265_video = false;
video_codec_t codec;
video_codec_t codec = VIDEO_CODEC_UNKNOWN;
const char h264[] = "h264";
const char h265[] = "h265";
bool unsupported_codec = false;
bool video_stream_suspended = false;
bool first_packet = true;
while (1) {
fd_set rfds;
@@ -232,7 +239,9 @@ raop_rtp_mirror_thread(void *arg)
/* Timeout happened */
continue;
} else if (ret == -1) {
logger_log(raop_rtp_mirror->logger, LOGGER_ERR, "raop_rtp_mirror error in select");
int sock_err = SOCKET_GET_ERROR();
logger_log(raop_rtp_mirror->logger, LOGGER_ERR,
"raop_rtp_mirror error in select %d %s", sock_err, SOCKET_ERROR_STRING(sock_err));
break;
}
@@ -275,6 +284,8 @@ raop_rtp_mirror_thread(void *arg)
logger_log(raop_rtp_mirror->logger, LOGGER_WARNING,
"raop_rtp_mirror could not set stream socket keepalive time %d %s", sock_err, SOCKET_ERROR_STRING(sock_err));
}
/* OpenBSD does not have these options. */
#ifndef __OpenBSD__
option = 10;
if (setsockopt(stream_fd, SOL_TCP, TCP_KEEPINTVL, CAST &option, sizeof(option)) < 0) {
int sock_err = SOCKET_GET_ERROR();
@@ -287,6 +298,7 @@ raop_rtp_mirror_thread(void *arg)
logger_log(raop_rtp_mirror->logger, LOGGER_WARNING,
"raop_rtp_mirror could not set stream socket keepalive probes %d %s", sock_err, SOCKET_ERROR_STRING(sock_err));
}
#endif /* !__OpenBSD__ */
readstart = 0;
}
@@ -327,6 +339,11 @@ raop_rtp_mirror_thread(void *arg)
}
ntp_timestamp_raw = byteutils_get_long(packet, 8);
ntp_timestamp_remote = raop_ntp_timestamp_to_nano_seconds(ntp_timestamp_raw, false);
if (first_packet) {
uint64_t offset = raop_ntp_get_local_time() - ntp_timestamp_remote;
raop_ntp_set_video_arrival_offset(raop_rtp_mirror->ntp, &offset);
first_packet = false;
}
/* packet[4] + packet[5] identify the payload type: values seen are: *
* 0x00 0x00: encrypted packet containing a non-IDR type 1 VCL NAL unit *
@@ -388,11 +405,11 @@ raop_rtp_mirror_thread(void *arg)
// counting nano seconds since last boot.
ntp_timestamp_local = raop_ntp_convert_remote_time(raop_rtp_mirror->ntp, ntp_timestamp_remote);
if (logger_debug) {
uint64_t ntp_now = raop_ntp_get_local_time(raop_rtp_mirror->ntp);
int64_t latency = ((int64_t) ntp_now) - ((int64_t) ntp_timestamp_local);
if (logger_debug_data) {
uint64_t ntp_now = raop_ntp_get_local_time();
int64_t latency = (ntp_timestamp_local ? ((int64_t) ntp_now) - ((int64_t) ntp_timestamp_local) : 0);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG,
"raop_rtp video: now = %8.6f, ntp = %8.6f, latency = %8.6f, ts = %8.6f, %s %s",
"raop_rtp video: now = %8.6f, ntp = %8.6f, latency = %9.6f, ts = %8.6f, %s %s",
(double) ntp_now / SEC, (double) ntp_timestamp_local / SEC, (double) latency / SEC,
(double) ntp_timestamp_remote / SEC, packet_description, h265_video ? h265 : h264);
}
@@ -562,6 +579,9 @@ raop_rtp_mirror_thread(void *arg)
" payload_size %d header %s ts_client = %8.6f",
payload_size, packet_description, (double) ntp_timestamp_remote / SEC);
if (packet[6] == 0x56 || packet[6] == 0x5e) {
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "This packet indicates video stream is stopping");
}
if (!video_stream_suspended && (packet[6] == 0x56 || packet[6] == 0x5e)) {
video_stream_suspended = true;
raop_rtp_mirror->callbacks.video_pause(raop_rtp_mirror->callbacks.cls);
@@ -570,7 +590,6 @@ raop_rtp_mirror_thread(void *arg)
video_stream_suspended = false;
}
codec = VIDEO_CODEC_UNKNOWN;
assert (raop_rtp_mirror->callbacks.video_set_codec);
ntp_timestamp_nal = ntp_timestamp_raw;
@@ -611,9 +630,21 @@ raop_rtp_mirror_thread(void *arg)
if (!memcmp(payload + 4, hvc1, 4)) {
/* hvc1 HECV detected */
codec = VIDEO_CODEC_H265;
h265_video = true;
raop_rtp_mirror->callbacks.video_set_codec(raop_rtp_mirror->callbacks.cls, codec);
if (codec == VIDEO_CODEC_UNKNOWN) {
codec = VIDEO_CODEC_H265;
h265_video = true;
if (raop_rtp_mirror->callbacks.video_set_codec(raop_rtp_mirror->callbacks.cls, codec) < 0) {
logger_log(raop_rtp_mirror->logger, LOGGER_ERR, "failed to set video codec as H265 ");
/* drop connection */
conn_reset = true;
break;
}
} else if (codec != VIDEO_CODEC_H265) {
logger_log(raop_rtp_mirror->logger, LOGGER_ERR, "invalid video codec change to H265: codec was set previously");
/* drop connection */
conn_reset = true;
break;
}
unsigned char vps_start_code[] = { 0xa0, 0x00, 0x01, 0x00 };
unsigned char sps_start_code[] = { 0xa1, 0x00, 0x01, 0x00 };
unsigned char pps_start_code[] = { 0xa2, 0x00, 0x01, 0x00 };
@@ -636,7 +667,7 @@ raop_rtp_mirror_thread(void *arg)
vps = ptr;
if (logger_debug) {
char *str = utils_data_to_string(vps, vps_size, 16);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "h265 vps size %d\n%s",vps_size, str);
logger_log(raop_rtp_mirror->logger, LOGGER_INFO, "h265 vps size %d\n%s",vps_size, str);
free(str);
}
ptr += vps_size;
@@ -650,7 +681,7 @@ raop_rtp_mirror_thread(void *arg)
sps = ptr;
if (logger_debug) {
char *str = utils_data_to_string(sps, sps_size, 16);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "h265 sps size %d\n%s",vps_size, str);
logger_log(raop_rtp_mirror->logger, LOGGER_INFO, "h265 sps size %d\n%s",vps_size, str);
free(str);
}
ptr += sps_size;
@@ -664,7 +695,7 @@ raop_rtp_mirror_thread(void *arg)
pps = ptr;
if (logger_debug) {
char *str = utils_data_to_string(pps, pps_size, 16);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "h265 pps size %d\n%s",pps_size, str);
logger_log(raop_rtp_mirror->logger, LOGGER_INFO, "h265 pps size %d\n%s",pps_size, str);
free(str);
}
@@ -684,9 +715,21 @@ raop_rtp_mirror_thread(void *arg)
ptr += 4;
memcpy(ptr, pps, pps_size);
} else {
codec = VIDEO_CODEC_H264;
h265_video = false;
raop_rtp_mirror->callbacks.video_set_codec(raop_rtp_mirror->callbacks.cls, codec);
if (codec == VIDEO_CODEC_UNKNOWN) {
codec = VIDEO_CODEC_H264;
h265_video = false;
if (raop_rtp_mirror->callbacks.video_set_codec(raop_rtp_mirror->callbacks.cls, codec) < 0) {
logger_log(raop_rtp_mirror->logger, LOGGER_ERR, "failed to set video codec as H264 ");
/* drop connection */
conn_reset = true;
break;
}
} else if (codec != VIDEO_CODEC_H264) {
logger_log(raop_rtp_mirror->logger, LOGGER_ERR, "invalid codec change to H264: codec was set previously");
/* drop connection */
conn_reset = true;
break;
}
short sps_size = byteutils_get_short_be(payload,6);
unsigned char *sequence_parameter_set = payload + 8;
short pps_size = byteutils_get_short_be(payload, sps_size + 9);
@@ -694,23 +737,23 @@ raop_rtp_mirror_thread(void *arg)
int data_size = 6;
if (logger_debug) {
char *str = utils_data_to_string(payload, data_size, 16);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp_mirror: SPS+PPS header size = %d", data_size);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp_mirror h264 SPS+PPS header:\n%s", str);
logger_log(raop_rtp_mirror->logger, LOGGER_INFO, "raop_rtp_mirror: SPS+PPS header size = %d", data_size);
logger_log(raop_rtp_mirror->logger, LOGGER_INFO, "raop_rtp_mirror h264 SPS+PPS header:\n%s", str);
free(str);
str = utils_data_to_string(sequence_parameter_set, sps_size,16);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp_mirror SPS NAL size = %d", sps_size);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp_mirror h264 Sequence Parameter Set:\n%s", str);
logger_log(raop_rtp_mirror->logger, LOGGER_INFO, "raop_rtp_mirror SPS NAL size = %d", sps_size);
logger_log(raop_rtp_mirror->logger, LOGGER_INFO, "raop_rtp_mirror h264 Sequence Parameter Set:\n%s", str);
free(str);
str = utils_data_to_string(picture_parameter_set, pps_size, 16);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp_mirror PPS NAL size = %d", pps_size);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp_mirror h264 Picture Parameter Set:\n%s", str);
logger_log(raop_rtp_mirror->logger, LOGGER_INFO, "raop_rtp_mirror PPS NAL size = %d", pps_size);
logger_log(raop_rtp_mirror->logger, LOGGER_INFO, "raop_rtp_mirror h264 Picture Parameter Set:\n%s", str);
free(str);
}
data_size = payload_size - sps_size - pps_size - 11;
if (data_size > 0 && logger_debug) {
char *str = utils_data_to_string (picture_parameter_set + pps_size, data_size, 16);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "remainder size = %d", data_size);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "remainder of SPS+PPS packet:\n%s", str);
logger_log(raop_rtp_mirror->logger, LOGGER_INFO, "remainder size = %d", data_size);
logger_log(raop_rtp_mirror->logger, LOGGER_INFO, "remainder of SPS+PPS packet:\n%s", str);
free(str);
} else if (data_size < 0) {
logger_log(raop_rtp_mirror->logger, LOGGER_ERR, " pps_sps error: packet remainder size = %d < 0", data_size);
@@ -804,9 +847,8 @@ raop_rtp_mirror_thread(void *arg)
MUTEX_UNLOCK(raop_rtp_mirror->run_mutex);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp_mirror exiting TCP thread");
if (conn_reset && raop_rtp_mirror->callbacks.conn_reset) {
const bool video_reset = false; /* leave "frozen video" showing */
raop_rtp_mirror->callbacks.conn_reset(raop_rtp_mirror->callbacks.cls, 0, video_reset);
if (conn_reset&& raop_rtp_mirror->callbacks.conn_reset) {
raop_rtp_mirror->callbacks.conn_reset(raop_rtp_mirror->callbacks.cls, 1);
}
if (unsupported_codec) {
@@ -899,14 +941,14 @@ void raop_rtp_mirror_stop(raop_rtp_mirror_t *raop_rtp_mirror) {
raop_rtp_mirror->running = 0;
MUTEX_UNLOCK(raop_rtp_mirror->run_mutex);
/* Join the thread */
THREAD_JOIN(raop_rtp_mirror->thread_mirror);
if (raop_rtp_mirror->mirror_data_sock != -1) {
closesocket(raop_rtp_mirror->mirror_data_sock);
raop_rtp_mirror->mirror_data_sock = -1;
}
/* Join the thread */
THREAD_JOIN(raop_rtp_mirror->thread_mirror);
/* Mark thread as joined */
MUTEX_LOCK(raop_rtp_mirror->run_mutex);
raop_rtp_mirror->joined = 1;

View File

@@ -186,14 +186,14 @@ char *utils_parse_hex(const char *str, int str_len, int *data_len) {
return data;
}
char *utils_pk_to_string(const unsigned char *pk, int pk_len) {
char *pk_str = (char *) malloc(2*pk_len + 1);
char* pos = pk_str;
for (int i = 0; i < pk_len; i++) {
snprintf(pos, 3, "%2.2x", *(pk + i));
char *utils_hex_to_string(const unsigned char *hex, int hex_len) {
char *hex_str = (char *) malloc(2*hex_len + 1);
char* pos = hex_str;
for (int i = 0; i < hex_len; i++) {
snprintf(pos, 3, "%2.2x", *(hex + i));
pos +=2;
}
return pk_str;
return hex_str;
}
char *utils_data_to_string(const unsigned char *data, int datalen, int chars_per_line) {
@@ -283,6 +283,67 @@ int utils_ipaddress_to_string(int addresslen, const unsigned char *address, unsi
return ret;
}
char *utils_strip_data_from_plist_xml(char *plist_xml) {
/* returns NULL if no data needs to be stripped out.
* returns pointer to newly-allocated stripped text char *xml
* WHICH (like plist_xml) MUST BE FREED AFTER USE*/
assert(plist_xml);
int len = (int) strlen(plist_xml);
char *last = plist_xml + len * sizeof(char); // position of null termination of plist_xml
char *eol;
char *eol_data;
char *xml = NULL;
int nchars;
char line[81];
int count;
char *begin = strstr(plist_xml, "<data>");
char *end;
if (!begin) {
/* there are no data lines, nothing to do */
return NULL;
} else {
xml = (char *) calloc((len + 1), sizeof(char));
}
char *ptr1 = plist_xml;
char *ptr2 = xml;
do {
eol = strchr(begin,'\n');
nchars = eol + 1 - ptr1;
memcpy(ptr2, ptr1, nchars);
ptr2 += nchars;
ptr1 += nchars;
end = strstr(ptr1, "</data>");
assert(end);
count = 0;
do {
eol_data = eol;
eol = strchr(eol + 1, '\n');
count++;
} while (eol < end);
count--; // last '\n' counted ends the first non-data line (contains "</data>")
if (count > 1) {
snprintf(line, sizeof(line), " (%d lines data omitted, 64 chars/line)\n", count);
nchars = strlen(line);
memcpy(ptr2, line, nchars);
ptr2 += nchars;
ptr1 = eol_data + 1;
} else {
nchars = eol_data + 1 - ptr1;
memcpy(ptr2, ptr1, nchars);
ptr2 += nchars;
ptr1 += nchars;
}
begin = strstr(ptr1, "<data>");
if (begin == NULL) {
nchars = (int) (last + 1 - ptr1);
memcpy(ptr2, ptr1, nchars); //includes the null terminator
break;
}
} while (ptr1 <= last);
return xml;
}
const char *gmt_time_string() {
static char date_buf[64];
memset(date_buf, 0, 64);

View File

@@ -25,7 +25,7 @@ int utils_read_file(char **dst, const char *pemstr);
int utils_hwaddr_raop(char *str, int strlen, const char *hwaddr, int hwaddrlen);
int utils_hwaddr_airplay(char *str, int strlen, const char *hwaddr, int hwaddrlen);
char *utils_parse_hex(const char *str, int str_len, int *data_len);
char *utils_pk_to_string(const unsigned char *pk, int pk_len);
char *utils_hex_to_string(const unsigned char *hex, int hex_len);
char *utils_data_to_string(const unsigned char *data, int datalen, int chars_per_line);
char *utils_data_to_text(const char *data, int datalen);
void ntp_timestamp_to_time(uint64_t ntp_timestamp, char *timestamp, size_t maxsize);
@@ -33,4 +33,5 @@ void ntp_timestamp_to_seconds(uint64_t ntp_timestamp, char *timestamp, size_t ma
const char *gmt_time_string();
int utils_ipaddress_to_string(int addresslen, const unsigned char *address,
unsigned int zone_id, char *string, int len);
char *utils_strip_data_from_plist_xml(char * plist_xml);
#endif

View File

@@ -1,4 +1,3 @@
cmake_minimum_required(VERSION 3.5)
if (APPLE )
set( ENV{PKG_CONFIG_PATH} "/Library/FrameWorks/GStreamer.framework/Libraries/pkgconfig" ) # GStreamer.framework, preferred

View File

@@ -25,6 +25,7 @@
#include "video_renderer.h"
#define SECOND_IN_NSECS 1000000000UL
#define SECOND_IN_MICROSECS 1000000
#ifdef X_DISPLAY_FIX
#include <gst/video/navigation.h>
#include "x_display_fix.h"
@@ -45,14 +46,39 @@ static bool use_x11 = false;
#endif
static bool logger_debug = false;
static bool video_terminate = false;
static gint64 hls_requested_start_position = 0;
static gint64 hls_seek_start = 0;
static gint64 hls_seek_end = 0;
static gint64 hls_duration;
static gboolean hls_seek_enabled;
static gboolean hls_playing;
static gboolean hls_buffer_empty;
static gboolean hls_buffer_full;
#define NCODECS 2 /* renderers for h264 and h265 */
typedef enum {
//GST_PLAY_FLAG_VIDEO = (1 << 0),
//GST_PLAY_FLAG_AUDIO = (1 << 1),
//GST_PLAY_FLAG_TEXT = (1 << 2),
//GST_PLAY_FLAG_VIS = (1 << 3),
//GST_PLAY_FLAG_SOFT_VOLUME = (1 << 4),
//GST_PLAY_FLAG_NATIVE_AUDIO = (1 << 5),
//GST_PLAY_FLAG_NATIVE_VIDEO = (1 << 6),
GST_PLAY_FLAG_DOWNLOAD = (1 << 7),
GST_PLAY_FLAG_BUFFERING = (1 << 8),
//GST_PLAY_FLAG_DEINTERLACE = (1 << 9),
//GST_PLAY_FLAG_SOFT_COLORBALANCE = (1 << 10),
//GST_PLAY_FLAG_FORCE_FILTERS = (1 << 11),
//GST_PLAY_FLAG_FORCE_SW_DECODERS = (1 << 12),
} GstPlayFlags;
#define NCODECS 3 /* renderers for h264,h265, and jpeg images */
struct video_renderer_s {
GstElement *appsrc, *pipeline;
GstBus *bus;
const char *codec;
bool autovideo, state_pending;
bool autovideo;
int id;
gboolean terminate;
gint64 duration;
@@ -69,7 +95,8 @@ static video_renderer_t *renderer_type[NCODECS] = {0};
static int n_renderers = NCODECS;
static char h264[] = "h264";
static char h265[] = "h265";
static char hls[] = "hls";
static char hls[] = "hls";
static char jpeg[] = "jpeg";
static void append_videoflip (GString *launch, const videoflip_t *flip, const videoflip_t *rot) {
/* videoflip image transform */
@@ -138,6 +165,7 @@ static void append_videoflip (GString *launch, const videoflip_t *flip, const vi
* closest used by GStreamer < 1.20.4 is BT709, 2:3:5:1 with * // now use sRGB = 1:1:7:1
* range = 2 -> GST_VIDEO_COLOR_RANGE_16_235 ("limited RGB") */
static const char jpeg_caps[]="image/jpeg";
static const char h264_caps[]="video/x-h264,stream-format=(string)byte-stream,alignment=(string)au";
static const char h265_caps[]="video/x-h265,stream-format=(string)byte-stream,alignment=(string)au";
@@ -156,7 +184,7 @@ GstElement *make_video_sink(const char *videosink, const char *videosink_options
return NULL;
}
/* process the video_sink_optons */
/* process the video_sink_options */
size_t len = strlen(videosink_options);
if (!len) {
return video_sink;
@@ -193,7 +221,7 @@ GstElement *make_video_sink(const char *videosink, const char *videosink_options
void video_renderer_init(logger_t *render_logger, const char *server_name, videoflip_t videoflip[2], const char *parser,
const char *decoder, const char *converter, const char *videosink, const char *videosink_options,
bool initial_fullscreen, bool video_sync, bool h265_support, const char *uri) {
bool initial_fullscreen, bool video_sync, bool h265_support, guint playbin_version, const char *uri) {
GError *error = NULL;
GstCaps *caps = NULL;
hls_video = (uri != NULL);
@@ -203,6 +231,14 @@ void video_renderer_init(logger_t *render_logger, const char *server_name, vide
logger = render_logger;
logger_debug = (logger_get_level(logger) >= LOGGER_DEBUG);
video_terminate = false;
hls_seek_enabled = FALSE;
hls_playing = FALSE;
hls_seek_start = -1;
hls_seek_end = -1;
hls_duration = -1;
hls_buffer_empty = TRUE;
hls_buffer_empty = FALSE;
/* this call to g_set_application_name makes server_name appear in the X11 display window title bar, */
/* (instead of the program name uxplay taken from (argv[0]). It is only set one time. */
@@ -212,33 +248,45 @@ void video_renderer_init(logger_t *render_logger, const char *server_name, vide
appname = NULL;
/* the renderer for hls video will only be built if a HLS uri is provided in
* the call to video_renderer_init, in which case the h264 and 265 mirror-mode
* renderers will not be built. This is because it appears that we cannot
* the call to video_renderer_init, in which case the h264/h265 mirror-mode and jpeg
* audio-mode renderers will not be built. This is because it appears that we cannot
* put playbin into GST_STATE_READY before knowing the uri (?), so cannot use a
* unified renderer structure with h264, h265 and hls */
* unified renderer structure with h264, h265, jpeg and hls */
if (hls_video) {
n_renderers = 1;
/* renderer[0]: playbin (hls) */
} else {
n_renderers = h265_support ? 2 : 1;
n_renderers = h265_support ? 3 : 2;
/* renderer[0]: jpeg; [1]: h264; [2]: h265 */
}
g_assert (n_renderers <= NCODECS);
for (int i = 0; i < n_renderers; i++) {
g_assert (i < 2);
g_assert (i < 3);
renderer_type[i] = (video_renderer_t *) calloc(1, sizeof(video_renderer_t));
g_assert(renderer_type[i]);
renderer_type[i]->autovideo = auto_videosink;
renderer_type[i]->id = i;
renderer_type[i]->bus = NULL;
renderer_type[i]->id = i;
renderer_type[i]->bus = NULL;
if (hls_video) {
/* use playbin3 to play HLS video: replace "playbin3" by "playbin" to use playbin2 */
renderer_type[i]->pipeline = gst_element_factory_make("playbin3", "hls-playbin3");
/* use playbin3 to play HLS video: replace "playbin3" by "playbin" to use playbin2 */
switch (playbin_version) {
case 2:
renderer_type[i]->pipeline = gst_element_factory_make("playbin", "hls-playbin2");
break;
case 3:
renderer_type[i]->pipeline = gst_element_factory_make("playbin3", "hls-playbin3");
break;
default:
logger_log(logger, LOGGER_ERR, "video_renderer_init: invalid playbin version %u", playbin_version);
g_assert(0);
}
logger_log(logger, LOGGER_INFO, "Will use GStreamer playbin version %u to play HLS streamed video", playbin_version);
g_assert(renderer_type[i]->pipeline);
renderer_type[i]->appsrc = NULL;
renderer_type[i]->codec = hls;
/* if we are not using autovideosink, build a videossink based on the stricng "videosink" */
if(strcmp(videosink, "autovideosink")) {
GstElement *playbin_videosink = make_video_sink(videosink, videosink_options);
/* if we are not using an autovideosink, build a videosink based on the string "videosink" */
if (!auto_videosink) {
GstElement *playbin_videosink = make_video_sink(videosink, videosink_options);
if (!playbin_videosink) {
logger_log(logger, LOGGER_ERR, "video_renderer_init: failed to create playbin_videosink");
} else {
@@ -246,15 +294,25 @@ void video_renderer_init(logger_t *render_logger, const char *server_name, vide
g_object_set(G_OBJECT (renderer_type[i]->pipeline), "video-sink", playbin_videosink, NULL);
}
}
g_object_set (G_OBJECT (renderer_type[i]->pipeline), "uri", uri, NULL);
gint flags;
g_object_get(renderer_type[i]->pipeline, "flags", &flags, NULL);
flags |= GST_PLAY_FLAG_DOWNLOAD;
flags |= GST_PLAY_FLAG_BUFFERING; // set by default in playbin3, but not in playbin2; is it needed?
g_object_set(renderer_type[i]->pipeline, "flags", flags, NULL);
g_object_set (G_OBJECT (renderer_type[i]->pipeline), "uri", uri, NULL);
} else {
bool jpeg_pipeline = false;
switch (i) {
case 0:
jpeg_pipeline = true;
renderer_type[i]->codec = jpeg;
caps = gst_caps_from_string(jpeg_caps);
break;
case 1:
renderer_type[i]->codec = h264;
caps = gst_caps_from_string(h264_caps);
break;
case 1:
case 2:
renderer_type[i]->codec = h265;
caps = gst_caps_from_string(h265_caps);
break;
@@ -262,22 +320,29 @@ void video_renderer_init(logger_t *render_logger, const char *server_name, vide
g_assert(0);
}
GString *launch = g_string_new("appsrc name=video_source ! ");
g_string_append(launch, "queue ! ");
g_string_append(launch, parser);
g_string_append(launch, " ! ");
g_string_append(launch, decoder);
if (jpeg_pipeline) {
g_string_append(launch, "jpegdec ");
} else {
g_string_append(launch, "queue ! ");
g_string_append(launch, parser);
g_string_append(launch, " ! ");
g_string_append(launch, decoder);
}
g_string_append(launch, " ! ");
append_videoflip(launch, &videoflip[0], &videoflip[1]);
g_string_append(launch, converter);
g_string_append(launch, " ! ");
g_string_append(launch, "videoscale ! ");
if (jpeg_pipeline) {
g_string_append(launch, " imagefreeze allow-replace=TRUE ! ");
}
g_string_append(launch, videosink);
g_string_append(launch, " name=");
g_string_append(launch, videosink);
g_string_append(launch, "_");
g_string_append(launch, renderer_type[i]->codec);
g_string_append(launch, videosink_options);
if (video_sync) {
if (video_sync && !jpeg_pipeline) {
g_string_append(launch, " sync=true");
sync = true;
} else {
@@ -302,7 +367,13 @@ void video_renderer_init(logger_t *render_logger, const char *server_name, vide
logger_log(logger, LOGGER_DEBUG, "GStreamer video pipeline %d:\n\"%s\"", i + 1, launch->str);
renderer_type[i]->pipeline = gst_parse_launch(launch->str, &error);
if (error) {
g_error ("get_parse_launch error (video) :\n %s\n",error->message);
logger_log(logger, LOGGER_ERR, "GStreamer gst_parse_launch failed to create video pipeline %d\n"
"*** error message from gst_parse_launch was:\n%s\n"
"launch string parsed was \n[%s]", i + 1, error->message, launch->str);
if (strstr(error->message, "no element")) {
logger_log(logger, LOGGER_ERR, "This error usually means that a uxplay option was mistyped\n"
" or some requested part of GStreamer is not installed\n");
}
g_clear_error (&error);
}
g_assert (renderer_type[i]->pipeline);
@@ -312,7 +383,6 @@ void video_renderer_init(logger_t *render_logger, const char *server_name, vide
gst_pipeline_use_clock(GST_PIPELINE_CAST(renderer_type[i]->pipeline), clock);
renderer_type[i]->appsrc = gst_bin_get_by_name (GST_BIN (renderer_type[i]->pipeline), "video_source");
g_assert(renderer_type[i]->appsrc);
g_object_set(renderer_type[i]->appsrc, "caps", caps, "stream-type", 0, "is-live", TRUE, "format", GST_FORMAT_TIME, NULL);
g_string_free(launch, TRUE);
gst_caps_unref(caps);
@@ -325,13 +395,16 @@ void video_renderer_init(logger_t *render_logger, const char *server_name, vide
renderer_type[i]->gst_window = NULL;
renderer_type[i]->use_x11 = false;
X11_search_attempts = 0;
/* setting char *x11_display_name to NULL means the value is taken from $DISPLAY in the environment
* (a uxplay option to specify a different value is possible) */
char *x11_display_name = NULL;
if (use_x11) {
if (i == 0) {
renderer_type[0]->gst_window = (X11_Window_t *) calloc(1, sizeof(X11_Window_t));
g_assert(renderer_type[0]->gst_window);
get_X11_Display(renderer_type[0]->gst_window);
get_X11_Display(renderer_type[0]->gst_window, x11_display_name);
if (renderer_type[0]->gst_window->display) {
renderer_type[i]->use_x11 = true;
renderer_type[0]->use_x11 = true;
} else {
free(renderer_type[0]->gst_window);
renderer_type[0]->gst_window = NULL;
@@ -385,18 +458,23 @@ void video_renderer_resume() {
}
void video_renderer_start() {
GstState state;
const gchar *state_name;
if (hls_video) {
renderer->bus = gst_element_get_bus(renderer->pipeline);
gst_element_set_state (renderer->pipeline, GST_STATE_PLAYING);
gst_element_set_state (renderer->pipeline, GST_STATE_PAUSED);
gst_element_get_state(renderer->pipeline, &state, NULL, 1000 * GST_MSECOND);
state_name= gst_element_state_get_name(state);
logger_log(logger, LOGGER_DEBUG, "video renderer_start: state %s", state_name);
return;
}
/* when not hls, start both h264 and h265 pipelines; will shut down the "wrong" one when we know the codec */
for (int i = 0; i < n_renderers; i++) {
gst_element_set_state (renderer_type[i]->pipeline, GST_STATE_PLAYING);
if (renderer_type[i]->appsrc) {
gst_video_pipeline_base_time = gst_element_get_base_time(renderer_type[i]->appsrc);
}
renderer_type[i]->bus = gst_element_get_bus(renderer_type[i]->pipeline);
gst_element_set_state (renderer_type[i]->pipeline, GST_STATE_PAUSED);
gst_element_get_state(renderer_type[i]->pipeline, &state, NULL, 1000 * GST_MSECOND);
state_name= gst_element_state_get_name(state);
logger_log(logger, LOGGER_DEBUG, "video renderer_start: renderer %d state %s", i, state_name);
}
renderer = NULL;
first_packet = true;
@@ -418,11 +496,24 @@ bool waiting_for_x11_window() {
return true; /* window still not found */
}
}
if (fullscreen) {
set_fullscreen(renderer->gst_window, &fullscreen);
}
#endif
return false;
}
void video_renderer_render_buffer(unsigned char* data, int *data_len, int *nal_count, uint64_t *ntp_time) {
void video_renderer_display_jpeg(const void *data, int *data_len) {
GstBuffer *buffer;
if (renderer && !strcmp(renderer->codec, jpeg)) {
buffer = gst_buffer_new_allocate(NULL, *data_len, NULL);
g_assert(buffer != NULL);
gst_buffer_fill(buffer, 0, data, *data_len);
gst_app_src_push_buffer (GST_APP_SRC(renderer->appsrc), buffer);
}
}
uint64_t video_renderer_render_buffer(unsigned char* data, int *data_len, int *nal_count, uint64_t *ntp_time) {
GstBuffer *buffer;
GstClockTime pts = (GstClockTime) *ntp_time; /*now in nsecs */
//GstClockTimeDiff latency = GST_CLOCK_DIFF(gst_element_get_current_clock_time (renderer->appsrc), pts);
@@ -430,9 +521,10 @@ void video_renderer_render_buffer(unsigned char* data, int *data_len, int *nal_c
if (pts >= gst_video_pipeline_base_time) {
pts -= gst_video_pipeline_base_time;
} else {
logger_log(logger, LOGGER_ERR, "*** invalid ntp_time < gst_video_pipeline_base_time\n%8.6f ntp_time\n%8.6f base_time",
// adjust timestamps to be >= gst_video_pipeline_base time
logger_log(logger, LOGGER_DEBUG, "*** invalid ntp_time < gst_video_pipeline_base_time\n%8.6f ntp_time\n%8.6f base_time",
((double) *ntp_time) / SECOND_IN_NSECS, ((double) gst_video_pipeline_base_time) / SECOND_IN_NSECS);
return;
return (uint64_t) gst_video_pipeline_base_time - pts;
}
}
g_assert(data_len != 0);
@@ -469,6 +561,7 @@ void video_renderer_render_buffer(unsigned char* data, int *data_len, int *nal_c
}
#endif
}
return 0;
}
void video_renderer_flush() {
@@ -476,6 +569,7 @@ void video_renderer_flush() {
void video_renderer_stop() {
if (renderer) {
logger_log(logger, LOGGER_DEBUG,"video_renderer_stop");
if (renderer->appsrc) {
gst_app_src_end_of_stream (GST_APP_SRC(renderer->appsrc));
}
@@ -484,15 +578,21 @@ void video_renderer_stop() {
}
}
static void video_renderer_destroy_h26x(video_renderer_t *renderer) {
static void video_renderer_destroy_instance(video_renderer_t *renderer) {
if (renderer) {
logger_log(logger, LOGGER_DEBUG,"destroying renderer instance %p", renderer);
GstState state;
GstStateChangeReturn ret;
gst_element_get_state(renderer->pipeline, &state, NULL, 100 * GST_MSECOND);
logger_log(logger, LOGGER_DEBUG,"pipeline state is %s", gst_element_state_get_name(state));
if (state != GST_STATE_NULL) {
if (!hls_video) {
gst_app_src_end_of_stream (GST_APP_SRC(renderer->appsrc));
}
gst_element_set_state (renderer->pipeline, GST_STATE_NULL);
ret = gst_element_set_state (renderer->pipeline, GST_STATE_NULL);
logger_log(logger, LOGGER_DEBUG,"pipeline_state_change_return: %s",
gst_element_state_change_return_get_name(ret));
gst_element_get_state(renderer->pipeline, NULL, NULL, 1000 * GST_MSECOND);
}
gst_object_unref(renderer->bus);
if (renderer->appsrc) {
@@ -513,53 +613,139 @@ static void video_renderer_destroy_h26x(video_renderer_t *renderer) {
void video_renderer_destroy() {
for (int i = 0; i < n_renderers; i++) {
if (renderer_type[i]) {
video_renderer_destroy_h26x(renderer_type[i]);
video_renderer_destroy_instance(renderer_type[i]);
}
}
}
}
static void get_stream_status_name(GstStreamStatusType type, char *name, size_t len) {
switch (type) {
case GST_STREAM_STATUS_TYPE_CREATE:
strncpy(name, "CREATE", len);
return;
case GST_STREAM_STATUS_TYPE_ENTER:
strncpy(name, "ENTER", len);
return;
case GST_STREAM_STATUS_TYPE_LEAVE:
strncpy(name, "LEAVE", len);
return;
case GST_STREAM_STATUS_TYPE_DESTROY:
strncpy(name, "DESTROY", len);
return;
case GST_STREAM_STATUS_TYPE_START:
strncpy(name, "START", len);
return;
case GST_STREAM_STATUS_TYPE_PAUSE:
strncpy(name, "PAUSE", len);
return;
case GST_STREAM_STATUS_TYPE_STOP:
strncpy(name, "STOP", len);
return;
default:
strncpy(name, "", len);
return;
}
}
gboolean gstreamer_pipeline_bus_callback(GstBus *bus, GstMessage *message, void *loop) {
GstState old_state, new_state;
const gchar no_state[] = "";
const gchar *old_state_name = no_state, *new_state_name = no_state;
if (GST_MESSAGE_TYPE(message) == GST_MESSAGE_STATE_CHANGED) {
GstState old_state, new_state;
gst_message_parse_state_changed (message, &old_state, &new_state, NULL);
old_state_name = gst_element_state_get_name (old_state);
new_state_name = gst_element_state_get_name (new_state);
}
/* identify which pipeline sent the message */
/* identify which pipeline sent the message */
int type = -1;
for (int i = 0 ; i < n_renderers ; i ++ ) {
if (renderer_type[i]->bus == bus) {
if (renderer_type[i] && renderer_type[i]->bus == bus) {
type = i;
break;
}
}
g_assert(type != -1);
if (logger_debug) {
g_print("GStreamer %s bus message: %s %s\n", renderer_type[type]->codec, GST_MESSAGE_SRC_NAME(message), GST_MESSAGE_TYPE_NAME(message));
/* if the bus sending the message is not found, the renderer may already have been destroyed */
if (type == -1) {
if (logger_debug) {
g_print("GStreamer(UNKNOWN, now destroyed?) bus message: %s %s %s %s\n",
GST_MESSAGE_SRC_NAME(message), GST_MESSAGE_TYPE_NAME(message), old_state_name, new_state_name);
}
return TRUE;
}
if (logger_debug && hls_video) {
gint64 pos;
gst_element_query_position (renderer_type[type]->pipeline, GST_FORMAT_TIME, &pos);
if (logger_debug) {
gchar *name = NULL;
GstElement *element = NULL;
gchar type_name[8] = { 0 };
if (GST_MESSAGE_TYPE(message) == GST_MESSAGE_STREAM_STATUS) {
GstStreamStatusType type;
gst_message_parse_stream_status(message, &type, &element);
name = gst_element_get_name(element);
get_stream_status_name(type, type_name, 8);
old_state_name = name;
new_state_name = type_name;
}
gint64 pos = -1;
if (hls_video) {
gst_element_query_position (renderer_type[type]->pipeline, GST_FORMAT_TIME, &pos);
}
if (GST_CLOCK_TIME_IS_VALID(pos)) {
g_print("GStreamer bus message %s %s; position: %" GST_TIME_FORMAT "\n", GST_MESSAGE_SRC_NAME(message),
GST_MESSAGE_TYPE_NAME(message), GST_TIME_ARGS(pos));
g_print("GStreamer %s bus message %s %s %s %s; position: %" GST_TIME_FORMAT "\n" ,renderer_type[type]->codec,
GST_MESSAGE_SRC_NAME(message), GST_MESSAGE_TYPE_NAME(message), old_state_name, new_state_name, GST_TIME_ARGS(pos));
} else {
g_print("GStreamer bus message %s %s; position: none\n", GST_MESSAGE_SRC_NAME(message),
GST_MESSAGE_TYPE_NAME(message));
g_print("GStreamer %s bus message %s %s %s %s\n", renderer_type[type]->codec,
GST_MESSAGE_SRC_NAME(message), GST_MESSAGE_TYPE_NAME(message), old_state_name, new_state_name);
}
if (name) {
g_free(name);
}
}
/* monitor hls video position until seek to hls_start_position is achieved */
if (hls_video && hls_requested_start_position) {
if (strstr(GST_MESSAGE_SRC_NAME(message), "sink")) {
gint64 pos;
if (!GST_CLOCK_TIME_IS_VALID(hls_duration)) {
gst_element_query_duration (renderer->pipeline, GST_FORMAT_TIME, &hls_duration);
}
gst_element_query_position (renderer_type[type]->pipeline, GST_FORMAT_TIME, &pos);
//g_print("HLS position %" GST_TIME_FORMAT " requested_start_position %" GST_TIME_FORMAT " duration %" GST_TIME_FORMAT " %s\n",
// GST_TIME_ARGS(pos), GST_TIME_ARGS(hls_requested_start_position), GST_TIME_ARGS(hls_duration),
// (hls_seek_enabled ? "seek enabled" : "seek not enabled"));
if (pos > hls_requested_start_position) {
hls_requested_start_position = 0;
}
if ( hls_requested_start_position && pos < hls_requested_start_position && hls_seek_enabled) {
g_print("***************** seek to hls_requested_start_position %" GST_TIME_FORMAT "\n", GST_TIME_ARGS(hls_requested_start_position));
if (gst_element_seek_simple (renderer_type[type]->pipeline, GST_FORMAT_TIME,
GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT, hls_requested_start_position)) {
hls_requested_start_position = 0;
}
}
}
}
switch (GST_MESSAGE_TYPE (message)) {
case GST_MESSAGE_DURATION:
renderer_type[type]->duration = GST_CLOCK_TIME_NONE;
hls_duration = GST_CLOCK_TIME_NONE;
break;
case GST_MESSAGE_BUFFERING:
if (hls_video) {
gint percent = -1;
gst_message_parse_buffering(message, &percent);
if (percent >= 0) {
hls_buffer_empty = TRUE;
hls_buffer_full = FALSE;
if (percent > 0) {
hls_buffer_empty = FALSE;
renderer_type[type]->buffering_level = percent;
logger_log(logger, LOGGER_DEBUG, "Buffering :%u percent done", percent);
logger_log(logger, LOGGER_DEBUG, "Buffering :%d percent done", percent);
if (percent < 100) {
gst_element_set_state (renderer_type[type]->pipeline, GST_STATE_PAUSED);
} else {
hls_buffer_full = TRUE;
gst_element_set_state (renderer_type[type]->pipeline, GST_STATE_PLAYING);
}
}
@@ -603,14 +789,31 @@ gboolean gstreamer_pipeline_bus_callback(GstBus *bus, GstMessage *message, void
}
break;
case GST_MESSAGE_STATE_CHANGED:
if (renderer_type[type]->state_pending && strstr(GST_MESSAGE_SRC_NAME(message), "pipeline")) {
GstState state;
gst_element_get_state(renderer_type[type]->pipeline, &state, NULL, 100 * GST_MSECOND);
if (state == GST_STATE_NULL) {
gst_element_set_state(renderer_type[type]->pipeline, GST_STATE_PLAYING);
} else if (state == GST_STATE_PLAYING) {
renderer_type[type]->state_pending = false;
if (hls_video && logger_debug && strstr(GST_MESSAGE_SRC_NAME(message), "hls-playbin")) {
GstState old_state, new_state;
gst_message_parse_state_changed (message, &old_state, &new_state, NULL);
g_print ("****** hls_playbin: Element %s changed state from %s to %s.\n", GST_OBJECT_NAME (message->src),
gst_element_state_get_name (old_state),
gst_element_state_get_name (new_state));
if (new_state != GST_STATE_PLAYING) {
break;
hls_playing = FALSE;
}
hls_playing = TRUE;
GstQuery *query;
query = gst_query_new_seeking(GST_FORMAT_TIME);
if (gst_element_query(renderer->pipeline, query)) {
gst_query_parse_seeking (query, NULL, &hls_seek_enabled, &hls_seek_start, &hls_seek_end);
if (hls_seek_enabled) {
g_print ("Seeking is ENABLED from %" GST_TIME_FORMAT " to %" GST_TIME_FORMAT "\n",
GST_TIME_ARGS (hls_seek_start), GST_TIME_ARGS (hls_seek_end));
} else {
g_print ("Seeking is DISABLED for this stream.\n");
}
} else {
g_printerr ("Seeking query failed.");
}
gst_query_unref (query);
}
if (renderer_type[type]->autovideo) {
char *sink = strstr(GST_MESSAGE_SRC_NAME(message), "-actual-sink-");
@@ -672,27 +875,52 @@ gboolean gstreamer_pipeline_bus_callback(GstBus *bus, GstMessage *message, void
return TRUE;
}
void video_renderer_choose_codec (bool video_is_h265) {
int video_renderer_choose_codec (bool video_is_jpeg, bool video_is_h265) {
video_renderer_t *renderer_used = NULL;
g_assert(!hls_video);
/* set renderer to h264 or h265, depending on pps/sps received by raop_rtp_mirror */
video_renderer_t *renderer_new = video_is_h265 ? renderer_type[1] : renderer_type[0];
if (renderer == renderer_new) {
return;
if (video_is_jpeg) {
renderer_used = renderer_type[0];
} else if (n_renderers == 2) {
if (video_is_h265) {
logger_log(logger, LOGGER_ERR, "video is h265 but the -h265 option was not used");
return -1;
}
renderer_used = renderer_type[1];
} else {
renderer_used = video_is_h265 ? renderer_type[2] : renderer_type[1];
}
video_renderer_t *renderer_prev = renderer;
renderer = renderer_new;
if (renderer_used == NULL) {
return -1;
} else if (renderer_used == renderer) {
return 0;
} else if (renderer) {
return -1;
}
renderer = renderer_used;
gst_element_set_state (renderer->pipeline, GST_STATE_PLAYING);
GstState old_state, new_state;
if (gst_element_get_state(renderer->pipeline, &old_state, &new_state, 100 * GST_MSECOND) == GST_STATE_CHANGE_FAILURE) {
g_error("video pipeline failed to go into playing state");
return -1;
}
logger_log(logger, LOGGER_DEBUG, "video_pipeline state change from %s to %s\n",
gst_element_state_get_name (old_state),gst_element_state_get_name (new_state));
gst_video_pipeline_base_time = gst_element_get_base_time(renderer->appsrc);
/* it seems unlikely that the codec will change between h264 and h265 during a connection,
* but in case it does, we set the previous renderer to GST_STATE_NULL, detect
* when this is finished by listening for the bus message, and then reset it to
* GST_STATE_READY, so it can be reused if the codec changes again. */
if (renderer_prev) {
gst_app_src_end_of_stream (GST_APP_SRC(renderer_prev->appsrc));
gst_bus_set_flushing(renderer_prev->bus, TRUE);
/* set state of previous renderer to GST_STATE_NULL to (hopefully?) close its video window */
gst_element_set_state (renderer_prev->pipeline, GST_STATE_NULL);
renderer_prev->state_pending = true; // will set state to PLAYING once state is NULL
if (n_renderers > 2 && renderer == renderer_type[2]) {
logger_log(logger, LOGGER_INFO, "*** video format is h265 high definition (HD/4K) video %dx%d", width, height);
}
/* destroy unused renderers */
for (int i = 1; i < n_renderers; i++) {
if (renderer_type[i] == renderer) {
continue;
}
if (renderer_type[i]) {
video_renderer_t *renderer_unused = renderer_type[i];
renderer_type[i] = NULL;
video_renderer_destroy_instance(renderer_unused);
}
}
return 0;
}
unsigned int video_reset_callback(void * loop) {
@@ -709,16 +937,18 @@ unsigned int video_reset_callback(void * loop) {
return (unsigned int) TRUE;
}
bool video_get_playback_info(double *duration, double *position, float *rate) {
bool video_get_playback_info(double *duration, double *position, float *rate, bool *buffer_empty, bool *buffer_full) {
gint64 pos = 0;
GstState state;
*duration = 0.0;
*position = -1.0;
*rate = 0.0f;
if (!renderer) {
return true;
}
*buffer_empty = (bool) hls_buffer_empty;
*buffer_full = (bool) hls_buffer_full;
gst_element_get_state(renderer->pipeline, &state, NULL, 0);
*rate = 0.0f;
switch (state) {
@@ -728,12 +958,12 @@ bool video_get_playback_info(double *duration, double *position, float *rate) {
break;
}
if (!GST_CLOCK_TIME_IS_VALID(renderer->duration)) {
if (!gst_element_query_duration (renderer->pipeline, GST_FORMAT_TIME, &renderer->duration)) {
if (!GST_CLOCK_TIME_IS_VALID(hls_duration)) {
if (!gst_element_query_duration (renderer->pipeline, GST_FORMAT_TIME, &hls_duration)) {
return true;
}
}
*duration = ((double) renderer->duration) / GST_SECOND;
*duration = ((double) hls_duration) / GST_SECOND;
if (*duration) {
if (gst_element_query_position (renderer->pipeline, GST_FORMAT_TIME, &pos) &&
GST_CLOCK_TIME_IS_VALID(pos)) {
@@ -742,19 +972,27 @@ bool video_get_playback_info(double *duration, double *position, float *rate) {
}
logger_log(logger, LOGGER_DEBUG, "********* video_get_playback_info: position %" GST_TIME_FORMAT " duration %" GST_TIME_FORMAT " %s *********",
GST_TIME_ARGS (pos), GST_TIME_ARGS (renderer->duration), gst_element_state_get_name(state));
GST_TIME_ARGS (pos), GST_TIME_ARGS (hls_duration), gst_element_state_get_name(state));
return true;
}
void video_renderer_set_start(float position) {
int pos_in_micros = (int) (position * SECOND_IN_MICROSECS);
hls_requested_start_position = (gint64) (pos_in_micros * GST_USECOND);
logger_log(logger, LOGGER_DEBUG, "register HLS video start position %f %lld", position,
hls_requested_start_position);
}
void video_renderer_seek(float position) {
double pos = (double) position;
pos *= GST_SECOND;
gint64 seek_position = (gint64) pos;
int pos_in_micros = (int) (position * SECOND_IN_MICROSECS);
gint64 seek_position = (gint64) (pos_in_micros * GST_USECOND);
/* don't seek to within 1 microsecond of beginning or end of video */
if (hls_duration < 2000) return;
seek_position = seek_position < 1000 ? 1000 : seek_position;
seek_position = seek_position > renderer->duration - 1000 ? renderer->duration - 1000: seek_position;
seek_position = seek_position > hls_duration - 1000 ? hls_duration - 1000 : seek_position;
g_print("SCRUB: seek to %f secs = %" GST_TIME_FORMAT ", duration = %" GST_TIME_FORMAT "\n", position,
GST_TIME_ARGS(seek_position), GST_TIME_ARGS(renderer->duration));
GST_TIME_ARGS(seek_position), GST_TIME_ARGS(hls_duration));
gboolean result = gst_element_seek_simple(renderer->pipeline, GST_FORMAT_TIME,
(GstSeekFlags)(GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT),
seek_position);

View File

@@ -49,21 +49,23 @@ typedef struct video_renderer_s video_renderer_t;
void video_renderer_init (logger_t *logger, const char *server_name, videoflip_t videoflip[2], const char *parser,
const char *decoder, const char *converter, const char *videosink, const char *videosink_options,
bool initial_fullscreen, bool video_sync, bool h265_support, const char *uri);
bool initial_fullscreen, bool video_sync, bool h265_support, guint playbin_version, const char *uri);
void video_renderer_start ();
void video_renderer_stop ();
void video_renderer_pause ();
void video_renderer_seek(float position);
void video_renderer_set_start(float position);
void video_renderer_resume ();
bool video_renderer_is_paused();
void video_renderer_render_buffer (unsigned char* data, int *data_len, int *nal_count, uint64_t *ntp_time);
uint64_t video_renderer_render_buffer (unsigned char* data, int *data_len, int *nal_count, uint64_t *ntp_time);
void video_renderer_display_jpeg(const void *data, int *data_len);
void video_renderer_flush ();
unsigned int video_renderer_listen(void *loop, int id);
void video_renderer_destroy ();
void video_renderer_size(float *width_source, float *height_source, float *width, float *height);
bool waiting_for_x11_window();
bool video_get_playback_info(double *duration, double *position, float *rate);
void video_renderer_choose_codec(bool is_h265);
bool video_get_playback_info(double *duration, double *position, float *rate, bool *buffer_empty, bool *buffer_full);
int video_renderer_choose_codec (bool video_is_jpeg, bool video_is_h265);
unsigned int video_renderer_listen(void *loop, int id);
unsigned int video_reset_callback(void *loop);
#ifdef __cplusplus

View File

@@ -43,8 +43,8 @@ struct X11_Window_s {
Window window;
} typedef X11_Window_t;
static void get_X11_Display(X11_Window_t * X11) {
X11->display = XOpenDisplay(NULL);
static void get_X11_Display(X11_Window_t * X11, char *display_name) {
X11->display = XOpenDisplay(display_name);
X11->window = (Window) NULL;
}

View File

@@ -1,11 +1,11 @@
.TH UXPLAY "1" "December 2024" "1.71" "User Commands"
.TH UXPLAY "1" "May 2025" "1.72" "User Commands"
.SH NAME
uxplay \- start AirPlay server
.SH SYNOPSIS
.B uxplay
[\fI\,-n name\/\fR] [\fI\,-s wxh\/\fR] [\fI\,-p \/\fR[\fI\,n\/\fR]] [more \fI OPTIONS \/\fR ...]
.SH DESCRIPTION
UxPlay 1.71: An open\-source AirPlay mirroring (+ audio streaming) server:
UxPlay 1.72: An open\-source AirPlay mirroring (+ audio streaming) server:
.SH OPTIONS
.TP
.B
@@ -16,6 +16,8 @@ UxPlay 1.71: An open\-source AirPlay mirroring (+ audio streaming) server:
\fB\-h265\fR Support h265 (4K) video (with h265 versions of h264 plugins)
.TP
\fB\-hls\fR Support HTTP Live Streaming (currently YouTube video only)
.IP
v = 2 or 3 (default 3) optionally selects video player version
.TP
\fB\-pin\fI[xxxx]\fRUse a 4-digit pin code to control client access (default: no)
.IP
@@ -25,6 +27,13 @@ UxPlay 1.71: An open\-source AirPlay mirroring (+ audio streaming) server:
.IP
client pin-registration; (option: use file "fn" for this)
.TP
\fB\-pw\fI [pwd]\fR Require use of password "pwd" to control client access.
.IP
(with no \fIpwd\fR, pin entry is required at \fIeach\fR connection.)
.IP
(option "-pw" after "-pin" overrides it, and vice versa.)
.TP
\fB\-vsync\fI[x]\fR Mirror mode: sync audio to video using timestamps (default)
.IP
\fIx\fR is optional audio delay: millisecs, decimal, can be neg.
@@ -42,6 +51,8 @@ UxPlay 1.71: An open\-source AirPlay mirroring (+ audio streaming) server:
.TP
\fB\-taper\fR Use a "tapered" AirPlay volume-control profile.
.TP
\fB\-vol\fI v \fR Set initial audio-streaming volume: range [mute=0.0:1.0=full].
.TP
\fB\-s\fR wxh[@r]Request to client for video display resolution [refresh_rate]
.IP
default 1920x1080[@60] (or 3840x2160[@60] with -h265 option).
@@ -49,7 +60,7 @@ UxPlay 1.71: An open\-source AirPlay mirroring (+ audio streaming) server:
.TP
\fB\-o\fR Set display "overscanned" mode on (not usually needed)
.TP
\fB-fs\fR Full-screen (only works with X11, Wayland, VAAPI, D3D11)
\fB-fs\fR Full-screen (only with X11, Wayland, VAAPI, D3D11, kms)
.TP
\fB\-p\fR Use legacy ports UDP 6000:6001:7011 TCP 7000:7001:7100
.TP
@@ -104,13 +115,19 @@ UxPlay 1.71: An open\-source AirPlay mirroring (+ audio streaming) server:
.TP
\fB\-al\fR x Audio latency in seconds (default 0.25) reported to client.
.TP
\fB\-ca\fR Display cover-art in AirPlay Audio (ALAC) mode.
.TP
\fB\-ca\fI fn \fR In Airplay Audio (ALAC) mode, write cover-art to file fn.
.TP
\fB\-reset\fR n Reset after 3n seconds client silence (default 5, 0=never).
\fB\-md\fI fn \fR In Airplay Audio (ALAC) mode, write metadata text to file fn.
.TP
\fB\-reset\fR n Reset after n seconds client silence (default n=15, 0=never).
.TP
\fB\-nofreeze\fR Do NOT leave frozen screen in place after reset.
.TP
\fB\-nc\fR Do NOT close video window when client stops mirroring
\fB\-nc\fR Do NOT close video window when client stops mirroring.
.TP
\fB\-nc\fR no Cancel the -nc option (DO close video window).
.TP
\fB\-nohold\fR Drop current connection when new client connects.
.TP
@@ -163,11 +180,13 @@ UxPlay 1.71: An open\-source AirPlay mirroring (+ audio streaming) server:
audio packets are dumped. "aud"= unknown format.
.PP
.TP
\fB\-d\fR Enable debug logging
\fB\-d [n]\fR Enable debug logging; optional: n=1 to skip normal packet data.
.TP
\fB\-v\fR Displays version information
.TP
\fB\-h\fR Displays help information
.TP
\fB\-rc\fI fn\fR Read startup options from file "fn" instead of ~/.uxplayrc, etc
.SH
FILES
Options in one of $UXPLAYRC, or ~/.uxplayrc, or ~/.config/uxplayrc

File diff suppressed because it is too large Load Diff

13
uxplay.service Normal file
View File

@@ -0,0 +1,13 @@
[Unit]
Description=AirPlay Unix mirroring server
Requires=avahi-daemon
After=avahi-daemon
[Service]
Type=simple
ExecStart=uxplay
Restart=on-failure
#StandardOutput=file:%h/uxplay.log
[Install]
WantedBy=default.target

View File

@@ -1,5 +1,5 @@
Name: uxplay
Version: 1.71.1
Version: 1.72.2
Release: 1%{?dist}
%global gittag v%{version}
@@ -135,6 +135,8 @@ cd build
%{_docdir}/%{name}/llhttp/LICENSE-MIT
%changelog
* Mon Jul 7 2025 UxPlay maintainer <https://github.com/FDH2/UxPlay>
Update for 1.72.2 release
* Fri Nov 15 2024 UxPlay maintainer <https://github.com/FDH2/UxPlay>
Initial uxplay.spec: tested on Fedora 38, Rocky Linux 9.2, OpenSUSE
Leap 15.5, Mageia 9, OpenMandriva ROME, PCLinuxOS