Merge pull request #133 from FDH2/master

Update to UxPlay1.64
This commit is contained in:
antimof
2023-05-06 21:16:12 +03:00
committed by GitHub
18 changed files with 908 additions and 576 deletions

View File

@@ -1,6 +1,6 @@
<h1
id="uxplay-1.63-airplay-mirror-and-airplay-audio-server-for-linux-macos-and-unix-now-also-runs-on-windows.">UxPlay
1.63: AirPlay-Mirror and AirPlay-Audio server for Linux, macOS, and Unix
id="uxplay-1.64-airplay-mirror-and-airplay-audio-server-for-linux-macos-and-unix-now-also-runs-on-windows.">UxPlay
1.64: AirPlay-Mirror and AirPlay-Audio server for Linux, macOS, and Unix
(now also runs on Windows).</h1>
<h3
id="now-developed-at-the-github-site-httpsgithub.comfdh2uxplay-where-all-user-issues-should-be-posted.">Now
@@ -28,7 +28,7 @@ pipeline).</li>
<li>Support for server behind a firewall.</li>
<li>Raspberry Pi support <strong>both with and without hardware video
decoding</strong> by the Broadcom GPU. <em>Tested on Raspberry Pi 4
Model B.</em></li>
Model B and Pi 3 model B+.</em></li>
<li>Support for running on Microsoft Windows (builds with the MinGW-64
compiler in the unix-like MSYS2 environment).</li>
</ul>
@@ -83,12 +83,14 @@ posts updates pulled from the new main <a
href="https://github.com/FDH2/UxPlay">UxPlay site</a>).</p>
<p>UxPlay is tested on a number of systems, including (among others)
Debian 10.11 “Buster” and 11.2 “Bullseye”, Ubuntu 20.04 LTS and 22.04.1
LTS, Linux Mint 20.3, Pop!_OS 22.04 (NVIDIA edition), Rocky Linux 8.6 (a
CentOS successor), Fedora 36, OpenSUSE 15.4, Arch Linux 22.10, macOS
12.3 (Intel and M1), FreeBSD 13.1, Windows 10 and 11 (64 bit).</p>
LTS, (also Ubuntu derivatives Linux Mint 20.3, Pop!_OS 22.04 (NVIDIA
edition)), Rocky Linux 9.1 (a CentOS successor), Fedora 36, OpenSUSE
15.4, Arch Linux 22.10, macOS 13.3 (Intel and M2), FreeBSD 13.2, Windows
10 and 11 (64 bit).</p>
<p>On Raspberry Pi 4 model B, it is tested on Raspberry Pi OS (Bullseye)
(32- and 64-bit), Ubuntu 22.10, Manjaro RPi4 23.02, and (without
hardware video decoding) on OpenSUSE 15.4.</p>
(32- and 64-bit), Ubuntu 22.04 and 22.10, Manjaro RPi4 23.02, and
(without hardware video decoding) on OpenSUSE 15.4. Also tested on
Raspberry Pi 3 model B+.</p>
<p>Its main use is to act like an AppleTV for screen-mirroring (with
audio) of iOS/iPadOS/macOS clients (iPhone, iPod Touch, iPad, Mac
computers) on the server display of a host running Linux, macOS, or
@@ -328,7 +330,7 @@ plugins (Non-Debian-based Linux or *BSD)</h3>
<li><p><strong>Red Hat, or clones like CentOS (now continued as Rocky
Linux or Alma Linux):</strong> (sudo dnf install, or sudo yum install)
gstreamer1-libav gstreamer1-plugins-bad-free (+ gstreamer1-vaapi for
intel graphics). <em>You may need to get some of them (in particular
Intel/AMD graphics). <em>You may need to get some of them (in particular
gstreamer1-libav) from <a href="https://rpmfusion.org">rpmfusion.org</a>
(which provides packages including plugins that RedHat does not ship for
license reasons). [In recent <strong>Fedora</strong>, the libav plugin
@@ -340,29 +342,37 @@ fail to start, with error: <strong>no element “avdec_aac”</strong>
]</em>.</p></li>
<li><p><strong>OpenSUSE:</strong> (sudo zypper install)
gstreamer-plugins-libav gstreamer-plugins-bad (+ gstreamer-plugins-vaapi
for Intel graphics). <em>In some cases, you may need to use gstreamer or
libav* packages for OpenSUSE from <a
for Intel/AMD graphics). <em>In some cases, you may need to use
gstreamer or libav* packages for OpenSUSE from <a
href="https://ftp.gwdg.de/pub/linux/misc/packman/suse/">Packman</a>
“Essentials” (which provides packages including plugins that OpenSUSE
does not ship for license reasons; recommendation: after adding the
Packman repository, use the option in YaST Software management to switch
all system packages for multimedia to Packman).</em></p></li>
<li><p><strong>Arch Linux</strong> (sudo pacman -Syu) gst-plugins-good
gst-plugins-bad gst-libav (+ gstreamer-vaapi for Intel
gst-plugins-bad gst-libav (+ gstreamer-vaapi for Intel/AMD
graphics).</p></li>
<li><p><strong>FreeBSD:</strong> (sudo pkg install) gstreamer1-libav,
gstreamer1-plugins, gstreamer1-plugins-* (* = core, good, bad, x, gtk,
gl, vulkan, pulse, v4l2, …), (+ gstreamer1-vaapi for Intel
gl, vulkan, pulse, v4l2, …), (+ gstreamer1-vaapi for Intel/AMD
graphics).</p></li>
</ul>
<h3 id="starting-uxplay">Starting UxPlay</h3>
<p><strong>Finally, run uxplay in a terminal window</strong>. On some
systems, you can toggle into and out of fullscreen mode with F11 or
(held-down left Alt)+Enter keys. Use Ctrl-C (or close the window) to
terminate it when done. If the UxPlay server is not seen by the iOS
clients drop-down “Screen Mirroring” panel, check that your DNS-SD
server (usually avahi-daemon) is running: do this in a terminal window
with <code>systemctl status avahi-daemon</code>. If this shows the
<h3 id="starting-and-running-uxplay">Starting and running UxPlay</h3>
<p>Since UxPlay-1.64, UxPlay can be started with options read from a
configuration file, which will be the first found of (1) a file with a
path given by environment variable <code>$UXPLAYRC</code>, (2)
<code>~/.uxplayrc</code> in the users home directory (“~”), (3)
<code>~/.config/uxplayrc</code>. The format is one option per line,
omitting the initial <code>"-"</code> of the command-line option. Lines
in the configuration file beginning with <code>"#"</code> are treated as
comments and ignored.</p>
<p><strong>Run uxplay in a terminal window</strong>. On some systems,
you can toggle into and out of fullscreen mode with F11 or (held-down
left Alt)+Enter keys. Use Ctrl-C (or close the window) to terminate it
when done. If the UxPlay server is not seen by the iOS clients
drop-down “Screen Mirroring” panel, check that your DNS-SD server
(usually avahi-daemon) is running: do this in a terminal window with
<code>systemctl status avahi-daemon</code>. If this shows the
avahi-daemon is not running, control it with
<code>sudo systemctl [start,stop,enable,disable] avahi-daemon</code> (on
non-systemd systems, such as *BSD, use
@@ -375,35 +385,54 @@ mDNS queries) needed by Avahi</strong>. See <a
href="#troubleshooting">Troubleshooting</a> below for help with this or
other problems.</p>
<ul>
<li><p>you may find video is improved by the setting -fps 60 that allows
some video to be played at 60 frames per second. (You can see what
framerate is actually streaming by using -vs fpsdisplaysink, and/or
-FPSdata.)</p></li>
<li><p>By default, UxPlay is locked to its current client until that
client drops the connection; since UxPlay-1.58, the option
<code>-nohold</code> modifies this behavior so that when a new client
requests a connection, it removes the current client and takes
over.</p></li>
<li><p>In its default mode, Uxplay uses a simple GStreamer mode
(“sync=false”) that streams without using audio- and video-timestamps
for synchronization. UxPlay 1.63 also introduces <code>-vsync</code> and
<code>-async</code> as alternatives that use timestamps in Mirror and
Audio-Only modes respectively (GStreamers “sync=true” mode). Simple
default streaming in Mirror mode seems to maintain synchronisation of
audio with video on desktop systems, but you may wish to use
<code>-vsync</code>, which becomes essential in low-powered systems like
Raspberry Pi if hardware video decoding is not used (<strong>and is
likely to become the default in future releases of UxPlay</strong>).
These options also allow an optional positive (or negative) audio-delay
adjustment in <em>milliseconds</em> for fine-tuning :
<code>-vsync 20.5</code> delays audio relative to video by 0.0205 secs;
a negative value advances it.)</p></li>
<li><p>The <code>-async</code> option should be used if you want video
on the client to be synchronized with Audio-Only mode audio on the
server (<em>e.g.</em> for viewing song lyrics in Apple music while
listening to ALAC loss-free audio on the server); this introduces a
slight delay for events like pausing audio, changing tracks,
<em>etc.</em>, to be heard.</p></li>
<li><p>In Mirror mode, GStreamer has a choice of <strong>two</strong>
methods to play video with its accompanying audio: prior to UxPlay-1.64,
the video and audio streams were both played as soon as possible after
they arrived (the GStreamer “<em>sync=false</em>” method), with a
GStreamer internal clock used to try to keep them synchronized.
<strong>Starting with UxPlay-1.64, the other method (GStreamers
<em>sync=true</em>” mode), which uses timestamps in the audio and video
streams sent by the client, is the new default</strong>. On
low-decoding-power UxPlay hosts (such as Raspberry Pi 3 models) this
will drop video frames that cannot be decoded in time to play with the
audio, making the video jerky, but still synchronized.</p></li>
</ul>
<p>The older method which does not drop late video frames worked well on
more powerful systems, and is still available with the UxPlay option
<code>-vsync no</code>”; this method is adapted to “live streaming”,
and may be better when using UxPlay as a second monitor for a Mac
computer, for example, while the new default timestamp-based method is
best for watching a video, to keep lip movements and voices
synchronized. (Without use of timestamps, video will eventually lag
behind audio if it cannot be decoded fast enough: hardware-accelerated
video-decoding helped to prevent this previously when timestamps were
not being used.)</p>
<ul>
<li>In Audio-only mode the GStreamer “sync=false” mode (not using
timestamps) is still the default, but if you want to keep the audio
playing on the server synchronized with the video showing on the client,
use the <code>-async</code> timestamp-based option. (An example might be
if you want to follow the Apple Music lyrics on the client while
listening to superior sound on the UxPlay server). This delays the video
on the client to match audio on the server, so leads to a slight delay
before a pause or track-change initiated on the client takes effect on
the audio played by the server.</li>
</ul>
<p>The -vsync and -async options also allow an optional positive (or
negative) audio-delay adjustment in <em>milliseconds</em> for
fine-tuning : <code>-vsync 20.5</code> delays audio relative to video by
0.0205 secs; a negative value advances it.)</p>
<ul>
<li><p>you may find video is improved by the setting -fps 60 that allows
some video to be played at 60 frames per second. (You can see what
framerate is actually streaming by using -vs fpsdisplaysink, and/or
-FPSdata.) When using this, you should use the default timestamp-based
synchronization option <code>-vsync</code>.</p></li>
<li><p>Since UxPlay-1.54, you can display the accompanying “Cover Art”
from sources like Apple Music in Audio-Only (ALAC) mode: run
<code>uxplay -ca &lt;name&gt; &amp;</code>” in the background, then run
@@ -420,15 +449,14 @@ the GStreamer VAAPI plugin. If your system uses the Wayland compositor
for graphics, use “<code>uxplay -vs waylandsink</code>”.</strong> See <a
href="#usage">Usage</a> for more run-time options.</p>
<h3
id="special-instructions-for-raspberry-pi-tested-on-r-pi-4-model-b-8gb"><strong>Special
instructions for Raspberry Pi (tested on R Pi 4 model B
8GB)</strong>:</h3>
id="special-instructions-for-raspberry-pi-tested-on-r-pi-4-model-b-8gb-and-r-pi-3-model-b"><strong>Special
instructions for Raspberry Pi (tested on R Pi 4 model B 8GB and R Pi 3
model B+)</strong>:</h3>
<ul>
<li><p>If you use the software-only (h264) video-decoding UxPlay option
<code>-avdec</code>, you also need option <code>-vsync</code>to keep
audio and video synchronized (<code>-vsync</code> is a new feature;
before it was introduced, software decoding on the Pi was not
viable.)</p></li>
<code>-avdec</code>, it now works better than earlier, with the new
default timestamp-based synchronization to keep audio and video
synchronized.</p></li>
<li><p>For best performance, the Raspberry Pi needs the GStreamer
Video4linux2 plugin to use its Broadcom GPU hardware for decoding h264
video. This needs the bcm2835_codec kernel module which is maintained
@@ -439,16 +467,24 @@ it include Raspberry Pi OS, Ubuntu, and Manjaro (all available from
Raspberry Pi with their Raspberry Pi Imager). Other distributions
generally do not provide it: <strong>without this kernel module, UxPlay
cannot use the decoding firmware in the GPU.</strong></p></li>
<li><p>The plugin in the latest GStreamer-1.22 release works well, but
</ul>
<p>For use of the GPU, use raspi-config “Performance Options” (on
Raspberry Pi OS, use a similar tool on other distributions) to allocate
sufficient memory for the GPU (on R. Pi 3 model B+, the maximum (256MB)
is suggested). Even with GPU video decoding, some frames may be dropped
by the lower-power 3 B+ to keep audio and video synchronized using
timestamps.</p>
<ul>
<li>The plugin in the latest GStreamer-1.22 release works well, but
older releases of GStreamer will not work unless patched with backports
from GStreamer-1.22. Raspberry Pi OS (Bullseye) now has a working
backport. For a fuller backport, or for other distributions, patches for
the GStreamer Video4Linux2 plugin are <a
href="https://github.com/FDH2/UxPlay/wiki/Gstreamer-Video4Linux2-plugin-patches">available
with instructions in the UxPlay Wiki</a>.</p></li>
with instructions in the UxPlay Wiki</a>.</li>
</ul>
<p>The basic uxplay options for R Pi are
<code>uxplay -vsync [-vs &lt;videosink&gt;]</code>. The choice
<code>uxplay [-vs &lt;videosink&gt;]</code>. The choice
<code>&lt;videosink&gt;</code> = <code>glimagesink</code> is sometimes
useful. On a system without X11 (like R Pi OS Lite) with framebuffer
video, use <code>&lt;videosink&gt;</code> = <code>kmssink</code>. With
@@ -458,10 +494,10 @@ hardware video decoding, an option <code>-v4l2</code> may be useful: for
convenience, this also comes combined with various videosink options as
<code>-rpi</code>, <code>-rpigl</code> <code>-rpifb</code>,
<code>-rpiwl</code>, respectively provided for X11, X11 with OpenGL,
framebuffer, and Wayland systems. You may find that just
<code>uxplay -vsync</code>”, (<em>without</em> <code>-v4l2</code> or
framebuffer, and Wayland systems. <strong>You may find that just
<code>uxplay</code>”, (<em>without</em> <code>-v4l2</code> or
<code>-rpi*</code> options, which lets GStreamer try to find the best
video solution by itself) provides the best results (the
video solution by itself) provides the best results</strong> (the
<code>-rpi*</code> options may be removed in a future release of
UxPlay.)</p>
<ul>
@@ -493,54 +529,74 @@ developer tools are installed (if Xcode is installed, open the Terminal,
type “sudo xcode-select install” and accept the conditions).</p>
<p>It is also assumed that CMake &gt;= 3.13 is installed: this can be
done with package managers <a
href="http://www.macports.org">MacPorts</a>, <a
href="http://finkproject.org">Fink</a> or <a
href="http://brew.sh">Homebrew</a>, or by a download from <a
href="https://cmake.org/download/">https://cmake.org/download/</a>.</p>
<p>First install OpenSSL and libplist: static versions of these
libraries will be used, so they can be uninstalled after UxPlay is
built. These are available in MacPorts and Homebrew, or they can easily
be built from source (see instructions at the end of this README; this
requires development tools autoconf, automake, libtool, which can be
installed using MacPorts, HomeBrew, or Fink).</p>
href="http://www.macports.org">MacPorts</a>
(<code>sudo port install cmake</code>), <a
href="http://brew.sh">Homebrew</a> (<code>brew install cmake</code>), or
by a download from <a
href="https://cmake.org/download/">https://cmake.org/download/</a>. Also
install <code>git</code> if you will use it to fetch UxPlay.</p>
<p>Next install libplist and openssl-3.x. Note that static versions of
these libraries will be used in the macOS builds, so they can be
uninstalled after building uxplay, if you wish.</p>
<ul>
<li><p>If you use Homebrew:
<code>brew install libplist openssl@3</code></p></li>
<li><p>if you use MacPorts:
<code>sudo port install libplist-devel openssl3</code></p></li>
</ul>
<p>Otherwise, build libplist and openssl from source: see instructions
near the end of this README; requires development tools (autoconf,
automake, libtool, <em>etc.</em>) to be installed.</p>
<p>Next get the latest macOS release of GStreamer-1.0.</p>
<ul>
<li><p>recommended: install the “official” GStreamer release for macOS
from <a
<p><strong>Using “Official” GStreamer (Recommended for both MacPorts and
Homebrew users)</strong>: install the GStreamer release for macOS from
<a
href="https://gstreamer.freedesktop.org/download/">https://gstreamer.freedesktop.org/download/</a>.
The alternative is to install it from Homebrew. MacPorts packages of
GStreamer are compiled to use X11 and are <strong>NOT</strong>
recommended.</p></li>
<li><p>You could instead compile the “official” GStreamer release from
source: GStreamer-1.22.0 has been successfully built this way on a
system using MacPorts: see <a
href="https://github.com/FDH2/UxPlay/wiki/Building-GStreamer-from-Source-on-macOS-with-MacPorts">the
UxPlay Wiki</a></p></li>
</ul>
<p><strong>For the “official” release</strong>: install both the macOS
runtime and development installer packages. Assuming that the latest
release is 1.20.5 install
<code>gstreamer-1.0-1.20.5-universal.pkg</code> and
<code>gstreamer-1.0-devel-1.20.5-universal.pkg</code>. Click on them to
install (they install to /Library/FrameWorks/GStreamer.framework).</p>
(This release contains its own pkg-config, so you dont have to install
one.) Install both the gstreamer-1.0 and gstreamer-1.0-devel packages.
After downloading, Shift-Click on them to install (they install to
/Library/FrameWorks/GStreamer.framework). Homebrew or MacPorts users
should <strong>not</strong> install (or should uninstall) the GStreamer
supplied by their package manager, if they use the “official”
release.</p>
<ul>
<li><strong>ADDED 2023-01-25: v1.22.0 has just been released, but these
binaries seem to have problems, perhaps only on older macOS releases;
use v1.20.5 if they dont work for you.</strong></li>
<li><strong>ADDED 2023-01-25: in the latest release (now 1.22.2)
something in the GStreamer macOS binaries appears to not be working
(UxPlay starts receiving the AirPlay stream, but the video window does
not open)</strong>. If you have this problem, use the GStreamer-1.20.6
binary packages until a fix is found. <em>You could instead compile the
“official” GStreamer-1.22.x release from source: GStreamer-1.22.0 has
been successfully built this way on a system using MacPorts: see</em> <a
href="https://github.com/FDH2/UxPlay/wiki/Building-GStreamer-from-Source-on-macOS-with-MacPorts">the
UxPlay Wiki</a>.</li>
</ul>
<p><strong>For Homebrew</strong>: pkgconfig is needed (“brew install
pkgconfig”). Then “brew install gst-plugins-base gst-plugins-good
gst-plugins-bad gst-libav”. This appears to be functionally equivalent
to using GStreamer.framework, but causes a large number of extra
packages to be installed by Homebrew as dependencies. <strong>You may
need to set the environment variable
GST_PLUGIN_PATH=/usr/local/lib/gstreamer-1.0 to point to the Homebrew
GStreamer installation.</strong></p>
<p><strong>Using Homebrews GStreamer</strong>: pkg-config is needed:
(“brew install pkg-config gstreamer”). This causes a large number of
extra packages to be installed by Homebrew as dependencies. The <a
href="https://formulae.brew.sh/formula/gstreamer#default">Homebrew
gstreamer installation</a> has recently been reworked into a single
“formula” named <code>gstreamer</code>, which now works without needing
GST_PLUGIN_PATH to be set in the enviroment. Homebrew installs gstreamer
to <code>(HOMEBREW)/lib/gstreamer-1.0</code> where
<code>(HOMEBREW)/*</code> is <code>/opt/homebrew/*</code> on Apple
Silicon Macs, and <code>/usr/local/*</code> on Intel Macs; do not put
any extra non-Homebrew plugins (that you build yourself) there, and
instead set GST_PLUGIN_PATH to point to their location (Homebrew does
not supply a complete GStreamer, but seems to have everything needed for
UxPlay).</p>
<p>Finally, build and install uxplay: open a terminal and change into
the UxPlay source directory (“UxPlay-master” for zipfile downloads,
“UxPlay” for “git clone” downloads) and build/install with “cmake . ;
make ; sudo make install” (same as for Linux).</p>
<ul>
<li><p>Running UxPlay while checking for GStreamer warnings (do this
with “export GST_DEBUG=2” before runnng UxPlay) reveals that with the
default (since UxPlay 1.64) use of timestamps for video synchonization,
many video frames are being dropped (only on macOS), perhaps due to
another error (about videometa) that shows up in the GStreamer warnings.
<strong>Recommendation: use the new UxPlay “no timestamp” option
<code>-vsync no</code></strong> (you can add a line “vsync no” in the
uxplayrc configuration file).</p></li>
<li><p>On macOS with this installation of GStreamer, the only videosinks
available seem to be glimagesink (default choice made by autovideosink)
and osxvideosink. The window title does not show the Airplay server
@@ -561,21 +617,21 @@ have the correct aspect ratio when it first opens.</p></li>
</ul>
<p><strong><em>Using GStreamer installed from MacPorts (not
recommended):</em></strong></p>
<p>To install: “sudo port install pkgconfig”; “sudo port install
<p>To install: “sudo port install pkgconf”; “sudo port install
gstreamer1-gst-plugins-base gstreamer1-gst-plugins-good
gstreamer1-gst-plugins-bad gstreamer1-gst-libav”. <strong>The MacPorts
GStreamer is built to use X11</strong>: use the special CMake option
<code>-DUSE_X11=ON</code> when building UxPlay. Then uxplay must be run
from an XQuartz terminal, and needs option “-vs ximagesink”. On an
unibody (non-retina) MacBook Pro, the default resolution wxh = 1920x1080
was too large, but using option “-s 800x600” worked. The MacPorts
GStreamer pipeline seems fragile against attempts to change the X11
window size, or to rotations that switch a connected client between
portrait and landscape mode while uxplay is running. Using the MacPorts
X11 GStreamer seems only possible if the image size is left unchanged
from the initial “-s wxh” setting (also use the iPad/iPhone setting that
locks the screen orientation against switching between portrait and
landscape mode as the device is rotated).</p>
GStreamer is old (v1.16.2) and built to use X11</strong>: use the
special CMake option <code>-DUSE_X11=ON</code> when building UxPlay.
Then uxplay must be run from an XQuartz terminal, and needs option “-vs
ximagesink”. On a unibody (non-retina) MacBook Pro, the default
resolution wxh = 1920x1080 was too large, but using option “-s 800x600”
worked. The MacPorts GStreamer pipeline seems fragile against attempts
to change the X11 window size, or to rotations that switch a connected
client between portrait and landscape mode while uxplay is running.
Using the MacPorts X11 GStreamer seems only possible if the image size
is left unchanged from the initial “-s wxh” setting (also use the
iPad/iPhone setting that locks the screen orientation against switching
between portrait and landscape mode as the device is rotated).</p>
<h2
id="building-uxplay-on-microsoft-windows-using-msys2-with-the-mingw-64-compiler.">Building
UxPlay on Microsoft Windows, using MSYS2 with the MinGW-64
@@ -689,15 +745,19 @@ the mirror display (X11) window.</p>
<p><strong>-nh</strong> Do not append “<span class="citation"
data-cites="_hostname_">@_hostname_</span>” at the end of the AirPlay
server name.</p>
<p><strong>-vsync [x]</strong> (In Mirror mode:) this option uses
timestamps to synchronize audio with video on the server, with an
optional audio delay in (decimal) milliseconds (<em>x</em> = “20.5”
means 0.0205 seconds delay: positive or negative delays less than a
second are allowed.) It is needed on low-power systems such as Raspberry
Pi without hardware video decoding. Standard desktop systems seem to
work well without this (streaming without use of timestamps was the only
behavior prior to UxPlay 1.63), but you may wish to use it there too.
(It may become the default in future releases.)</p>
<p><strong>-vsync [x]</strong> (In Mirror mode:) this option
(<strong>now the default</strong>) uses timestamps to synchronize audio
with video on the server, with an optional audio delay in (decimal)
milliseconds (<em>x</em> = “20.5” means 0.0205 seconds delay: positive
or negative delays less than a second are allowed.) It is needed on
low-power systems such as Raspberry Pi without hardware video
decoding.</p>
<p><strong>-vsync no</strong> (In Mirror mode:) this switches off
timestamp-based audio-video synchronization, restoring the default
behavior prior to UxPlay-1.64. Standard desktop systems seem to work
well without use of timestamps: this mode is appropriate for “live
streaming” such as using UxPlay as a second monitor for a mac computer,
or monitoring a webcam; with it, no video frames are dropped.</p>
<p><strong>-async [x]</strong> (In Audio-Only (ALAC) mode:) this option
uses timestamps to synchronize audio on the server with video on the
client, with an optional audio delay in (decimal) milliseconds
@@ -710,6 +770,10 @@ principle be mitigated by using the <code>-al</code> audio latency
setting to change the latency (default 0.25 secs) that the server
reports to the client, but at present changing this does not seem to
have any effect</em>.</p>
<p><strong>-async no</strong>. This is the still the default behavior in
Audio-only mode, but this option may be useful as a command-line option
to switch off a <code>-async</code> option set in a “uxplayrc”
configuration file.</p>
<p><strong>-s wxh</strong> (e.g. -s 1920x1080 , which is the default )
sets the display resolution (width and height, in pixels). (This may be
a request made to the AirPlay client, and perhaps will not be the final
@@ -1017,13 +1081,13 @@ href="https://github.com/FDH2/UxPlay/wiki/Gstreamer-Video4Linux2-plugin-patches"
for patches). This is fixed in GStreamer-1.22, and by backport patches
from this in distributions such as Raspberry Pi OS (Bullseye):
<strong>use option <code>-bt709</code> with the GStreamer-1.18.4 from
Raspberry Pi OS</strong>.. This also needs the bcm2835-codec kernel
Raspberry Pi OS</strong>. This also needs the bcm2835-codec kernel
module that is not in the standard Linux kernel (it is available in
Raspberry Pi OS, Ubuntu and Manjaro).</p>
<ul>
<li><strong>If this kernel module is not available in your Raspberry Pi
operating system, or if GStreamer &lt; 1.22 is not patched, use options
<code>-avdec -vsync</code> for software h264-decoding.</strong></li>
operating system, or if GStreamer &lt; 1.22 is not patched, use option
<code>-avdec</code> for software h264-decoding.</strong></li>
</ul>
<p>Sometimes “autovideosink” may select the OpenGL renderer
“glimagesink” which may not work correctly on your system. Try the
@@ -1173,6 +1237,10 @@ the client by the AirPlay server) to be set.</p>
<p>The “features” code and other settings are set in
<code>UxPlay/lib/dnssdint.h</code>.</p>
<h1 id="changelog">Changelog</h1>
<p>1.64 2023-04-23 Timestamp-based synchronization of audio and video is
now the default in Mirror mode. (Use “-vsync no” to restore previous
behavior.) A configuration file can now be used for startup options.
Also some internal cleanups and a minor bugfix that fixes #192.</p>
<p>1.63 2023-02-12 Reworked audio-video synchronization, with new
options -vsync (for Mirror mode) and -async (for Audio-Only mode, to
sync with client video). Option -vsync makes software h264 decoding of

187
README.md
View File

@@ -1,4 +1,4 @@
# UxPlay 1.63: AirPlay-Mirror and AirPlay-Audio server for Linux, macOS, and Unix (now also runs on Windows).
# UxPlay 1.64: AirPlay-Mirror and AirPlay-Audio server for Linux, macOS, and Unix (now also runs on Windows).
### Now developed at the GitHub site [https://github.com/FDH2/UxPlay](https://github.com/FDH2/UxPlay) (where all user issues should be posted).
@@ -19,7 +19,7 @@
to select different hardware-appropriate output "videosinks" and
"audiosinks", and a fully-user-configurable video streaming pipeline).
* Support for server behind a firewall.
* Raspberry Pi support **both with and without hardware video decoding** by the Broadcom GPU. _Tested on Raspberry Pi 4 Model B._
* Raspberry Pi support **both with and without hardware video decoding** by the Broadcom GPU. _Tested on Raspberry Pi 4 Model B and Pi 3 model B+._
* Support for running on Microsoft Windows (builds with the MinGW-64 compiler in the
unix-like MSYS2 environment).
@@ -60,11 +60,12 @@ development, but periodically posts updates pulled from the new
main [UxPlay site](https://github.com/FDH2/UxPlay)).
UxPlay is tested on a number of systems, including (among others) Debian 10.11 "Buster" and 11.2 "Bullseye",
Ubuntu 20.04 LTS and 22.04.1 LTS, Linux Mint 20.3, Pop!\_OS 22.04 (NVIDIA edition), Rocky Linux 8.6 (a CentOS successor), Fedora 36,
OpenSUSE 15.4, Arch Linux 22.10, macOS 12.3 (Intel and M1), FreeBSD 13.1, Windows 10 and 11 (64 bit).
Ubuntu 20.04 LTS and 22.04.1 LTS, (also Ubuntu derivatives Linux Mint 20.3, Pop!\_OS 22.04 (NVIDIA edition)),
Rocky Linux 9.1 (a CentOS successor), Fedora 36, OpenSUSE 15.4, Arch Linux 22.10, macOS 13.3 (Intel and M2),
FreeBSD 13.2, Windows 10 and 11 (64 bit).
On Raspberry Pi 4 model B, it is tested on Raspberry Pi OS (Bullseye) (32- and 64-bit), Ubuntu 22.10, Manjaro RPi4 23.02, and (without hardware
video decoding) on OpenSUSE 15.4.
On Raspberry Pi 4 model B, it is tested on Raspberry Pi OS (Bullseye) (32- and 64-bit), Ubuntu 22.04 and 22.10, Manjaro RPi4 23.02,
and (without hardware video decoding) on OpenSUSE 15.4. Also tested on Raspberry Pi 3 model B+.
Its main use is to act like an AppleTV for screen-mirroring (with audio) of iOS/iPadOS/macOS clients
(iPhone, iPod Touch, iPad, Mac computers) on the server display
@@ -279,7 +280,7 @@ installed, depending on how your audio is set up.
* **Red Hat, or clones like CentOS (now continued as Rocky Linux or Alma Linux):**
(sudo dnf install, or sudo yum install) gstreamer1-libav gstreamer1-plugins-bad-free (+ gstreamer1-vaapi
for intel graphics). _You may need to get some of them (in particular gstreamer1-libav) from [rpmfusion.org](https://rpmfusion.org)
for Intel/AMD graphics). _You may need to get some of them (in particular gstreamer1-libav) from [rpmfusion.org](https://rpmfusion.org)
(which provides packages including plugins that RedHat does not ship for license reasons).
[In recent **Fedora**, the libav plugin package is renamed to "gstreamer1-plugin-libav",
which now needs the RPM Fusion package ffmpeg-libs for the
@@ -290,7 +291,7 @@ error: **no element "avdec_aac"** ]_.
* **OpenSUSE:**
(sudo zypper install)
gstreamer-plugins-libav gstreamer-plugins-bad (+ gstreamer-plugins-vaapi
for Intel graphics). _In some cases, you may need to use gstreamer or libav* packages for OpenSUSE
for Intel/AMD graphics). _In some cases, you may need to use gstreamer or libav* packages for OpenSUSE
from [Packman](https://ftp.gwdg.de/pub/linux/misc/packman/suse/) "Essentials"
(which provides packages including plugins that OpenSUSE does not ship for license reasons; recommendation: after adding the
Packman repository, use the option in YaST Software management to switch
@@ -298,15 +299,20 @@ all system packages for multimedia to Packman)._
* **Arch Linux**
(sudo pacman -Syu) gst-plugins-good gst-plugins-bad gst-libav (+ gstreamer-vaapi
for Intel graphics).
for Intel/AMD graphics).
* **FreeBSD:** (sudo pkg install) gstreamer1-libav, gstreamer1-plugins, gstreamer1-plugins-*
(\* = core, good, bad, x, gtk, gl, vulkan, pulse, v4l2, ...), (+ gstreamer1-vaapi for Intel graphics).
(\* = core, good, bad, x, gtk, gl, vulkan, pulse, v4l2, ...), (+ gstreamer1-vaapi for Intel/AMD graphics).
### Starting UxPlay
### Starting and running UxPlay
**Finally, run uxplay in a terminal window**. On some systems, you can toggle into and out of fullscreen mode
Since UxPlay-1.64, UxPlay can be started with options read from a configuration file, which will be the first found of
(1) a file with a path given by environment variable `$UXPLAYRC`, (2) ``~/.uxplayrc`` in the user's home
directory ("~"), (3) ``~/.config/uxplayrc``. The format is one option per line, omitting the initial ``"-"`` of
the command-line option. Lines in the configuration file beginning with `"#"` are treated as comments and ignored.
**Run uxplay in a terminal window**. On some systems, you can toggle into and out of fullscreen mode
with F11 or (held-down left Alt)+Enter keys. Use Ctrl-C (or close the window)
to terminate it when done. If the UxPlay server is not seen by the
iOS client's drop-down "Screen Mirroring" panel, check that your DNS-SD
@@ -322,26 +328,39 @@ are opened: **if a firewall is active, also open UDP port 5353 (for mDNS queries
needed by Avahi**. See [Troubleshooting](#troubleshooting) below for
help with this or other problems.
* you may find video is improved by the setting -fps 60 that allows some video to be played at 60 frames
per second. (You can see what framerate is actually streaming by using -vs fpsdisplaysink, and/or -FPSdata.)
* By default, UxPlay is locked to
its current client until that client drops the connection; since UxPlay-1.58, the option `-nohold` modifies this
behavior so that when a new client requests a connection, it removes the current client and takes over.
* In its default mode, Uxplay uses a simple GStreamer mode ("sync=false") that streams without using audio- and
video-timestamps for synchronization. UxPlay 1.63 also introduces `-vsync` and `-async` as alternatives that use timestamps
in Mirror and Audio-Only modes respectively (GStreamer's "sync=true" mode).
Simple default streaming in Mirror mode seems to maintain synchronisation of audio with video on desktop systems,
but you may wish to use `-vsync`, which becomes essential in low-powered systems like Raspberry Pi if hardware
video decoding is not used (**and is likely to become the default in future releases of UxPlay**). These options
* In Mirror mode, GStreamer has a choice of **two** methods to play video with its accompanying audio: prior to UxPlay-1.64,
the video and audio streams were both played as soon as possible after they arrived (the GStreamer "_sync=false_" method), with
a GStreamer internal clock used to try to keep them synchronized. **Starting with UxPlay-1.64, the other method
(GStreamer's "_sync=true_" mode), which uses timestamps in the audio and video streams sent by the client, is the new default**.
On low-decoding-power UxPlay hosts (such as Raspberry Pi 3 models) this will drop video frames that cannot be decoded in time
to play with the audio, making the video jerky, but still synchronized.
The older method which does not drop late video frames
worked well on more powerful systems, and is still available with the UxPlay option "`-vsync no`"; this method is adapted
to "live streaming", and may be better when using UxPlay as a second monitor for a Mac computer, for example, while the new default
timestamp-based method is best for watching a video, to keep lip movements and voices synchronized. (Without use of timestamps,
video will eventually lag behind audio if it cannot be decoded fast enough: hardware-accelerated video-decoding helped to prevent this
previously when timestamps were not being used.)
* In Audio-only mode the GStreamer "sync=false" mode (not using timestamps) is still the default, but if you want to keep the audio
playing on the server synchronized with the video showing on the client, use the `-async` timestamp-based option. (An example might be
if you want to follow the Apple Music lyrics on the client while listening to superior sound on the UxPlay server). This
delays the video on the client to match audio on the server, so leads to
a slight delay before a pause or track-change initiated on the client takes effect on the audio played by the server.
The -vsync and -async options
also allow an optional positive (or negative) audio-delay adjustment in _milliseconds_ for fine-tuning : `-vsync 20.5`
delays audio relative to video by 0.0205 secs; a negative value advances it.)
* The `-async` option should be used if you want
video on the client to be synchronized with Audio-Only mode audio on the server (_e.g._ for viewing song lyrics in Apple music
while listening to ALAC loss-free audio on the server); this introduces a slight delay for events like pausing audio,
changing tracks, _etc._, to be heard.
* you may find video is improved by the setting -fps 60 that allows some video to be played at 60 frames
per second. (You can see what framerate is actually streaming by using -vs fpsdisplaysink, and/or -FPSdata.)
When using this, you should use the default timestamp-based synchronization option `-vsync`.
* Since UxPlay-1.54, you can display the accompanying "Cover Art" from sources like Apple Music in Audio-Only (ALAC) mode:
run "`uxplay -ca <name> &`" in the background, then run a image viewer with an autoreload feature: an example
@@ -357,11 +376,10 @@ your system uses the Wayland compositor for graphics, use "`uxplay -vs waylandsi
See [Usage](#usage) for more run-time options.
### **Special instructions for Raspberry Pi (tested on R Pi 4 model B 8GB)**:
### **Special instructions for Raspberry Pi (tested on R Pi 4 model B 8GB and R Pi 3 model B+)**:
* If you use the software-only (h264) video-decoding UxPlay option `-avdec`, you also need
option `-vsync`to keep audio and video synchronized (`-vsync` is a new feature; before
it was introduced, software decoding on the Pi was not viable.)
* If you use the software-only (h264) video-decoding UxPlay option `-avdec`, it now works
better than earlier, with the new default timestamp-based synchronization to keep audio and video synchronized.
* For best performance, the Raspberry Pi needs the GStreamer Video4linux2 plugin to use
its Broadcom GPU hardware for decoding h264 video. This needs the bcm2835_codec kernel module
@@ -371,21 +389,26 @@ only distributions for R Pi that are known to supply it include Raspberry Pi OS,
from Raspberry Pi with their Raspberry Pi Imager). Other distributions generally do not
provide it: **without this kernel module, UxPlay cannot use the decoding firmware in the GPU.**
For use of the GPU, use raspi-config "Performance Options" (on Raspberry Pi OS, use a similar tool on other
distributions) to allocate sufficient memory for the GPU (on R. Pi 3 model B+, the maximum (256MB) is suggested).
Even with GPU video decoding, some frames may be dropped by the lower-power 3 B+ to keep audio and video synchronized
using timestamps.
* The plugin in the latest GStreamer-1.22 release works well, but older releases of GStreamer will not
work unless patched with backports from GStreamer-1.22. Raspberry Pi OS (Bullseye) now has a
working backport. For a fuller backport, or for other distributions, patches for the GStreamer Video4Linux2 plugin
are [available with instructions in the UxPlay Wiki](https://github.com/FDH2/UxPlay/wiki/Gstreamer-Video4Linux2-plugin-patches).
The basic uxplay options for R Pi are ```uxplay -vsync [-vs <videosink>]```. The
The basic uxplay options for R Pi are ```uxplay [-vs <videosink>]```. The
choice `<videosink>` = ``glimagesink`` is sometimes useful.
On a system without X11 (like R Pi OS Lite) with framebuffer video, use `<videosink>` = ``kmssink``.
With the Wayland video compositor, use `<videosink>` = ``waylandsink``. When using the Video4Linux2
plugin to access hardware video decoding, an option `-v4l2` may be useful: for convenience, this also comes
combined with various videosink options as `-rpi`, ``-rpigl`` ``-rpifb``, ```-rpiwl```, respectively
provided for X11, X11 with OpenGL, framebuffer, and Wayland systems.
You may find that just "`uxplay -vsync`", (_without_ ``-v4l2`` or ```-rpi*``` options, which lets GStreamer
**You may find that just "`uxplay`", (_without_ ``-v4l2`` or ```-rpi*``` options, which lets GStreamer
try to find the best video solution by itself)
provides the best results (the `-rpi*` options may be removed in a future release of UxPlay.)
provides the best results** (the `-rpi*` options may be removed in a future release of UxPlay.)
* If you are using Raspberry Pi OS (Bullseye) with Video4Linux2 from unpatched GStreamer-1.18.4, you
need the `-bt709` option with UxPlay-1.56 or later.
@@ -412,41 +435,53 @@ These instructions for macOS assume that the Xcode command-line developer tools
installed, open the Terminal, type "sudo xcode-select --install" and accept the conditions).
It is also assumed that CMake >= 3.13 is installed:
this can be done with package managers [MacPorts](http://www.macports.org),
[Fink](http://finkproject.org) or [Homebrew](http://brew.sh), or by a download from
[https://cmake.org/download/](https://cmake.org/download/).
this can be done with package managers [MacPorts](http://www.macports.org) (`sudo port install cmake`),
[Homebrew](http://brew.sh) (`brew install cmake`), or by a download from
[https://cmake.org/download/](https://cmake.org/download/). Also install `git` if you will use it to fetch UxPlay.
First install OpenSSL and libplist: static versions of these libraries will be used, so they can be uninstalled after UxPlay is built.
These are available in MacPorts and Homebrew, or they can easily be built from source (see instructions at the end of this README; this
requires development tools autoconf, automake, libtool, which can be installed using MacPorts, HomeBrew, or Fink).
Next install libplist and openssl-3.x. Note that static versions of these libraries will be
used in the macOS builds, so they can be uninstalled after building uxplay, if you wish.
* If you use Homebrew: `brew install libplist openssl@3`
* if you use MacPorts: `sudo port install libplist-devel openssl3`
Otherwise, build libplist and openssl from source: see instructions near the end of this README;
requires development tools (autoconf, automake, libtool, _etc._) to be installed.
Next get the latest macOS release of GStreamer-1.0.
* recommended: install the "official" GStreamer release for macOS
from [https://gstreamer.freedesktop.org/download/](https://gstreamer.freedesktop.org/download/). The alternative is to install it from Homebrew. MacPorts
packages of GStreamer are compiled to use X11 and are **NOT** recommended.
**Using "Official" GStreamer (Recommended for both MacPorts and Homebrew users)**: install
the GStreamer release for macOS
from [https://gstreamer.freedesktop.org/download/](https://gstreamer.freedesktop.org/download/).
(This release contains its own pkg-config,
so you don't have to install one.) Install both the gstreamer-1.0 and gstreamer-1.0-devel packages. After downloading, Shift-Click on them
to install (they install to /Library/FrameWorks/GStreamer.framework). Homebrew or MacPorts users should **not** install (or should uninstall) the GStreamer supplied by their package manager, if they use the "official" release.
* You could instead compile the "official" GStreamer release from source: GStreamer-1.22.0 has been successfully
built this way on a system using MacPorts: see [the UxPlay Wiki](https://github.com/FDH2/UxPlay/wiki/Building-GStreamer-from-Source-on-macOS-with-MacPorts)
* **ADDED 2023-01-25: in the latest release (now 1.22.2) something in the GStreamer macOS binaries appears to not be
working (UxPlay starts receiving the AirPlay stream, but the video window does not open)**. If
you have this problem, use the GStreamer-1.20.6 binary packages until a fix is found. _You could instead compile the "official" GStreamer-1.22.x release from source: GStreamer-1.22.0 has been successfully
built this way on a system using MacPorts: see_ [the UxPlay Wiki](https://github.com/FDH2/UxPlay/wiki/Building-GStreamer-from-Source-on-macOS-with-MacPorts).
**For the "official" release**: install both the macOS runtime and development installer packages. Assuming that the latest release is 1.20.5
install `gstreamer-1.0-1.20.5-universal.pkg` and ``gstreamer-1.0-devel-1.20.5-universal.pkg``. Click on them to
install (they install to /Library/FrameWorks/GStreamer.framework).
* **ADDED 2023-01-25: v1.22.0 has just been released, but these binaries
seem to have problems, perhaps only on older macOS releases; use v1.20.5 if they dont work for you.**
**For Homebrew**: pkgconfig is needed ("brew install pkgconfig").
Then
"brew install gst-plugins-base gst-plugins-good gst-plugins-bad gst-libav". This appears to be functionally equivalent
to using GStreamer.framework, but causes a large number of extra packages to be installed by Homebrew as dependencies.
**You may need to set the environment variable GST_PLUGIN_PATH=/usr/local/lib/gstreamer-1.0 to point to the Homebrew GStreamer installation.**
**Using Homebrew's GStreamer**: pkg-config is needed: ("brew install pkg-config gstreamer").
This causes a large number of extra packages to be installed by Homebrew as dependencies.
The [Homebrew gstreamer installation](https://formulae.brew.sh/formula/gstreamer#default) has recently been
reworked into a single "formula" named `gstreamer`, which now works without needing GST_PLUGIN_PATH to be
set in the enviroment. Homebrew installs gstreamer to `(HOMEBREW)/lib/gstreamer-1.0` where ``(HOMEBREW)/*`` is
`/opt/homebrew/*` on Apple Silicon Macs, and ``/usr/local/*`` on Intel Macs; do not put any
extra non-Homebrew plugins (that you build yourself) there, and instead set GST_PLUGIN_PATH to point to
their location (Homebrew does not supply a complete GStreamer, but seems to have everything needed for UxPlay).
Finally, build and install uxplay: open a terminal and change into the UxPlay source directory
("UxPlay-master" for zipfile downloads, "UxPlay" for "git clone" downloads) and build/install with
"cmake . ; make ; sudo make install " (same as for Linux).
"cmake . ; make ; sudo make install " (same as for Linux).
* Running UxPlay while checking for GStreamer warnings (do this with "export GST_DEBUG=2" before runnng UxPlay) reveals
that with the default (since UxPlay 1.64) use of timestamps for video synchonization, many video frames are being dropped
(only on macOS), perhaps due to another error (about videometa) that shows up in the GStreamer warnings. **Recommendation:
use the new UxPlay "no timestamp" option "`-vsync no`"** (you can add a line "vsync no" in the uxplayrc configuration file).
* On macOS with this installation of GStreamer, the only videosinks available seem to be glimagesink (default choice made by
autovideosink) and osxvideosink. The window title does not show the Airplay server name, but the window is visible to
@@ -466,15 +501,12 @@ Finally, build and install uxplay: open a terminal and change into the UxPlay so
***Using GStreamer installed from MacPorts (not recommended):***
To install: "sudo port install pkgconfig"; "sudo port install gstreamer1-gst-plugins-base gstreamer1-gst-plugins-good gstreamer1-gst-plugins-bad gstreamer1-gst-libav".
**The MacPorts GStreamer is built to use X11**: use the special CMake option `-DUSE_X11=ON` when building UxPlay.
Then uxplay must be run from an XQuartz terminal, and needs
option "-vs ximagesink". On an unibody (non-retina) MacBook Pro, the default resolution wxh = 1920x1080 was too large,
but using option "-s 800x600" worked. The MacPorts GStreamer pipeline seems fragile against attempts to change
the X11 window size, or to rotations that switch a connected client between portrait and landscape mode while uxplay is running.
Using the MacPorts X11 GStreamer seems only possible if the image size is left unchanged from the initial "-s wxh" setting
(also use the iPad/iPhone setting that locks the screen orientation against switching between portrait and landscape mode
as the device is rotated).
To install: "sudo port install pkgconf"; "sudo port install gstreamer1-gst-plugins-base gstreamer1-gst-plugins-good gstreamer1-gst-plugins-bad gstreamer1-gst-libav".
**The MacPorts GStreamer is old (v1.16.2) and built to use X11**: use the special CMake option `-DUSE_X11=ON`
when building UxPlay. Then uxplay must be run from an XQuartz terminal, and needs
option "-vs ximagesink". On a unibody (non-retina) MacBook Pro, the default resolution wxh = 1920x1080
was too large, but using option "-s 800x600" worked. The MacPorts GStreamer pipeline
seems fragile against attempts to change the X11 window size, or to rotations that switch a connected client between portrait and landscape mode while uxplay is running. Using the MacPorts X11 GStreamer seems only possible if the image size is left unchanged from the initial "-s wxh" setting (also use the iPad/iPhone setting that locks the screen orientation against switching between portrait and landscape mode as the device is rotated).
## Building UxPlay on Microsoft Windows, using MSYS2 with the MinGW-64 compiler.
@@ -573,12 +605,14 @@ Options:
**-nh** Do not append "@_hostname_" at the end of the AirPlay server name.
**-vsync [x]** (In Mirror mode:) this option uses timestamps to synchronize audio with video on the server,
**-vsync [x]** (In Mirror mode:) this option (**now the default**) uses timestamps to synchronize audio with video on the server,
with an optional audio delay in (decimal) milliseconds (_x_ = "20.5" means 0.0205 seconds delay: positive or
negative delays less than a second are allowed.) It is needed on low-power systems such as Raspberry Pi without hardware
video decoding. Standard desktop systems seem to work well without this (streaming without use of timestamps
was the only behavior prior to UxPlay 1.63), but you may wish to use it there too. (It may become the default in future releases.)
video decoding.
**-vsync no** (In Mirror mode:) this switches off timestamp-based audio-video synchronization, restoring the default behavior prior to
UxPlay-1.64. Standard desktop systems seem to work well without use of timestamps: this mode is appropriate for "live streaming" such as
using UxPlay as a second monitor for a mac computer, or monitoring a webcam; with it, no video frames are dropped.
**-async [x]** (In Audio-Only (ALAC) mode:) this option uses timestamps to synchronize audio on the server with video on the client,
with an optional audio delay in (decimal) milliseconds (_x_ = "20.5" means 0.0205 seconds delay: positive or
@@ -588,6 +622,8 @@ Options:
immediately. _This might in principle be mitigated by using the `-al` audio latency setting to change the latency (default 0.25 secs)
that the server reports to the client, but at present changing this does not seem to have any effect_.
**-async no**. This is the still the default behavior in Audio-only mode, but this option may be useful as a command-line option to switch off a
`-async` option set in a "uxplayrc" configuration file.
**-s wxh** (e.g. -s 1920x1080 , which is the default ) sets the display resolution (width and height,
in pixels). (This may be a
@@ -848,13 +884,14 @@ to guess what are the "best" plugins to use on your system).
A different reason for no audio occurred when a user with a firewall only opened two udp network
ports: **three** are required (the third one receives the audio data).
**Raspberry Pi** devices work best with hardware GPU h264 video decoding if the Video4Linux2 plugin in GStreamer v1.20.x or earlier has been patched
(see the UxPlay [Wiki](https://github.com/FDH2/UxPlay/wiki/Gstreamer-Video4Linux2-plugin-patches) for patches).
This is fixed in GStreamer-1.22, and by backport patches from this in distributions such as Raspberry Pi OS (Bullseye): **use option `-bt709` with the
GStreamer-1.18.4 from Raspberry Pi OS**..
**Raspberry Pi** devices work best with hardware GPU h264 video decoding if the Video4Linux2 plugin in GStreamer v1.20.x or earlier has
been patched (see the UxPlay [Wiki](https://github.com/FDH2/UxPlay/wiki/Gstreamer-Video4Linux2-plugin-patches) for patches).
This is fixed in GStreamer-1.22, and by backport patches from this in distributions such as Raspberry Pi OS (Bullseye): **use option `-bt709`
with the GStreamer-1.18.4 from Raspberry Pi OS**.
This also needs the bcm2835-codec kernel module that is not in the standard Linux kernel (it is available in Raspberry Pi OS, Ubuntu and Manjaro).
* **If this kernel module is not available in your Raspberry Pi operating system, or if GStreamer < 1.22 is not patched, use options `-avdec -vsync` for software h264-decoding.**
* **If this kernel module is not available in your Raspberry Pi operating system, or if GStreamer < 1.22 is not patched, use option `-avdec`
for software h264-decoding.**
Sometimes "autovideosink" may select the OpenGL renderer "glimagesink" which
may not work correctly on your system. Try the options "-vs ximagesink" or
@@ -974,6 +1011,10 @@ tvOS 12.2.1); it seems that the use of "legacy" protocol just requires bit 27 (l
The "features" code and other settings are set in `UxPlay/lib/dnssdint.h`.
# Changelog
1.64 2023-04-23 Timestamp-based synchronization of audio and video is now the default in Mirror mode.
(Use "-vsync no" to restore previous behavior.) A configuration file can now be used
for startup options. Also some internal cleanups and a minor bugfix that fixes #192.
1.63 2023-02-12 Reworked audio-video synchronization, with new options -vsync (for Mirror mode) and
-async (for Audio-Only mode, to sync with client video). Option -vsync makes software
h264 decoding of streamed videos with option -avdec viable on some recent Raspberry Pi models.

View File

@@ -1,4 +1,4 @@
# UxPlay 1.63: AirPlay-Mirror and AirPlay-Audio server for Linux, macOS, and Unix (now also runs on Windows).
# UxPlay 1.64: AirPlay-Mirror and AirPlay-Audio server for Linux, macOS, and Unix (now also runs on Windows).
### Now developed at the GitHub site <https://github.com/FDH2/UxPlay> (where all user issues should be posted).
@@ -22,7 +22,8 @@
pipeline).
- Support for server behind a firewall.
- Raspberry Pi support **both with and without hardware video
decoding** by the Broadcom GPU. *Tested on Raspberry Pi 4 Model B.*
decoding** by the Broadcom GPU. *Tested on Raspberry Pi 4 Model B
and Pi 3 model B+.*
- Support for running on Microsoft Windows (builds with the MinGW-64
compiler in the unix-like MSYS2 environment).
@@ -77,13 +78,15 @@ pulled from the new main [UxPlay site](https://github.com/FDH2/UxPlay)).
UxPlay is tested on a number of systems, including (among others) Debian
10.11 "Buster" and 11.2 "Bullseye", Ubuntu 20.04 LTS and 22.04.1 LTS,
Linux Mint 20.3, Pop!\_OS 22.04 (NVIDIA edition), Rocky Linux 8.6 (a
CentOS successor), Fedora 36, OpenSUSE 15.4, Arch Linux 22.10, macOS
12.3 (Intel and M1), FreeBSD 13.1, Windows 10 and 11 (64 bit).
(also Ubuntu derivatives Linux Mint 20.3, Pop!\_OS 22.04 (NVIDIA
edition)), Rocky Linux 9.1 (a CentOS successor), Fedora 36, OpenSUSE
15.4, Arch Linux 22.10, macOS 13.3 (Intel and M2), FreeBSD 13.2, Windows
10 and 11 (64 bit).
On Raspberry Pi 4 model B, it is tested on Raspberry Pi OS (Bullseye)
(32- and 64-bit), Ubuntu 22.10, Manjaro RPi4 23.02, and (without
hardware video decoding) on OpenSUSE 15.4.
(32- and 64-bit), Ubuntu 22.04 and 22.10, Manjaro RPi4 23.02, and
(without hardware video decoding) on OpenSUSE 15.4. Also tested on
Raspberry Pi 3 model B+.
Its main use is to act like an AppleTV for screen-mirroring (with audio)
of iOS/iPadOS/macOS clients (iPhone, iPod Touch, iPad, Mac computers) on
@@ -322,18 +325,19 @@ installed, depending on how your audio is set up.
- **Red Hat, or clones like CentOS (now continued as Rocky Linux or
Alma Linux):** (sudo dnf install, or sudo yum install)
gstreamer1-libav gstreamer1-plugins-bad-free (+ gstreamer1-vaapi for
intel graphics). *You may need to get some of them (in particular
gstreamer1-libav) from [rpmfusion.org](https://rpmfusion.org) (which
provides packages including plugins that RedHat does not ship for
license reasons). \[In recent **Fedora**, the libav plugin package
is renamed to "gstreamer1-plugin-libav", which now needs the RPM
Fusion package ffmpeg-libs for the patent-encumbered code which
RedHat does not provide: check with "`rpm -qi ffmpeg-libs`" that it
lists "Packager" as RPM Fusion; if this is not installed, uxplay
will fail to start, with error: **no element "avdec_aac"** \]*.
Intel/AMD graphics). *You may need to get some of them (in
particular gstreamer1-libav) from
[rpmfusion.org](https://rpmfusion.org) (which provides packages
including plugins that RedHat does not ship for license reasons).
\[In recent **Fedora**, the libav plugin package is renamed to
"gstreamer1-plugin-libav", which now needs the RPM Fusion package
ffmpeg-libs for the patent-encumbered code which RedHat does not
provide: check with "`rpm -qi ffmpeg-libs`" that it lists "Packager"
as RPM Fusion; if this is not installed, uxplay will fail to start,
with error: **no element "avdec_aac"** \]*.
- **OpenSUSE:** (sudo zypper install) gstreamer-plugins-libav
gstreamer-plugins-bad (+ gstreamer-plugins-vaapi for Intel
gstreamer-plugins-bad (+ gstreamer-plugins-vaapi for Intel/AMD
graphics). *In some cases, you may need to use gstreamer or libav\*
packages for OpenSUSE from
[Packman](https://ftp.gwdg.de/pub/linux/misc/packman/suse/)
@@ -344,21 +348,29 @@ installed, depending on how your audio is set up.
Packman).*
- **Arch Linux** (sudo pacman -Syu) gst-plugins-good gst-plugins-bad
gst-libav (+ gstreamer-vaapi for Intel graphics).
gst-libav (+ gstreamer-vaapi for Intel/AMD graphics).
- **FreeBSD:** (sudo pkg install) gstreamer1-libav,
gstreamer1-plugins, gstreamer1-plugins-\* (\* = core, good, bad, x,
gtk, gl, vulkan, pulse, v4l2, ...), (+ gstreamer1-vaapi for Intel
graphics).
gtk, gl, vulkan, pulse, v4l2, ...), (+ gstreamer1-vaapi for
Intel/AMD graphics).
### Starting UxPlay
### Starting and running UxPlay
**Finally, run uxplay in a terminal window**. On some systems, you can
toggle into and out of fullscreen mode with F11 or (held-down left
Alt)+Enter keys. Use Ctrl-C (or close the window) to terminate it when
done. If the UxPlay server is not seen by the iOS client's drop-down
"Screen Mirroring" panel, check that your DNS-SD server (usually
avahi-daemon) is running: do this in a terminal window with
Since UxPlay-1.64, UxPlay can be started with options read from a
configuration file, which will be the first found of (1) a file with a
path given by environment variable `$UXPLAYRC`, (2) `~/.uxplayrc` in the
user's home directory ("\~"), (3) `~/.config/uxplayrc`. The format is
one option per line, omitting the initial `"-"` of the command-line
option. Lines in the configuration file beginning with `"#"` are treated
as comments and ignored.
**Run uxplay in a terminal window**. On some systems, you can toggle
into and out of fullscreen mode with F11 or (held-down left Alt)+Enter
keys. Use Ctrl-C (or close the window) to terminate it when done. If the
UxPlay server is not seen by the iOS client's drop-down "Screen
Mirroring" panel, check that your DNS-SD server (usually avahi-daemon)
is running: do this in a terminal window with
`systemctl status avahi-daemon`. If this shows the avahi-daemon is not
running, control it with
`sudo systemctl [start,stop,enable,disable] avahi-daemon` (on
@@ -371,36 +383,53 @@ opened: **if a firewall is active, also open UDP port 5353 (for mDNS
queries) needed by Avahi**. See [Troubleshooting](#troubleshooting)
below for help with this or other problems.
- you may find video is improved by the setting -fps 60 that allows
some video to be played at 60 frames per second. (You can see what
framerate is actually streaming by using -vs fpsdisplaysink, and/or
-FPSdata.)
- By default, UxPlay is locked to its current client until that client
drops the connection; since UxPlay-1.58, the option `-nohold`
modifies this behavior so that when a new client requests a
connection, it removes the current client and takes over.
- In its default mode, Uxplay uses a simple GStreamer mode
("sync=false") that streams without using audio- and
video-timestamps for synchronization. UxPlay 1.63 also introduces
`-vsync` and `-async` as alternatives that use timestamps in Mirror
and Audio-Only modes respectively (GStreamer's "sync=true" mode).
Simple default streaming in Mirror mode seems to maintain
synchronisation of audio with video on desktop systems, but you may
wish to use `-vsync`, which becomes essential in low-powered systems
like Raspberry Pi if hardware video decoding is not used (**and is
likely to become the default in future releases of UxPlay**). These
options also allow an optional positive (or negative) audio-delay
adjustment in *milliseconds* for fine-tuning : `-vsync 20.5` delays
audio relative to video by 0.0205 secs; a negative value advances
it.)
- In Mirror mode, GStreamer has a choice of **two** methods to play
video with its accompanying audio: prior to UxPlay-1.64, the video
and audio streams were both played as soon as possible after they
arrived (the GStreamer "*sync=false*" method), with a GStreamer
internal clock used to try to keep them synchronized. **Starting
with UxPlay-1.64, the other method (GStreamer's "*sync=true*" mode),
which uses timestamps in the audio and video streams sent by the
client, is the new default**. On low-decoding-power UxPlay hosts
(such as Raspberry Pi 3 models) this will drop video frames that
cannot be decoded in time to play with the audio, making the video
jerky, but still synchronized.
- The `-async` option should be used if you want video on the client
to be synchronized with Audio-Only mode audio on the server (*e.g.*
for viewing song lyrics in Apple music while listening to ALAC
loss-free audio on the server); this introduces a slight delay for
events like pausing audio, changing tracks, *etc.*, to be heard.
The older method which does not drop late video frames worked well on
more powerful systems, and is still available with the UxPlay option
"`-vsync no`"; this method is adapted to "live streaming", and may be
better when using UxPlay as a second monitor for a Mac computer, for
example, while the new default timestamp-based method is best for
watching a video, to keep lip movements and voices synchronized.
(Without use of timestamps, video will eventually lag behind audio if it
cannot be decoded fast enough: hardware-accelerated video-decoding
helped to prevent this previously when timestamps were not being used.)
- In Audio-only mode the GStreamer "sync=false" mode (not using
timestamps) is still the default, but if you want to keep the audio
playing on the server synchronized with the video showing on the
client, use the `-async` timestamp-based option. (An example might
be if you want to follow the Apple Music lyrics on the client while
listening to superior sound on the UxPlay server). This delays the
video on the client to match audio on the server, so leads to a
slight delay before a pause or track-change initiated on the client
takes effect on the audio played by the server.
The -vsync and -async options also allow an optional positive (or
negative) audio-delay adjustment in *milliseconds* for fine-tuning :
`-vsync 20.5` delays audio relative to video by 0.0205 secs; a negative
value advances it.)
- you may find video is improved by the setting -fps 60 that allows
some video to be played at 60 frames per second. (You can see what
framerate is actually streaming by using -vs fpsdisplaysink, and/or
-FPSdata.) When using this, you should use the default
timestamp-based synchronization option `-vsync`.
- Since UxPlay-1.54, you can display the accompanying "Cover Art" from
sources like Apple Music in Audio-Only (ALAC) mode: run
@@ -418,12 +447,12 @@ plugin. If your system uses the Wayland compositor for graphics, use
"`uxplay -vs waylandsink`".** See [Usage](#usage) for more run-time
options.
### **Special instructions for Raspberry Pi (tested on R Pi 4 model B 8GB)**:
### **Special instructions for Raspberry Pi (tested on R Pi 4 model B 8GB and R Pi 3 model B+)**:
- If you use the software-only (h264) video-decoding UxPlay option
`-avdec`, you also need option `-vsync`to keep audio and video
synchronized (`-vsync` is a new feature; before it was introduced,
software decoding on the Pi was not viable.)
`-avdec`, it now works better than earlier, with the new default
timestamp-based synchronization to keep audio and video
synchronized.
- For best performance, the Raspberry Pi needs the GStreamer
Video4linux2 plugin to use its Broadcom GPU hardware for decoding
@@ -437,6 +466,13 @@ options.
provide it: **without this kernel module, UxPlay cannot use the
decoding firmware in the GPU.**
For use of the GPU, use raspi-config "Performance Options" (on Raspberry
Pi OS, use a similar tool on other distributions) to allocate sufficient
memory for the GPU (on R. Pi 3 model B+, the maximum (256MB) is
suggested). Even with GPU video decoding, some frames may be dropped by
the lower-power 3 B+ to keep audio and video synchronized using
timestamps.
- The plugin in the latest GStreamer-1.22 release works well, but
older releases of GStreamer will not work unless patched with
backports from GStreamer-1.22. Raspberry Pi OS (Bullseye) now has a
@@ -445,18 +481,18 @@ options.
instructions in the UxPlay
Wiki](https://github.com/FDH2/UxPlay/wiki/Gstreamer-Video4Linux2-plugin-patches).
The basic uxplay options for R Pi are `uxplay -vsync [-vs <videosink>]`.
The choice `<videosink>` = `glimagesink` is sometimes useful. On a
system without X11 (like R Pi OS Lite) with framebuffer video, use
The basic uxplay options for R Pi are `uxplay [-vs <videosink>]`. The
choice `<videosink>` = `glimagesink` is sometimes useful. On a system
without X11 (like R Pi OS Lite) with framebuffer video, use
`<videosink>` = `kmssink`. With the Wayland video compositor, use
`<videosink>` = `waylandsink`. When using the Video4Linux2 plugin to
access hardware video decoding, an option `-v4l2` may be useful: for
convenience, this also comes combined with various videosink options as
`-rpi`, `-rpigl` `-rpifb`, `-rpiwl`, respectively provided for X11, X11
with OpenGL, framebuffer, and Wayland systems. You may find that just
"`uxplay -vsync`", (*without* `-v4l2` or `-rpi*` options, which lets
GStreamer try to find the best video solution by itself) provides the
best results (the `-rpi*` options may be removed in a future release of
with OpenGL, framebuffer, and Wayland systems. **You may find that just
"`uxplay`", (*without* `-v4l2` or `-rpi*` options, which lets GStreamer
try to find the best video solution by itself) provides the best
results** (the `-rpi*` options may be removed in a future release of
UxPlay.)
- If you are using Raspberry Pi OS (Bullseye) with Video4Linux2 from
@@ -492,52 +528,73 @@ developer tools are installed (if Xcode is installed, open the Terminal,
type "sudo xcode-select --install" and accept the conditions).
It is also assumed that CMake \>= 3.13 is installed: this can be done
with package managers [MacPorts](http://www.macports.org),
[Fink](http://finkproject.org) or [Homebrew](http://brew.sh), or by a
download from <https://cmake.org/download/>.
with package managers [MacPorts](http://www.macports.org)
(`sudo port install cmake`), [Homebrew](http://brew.sh)
(`brew install cmake`), or by a download from
<https://cmake.org/download/>. Also install `git` if you will use it to
fetch UxPlay.
First install OpenSSL and libplist: static versions of these libraries
will be used, so they can be uninstalled after UxPlay is built. These
are available in MacPorts and Homebrew, or they can easily be built from
source (see instructions at the end of this README; this requires
development tools autoconf, automake, libtool, which can be installed
using MacPorts, HomeBrew, or Fink).
Next install libplist and openssl-3.x. Note that static versions of
these libraries will be used in the macOS builds, so they can be
uninstalled after building uxplay, if you wish.
- If you use Homebrew: `brew install libplist openssl@3`
- if you use MacPorts: `sudo port install libplist-devel openssl3`
Otherwise, build libplist and openssl from source: see instructions near
the end of this README; requires development tools (autoconf, automake,
libtool, *etc.*) to be installed.
Next get the latest macOS release of GStreamer-1.0.
- recommended: install the "official" GStreamer release for macOS from
<https://gstreamer.freedesktop.org/download/>. The alternative is to
install it from Homebrew. MacPorts packages of GStreamer are
compiled to use X11 and are **NOT** recommended.
**Using "Official" GStreamer (Recommended for both MacPorts and Homebrew
users)**: install the GStreamer release for macOS from
<https://gstreamer.freedesktop.org/download/>. (This release contains
its own pkg-config, so you don't have to install one.) Install both the
gstreamer-1.0 and gstreamer-1.0-devel packages. After downloading,
Shift-Click on them to install (they install to
/Library/FrameWorks/GStreamer.framework). Homebrew or MacPorts users
should **not** install (or should uninstall) the GStreamer supplied by
their package manager, if they use the "official" release.
- You could instead compile the "official" GStreamer release from
source: GStreamer-1.22.0 has been successfully built this way on a
system using MacPorts: see [the UxPlay
Wiki](https://github.com/FDH2/UxPlay/wiki/Building-GStreamer-from-Source-on-macOS-with-MacPorts)
- **ADDED 2023-01-25: in the latest release (now 1.22.2) something in
the GStreamer macOS binaries appears to not be working (UxPlay
starts receiving the AirPlay stream, but the video window does not
open)**. If you have this problem, use the GStreamer-1.20.6 binary
packages until a fix is found. *You could instead compile the
"official" GStreamer-1.22.x release from source: GStreamer-1.22.0
has been successfully built this way on a system using MacPorts:
see* [the UxPlay
Wiki](https://github.com/FDH2/UxPlay/wiki/Building-GStreamer-from-Source-on-macOS-with-MacPorts).
**For the "official" release**: install both the macOS runtime and
development installer packages. Assuming that the latest release is
1.20.5 install `gstreamer-1.0-1.20.5-universal.pkg` and
`gstreamer-1.0-devel-1.20.5-universal.pkg`. Click on them to install
(they install to /Library/FrameWorks/GStreamer.framework).
- **ADDED 2023-01-25: v1.22.0 has just been released, but these
binaries seem to have problems, perhaps only on older macOS
releases; use v1.20.5 if they dont work for you.**
**For Homebrew**: pkgconfig is needed ("brew install pkgconfig"). Then
"brew install gst-plugins-base gst-plugins-good gst-plugins-bad
gst-libav". This appears to be functionally equivalent to using
GStreamer.framework, but causes a large number of extra packages to be
installed by Homebrew as dependencies. **You may need to set the
environment variable GST_PLUGIN_PATH=/usr/local/lib/gstreamer-1.0 to
point to the Homebrew GStreamer installation.**
**Using Homebrew's GStreamer**: pkg-config is needed: ("brew install
pkg-config gstreamer"). This causes a large number of extra packages to
be installed by Homebrew as dependencies. The [Homebrew gstreamer
installation](https://formulae.brew.sh/formula/gstreamer#default) has
recently been reworked into a single "formula" named `gstreamer`, which
now works without needing GST_PLUGIN_PATH to be set in the enviroment.
Homebrew installs gstreamer to `(HOMEBREW)/lib/gstreamer-1.0` where
`(HOMEBREW)/*` is `/opt/homebrew/*` on Apple Silicon Macs, and
`/usr/local/*` on Intel Macs; do not put any extra non-Homebrew plugins
(that you build yourself) there, and instead set GST_PLUGIN_PATH to
point to their location (Homebrew does not supply a complete GStreamer,
but seems to have everything needed for UxPlay).
Finally, build and install uxplay: open a terminal and change into the
UxPlay source directory ("UxPlay-master" for zipfile downloads, "UxPlay"
for "git clone" downloads) and build/install with "cmake . ; make ; sudo
make install" (same as for Linux).
- Running UxPlay while checking for GStreamer warnings (do this with
"export GST_DEBUG=2" before runnng UxPlay) reveals that with the
default (since UxPlay 1.64) use of timestamps for video
synchonization, many video frames are being dropped (only on macOS),
perhaps due to another error (about videometa) that shows up in the
GStreamer warnings. **Recommendation: use the new UxPlay "no
timestamp" option "`-vsync no`"** (you can add a line "vsync no" in
the uxplayrc configuration file).
- On macOS with this installation of GStreamer, the only videosinks
available seem to be glimagesink (default choice made by
autovideosink) and osxvideosink. The window title does not show the
@@ -561,12 +618,12 @@ make install" (same as for Linux).
***Using GStreamer installed from MacPorts (not recommended):***
To install: "sudo port install pkgconfig"; "sudo port install
To install: "sudo port install pkgconf"; "sudo port install
gstreamer1-gst-plugins-base gstreamer1-gst-plugins-good
gstreamer1-gst-plugins-bad gstreamer1-gst-libav". **The MacPorts
GStreamer is built to use X11**: use the special CMake option
`-DUSE_X11=ON` when building UxPlay. Then uxplay must be run from an
XQuartz terminal, and needs option "-vs ximagesink". On an unibody
GStreamer is old (v1.16.2) and built to use X11**: use the special CMake
option `-DUSE_X11=ON` when building UxPlay. Then uxplay must be run from
an XQuartz terminal, and needs option "-vs ximagesink". On a unibody
(non-retina) MacBook Pro, the default resolution wxh = 1920x1080 was too
large, but using option "-s 800x600" worked. The MacPorts GStreamer
pipeline seems fragile against attempts to change the X11 window size,
@@ -695,15 +752,19 @@ will also now be the name shown above the mirror display (X11) window.
**-nh** Do not append "@_hostname_" at the end of the AirPlay server
name.
**-vsync \[x\]** (In Mirror mode:) this option uses timestamps to
synchronize audio with video on the server, with an optional audio delay
in (decimal) milliseconds (*x* = "20.5" means 0.0205 seconds delay:
positive or negative delays less than a second are allowed.) It is
needed on low-power systems such as Raspberry Pi without hardware video
decoding. Standard desktop systems seem to work well without this
(streaming without use of timestamps was the only behavior prior to
UxPlay 1.63), but you may wish to use it there too. (It may become the
default in future releases.)
**-vsync \[x\]** (In Mirror mode:) this option (**now the default**)
uses timestamps to synchronize audio with video on the server, with an
optional audio delay in (decimal) milliseconds (*x* = "20.5" means
0.0205 seconds delay: positive or negative delays less than a second are
allowed.) It is needed on low-power systems such as Raspberry Pi without
hardware video decoding.
**-vsync no** (In Mirror mode:) this switches off timestamp-based
audio-video synchronization, restoring the default behavior prior to
UxPlay-1.64. Standard desktop systems seem to work well without use of
timestamps: this mode is appropriate for "live streaming" such as using
UxPlay as a second monitor for a mac computer, or monitoring a webcam;
with it, no video frames are dropped.
**-async \[x\]** (In Audio-Only (ALAC) mode:) this option uses
timestamps to synchronize audio on the server with video on the client,
@@ -717,6 +778,10 @@ using the `-al` audio latency setting to change the latency (default
0.25 secs) that the server reports to the client, but at present
changing this does not seem to have any effect*.
**-async no**. This is the still the default behavior in Audio-only
mode, but this option may be useful as a command-line option to switch
off a `-async` option set in a "uxplayrc" configuration file.
**-s wxh** (e.g. -s 1920x1080 , which is the default ) sets the display
resolution (width and height, in pixels). (This may be a request made to
the AirPlay client, and perhaps will not be the final resolution you
@@ -1044,13 +1109,13 @@ patched (see the UxPlay
[Wiki](https://github.com/FDH2/UxPlay/wiki/Gstreamer-Video4Linux2-plugin-patches)
for patches). This is fixed in GStreamer-1.22, and by backport patches
from this in distributions such as Raspberry Pi OS (Bullseye): **use
option `-bt709` with the GStreamer-1.18.4 from Raspberry Pi OS**.. This
option `-bt709` with the GStreamer-1.18.4 from Raspberry Pi OS**. This
also needs the bcm2835-codec kernel module that is not in the standard
Linux kernel (it is available in Raspberry Pi OS, Ubuntu and Manjaro).
- **If this kernel module is not available in your Raspberry Pi
operating system, or if GStreamer \< 1.22 is not patched, use
options `-avdec -vsync` for software h264-decoding.**
operating system, or if GStreamer \< 1.22 is not patched, use option
`-avdec` for software h264-decoding.**
Sometimes "autovideosink" may select the OpenGL renderer "glimagesink"
which may not work correctly on your system. Try the options "-vs
@@ -1210,6 +1275,11 @@ The "features" code and other settings are set in
# Changelog
1.64 2023-04-23 Timestamp-based synchronization of audio and video is
now the default in Mirror mode. (Use "-vsync no" to restore previous
behavior.) A configuration file can now be used for startup options.
Also some internal cleanups and a minor bugfix that fixes #192.
1.63 2023-02-12 Reworked audio-video synchronization, with new options
-vsync (for Mirror mode) and -async (for Audio-Only mode, to sync with
client video). Option -vsync makes software h264 decoding of streamed

View File

@@ -276,14 +276,19 @@ http_request_get_header_string(http_request_t *request, char **header_str)
assert(str);
*header_str = str;
char *p = str;
int n = len + 1;
for (int i = 0; i < request->headers_size; i++) {
sprintf(p,"%s", request->headers[i]);
p += strlen(request->headers[i]);
int hlen = strlen(request->headers[i]);
snprintf(p, n, "%s", request->headers[i]);
n -= hlen;
p += hlen;
if (i%2 == 0) {
sprintf(p, ": ");
p +=2;
snprintf(p, n, ": ");
n -= 2;
p += 2;
} else {
sprintf(p, "\n");
snprintf(p, n, "\n");
n--;
p++;
}
}

View File

@@ -10,6 +10,9 @@
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Lesser General Public License for more details.
*
*===============================================================
* modified by fduncanh 2023
*/
#include <stdlib.h>
@@ -61,6 +64,19 @@ logger_set_level(logger_t *logger, int level)
MUTEX_UNLOCK(logger->lvl_mutex);
}
int
logger_get_level(logger_t *logger)
{
int level;
assert(logger);
MUTEX_LOCK(logger->lvl_mutex);
level = logger->level;
MUTEX_UNLOCK(logger->lvl_mutex);
return level;
}
void
logger_set_callback(logger_t *logger, logger_callback_t callback, void *cls)
{

View File

@@ -10,6 +10,9 @@
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Lesser General Public License for more details.
*
*=================================================================
* modified by fduncanh 2023
*/
#ifndef LOGGER_H
@@ -37,6 +40,7 @@ logger_t *logger_init();
void logger_destroy(logger_t *logger);
void logger_set_level(logger_t *logger, int level);
int logger_get_level(logger_t *logger);
void logger_set_callback(logger_t *logger, logger_callback_t callback, void *cls);
void logger_log(logger_t *logger, int level, const char *fmt, ...);

View File

@@ -28,7 +28,6 @@
#include <stdio.h>
#include <inttypes.h>
//#define DUMP_KEI_IV
struct mirror_buffer_s {
logger_t *logger;
aes_ctx_t *aes_ctx;
@@ -50,8 +49,8 @@ mirror_buffer_init_aes(mirror_buffer_t *mirror_buffer, const uint64_t *streamCon
/* AES key and IV */
// Need secondary processing to use
sprintf((char*) aeskey_video, "AirPlayStreamKey%" PRIu64, *streamConnectionID);
sprintf((char*) aesiv_video, "AirPlayStreamIV%" PRIu64, *streamConnectionID);
snprintf((char*) aeskey_video, sizeof(aeskey_video), "AirPlayStreamKey%" PRIu64, *streamConnectionID);
snprintf((char*) aesiv_video, sizeof(aesiv_video), "AirPlayStreamIV%" PRIu64, *streamConnectionID);
sha_ctx_t *ctx = sha_init();
sha_update(ctx, aeskey_video, strlen((char*) aeskey_video));
@@ -66,13 +65,6 @@ mirror_buffer_init_aes(mirror_buffer_t *mirror_buffer, const uint64_t *streamCon
// Need to be initialized externally
mirror_buffer->aes_ctx = aes_ctr_init(aeskey_video, aesiv_video);
#ifdef DUMP_KEI_IV
FILE* keyfile = fopen("/sdcard/111.keyiv", "wb");
fwrite(aeskey_video, 16, 1, keyfile);
fwrite(aesiv_video, 16, 1, keyfile);
fclose(keyfile);
#endif
}
mirror_buffer_t *

View File

@@ -156,13 +156,13 @@ conn_init(void *opaque, unsigned char *local, int locallen, unsigned char *remot
static void
conn_request(void *ptr, http_request_t *request, http_response_t **response) {
raop_conn_t *conn = ptr;
logger_log(conn->raop->logger, LOGGER_DEBUG, "conn_request");
const char *method;
const char *url;
const char *cseq;
char *response_data = NULL;
int response_datalen = 0;
logger_log(conn->raop->logger, LOGGER_DEBUG, "conn_request");
bool logger_debug = (logger_get_level(conn->raop->logger) >= LOGGER_DEBUG);
method = http_request_get_method(request);
url = http_request_get_url(request);
@@ -180,7 +180,7 @@ conn_request(void *ptr, http_request_t *request, http_response_t **response) {
free(header_str);
int request_datalen;
const char *request_data = http_request_get_data(request, &request_datalen);
if (request_data) {
if (request_data && logger_debug) {
if (request_datalen > 0) {
if (data_is_plist) {
plist_t req_root_node = NULL;
@@ -333,7 +333,7 @@ conn_request(void *ptr, http_request_t *request, http_response_t **response) {
bool data_is_text = (strstr(header_str,"text/parameters") != NULL);
free(header_str);
if (response_data) {
if (response_datalen > 0) {
if (response_datalen > 0 && logger_debug) {
if (data_is_plist) {
plist_t res_root_node = NULL;
plist_from_bin(response_data, response_datalen, &res_root_node);

View File

@@ -79,14 +79,6 @@ raop_buffer_init(logger_t *logger,
// Need to be initialized internally
raop_buffer->aes_ctx = aes_cbc_init(aeskey, aesiv, AES_DECRYPT);
#ifdef DUMP_AUDIO
if (file_keyiv != NULL) {
fwrite(aeskey, 16, 1, file_keyiv);
fwrite(aesiv, 16, 1, file_keyiv);
fclose(file_keyiv);
}
#endif
for (int i = 0; i < RAOP_BUFFER_LENGTH; i++) {
raop_buffer_entry_t *entry = &raop_buffer->entries[i];
entry->payload_data = NULL;
@@ -113,15 +105,6 @@ raop_buffer_destroy(raop_buffer_t *raop_buffer)
free(raop_buffer);
}
#ifdef DUMP_AUDIO
if (file_aac != NULL) {
fclose(file_aac);
}
if (file_source != NULL) {
fclose(file_source);
}
#endif
}
static short
@@ -130,32 +113,11 @@ seqnum_cmp(unsigned short s1, unsigned short s2)
return (s1 - s2);
}
//#define DUMP_AUDIO
#ifdef DUMP_AUDIO
static FILE* file_aac = NULL;
static FILE* file_source = NULL;
static FILE* file_keyiv = NULL;
#endif
int
raop_buffer_decrypt(raop_buffer_t *raop_buffer, unsigned char *data, unsigned char* output, unsigned int payload_size, unsigned int *outputlen)
{
assert(raop_buffer);
int encryptedlen;
#ifdef DUMP_AUDIO
if (file_aac == NULL) {
file_aac = fopen("/home/pi/Airplay.aac", "wb");
file_source = fopen("/home/pi/Airplay.source", "wb");
file_keyiv = fopen("/home/pi/Airplay.keyiv", "wb");
}
// Undecrypted file
if (file_source != NULL) {
fwrite(&data[12], payloadsize, 1, file_source);
}
#endif
if (DECRYPTION_TEST) {
char *str = utils_data_to_string(data,12,12);
logger_log(raop_buffer->logger, LOGGER_INFO, "encrypted 12 byte header %s", str);
@@ -199,13 +161,6 @@ raop_buffer_decrypt(raop_buffer_t *raop_buffer, unsigned char *data, unsigned ch
free(str);
}
}
#ifdef DUMP_AUDIO
// Decrypted file
if (file_aac != NULL) {
fwrite(output, payloadsize, 1, file_aac);
}
#endif
return 1;
}

View File

@@ -318,10 +318,10 @@ raop_handler_setup(raop_conn_t *conn,
int use_udp;
const char *dacp_id;
const char *active_remote_header;
bool logger_debug = (logger_get_level(conn->raop->logger) >= LOGGER_DEBUG);
const char *data;
int data_len;
data = http_request_get_data(request, &data_len);
dacp_id = http_request_get_header(request, "DACP-ID");
@@ -369,9 +369,11 @@ raop_handler_setup(raop_conn_t *conn,
memcpy(aesiv, eiv, 16);
free(eiv);
logger_log(conn->raop->logger, LOGGER_DEBUG, "eiv_len = %llu", eiv_len);
char* str = utils_data_to_string(aesiv, 16, 16);
logger_log(conn->raop->logger, LOGGER_DEBUG, "16 byte aesiv (needed for AES-CBC audio decryption iv):\n%s", str);
free(str);
if (logger_debug) {
char* str = utils_data_to_string(aesiv, 16, 16);
logger_log(conn->raop->logger, LOGGER_DEBUG, "16 byte aesiv (needed for AES-CBC audio decryption iv):\n%s", str);
free(str);
}
char* ekey = NULL;
uint64_t ekey_len = 0;
@@ -380,21 +382,26 @@ raop_handler_setup(raop_conn_t *conn,
free(ekey);
logger_log(conn->raop->logger, LOGGER_DEBUG, "ekey_len = %llu", ekey_len);
// eaeskey is 72 bytes, aeskey is 16 bytes
str = utils_data_to_string((unsigned char *) eaeskey, ekey_len, 16);
logger_log(conn->raop->logger, LOGGER_DEBUG, "ekey:\n%s", str);
free (str);
if (logger_debug) {
char *str = utils_data_to_string((unsigned char *) eaeskey, ekey_len, 16);
logger_log(conn->raop->logger, LOGGER_DEBUG, "ekey:\n%s", str);
free (str);
}
int ret = fairplay_decrypt(conn->fairplay, (unsigned char*) eaeskey, aeskey);
logger_log(conn->raop->logger, LOGGER_DEBUG, "fairplay_decrypt ret = %d", ret);
str = utils_data_to_string(aeskey, 16, 16);
logger_log(conn->raop->logger, LOGGER_DEBUG, "16 byte aeskey (fairplay-decrypted from ekey):\n%s", str);
free(str);
if (logger_debug) {
char *str = utils_data_to_string(aeskey, 16, 16);
logger_log(conn->raop->logger, LOGGER_DEBUG, "16 byte aeskey (fairplay-decrypted from ekey):\n%s", str);
free(str);
}
unsigned char ecdh_secret[X25519_KEY_SIZE];
pairing_get_ecdh_secret_key(conn->pairing, ecdh_secret);
str = utils_data_to_string(ecdh_secret, X25519_KEY_SIZE, 16);
logger_log(conn->raop->logger, LOGGER_DEBUG, "32 byte shared ecdh_secret:\n%s", str);
free(str);
if (logger_debug) {
char *str = utils_data_to_string(ecdh_secret, X25519_KEY_SIZE, 16);
logger_log(conn->raop->logger, LOGGER_DEBUG, "32 byte shared ecdh_secret:\n%s", str);
free(str);
}
const char *user_agent = http_request_get_header(request, "User-Agent");
logger_log(conn->raop->logger, LOGGER_INFO, "Client identified as User-Agent: %s", user_agent);
@@ -413,9 +420,11 @@ raop_handler_setup(raop_conn_t *conn,
sha_final(ctx, eaeskey, NULL);
sha_destroy(ctx);
memcpy(aeskey, eaeskey, 16);
str = utils_data_to_string(aeskey, 16, 16);
logger_log(conn->raop->logger, LOGGER_DEBUG, "16 byte aeskey after sha-256 hash with ecdh_secret:\n%s", str);
free(str);
if (logger_debug) {
char *str = utils_data_to_string(aeskey, 16, 16);
logger_log(conn->raop->logger, LOGGER_DEBUG, "16 byte aeskey after sha-256 hash with ecdh_secret:\n%s", str);
free(str);
}
}
// Time port
@@ -694,7 +703,7 @@ raop_handler_record(raop_conn_t *conn,
{
char audio_latency[12];
unsigned int ad = (unsigned int) (((uint64_t) conn->raop->audio_delay_micros) * AUDIO_SAMPLE_RATE / SECOND_IN_USECS);
sprintf(audio_latency, "%u", ad);
snprintf(audio_latency, sizeof(audio_latency), "%u", ad);
logger_log(conn->raop->logger, LOGGER_DEBUG, "raop_handler_record");
http_response_add_header(response, "Audio-Latency", audio_latency);
http_response_add_header(response, "Audio-Jack-Status", "connected; type=analog");

View File

@@ -126,7 +126,8 @@ raop_ntp_parse_remote_address(raop_ntp_t *raop_ntp, const unsigned char *remote_
return -1;
}
memset(current, 0, sizeof(current));
sprintf(current, "%d.%d.%d.%d", remote_addr[0], remote_addr[1], remote_addr[2], remote_addr[3]);
snprintf(current, sizeof(current), "%d.%d.%d.%d", remote_addr[0], remote_addr[1],
remote_addr[2], remote_addr[3]);
logger_log(raop_ntp->logger, LOGGER_DEBUG, "raop_ntp parse remote ip = %s", current);
ret = netutils_parse_address(family, current,
&raop_ntp->remote_saddr,
@@ -271,6 +272,7 @@ raop_ntp_thread(void *arg)
const unsigned two_pow_n[RAOP_NTP_DATA_COUNT] = {2, 4, 8, 16, 32, 64, 128, 256};
int timeout_counter = 0;
bool conn_reset = false;
bool logger_debug = (logger_get_level(raop_ntp->logger) >= LOGGER_DEBUG);
while (1) {
MUTEX_LOCK(raop_ntp->run_mutex);
@@ -288,10 +290,12 @@ raop_ntp_thread(void *arg)
byteutils_put_ntp_timestamp(request, 24, send_time);
int send_len = sendto(raop_ntp->tsock, (char *)request, sizeof(request), 0,
(struct sockaddr *) &raop_ntp->remote_saddr, raop_ntp->remote_saddr_len);
char *str = utils_data_to_string(request, send_len, 16);
logger_log(raop_ntp->logger, LOGGER_DEBUG, "\nraop_ntp send time type_t=%d send_len = %d, now = %8.6f\n%s",
request[1] &~0x80, send_len, (double) send_time / SECOND_IN_NSECS, str);
free(str);
if (logger_debug) {
char *str = utils_data_to_string(request, sizeof(request), 16);
logger_log(raop_ntp->logger, LOGGER_DEBUG, "\nraop_ntp send time type_t=%d packetlen = %d, now = %8.6f\n%s",
request[1] &~0x80, sizeof(request), (double) send_time / SECOND_IN_NSECS, str);
free(str);
}
if (send_len < 0) {
logger_log(raop_ntp->logger, LOGGER_ERR, "raop_ntp error sending request");
} else {
@@ -323,11 +327,14 @@ raop_ntp_thread(void *arg)
// Local time of the client when the response message leaves the client
int64_t t2 = (int64_t) byteutils_get_ntp_timestamp(response, 24);
char *str = utils_data_to_string(response, response_len, 16);
logger_log(raop_ntp->logger, LOGGER_DEBUG, "raop_ntp receive time type_t=%d packetlen = %d, now = %8.6f t1 = %8.6f, t2 = %8.6f\n%s",
response[1] &~0x80, response_len, (double) t3 / SECOND_IN_NSECS, (double) t1 / SECOND_IN_NSECS, (double) t2 / SECOND_IN_NSECS, str);
free(str);
if (logger_debug) {
char *str = utils_data_to_string(response, response_len, 16);
logger_log(raop_ntp->logger, LOGGER_DEBUG,
"raop_ntp receive time type_t=%d packetlen = %d, now = %8.6f t1 = %8.6f, t2 = %8.6f\n%s",
response[1] &~0x80, response_len, (double) t3 / SECOND_IN_NSECS, (double) t1 / SECOND_IN_NSECS,
(double) t2 / SECOND_IN_NSECS, str);
free(str);
}
// The iOS client device sends its time in seconds relative to an arbitrary Epoch (the last boot).
// For a little bonus confusion, they add SECONDS_FROM_1900_TO_1970.
// This means we have to expect some rather huge offset, but its growth or shrink over time should be small.

View File

@@ -136,7 +136,7 @@ raop_rtp_parse_remote(raop_rtp_t *raop_rtp, const unsigned char *remote, int rem
return -1;
}
memset(current, 0, sizeof(current));
sprintf(current, "%d.%d.%d.%d", remote[0], remote[1], remote[2], remote[3]);
snprintf(current, sizeof(current), "%d.%d.%d.%d", remote[0], remote[1], remote[2], remote[3]);
logger_log(raop_rtp->logger, LOGGER_DEBUG, "raop_rtp parse remote ip = %s", current);
ret = netutils_parse_address(family, current,
&raop_rtp->remote_saddr,
@@ -463,6 +463,7 @@ raop_rtp_thread_udp(void *arg)
unsigned short seqnum1 = 0, seqnum2 = 0;
assert(raop_rtp);
bool logger_debug = (logger_get_level(raop_rtp->logger) >= LOGGER_DEBUG);
raop_rtp->ntp_start_time = raop_ntp_get_local_time(raop_rtp->ntp);
raop_rtp->rtp_clock_started = false;
for (int i = 0; i < RAOP_RTP_SYNC_DATA_COUNT; i++) {
@@ -531,7 +532,7 @@ raop_rtp_thread_udp(void *arg)
logger_log(raop_rtp->logger, LOGGER_DEBUG, "raop_rtp resent audio packet: seqnum=%u", seqnum);
int result = raop_buffer_enqueue(raop_rtp->buffer, resent_packet, resent_packetlen, &ntp_time, &rtp_time, 1);
assert(result >= 0);
} else {
} else if (logger_debug) {
/* type_c = 0x56 packets with length 8 have been reported */
char *str = utils_data_to_string(packet, packetlen, 16);
logger_log(raop_rtp->logger, LOGGER_DEBUG, "Received empty resent audio packet length %d, seqnum=%u:\n%s",
@@ -557,15 +558,17 @@ raop_rtp_thread_udp(void *arg)
}
uint64_t sync_ntp_raw = byteutils_get_long_be(packet, 8);
uint64_t sync_ntp_remote = raop_ntp_timestamp_to_nano_seconds(sync_ntp_raw, true);
uint64_t sync_ntp_local = raop_ntp_convert_remote_time(raop_rtp->ntp, sync_ntp_remote);
char *str = utils_data_to_string(packet, packetlen, 20);
logger_log(raop_rtp->logger, LOGGER_DEBUG,
"raop_rtp sync: client ntp=%8.6f, ntp = %8.6f, ntp_start_time %8.6f\nts_client = %8.6f sync_rtp=%u\n%s",
(double) sync_ntp_remote / SEC, (double) sync_ntp_local / SEC,
(double) raop_rtp->ntp_start_time / SEC, (double) sync_ntp_remote / SEC, sync_rtp, str);
free(str);
if (logger_debug) {
uint64_t sync_ntp_local = raop_ntp_convert_remote_time(raop_rtp->ntp, sync_ntp_remote);
char *str = utils_data_to_string(packet, packetlen, 20);
logger_log(raop_rtp->logger, LOGGER_DEBUG,
"raop_rtp sync: client ntp=%8.6f, ntp = %8.6f, ntp_start_time %8.6f\nts_client = %8.6f sync_rtp=%u\n%s",
(double) sync_ntp_remote / SEC, (double) sync_ntp_local / SEC,
(double) raop_rtp->ntp_start_time / SEC, (double) sync_ntp_remote / SEC, sync_rtp, str);
free(str);
}
raop_rtp_sync_clock(raop_rtp, &sync_ntp_remote, &sync_rtp64);
} else {
} else if (logger_debug) {
char *str = utils_data_to_string(packet, packetlen, 16);
logger_log(raop_rtp->logger, LOGGER_DEBUG, "raop_rtp unknown udp control packet\n%s", str);
free(str);
@@ -619,9 +622,12 @@ raop_rtp_thread_udp(void *arg)
//logger_log(raop_rtp->logger, LOGGER_DEBUG, "raop_rtp_thread_udp type_d 0x%02x, packetlen = %d", type_d, packetlen);
if (packetlen < 12) {
char *str = utils_data_to_string(packet, packetlen, 16);
logger_log(raop_rtp->logger, LOGGER_DEBUG, "Received short type_d = 0x%2x packet with length %d:\n%s", packet[1] & ~0x80, packetlen, str);
free (str);
if (logger_debug) {
char *str = utils_data_to_string(packet, packetlen, 16);
logger_log(raop_rtp->logger, LOGGER_DEBUG, "Received short type_d = 0x%2x packet with length %d:\n%s",
packet[1] & ~0x80, packetlen, str);
free (str);
}
continue;
}
@@ -694,11 +700,14 @@ raop_rtp_thread_udp(void *arg)
}
raop_rtp->callbacks.audio_process(raop_rtp->callbacks.cls, raop_rtp->ntp, &audio_data);
free(payload);
uint64_t ntp_now = raop_ntp_get_local_time(raop_rtp->ntp);
int64_t latency = ((int64_t) ntp_now) - ((int64_t) audio_data.ntp_time_local);
logger_log(raop_rtp->logger, LOGGER_DEBUG, "raop_rtp audio: now = %8.6f, ntp = %8.6f, latency = %8.6f, rtp_time=%u seqnum = %u",
(double) ntp_now / SEC, (double) audio_data.ntp_time_local / SEC, (double) latency / SEC, (uint32_t) rtp64_timestamp,
seqnum);
if (logger_debug) {
uint64_t ntp_now = raop_ntp_get_local_time(raop_rtp->ntp);
int64_t latency = ((int64_t) ntp_now) - ((int64_t) audio_data.ntp_time_local);
logger_log(raop_rtp->logger, LOGGER_DEBUG,
"raop_rtp audio: now = %8.6f, ntp = %8.6f, latency = %8.6f, rtp_time=%u seqnum = %u",
(double) ntp_now / SEC, (double) audio_data.ntp_time_local / SEC, (double) latency / SEC,
(uint32_t) rtp64_timestamp, seqnum);
}
}
/* Handle possible resend requests */

View File

@@ -104,12 +104,6 @@ struct raop_rtp_mirror_s {
/* switch for displaying client FPS data */
uint8_t show_client_FPS_data;
/* SPS and PPS */
int sps_pps_len;
unsigned char* sps_pps;
bool sps_pps_waiting;
};
static int
@@ -127,7 +121,7 @@ raop_rtp_parse_remote(raop_rtp_mirror_t *raop_rtp_mirror, const unsigned char *r
return -1;
}
memset(current, 0, sizeof(current));
sprintf(current, "%d.%d.%d.%d", remote[0], remote[1], remote[2], remote[3]);
snprintf(current, sizeof(current), "%d.%d.%d.%d", remote[0], remote[1], remote[2], remote[3]);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp_mirror parse remote ip = %s", current);
ret = netutils_parse_address(family, current,
&raop_rtp_mirror->remote_saddr,
@@ -154,9 +148,6 @@ raop_rtp_mirror_t *raop_rtp_mirror_init(logger_t *logger, raop_callbacks_t *call
}
raop_rtp_mirror->logger = logger;
raop_rtp_mirror->ntp = ntp;
raop_rtp_mirror->sps_pps_len = 0;
raop_rtp_mirror->sps_pps = NULL;
raop_rtp_mirror->sps_pps_waiting = false;
memcpy(&raop_rtp_mirror->callbacks, callbacks, sizeof(raop_callbacks_t));
raop_rtp_mirror->buffer = mirror_buffer_init(logger, aeskey);
@@ -182,8 +173,6 @@ raop_rtp_init_mirror_aes(raop_rtp_mirror_t *raop_rtp_mirror, uint64_t *streamCon
mirror_buffer_init_aes(raop_rtp_mirror->buffer, streamConnectionID);
}
//#define DUMP_H264
#define RAOP_PACKET_LEN 32768
/**
* Mirror
@@ -197,6 +186,9 @@ raop_rtp_mirror_thread(void *arg)
int stream_fd = -1;
unsigned char packet[128];
memset(packet, 0 , 128);
unsigned char* sps_pps = NULL;
bool prepend_sps_pps = false;
int sps_pps_len = 0;
unsigned char* payload = NULL;
unsigned int readstart = 0;
bool conn_reset = false;
@@ -205,14 +197,7 @@ raop_rtp_mirror_thread(void *arg)
uint64_t ntp_timestamp_remote = 0;
uint64_t ntp_timestamp_local = 0;
unsigned char nal_start_code[4] = { 0x00, 0x00, 0x00, 0x01 };
#ifdef DUMP_H264
// C decrypted
FILE* file = fopen("/home/pi/Airplay.h264", "wb");
// Encrypted source file
FILE* file_source = fopen("/home/pi/Airplay.source", "wb");
FILE* file_len = fopen("/home/pi/Airplay.len", "wb");
#endif
bool logger_debug = (logger_get_level(raop_rtp_mirror->logger) >= LOGGER_DEBUG);
while (1) {
fd_set rfds;
@@ -257,7 +242,8 @@ raop_rtp_mirror_thread(void *arg)
saddrlen = sizeof(saddr);
stream_fd = accept(raop_rtp_mirror->mirror_data_sock, (struct sockaddr *)&saddr, &saddrlen);
if (stream_fd == -1) {
logger_log(raop_rtp_mirror->logger, LOGGER_ERR, "raop_rtp_mirror error in accept %d %s", errno, strerror(errno));
logger_log(raop_rtp_mirror->logger, LOGGER_ERR,
"raop_rtp_mirror error in accept %d %s", errno, strerror(errno));
break;
}
@@ -266,26 +252,31 @@ raop_rtp_mirror_thread(void *arg)
tv.tv_sec = 0;
tv.tv_usec = 5000;
if (setsockopt(stream_fd, SOL_SOCKET, SO_RCVTIMEO, CAST &tv, sizeof(tv)) < 0) {
logger_log(raop_rtp_mirror->logger, LOGGER_ERR, "raop_rtp_mirror could not set stream socket timeout %d %s", errno, strerror(errno));
logger_log(raop_rtp_mirror->logger, LOGGER_ERR,
"raop_rtp_mirror could not set stream socket timeout %d %s", errno, strerror(errno));
break;
}
int option;
option = 1;
if (setsockopt(stream_fd, SOL_SOCKET, SO_KEEPALIVE, CAST &option, sizeof(option)) < 0) {
logger_log(raop_rtp_mirror->logger, LOGGER_WARNING, "raop_rtp_mirror could not set stream socket keepalive %d %s", errno, strerror(errno));
logger_log(raop_rtp_mirror->logger, LOGGER_WARNING,
"raop_rtp_mirror could not set stream socket keepalive %d %s", errno, strerror(errno));
}
option = 60;
if (setsockopt(stream_fd, SOL_TCP, TCP_KEEPIDLE, CAST &option, sizeof(option)) < 0) {
logger_log(raop_rtp_mirror->logger, LOGGER_WARNING, "raop_rtp_mirror could not set stream socket keepalive time %d %s", errno, strerror(errno));
logger_log(raop_rtp_mirror->logger, LOGGER_WARNING,
"raop_rtp_mirror could not set stream socket keepalive time %d %s", errno, strerror(errno));
}
option = 10;
if (setsockopt(stream_fd, SOL_TCP, TCP_KEEPINTVL, CAST &option, sizeof(option)) < 0) {
logger_log(raop_rtp_mirror->logger, LOGGER_WARNING, "raop_rtp_mirror could not set stream socket keepalive interval %d %s", errno, strerror(errno));
logger_log(raop_rtp_mirror->logger, LOGGER_WARNING,
"raop_rtp_mirror could not set stream socket keepalive interval %d %s", errno, strerror(errno));
}
option = 6;
if (setsockopt(stream_fd, SOL_TCP, TCP_KEEPCNT, CAST &option, sizeof(option)) < 0) {
logger_log(raop_rtp_mirror->logger, LOGGER_WARNING, "raop_rtp_mirror could not set stream socket keepalive probes %d %s", errno, strerror(errno));
logger_log(raop_rtp_mirror->logger, LOGGER_WARNING,
"raop_rtp_mirror could not set stream socket keepalive probes %d %s", errno, strerror(errno));
}
readstart = 0;
}
@@ -301,13 +292,15 @@ raop_rtp_mirror_thread(void *arg)
}
if (payload == NULL && ret == 0) {
logger_log(raop_rtp_mirror->logger, LOGGER_ERR, "raop_rtp_mirror tcp socket is closed, got %d bytes of 128 byte header",readstart);
logger_log(raop_rtp_mirror->logger, LOGGER_ERR,
"raop_rtp_mirror tcp socket is closed, got %d bytes of 128 byte header",readstart);
FD_CLR(stream_fd, &rfds);
stream_fd = -1;
continue;
} else if (payload == NULL && ret == -1) {
if (errno == EAGAIN || errno == EWOULDBLOCK) continue; // Timeouts can happen even if the connection is fine
logger_log(raop_rtp_mirror->logger, LOGGER_ERR, "raop_rtp_mirror error in header recv: %d %s", errno, strerror(errno));
logger_log(raop_rtp_mirror->logger, LOGGER_ERR,
"raop_rtp_mirror error in header recv: %d %s", errno, strerror(errno));
if (errno == ECONNRESET) conn_reset = true;;
break;
}
@@ -316,32 +309,36 @@ raop_rtp_mirror_thread(void *arg)
int payload_size = byteutils_get_int(packet, 0);
char packet_description[13] = {0};
char *p = packet_description;
int n = sizeof(packet_description);
for (int i = 4; i < 8; i++) {
sprintf(p, "%2.2x ", (unsigned int) packet[i]);
snprintf(p, n, "%2.2x ", (unsigned int) packet[i]);
n -= 3;
p += 3;
}
ntp_timestamp_raw = byteutils_get_long(packet, 8);
ntp_timestamp_remote = raop_ntp_timestamp_to_nano_seconds(ntp_timestamp_raw, false);
/* packet[4] appears to have one of three possible values: *
* 0x00 : encrypted packet *
* 0x01 : unencrypted packet with a SPS and a PPS NAL, sent initially, and also when *
* a change in video format (e.g., width, height) subsequently occurs *
* 0x05 : unencrypted packet with a "streaming report", sent once per second */
/* encrypted packets have packet[5] = 0x00 or 0x10, and packet[6]= packet[7] = 0x00; *
* encrypted packets immediately following an unencrypted SPS/PPS packet appear to *
* be the only ones with packet[5] = 0x10, and almost always have packet[5] = 0x10, *
* but occasionally have packet[5] = 0x00. */
/* packet[4] + packet[5] identify the payload type: values seen are: *
* 0x00 0x00: encrypted packet containing a non-IDR type 1 VCL NAL unit *
* 0x00 0x10: encrypted packet containing an IDR type 5 VCL NAL unit *
* 0x01 0x00 unencrypted packet containing a type 7 SPS NAL + a type 8 PPS NAL unit *
* 0x05 0x00 unencrypted packet with a "streaming report", sent once per second. */
/* unencrypted SPS/PPS packets have packet[4:7] = 0x01 0x00 (0x16 or 0x56) 0x01 *
* they are followed by an encrypted packet with the same timestamp in packet[8:15] */
/* packet[6] + packet[7] may list a payload "option": values seen are: *
* 0x00 0x00 : encrypted and "streaming report" packets *
* 0x16 0x01 : seen in most unencrypted SPS+PPS packets *
* 0x56 0x01 : occasionally seen in unencrypted SPS+PPS packets (why different?) */
/* "streaming report" packages have packet[4:7] = 0x05 0x00 0x00 0x00, and have no *
* timestamp in packet[8:15] */
/* unencrypted packets with a SPS and a PPS NAL are sent initially, and also when a *
* change in video format (e.g. width, height) subsequently occurs. They seem always *
* to be followed by a packet with a type 5 encrypted IDR VCL NAL, with an identical *
* timestamp. On M1/M2 Mac clients, this type 5 NAL is prepended with a type 6 SEI *
* NAL unit. Here we prepend the SPS+PPS NALs to the next encrypted packet, which *
* always has the same timestamp, and is (almost?) always an IDR NAL unit. */
//unsigned short payload_type = byteutils_get_short(packet, 4) & 0xff;
//unsigned short payload_option = byteutils_get_short(packet, 6);
/* Unencrypted SPS/PPS packets also have image-size data in (parts of) packet[16:127] */
/* "streaming report" packets have no timestamp in packet[8:15] */
if (payload == NULL) {
payload = malloc(payload_size);
@@ -376,38 +373,45 @@ raop_rtp_mirror_thread(void *arg)
// counting nano seconds since last boot.
ntp_timestamp_local = raop_ntp_convert_remote_time(raop_rtp_mirror->ntp, ntp_timestamp_remote);
uint64_t ntp_now = raop_ntp_get_local_time(raop_rtp_mirror->ntp);
int64_t latency = ((int64_t) ntp_now) - ((int64_t) ntp_timestamp_local);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp video: now = %8.6f, ntp = %8.6f, latency = %8.6f, ts = %8.6f, %s",
(double) ntp_now / SEC, (double) ntp_timestamp_local / SEC, (double) latency / SEC, (double) ntp_timestamp_remote / SEC, packet_description);
if (logger_debug) {
uint64_t ntp_now = raop_ntp_get_local_time(raop_rtp_mirror->ntp);
int64_t latency = ((int64_t) ntp_now) - ((int64_t) ntp_timestamp_local);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG,
"raop_rtp video: now = %8.6f, ntp = %8.6f, latency = %8.6f, ts = %8.6f, %s",
(double) ntp_now / SEC, (double) ntp_timestamp_local / SEC, (double) latency / SEC,
(double) ntp_timestamp_remote / SEC, packet_description);
}
#ifdef DUMP_H264
fwrite(payload, payload_size, 1, file_source);
fwrite(&readstart, sizeof(readstart), 1, file_len);
#endif
unsigned char* payload_out;
unsigned char* payload_decrypted;
if (!raop_rtp_mirror->sps_pps_waiting && packet[5] != 0x00) {
logger_log(raop_rtp_mirror->logger, LOGGER_WARNING, "unexpected: packet[5] = %2.2x, but not preceded by SPS+PPS packet", packet[5]);
}
/* if a previous unencrypted packet contains an SPS (type 7) and PPS (type 8) NAL which has not
* yet been sent, it should be prepended to the current NAL. In this case packet[5] is usually
* 0x10; however, the M1 Macs have increased the h264 level, and now the encrypted packet after the
* unencrypted SPS+PPS packet may contain a SEI (type 6) NAL prepended to the next VCL NAL, with
* packet[5] = 0x00. Now the flag raop_rtp_mirror->sps_pps_waiting = true will signal that a
* previous packet contained a SPS NAL + a PPS NAL, that has not yet been sent. This will trigger
* prepending it to the current NAL, and the sps_pps_waiting flag will be set to false after
* it has been prepended. It is not clear if the case packet[5] = 0x10 will occur when
* raop_rtp_mirror->sps_pps = false, but if it does, the current code will prepend the stored
* PPS + SPS NAL to the current encrypted NAL, and issue a warning message */
/*
* nal_types:1 Coded non-partitioned slice of a non-IDR picture
* 5 Coded non-partitioned slice of an IDR picture
* 6 Supplemental enhancement information (SEI)
* 7 Sequence parameter set (SPS)
* 8 Picture parameter set (PPS)
*
* if a previous unencrypted packet contains an SPS (type 7) and PPS (type 8) NAL which has not
* yet been sent, it should be prepended to the current NAL. The M1 Macs have increased the h264 level,
* and now the first encrypted packet after the unencrypted SPS+PPS packet may also contain a SEI (type 6) NAL
* prepended to its VCL NAL.
*
* The flag prepend_sps_pps = true will signal that the previous packet contained a SPS NAL + a PPS NAL,
* that has not yet been sent. This will trigger prepending it to the current NAL, and the prepend_sps_pps
* flag will be set to false after it has been prepended. */
bool prepend_sps_pps = (raop_rtp_mirror->sps_pps_waiting || packet[5] != 0x00);
if (prepend_sps_pps) {
assert(raop_rtp_mirror->sps_pps);
payload_out = (unsigned char*) malloc(payload_size + raop_rtp_mirror->sps_pps_len);
payload_decrypted = payload_out + raop_rtp_mirror->sps_pps_len;
memcpy(payload_out, raop_rtp_mirror->sps_pps, raop_rtp_mirror->sps_pps_len);
raop_rtp_mirror->sps_pps_waiting = false;
assert(sps_pps);
payload_out = (unsigned char*) malloc(payload_size + sps_pps_len);
payload_decrypted = payload_out + sps_pps_len;
if (ntp_timestamp_raw != ntp_timestamp_nal) {
logger_log(raop_rtp_mirror->logger, LOGGER_WARNING,
"raop_rtp_mirror: prepended sps_pps timestamp does not match timestamp of "
"video payload\n%llu\n%llu", ntp_timestamp_raw, ntp_timestamp_nal);
}
memcpy(payload_out, sps_pps, sps_pps_len);
free (sps_pps);
sps_pps = NULL;
} else {
payload_out = (unsigned char*) malloc(payload_size);
payload_decrypted = payload_out;
@@ -420,7 +424,6 @@ raop_rtp_mirror_thread(void *arg)
bool valid_data = true;
int nalu_size = 0;
int nalus_count = 0;
int nalu_type; /* 0x01 non-IDR VCL, 0x05 IDR VCL, 0x06 SEI 0x07 SPS, 0x08 PPS */
while (nalu_size < payload_size) {
int nc_len = byteutils_get_int_be(payload_decrypted, nalu_size);
if (nc_len < 0 || nalu_size + 4 > payload_size) {
@@ -430,22 +433,49 @@ raop_rtp_mirror_thread(void *arg)
memcpy(payload_decrypted + nalu_size, nal_start_code, 4);
nalu_size += 4;
nalus_count++;
if (payload_decrypted[nalu_size] & 0x80) valid_data = false; /* first bit of h264 nalu MUST be 0 ("forbidden_zero_bit") */
nalu_type = payload_decrypted[nalu_size] & 0x1f;
nalu_size += nc_len;
if (nalu_type != 1) {
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "nalu_type = %d, nalu_size = %d, processed bytes %d, payloadsize = %d nalus_count = %d",
nalu_type, nc_len, nalu_size, payload_size, nalus_count);
/* first bit of h264 nalu MUST be 0 ("forbidden_zero_bit") */
if (payload_decrypted[nalu_size] & 0x80) {
valid_data = false;
break;
}
}
int nalu_type = payload_decrypted[nalu_size] & 0x1f;
int ref_idc = (payload_decrypted[nalu_size] >> 5);
switch (nalu_type) {
case 5: /*IDR, slice_layer_without_partitioning */
case 1: /*non-IDR, slice_layer_without_partitioning */
break;
case 2: /* slice data partition A */
case 3: /* slice data partition B */
case 4: /* slice data partition C */
logger_log(raop_rtp_mirror->logger, LOGGER_INFO,
"unexpected partitioned VCL NAL unit: nalu_type = %d, ref_idc = %d, nalu_size = %d,"
"processed bytes %d, payloadsize = %d nalus_count = %d",
nalu_type, ref_idc, nc_len, nalu_size, payload_size, nalus_count);
break;
case 6:
if (logger_debug) {
char *str = utils_data_to_string(payload_decrypted + nalu_size, nc_len, 16);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp_mirror SEI NAL size = %d", nc_len);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG,
"raop_rtp_mirror h264 Supplemental Enhancement Information:\n%s", str);
free(str);
}
break;
default:
logger_log(raop_rtp_mirror->logger, LOGGER_INFO,
"unexpected non-VCL NAL unit: nalu_type = %d, ref_idc = %d, nalu_size = %d,"
"processed bytes %d, payloadsize = %d nalus_count = %d",
nalu_type, ref_idc, nc_len, nalu_size, payload_size, nalus_count);
break;
}
nalu_size += nc_len;
}
if (nalu_size != payload_size) valid_data = false;
if(!valid_data) {
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "nalu marked as invalid");
payload_out[0] = 1; /* mark video data as invalid h264 (failed decryption) */
}
#ifdef DUMP_H264
fwrite(payload_decrypted, payload_size, 1, file);
#endif
payload_decrypted = NULL;
h264_decode_struct h264_data;
h264_data.ntp_time_local = ntp_timestamp_local;
@@ -454,11 +484,9 @@ raop_rtp_mirror_thread(void *arg)
h264_data.data_len = payload_size;
h264_data.data = payload_out;
if (prepend_sps_pps) {
h264_data.data_len += raop_rtp_mirror->sps_pps_len;
h264_data.data_len += sps_pps_len;
h264_data.nal_count += 2;
if (ntp_timestamp_raw != ntp_timestamp_nal) {
logger_log(raop_rtp_mirror->logger, LOGGER_WARNING, "raop_rtp_mirror: prepended sps_pps timestamp does not match that of video payload");
}
prepend_sps_pps = false;
}
raop_rtp_mirror->callbacks.video_process(raop_rtp_mirror->callbacks.cls, raop_rtp_mirror->ntp, &h264_data);
free(payload_out);
@@ -466,7 +494,8 @@ raop_rtp_mirror_thread(void *arg)
case 0x01:
// The information in the payload contains an SPS and a PPS NAL
// The sps_pps is not encrypted
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "\nReceived unencryted codec packet from client: payload_size %d header %s ts_client = %8.6f",
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "\nReceived unencryted codec packet from client:"
" payload_size %d header %s ts_client = %8.6f",
payload_size, packet_description, (double) ntp_timestamp_remote / SEC);
if (payload_size == 0) {
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp_mirror, discard type 0x01 packet with no payload");
@@ -478,8 +507,8 @@ raop_rtp_mirror_thread(void *arg)
float width_source = byteutils_get_float(packet, 40);
float height_source = byteutils_get_float(packet, 44);
if (width != width_source || height != height_source) {
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp_mirror: Unexpected : data %f, %f != width_source = %f, height_source = %f",
width, height, width_source, height_source);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp_mirror: Unexpected : data %f,"
" %f != width_source = %f, height_source = %f", width, height, width_source, height_source);
}
width = byteutils_get_float(packet, 48);
height = byteutils_get_float(packet, 52);
@@ -496,44 +525,44 @@ raop_rtp_mirror_thread(void *arg)
unsigned char *sequence_parameter_set = payload + 8;
short pps_size = byteutils_get_short_be(payload, sps_size + 9);
unsigned char *picture_parameter_set = payload + sps_size + 11;
int data_size = 6;
char *str = utils_data_to_string(payload, data_size, 16);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp_mirror: sps/pps header size = %d", data_size);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp_mirror h264 sps/pps header:\n%s", str);
free(str);
str = utils_data_to_string(sequence_parameter_set, sps_size,16);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp_mirror sps size = %d", sps_size);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp_mirror h264 Sequence Parameter Set:\n%s", str);
free(str);
str = utils_data_to_string(picture_parameter_set, pps_size, 16);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp_mirror pps size = %d", pps_size);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp_mirror h264 Picture Parameter Set:\n%s", str);
free(str);
int data_size = 6;
if (logger_debug) {
char *str = utils_data_to_string(payload, data_size, 16);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp_mirror: SPS+PPS header size = %d", data_size);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp_mirror h264 SPS+PPS header:\n%s", str);
free(str);
str = utils_data_to_string(sequence_parameter_set, sps_size,16);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp_mirror SPS NAL size = %d", sps_size);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp_mirror h264 Sequence Parameter Set:\n%s", str);
free(str);
str = utils_data_to_string(picture_parameter_set, pps_size, 16);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp_mirror PPS NAL size = %d", pps_size);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp_mirror h264 Picture Parameter Set:\n%s", str);
free(str);
}
data_size = payload_size - sps_size - pps_size - 11;
if (data_size > 0) {
str = utils_data_to_string (picture_parameter_set + pps_size, data_size, 16);
if (data_size > 0 && logger_debug) {
char *str = utils_data_to_string (picture_parameter_set + pps_size, data_size, 16);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "remainder size = %d", data_size);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "remainder of sps+pps packet:\n%s", str);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "remainder of SPS+PPS packet:\n%s", str);
free(str);
} else if (data_size < 0) {
logger_log(raop_rtp_mirror->logger, LOGGER_ERR, " pps_sps error: packet remainder size = %d < 0", data_size);
}
// Copy the sps and pps into a buffer to prepend to the next NAL unit.
raop_rtp_mirror->sps_pps_len = sps_size + pps_size + 8;
if (raop_rtp_mirror->sps_pps) {
free(raop_rtp_mirror->sps_pps);
if (sps_pps) {
free(sps_pps);
sps_pps = NULL;
}
raop_rtp_mirror->sps_pps = (unsigned char*) malloc(raop_rtp_mirror->sps_pps_len);
assert(raop_rtp_mirror->sps_pps);
memcpy(raop_rtp_mirror->sps_pps, nal_start_code, 4);
memcpy(raop_rtp_mirror->sps_pps + 4, sequence_parameter_set, sps_size);
memcpy(raop_rtp_mirror->sps_pps + sps_size + 4, nal_start_code, 4);
memcpy(raop_rtp_mirror->sps_pps + sps_size + 8, payload + sps_size + 11, pps_size);
raop_rtp_mirror->sps_pps_waiting = true;
#ifdef DUMP_H264
fwrite(raop_rtp_mirror->sps_pps, raop_rtp_mirror->sps_pps_len, 1, file);
#endif
sps_pps_len = sps_size + pps_size + 8;
sps_pps = (unsigned char*) malloc(sps_pps_len);
assert(sps_pps);
memcpy(sps_pps, nal_start_code, 4);
memcpy(sps_pps + 4, sequence_parameter_set, sps_size);
memcpy(sps_pps + sps_size + 4, nal_start_code, 4);
memcpy(sps_pps + sps_size + 8, payload + sps_size + 11, pps_size);
prepend_sps_pps = true;
// h264codec_t h264;
// h264.version = payload[0];
@@ -552,8 +581,8 @@ raop_rtp_mirror_thread(void *arg)
break;
case 0x05:
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "\nReceived video streaming performance info packet from client: payload_size %d header %s ts_raw = %llu",
payload_size, packet_description, ntp_timestamp_raw);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "\nReceived video streaming performance info packet from client:"
" payload_size %d header %s ts_raw = %llu", payload_size, packet_description, ntp_timestamp_raw);
/* payloads with packet[4] = 0x05 have no timestamp, and carry video info from the client as a binary plist *
* Sometimes (e.g, when the client has a locked screen), there is a 25kB trailer attached to the packet. *
* This 25000 Byte trailer with unidentified content seems to be the same data each time it is sent. */
@@ -566,9 +595,12 @@ raop_rtp_mirror_thread(void *arg)
int plist_size = payload_size;
if (payload_size > 25000) {
plist_size = payload_size - 25000;
char *str = utils_data_to_string(payload + plist_size, 16, 16);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "video_info packet had 25kB trailer; first 16 bytes are:\n%s", str);
if (logger_debug) {
char *str = utils_data_to_string(payload + plist_size, 16, 16);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG,
"video_info packet had 25kB trailer; first 16 bytes are:\n%s", str);
free(str);
}
}
if (plist_size) {
char *plist_xml;
@@ -582,8 +614,8 @@ raop_rtp_mirror_thread(void *arg)
}
break;
default:
logger_log(raop_rtp_mirror->logger, LOGGER_WARNING, "\nReceived unexpected TCP packet from client, size %d, %s ts_raw = raw%llu",
payload_size, packet_description, ntp_timestamp_raw);
logger_log(raop_rtp_mirror->logger, LOGGER_WARNING, "\nReceived unexpected TCP packet from client, "
"size %d, %s ts_raw = raw%llu", payload_size, packet_description, ntp_timestamp_raw);
break;
}
@@ -599,12 +631,6 @@ raop_rtp_mirror_thread(void *arg)
closesocket(stream_fd);
}
#ifdef DUMP_H264
fclose(file);
fclose(file_source);
fclose(file_len);
#endif
// Ensure running reflects the actual state
MUTEX_LOCK(raop_rtp_mirror->run_mutex);
raop_rtp_mirror->running = false;
@@ -639,7 +665,8 @@ raop_rtp_init_mirror_sockets(raop_rtp_mirror_t *raop_rtp_mirror, int use_ipv6)
/* Set port values */
raop_rtp_mirror->mirror_data_lport = dport;
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp_mirror local data port socket %d port TCP %d", dsock, dport);
logger_log(raop_rtp_mirror->logger, LOGGER_DEBUG, "raop_rtp_mirror local data port socket %d port TCP %d",
dsock, dport);
return 0;
sockets_cleanup:
@@ -648,7 +675,8 @@ raop_rtp_init_mirror_sockets(raop_rtp_mirror_t *raop_rtp_mirror, int use_ipv6)
}
void
raop_rtp_start_mirror(raop_rtp_mirror_t *raop_rtp_mirror, int use_udp, unsigned short *mirror_data_lport, uint8_t show_client_FPS_data)
raop_rtp_start_mirror(raop_rtp_mirror_t *raop_rtp_mirror, int use_udp, unsigned short *mirror_data_lport,
uint8_t show_client_FPS_data)
{
logger_log(raop_rtp_mirror->logger, LOGGER_INFO, "raop_rtp_mirror starting mirroring");
int use_ipv6 = 0;
@@ -716,9 +744,6 @@ void raop_rtp_mirror_destroy(raop_rtp_mirror_t *raop_rtp_mirror) {
raop_rtp_mirror_stop(raop_rtp_mirror);
MUTEX_DESTROY(raop_rtp_mirror->run_mutex);
mirror_buffer_destroy(raop_rtp_mirror->buffer);
if (raop_rtp_mirror->sps_pps) {
free(raop_rtp_mirror->sps_pps);
}
free(raop_rtp_mirror);
}
}

View File

@@ -187,19 +187,28 @@ char *utils_parse_hex(const char *str, int str_len, int *data_len) {
}
char *utils_data_to_string(const unsigned char *data, int datalen, int chars_per_line) {
int len = 3*datalen + ((datalen-1)/chars_per_line ) + 1;
assert(datalen >= 0);
assert(chars_per_line > 0);
int len = 3*datalen + 1;
if (datalen > chars_per_line) {
len += (datalen-1)/chars_per_line;
}
char *str = (char *) calloc(len + 1, sizeof(char));
assert(str);
char *p = str;
int n = len + 1;
for (int i = 0; i < datalen; i++) {
if (i > 0 && i % chars_per_line == 0) {
sprintf(p,"\n");
snprintf(p, n, "\n");
n--;
p++;
}
sprintf(p,"%2.2x ", (unsigned int) data[i]);
snprintf(p, n, "%2.2x ", (unsigned int) data[i]);
n -= 3;
p += 3;
}
sprintf(p,"\n");
snprintf(p, n, "\n");
n--;
p++;
assert(p == &(str[len]));
assert(len == strlen(str));

View File

@@ -19,6 +19,7 @@ if ( X11_FOUND )
if ( GST120_FOUND )
message( "-- ZOOMFIX will NOT be applied as Gstreamer version is >= 1.20" )
else()
message( "-- Failure to find Gstreamer >= 1.20 is NOT an error!" )
message( "-- ZOOMFIX will be applied as Gstreamer version is < 1.20" )
add_definitions( -DZOOM_WINDOW_NAME_FIX )
endif()

View File

@@ -328,7 +328,8 @@ gboolean gstreamer_pipeline_bus_callback(GstBus *bus, GstMessage *message, gpoin
"*** If you are letting the default autovideosink select the videosink,\n"
"*** GStreamer may be trying to use non-functional hardware h264 video decoding.\n"
"*** Try using option -avdec to force software decoding or use -vs <videosink>\n"
"*** to select a videosink of your choice (see \"man uxplay\")");
"*** to select a videosink of your choice (see \"man uxplay\").\n"
"*** Raspberry Pi OS with GStreamer-1.18.4 needs \"-bt709\" uxplay option");
}
g_error_free (err);
g_free (debug);

View File

@@ -1,23 +1,27 @@
.TH UXPLAY "1" "February 2023" "1.63" "User Commands"
.TH UXPLAY "1" "April 2023" "1.64" "User Commands"
.SH NAME
uxplay \- start AirPlay server
.SH SYNOPSIS
.B uxplay
[\fI\,-n name\/\fR] [\fI\,-s wxh\/\fR] [\fI\,-p \/\fR[\fI\,n\/\fR]] [more \fI OPTIONS \/\fR ...]
.SH DESCRIPTION
UxPlay 1.63: An open\-source AirPlay mirroring (+ audio streaming) server.
UxPlay 1.64: An open\-source AirPlay mirroring (+ audio streaming) server:
.SH OPTIONS
.TP
.B
\fB\-n\fR name Specify the network name of the AirPlay server
.TP
\fB\-nh\fR Do \fBNOT\fR append "@\fIhostname\fR" at end of the AirPlay server name
\fB\-nh\fR Do \fBNOT\fR append "@\fIhostname\fR" at end of AirPlay server name
.TP
\fB\-vsync\fR Mirror mode: sync audio to video (default: stream w/o sync)
\fB\-vsync\fI[x]\fR Mirror mode: sync audio to video using timestamps (default)
.IP
\fIx\fR is optional audio delay: millisecs, decimal, can be neg.
.TP
\fB\-vsync\fI[x]\fR \fIx\fR is optional audio delay in millisecs, can be neg., decimal.
\fB\-vsync\fR no Switch off audio/(server)video timestamp synchronization.
.TP
\fB\-async\fR[\fIx\fR] Audio-Only mode: sync audio to client video (default: no sync).
\fB\-async\fR[\fIx\fR] Audio-Only mode: sync audio to client video (default: no).
.TP
\fB\-async\fR no Switch off audio/(client)video timestamp synchronization.
.TP
\fB\-s\fR wxh[@r]Set display resolution [refresh_rate] default 1920x1080[@60]
.TP
@@ -121,3 +125,25 @@ UxPlay 1.63: An open\-source AirPlay mirroring (+ audio streaming) server.
\fB\-v\fR Displays version information
.TP
\fB\-h\fR Displays help information
.SH
FILES
.TP
Options in one of $UXPLAYRC, or ~/.uxplayrc, or ~/.config/uxplayrc
.TP
are applied first (command-line options may modify them). uxplayrc format:
.TP
one option per line,\fI no\fR initial "-"; lines beginning with "#" ignored.
.SH
AUTHORS
.TP
Various, see website or distribution.
.SH
COPYRIGHT
.TP
Various, see website or distribution. License: GPL v3+: GNU GPL version 3 or later.
.TP
(some parts LGPL v.2.1 and MIT).
.SH
SEE ALSO
.TP
Website: <https://github.com/FDH2/UxPlay>

View File

@@ -28,6 +28,9 @@
#include <string>
#include <vector>
#include <fstream>
#include <sstream>
#include <iterator>
#include <sys/stat.h>
#ifdef _WIN32 /*modifications for Windows compilation */
#include <glib.h>
@@ -39,6 +42,8 @@
#include <sys/utsname.h>
#include <sys/socket.h>
#include <ifaddrs.h>
#include <sys/types.h>
#include <pwd.h>
# ifdef __linux__
# include <netpacket/packet.h>
# else
@@ -54,7 +59,7 @@
#include "renderers/video_renderer.h"
#include "renderers/audio_renderer.h"
#define VERSION "1.63"
#define VERSION "1.64"
#define SECOND_IN_USECS 1000000
#define SECOND_IN_NSECS 1000000000UL
@@ -70,7 +75,7 @@ static dnssd_t *dnssd = NULL;
static raop_t *raop = NULL;
static logger_t *render_logger = NULL;
static bool audio_sync = false;
static bool video_sync = false;
static bool video_sync = true;
static int64_t audio_delay_alac = 0;
static int64_t audio_delay_aac = 0;
static bool relaunch_video = false;
@@ -269,6 +274,37 @@ static int parse_hw_addr (std::string str, std::vector<char> &hw_addr) {
return 0;
}
static std::string find_uxplay_config_file() {
std::string no_config_file = "";
const char *homedir = NULL;
const char *uxplayrc = NULL;
std::string config0, config1, config2;
struct stat sb;
uxplayrc = getenv("UXPLAYRC"); /* first look for $UXPLAYRC */
if (uxplayrc) {
config0 = uxplayrc;
if (stat(config0.c_str(), &sb) == 0) return config0;
}
homedir = getenv("XDG_CONFIG_HOMEDIR");
if (homedir == NULL) {
homedir = getenv("HOME");
}
#ifndef _WIN32
if (homedir == NULL){
homedir = getpwuid(getuid())->pw_dir;
}
#endif
if (homedir) {
config1 = homedir;
config1.append("/.uxplayrc");
if (stat(config1.c_str(), &sb) == 0) return config1; /* look for ~/.uxplayrc */
config2 = homedir;
config2.append("/.config/uxplayrc"); /* look for ~/.config/uxplayrc */
if (stat(config2.c_str(), &sb) == 0) return config2;
}
return no_config_file;
}
static std::string find_mac () {
/* finds the MAC address of a network interface *
* in a Windows, Linux, *BSD or macOS system. */
@@ -296,7 +332,7 @@ static std::string find_mac () {
}
mac.erase();
for (int i = 0; i < 6; i++) {
sprintf(str,"%02x", int(address->PhysicalAddress[i]));
snprintf(str, sizeof(str), "%02x", int(address->PhysicalAddress[i]));
mac = mac + str;
if (i < 5) mac = mac + ":";
}
@@ -329,7 +365,7 @@ static std::string find_mac () {
if (non_null_octets) {
mac.erase();
for (int i = 0; i < 6 ; i++) {
sprintf(str,"%02x", octet[i]);
snprintf(str, sizeof(str), "%02x", octet[i]);
mac = mac + str;
if (i < 5) mac = mac + ":";
}
@@ -362,14 +398,17 @@ static std::string random_mac () {
}
static void print_info (char *name) {
printf("UxPlay %s: An open-source AirPlay mirroring server based on RPiPlay\n", VERSION);
printf("Usage: %s [-n name] [-s wxh] [-p [n]]\n", name);
printf("UxPlay %s: An open-source AirPlay mirroring server.\n", VERSION);
printf("=========== Website: https://github.com/FDH2/UxPlay ==========\n");
printf("Usage: %s [-n name] [-s wxh] [-p [n]] [(other options)]\n", name);
printf("Options:\n");
printf("-n name Specify the network name of the AirPlay server\n");
printf("-nh Do not add \"@hostname\" at the end of the AirPlay server name\n");
printf("-vsync [x]Mirror mode: sync audio to video (default: stream w/o sync)\n");
printf(" x is optional audio delay in millisecs, can be neg., decimal\n");
printf("-async [x]Audio-Only mode: sync audio to client video (default: no sync)\n");
printf("-nh Do not add \"@hostname\" at the end of AirPlay server name\n");
printf("-vsync [x]Mirror mode: sync audio to video using timestamps (default)\n");
printf(" x is optional audio delay: millisecs, decimal, can be neg.\n");
printf("-vsync no Switch off audio/(server)video timestamp synchronization \n");
printf("-async [x]Audio-Only mode: sync audio to client video (default: no)\n");
printf("-async no Switch off audio/(client)video timestamp synchronization\n");
printf("-s wxh[@r]Set display resolution [refresh_rate] default 1920x1080[@60]\n");
printf("-o Set display \"overscanned\" mode on (not usually needed)\n");
printf("-fs Full-screen (only works with X11, Wayland and VAAPI)\n");
@@ -390,7 +429,7 @@ static void print_info (char *name) {
printf(" gtksink,waylandsink,osximagesink,kmssink,d3d11videosink etc.\n");
printf("-vs 0 Streamed audio only, with no video display window\n");
printf("-v4l2 Use Video4Linux2 for GPU hardware h264 decoding\n");
printf("-bt709 A workaround (bt709 color) that may be needed with -rpi\n");
printf("-bt709 A workaround (bt709 color) sometimes needed on RPi\n");
printf("-rpi Same as \"-v4l2\" (for RPi=Raspberry Pi).\n");
printf("-rpigl Same as \"-rpi -vs glimagesink\" for RPi.\n");
printf("-rpifb Same as \"-rpi -vs kmssink\" for RPi using framebuffer.\n");
@@ -420,6 +459,9 @@ static void print_info (char *name) {
printf("-d Enable debug logging\n");
printf("-v Displays version information\n");
printf("-h Displays this help\n");
printf("Startup options in $UXPLAYRC, ~/.uxplayrc, or ~/.config/uxplayrc are\n");
printf("applied first (command-line options may modify them): format is one \n");
printf("option per line, no initial \"-\"; lines starting with \"#\" are ignored.\n");
}
bool option_has_value(const int i, const int argc, std::string option, const char *next_arg) {
@@ -566,6 +608,11 @@ static void parse_arguments (int argc, char *argv[]) {
} else if (arg == "-async") {
audio_sync = true;
if (i < argc - 1) {
if (strlen(argv[i+1]) == 2 && strncmp(argv[i+1], "no", 2) == 0) {
audio_sync = false;
i++;
continue;
}
char *end;
int n = (int) (strtof(argv[i + 1], &end) * 1000);
if (*end == '\0') {
@@ -581,6 +628,11 @@ static void parse_arguments (int argc, char *argv[]) {
} else if (arg == "-vsync") {
video_sync = true;
if (i < argc - 1) {
if (strlen(argv[i+1]) == 2 && strncmp(argv[i+1], "no", 2) == 0) {
video_sync = false;
i++;
continue;
}
char *end;
int n = (int) (strtof(argv[i + 1], &end) * 1000);
if (*end == '\0') {
@@ -1252,9 +1304,47 @@ static void stop_raop_server () {
return;
}
static void read_config_file(const char * filename, const char * uxplay_name) {
std::string config_file = filename;
std::string option_char = "-";
std::vector<std::string> options;
options.push_back(uxplay_name);
std::ifstream file(config_file);
if (file.is_open()) {
fprintf(stdout,"UxPlay: reading configuration from %s\n", config_file.c_str());
std::string line;
while (std::getline(file, line)) {
if (line[0] == '#') continue;
std::stringstream ss(line);
std::istream_iterator<std::string> begin(ss);
std::istream_iterator<std::string> end;
std::vector<std::string> tokens(begin,end);
if (tokens.size() > 0) {
options.push_back(option_char + tokens[0]);
for (int i = 1; i < tokens.size(); i++) {
options.push_back(tokens[i].c_str());
}
}
}
file.close();
} else {
fprintf(stderr,"UxPlay: failed to open configuration file at %s\n", config_file.c_str());
}
if (options.size() > 1) {
int argc = options.size();
char **argv = (char **) malloc(sizeof(char*) * argc);
for (int i = 0; i < argc; i++) {
argv[i] = (char *) options[i].c_str();
}
parse_arguments (argc, argv);
free (argv);
}
}
int main (int argc, char *argv[]) {
std::vector<char> server_hw_addr;
std::string mac_address;
std::string config_file = "";
#ifdef SUPPRESS_AVAHI_COMPAT_WARNING
// suppress avahi_compat nag message. avahi emits a "nag" warning (once)
@@ -1263,6 +1353,10 @@ int main (int argc, char *argv[]) {
if (!getenv("AVAHI_COMPAT_NOWARN")) putenv(avahi_compat_nowarn);
#endif
config_file = find_uxplay_config_file();
if (config_file.length()) {
read_config_file(config_file.c_str(), argv[0]);
}
parse_arguments (argc, argv);
#ifdef _WIN32 /* use utf-8 terminal output; don't buffer stdout in WIN32 when debug_log = false */