diff --git a/README.html b/README.html index 5031122..8f2e2c5 100644 --- a/README.html +++ b/README.html @@ -1,8 +1,8 @@ -

UxPlay 1.40

+

UxPlay 1.41

This project is a GPLv3 unix AirPlay server which now also works on macOS. Its main use is to act like an AppleTV for screen-mirroring (with audio) of iOS/macOS clients (iPads, iPhones, MacBooks) in a window on the server display (with the possibility of sharing that window on screen-sharing applications such as Zoom) on a host running Linux, macOS, or other unix, using Apple’s AirPlay Mirror protocol first available in iOS 5. (Details of what is known about the AirPlay2 protocol can be found here and here).

The UxPlay server and its client must be on the same local area network, on which a Bonjour/Zeroconf mDNS/DNS-SD server is also running (only DNS-SD “Service Discovery” service is necessary, it is not necessary that the local network also be of the “.local” mDNS-based type). On Linux and BSD Unix servers, this is usually provided by Avahi, through the avahi-daemon service, and is included in most Linux distributions (this service can also be provided by macOS, iOS or Windows servers).

-

New: UxPlay 1.40 now also supports the Airplay audio-only protocol as well as AirPlay Mirror protocol, and (when the client screen is not being mirrored) can play Apple Lossless (ALAC) 44100/16/2 audio streamed from the client in 2-channel stereo without video (the accompanying cover-art and metadata is received by the server, but not displayed). The initial connection to the client can be in AirPlay audio mode, or an initial Airplay Mirror connection can be changed to Airplay audio by closing the Mirror window and reconnecting in audio-only mode (this changes back to AAC audio if screen mirroring is restarted).

-

UxPlay 1.40 is based on https://github.com/FD-/RPiPlay, with GStreamer integration from https://github.com/antimof/UxPlay. (UxPlay only uses GStreamer, and does not contain the alternative Raspberry-Pi-specific audio and video renderers also found in RPiPlay.) Tested on Ubuntu 20.04, Linux Mint 20.2, OpenSUSE 15.3, macOS 10.15.

+

New: UxPlay > 1.38 now also supports the Airplay audio-only protocol as well as AirPlay Mirror protocol, and (when the client screen is not being mirrored) can play Apple Lossless (ALAC) 44100/16/2 audio streamed from the client in 2-channel stereo without video (the accompanying cover-art and metadata is received by the server, but not displayed). The initial connection to the client can be in AirPlay audio mode, or an initial Airplay Mirror connection can be changed to Airplay audio by closing the Mirror window and reconnecting in audio-only mode (this changes back to AAC audio if screen mirroring is restarted).

+

UxPlay 1.41 is based on https://github.com/FD-/RPiPlay, with GStreamer integration from https://github.com/antimof/UxPlay. (UxPlay only uses GStreamer, and does not contain the alternative Raspberry-Pi-specific audio and video renderers also found in RPiPlay.) Tested on Ubuntu 20.04, Linux Mint 20.2, OpenSUSE 15.3, macOS 10.15.

Features: 1. Based on Gstreamer. 2. Video and audio are supported out of the box. 3. Gstreamer decoding is plugin agnostic. Uses accelerated decoders if available. VAAPI is preferable, (but don’t use VAAPI with nVidia). 4. Automatic screen orientation.

Getting UxPlay:

Either download and unzip UxPlay-master.zip, or (if git is installed): “git clone https://github.com/FDH2/UxPlay”.

@@ -69,6 +69,7 @@

-as 0 or -a suppresses playing of streamed audio, but displays streamed video.

-t timeout will cause the server to relaunch (without stopping uxplay) if no connections have been present during the previous timeout seconds. (You may wish to use this because the Server may not be visible to new Clients that were inactive when the Server was launched, and an idle Bonjour registration also eventually becomes unavailable for new connections.) The timer only starts once a Client has first made a mirror connection and then has disconnected with “Stop Mirrroring”. This option should not be used if the display window is an OpenGL window on macOS, as such an OpenGL window created by GStreamer does not terminate correctly (it causes a segfault) if it is still open when the GStreamer pipeline is closed.

ChangeLog

+

1.41 2021-11-11 Further cleanups of multiple audio format support (internal changes, separated RAOP and GStreamer audio/video startup)

1.40 2021-11-09 Cleanup segfault in ALAC support, manpage location fix, show request Plists in debug mode.

1.39 2021-11-06 Added support for Apple Lossless (ALAC) audio streams.

1.38 2021-10-8 Add -as audiosink option to allow user to choose the GStreamer audiosink.

@@ -91,6 +92,7 @@
  • The avahi_compat “nag” warning on startup is suppressed, by placing “AVAHI_COMPAT_NOWARN=1” into the runtime environment when uxplay starts. (This uses a call to putenv() in a form that is believed to be safe against memory leaks, at least in modern Linux; if for any reason you don’t want this fix, comment out the line in CMakeLists.txt that activates it when uxplay is compiled.) On macOS, Avahi is not used.

  • UxPlay now builds on macOS.

  • The hostname of the server running uxplay is now appended to the AirPlay server name, which is now displayed as name@hostname, where name is “UxPlay”, (or whatever is set with the -n option).

  • +
  • Added support for audio-only streaming with original (non-Mirror) AirPlay protocal, with Apple Lossless (ALAC) audio.

  • Disclaimer

    All the resources in this repository are written using only freely available information from the internet. The code and related resources are meant for educational purposes only. It is the responsibility of the user to make sure all local laws are adhered to.

    diff --git a/README.md b/README.md index fb305e2..914b87c 100644 --- a/README.md +++ b/README.md @@ -1,5 +1,5 @@ -# UxPlay 1.40 +# UxPlay 1.41 This project is a GPLv3 unix AirPlay server which now also works on macOS. Its main use is to act like an AppleTV for screen-mirroring (with audio) of iOS/macOS clients @@ -16,12 +16,13 @@ On Linux and BSD Unix servers, this is usually provided by [Avahi](https://www.a through the avahi-daemon service, and is included in most Linux distributions (this service can also be provided by macOS, iOS or Windows servers). -_New: UxPlay 1.40 now also supports the Airplay audio-only protocol as well as AirPlay Mirror protocol, and (when the client screen is not being mirrored) +_New: UxPlay > 1.38 now also supports the Airplay audio-only protocol as well as AirPlay Mirror protocol, and (when the client screen is not being mirrored) can play Apple Lossless (ALAC) 44100/16/2 audio streamed from the client in 2-channel stereo without video (the accompanying cover-art and metadata is received by the server, -but not displayed). The initial connection to the client can be in AirPlay audio mode, or an initial Airplay Mirror connection can be changed to Airplay audio by closing the Mirror window and reconnecting in audio-only mode (this changes back to AAC audio if screen mirroring is restarted)._ +but not displayed). The initial connection to the client can be in AirPlay audio mode, or an initial Airplay Mirror connection can be changed to +Airplay audio by closing the Mirror window and reconnecting in audio-only mode (this changes back to AAC audio if screen mirroring is restarted)._ -UxPlay 1.40 is based on https://github.com/FD-/RPiPlay, with GStreamer integration from +UxPlay 1.41 is based on https://github.com/FD-/RPiPlay, with GStreamer integration from https://github.com/antimof/UxPlay. (UxPlay only uses GStreamer, and does not contain the alternative Raspberry-Pi-specific audio and video renderers also found in RPiPlay.) @@ -320,6 +321,9 @@ Also: image transforms that had been added to RPiPlay have been ported to UxPlay # ChangeLog +1.41 2021-11-11 Further cleanups of multiple audio format support (internal changes, + separated RAOP and GStreamer audio/video startup) + 1.40 2021-11-09 Cleanup segfault in ALAC support, manpage location fix, show request Plists in debug mode. 1.39 2021-11-06 Added support for Apple Lossless (ALAC) audio streams. @@ -414,6 +418,8 @@ is compiled.) On macOS, Avahi is not used. 11. The hostname of the server running uxplay is now appended to the AirPlay server name, which is now displayed as _name_@hostname, where _name_ is "UxPlay", (or whatever is set with the **-n** option). +12. Added support for audio-only streaming with original (non-Mirror) AirPlay protocal, with Apple Lossless (ALAC) audio. + # Disclaimer All the resources in this repository are written using only freely available information from the internet. The code and related resources are meant for educational purposes only. It is the responsibility of the user to make sure all local laws are adhered to. diff --git a/README.txt b/README.txt index 91fe9ef..6bd57e4 100644 --- a/README.txt +++ b/README.txt @@ -1,4 +1,4 @@ -UxPlay 1.40 +UxPlay 1.41 =========== This project is a GPLv3 unix AirPlay server which now also works on @@ -21,8 +21,8 @@ Linux and BSD Unix servers, this is usually provided by included in most Linux distributions (this service can also be provided by macOS, iOS or Windows servers). -*New: UxPlay 1.40 now also supports the Airplay audio-only protocol as -well as AirPlay Mirror protocol, and (when the client screen is not +*New: UxPlay \> 1.38 now also supports the Airplay audio-only protocol +as well as AirPlay Mirror protocol, and (when the client screen is not being mirrored) can play Apple Lossless (ALAC) 44100/16/2 audio streamed from the client in 2-channel stereo without video (the accompanying cover-art and metadata is received by the server, but not displayed). @@ -31,7 +31,7 @@ initial Airplay Mirror connection can be changed to Airplay audio by closing the Mirror window and reconnecting in audio-only mode (this changes back to AAC audio if screen mirroring is restarted).* -UxPlay 1.40 is based on https://github.com/FD-/RPiPlay, with GStreamer +UxPlay 1.41 is based on https://github.com/FD-/RPiPlay, with GStreamer integration from https://github.com/antimof/UxPlay. (UxPlay only uses GStreamer, and does not contain the alternative Raspberry-Pi-specific audio and video renderers also found in RPiPlay.) Tested on Ubuntu @@ -391,6 +391,9 @@ still open when the GStreamer pipeline is closed.* ChangeLog ========= +1.41 2021-11-11 Further cleanups of multiple audio format support +(internal changes, separated RAOP and GStreamer audio/video startup) + 1.40 2021-11-09 Cleanup segfault in ALAC support, manpage location fix, show request Plists in debug mode. @@ -501,6 +504,9 @@ Improvements since the original UxPlay by antimof: where *name* is "UxPlay", (or whatever is set with the **-n** option). +12. Added support for audio-only streaming with original (non-Mirror) + AirPlay protocal, with Apple Lossless (ALAC) audio. + Disclaimer ========== diff --git a/lib/raop.h b/lib/raop.h index 67e418b..8b8a957 100755 --- a/lib/raop.h +++ b/lib/raop.h @@ -36,7 +36,7 @@ struct raop_callbacks_s { void (*audio_process)(void *cls, raop_ntp_t *ntp, aac_decode_struct *data); void (*video_process)(void *cls, raop_ntp_t *ntp, h264_decode_struct *data); - void (*audio_setup)(void *cls, unsigned char *compression_type); + /* Optional but recommended callback functions */ void (*conn_init)(void *cls); @@ -48,6 +48,7 @@ struct raop_callbacks_s { void (*audio_set_coverart)(void *cls, const void *buffer, int buflen); void (*audio_remote_control_id)(void *cls, const char *dacp_id, const char *active_remote_header); void (*audio_set_progress)(void *cls, unsigned int start, unsigned int curr, unsigned int end); + void (*audio_get_format)(void *cls, unsigned char *ct, unsigned short *spf, bool *usingScreen, bool *isMedia, uint64_t *audioFormat); }; typedef struct raop_callbacks_s raop_callbacks_t; diff --git a/lib/raop_handlers.h b/lib/raop_handlers.h index 5766c96..aee8d90 100755 --- a/lib/raop_handlers.h +++ b/lib/raop_handlers.h @@ -437,11 +437,47 @@ raop_handler_setup(raop_conn_t *conn, } case 96: { // Audio unsigned short cport = conn->raop->control_lport, dport = conn->raop->data_lport; - uint64_t ct; - /* get audio compression type */ - plist_t req_stream_ct_node = plist_dict_get_item(req_stream_node, "ct"); - plist_get_uint_val(req_stream_ct_node, &ct); - conn->raop->callbacks.audio_setup(conn->raop->callbacks.cls, (unsigned char*) &ct); + + uint64_t audioFormat; + unsigned char ct; + unsigned short spf; + bool isMedia; + bool usingScreen; + + if (conn->raop->callbacks.audio_get_format) { + /* get audio compression type */ + uint64_t uint_val; + uint8_t bool_val; + + plist_t req_stream_ct_node = plist_dict_get_item(req_stream_node, "ct"); + plist_get_uint_val(req_stream_ct_node, &uint_val); + ct = (unsigned char) uint_val; + + plist_t req_stream_spf_node = plist_dict_get_item(req_stream_node, "spf"); + plist_get_uint_val(req_stream_spf_node, &uint_val); + spf = (unsigned short) uint_val; + + plist_t req_stream_audio_format_node = plist_dict_get_item(req_stream_node, "audioFormat"); + plist_get_uint_val(req_stream_audio_format_node, &audioFormat); + + plist_t req_stream_ismedia_node = plist_dict_get_item(req_stream_node, "isMedia"); + if (req_stream_ismedia_node) { + plist_get_bool_val(req_stream_ismedia_node, &bool_val); + isMedia = (bool) bool_val; + } else { + isMedia = false; + } + + plist_t req_stream_usingscreen_node = plist_dict_get_item(req_stream_node, "usingScreen"); + if (req_stream_usingscreen_node) { + plist_get_bool_val(req_stream_usingscreen_node, &bool_val); + usingScreen = (bool) bool_val; + } else { + usingScreen = false; + } + + conn->raop->callbacks.audio_get_format(conn->raop->callbacks.cls, &ct, &spf, &usingScreen, &isMedia, &audioFormat); + } if (conn->raop_rtp) { raop_rtp_start_audio(conn->raop_rtp, use_udp, remote_cport, &cport, &dport); diff --git a/renderers/audio_renderer.h b/renderers/audio_renderer.h index 14c3956..558c3f3 100644 --- a/renderers/audio_renderer.h +++ b/renderers/audio_renderer.h @@ -25,19 +25,17 @@ extern "C" { #endif #include -#include -#include "../lib/logger.h" #include "../lib/raop_ntp.h" -#include "video_renderer.h" -typedef struct audio_renderer_s audio_renderer_t; -audio_renderer_t *audio_renderer_init(logger_t *logger, unsigned char *compression_type, const char* audiosink); -void audio_renderer_start(audio_renderer_t *renderer); -void audio_renderer_render_buffer(audio_renderer_t *renderer, raop_ntp_t *ntp, unsigned char* data, int data_len, uint64_t pts); -void audio_renderer_set_volume(audio_renderer_t *renderer, float volume); -void audio_renderer_flush(audio_renderer_t *renderer); -void audio_renderer_destroy(audio_renderer_t *renderer); + +void audio_renderer_init(const char* audiosink); +void audio_renderer_start(unsigned char* compression_type); +void audio_renderer_stop(); +void audio_renderer_render_buffer(raop_ntp_t *ntp, unsigned char* data, int data_len, uint64_t pts); +void audio_renderer_set_volume(float volume); +void audio_renderer_flush(); +void audio_renderer_destroy(); #ifdef __cplusplus } diff --git a/renderers/audio_renderer_gstreamer.c b/renderers/audio_renderer_gstreamer.c index f5f9ab6..56f42ab 100644 --- a/renderers/audio_renderer_gstreamer.c +++ b/renderers/audio_renderer_gstreamer.c @@ -17,12 +17,14 @@ * Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA */ -#include "audio_renderer.h" #include #include +#include #include +#include "audio_renderer.h" -/* GStreamer Caps strings for Airplay-defined connection types (ct) */ + +/* GStreamer Caps strings for Airplay-defined audio compression types (ct) */ /* ct = 1; linear PCM (uncompressed): 44100/16/2, S16LE */ static const char lpcm[]="audio/x-raw,rate=(int)44100,channels=(int)2,format=S16LE,layout=interleaved"; @@ -37,141 +39,175 @@ static const char aac_lc[] ="audio/mpeg,mpegversion=(int)4,channnels=(int)2,rate /* ct = 8; codec_data from MPEG v4 ISO 14996-3 Section 1.6.2.1: AAC_ELD 44100/2 spf = 480 */ static const char aac_eld[] ="audio/mpeg,mpegversion=(int)4,channnels=(int)2,rate=(int)44100,stream-format=raw,codec_data=(buffer)f8e85000"; -struct audio_renderer_s { - logger_t *logger; +typedef struct audio_renderer_s { GstElement *appsrc; GstElement *pipeline; GstElement *volume; -}; + unsigned char ct; +} audio_renderer_t ; + static gboolean check_plugins (void) { - int i; - gboolean ret; - GstRegistry *registry; - const gchar *needed[] = {"app", "libav", "playback", "autodetect", NULL}; + int i; + gboolean ret; + GstRegistry *registry; + const gchar *needed[] = { "app", "libav", "playback", "autodetect", NULL}; - registry = gst_registry_get (); - ret = TRUE; - for (i = 0; i < g_strv_length ((gchar **) needed); i++) { - GstPlugin *plugin; - plugin = gst_registry_find_plugin (registry, needed[i]); - if (!plugin) { - g_print ("Required gstreamer plugin '%s' not found\n", needed[i]); - ret = FALSE; - continue; + registry = gst_registry_get (); + ret = TRUE; + for (i = 0; i < g_strv_length ((gchar **) needed); i++) { + GstPlugin *plugin; + plugin = gst_registry_find_plugin (registry, needed[i]); + if (!plugin) { + g_print ("Required gstreamer plugin '%s' not found\n", needed[i]); + ret = FALSE; + continue; + } + gst_object_unref (plugin); + plugin = NULL; } - gst_object_unref (plugin); - } - return ret; + return ret; } -audio_renderer_t *audio_renderer_init(logger_t *logger, unsigned char *compression_type, const char* audiosink) { - audio_renderer_t *renderer; +#define NFORMATS 4 +static audio_renderer_t *renderer_type[NFORMATS]; +static audio_renderer_t *renderer = NULL; +const char * format[NFORMATS]; + +void audio_renderer_init(const char* audiosink) { GError *error = NULL; GstCaps *caps = NULL; - switch (*compression_type) { - case 1: /* uncompressed PCM */ - case 2: /* Apple lossless ALAC */ - case 4: /* AAC-LC */ - case 8: /* AAC-ELD */ - logger_log(logger, LOGGER_INFO , "audio_renderer_init: compression_type ct = %d", *compression_type); - break; - default: - logger_log(logger, LOGGER_ERR, "audio_renderer_init: unsupported compression_type ct = %d", *compression_type); - return NULL; - } + gst_init(NULL,NULL); + assert(check_plugins ()); + for (int i = 0; i < NFORMATS ; i++) { + renderer_type[i] = (audio_renderer_t *) calloc(1,sizeof(audio_renderer_t)); + GString *launch = g_string_new("appsrc name=audio_source stream-type=0 format=GST_FORMAT_TIME is-live=true ! queue ! "); + switch (i) { + case 0: /* AAC-ELD */ + case 2: /* AAC-LC */ + g_string_append(launch, "avdec_aac ! "); + break; + case 1: /* ALAC */ + g_string_append(launch, "avdec_alac ! "); + break; + case 3: /*PCM*/ + break; + } + g_string_append (launch, "audioconvert ! volume name=volume ! level ! "); + g_string_append (launch, audiosink); + g_string_append (launch, " sync=false"); + renderer_type[i]->pipeline = gst_parse_launch(launch->str, &error); + if (error) { + g_error ("get_parse_launch error:\n %s\n",error->message); + g_clear_error (&error); + } + g_assert (renderer_type[i]->pipeline); + g_string_free(launch, TRUE); + renderer_type[i]->appsrc = gst_bin_get_by_name (GST_BIN (renderer_type[i]->pipeline), "audio_source"); + renderer_type[i]->volume = gst_bin_get_by_name (GST_BIN (renderer_type[i]->pipeline), "volume"); + switch (i) { + case 0: + caps = gst_caps_from_string(aac_eld); + renderer_type[i]->ct = 8; + format[i] = "AAC-ELD 44100/2"; + break; + case 1: + caps = gst_caps_from_string(alac); + renderer_type[i]->ct = 2; + format[i] = "ALAC 44100/16/2"; + break; + case 2: + caps = gst_caps_from_string(aac_lc); + renderer_type[i]->ct = 4; + format[i] = "AAC-LC 44100/2"; + break; + case 3: + caps = gst_caps_from_string(lpcm); + renderer_type[i]->ct = 1; + format[i] = "PCM 44100/16/2 S16LE"; + break; + } + g_message ("supported audio format %d: %s",i+1,format[i]); + g_object_set(renderer_type[i]->appsrc, "caps", caps, NULL); + gst_caps_unref(caps); + } +} + +void audio_renderer_stop() { + if (renderer) { + gst_app_src_end_of_stream(GST_APP_SRC(renderer->appsrc)); + gst_element_set_state (renderer->pipeline, GST_STATE_NULL); + renderer = NULL; + } +} + +void audio_renderer_start(unsigned char *ct) { + unsigned char compression_type = 0, id; + for (int i = 0; i < NFORMATS; i++) { + if(renderer_type[i]->ct == *ct) { + compression_type = *ct; + id = i; + break; + } + } + if (compression_type && renderer) { + if(compression_type != renderer->ct) { + gst_app_src_end_of_stream(GST_APP_SRC(renderer->appsrc)); + gst_element_set_state (renderer->pipeline, GST_STATE_NULL); + g_message ("changed audio connection, format %s\n", format[id]); + renderer = renderer_type[id]; + gst_element_set_state (renderer->pipeline, GST_STATE_PLAYING); + } + } else if (compression_type) { + g_message ("start audio connection, format %s", format[id]); + renderer = renderer_type[id]; + gst_element_set_state (renderer->pipeline, GST_STATE_PLAYING); + } else { + g_error( "unknown audio compression type ct = %d\n", *ct); + } - renderer = calloc(1, sizeof(audio_renderer_t)); - if (!renderer) { - return NULL; - } - renderer->logger = logger; - - assert(check_plugins ()); - - GString *launch = g_string_new("appsrc name=audio_source stream-type=0 format=GST_FORMAT_TIME is-live=true ! queue ! "); - if (*compression_type == 8 || *compression_type == 4) { - g_string_append(launch, "avdec_aac ! "); - } else if (*compression_type == 2) { - g_string_append(launch, "avdec_alac ! "); - } - g_string_append(launch, "audioconvert ! volume name=volume ! level ! "); - g_string_append(launch, audiosink); - g_string_append(launch, " sync=false"); - renderer->pipeline = gst_parse_launch(launch->str, &error); - g_assert (renderer->pipeline); - g_string_free(launch, TRUE); - - renderer->appsrc = gst_bin_get_by_name (GST_BIN (renderer->pipeline), "audio_source"); - renderer->volume = gst_bin_get_by_name (GST_BIN (renderer->pipeline), "volume"); - - if (*compression_type == 8) { - logger_log(logger, LOGGER_INFO, "AAC-ELD 44100/2"); - caps = gst_caps_from_string(aac_eld); - } else if (*compression_type == 2) { - logger_log(logger, LOGGER_INFO, "ALAC 44100/16/2"); - caps = gst_caps_from_string(alac);; - } else if (*compression_type == 4) { - logger_log(logger, LOGGER_INFO, "AAC-LC 44100/2"); - caps = gst_caps_from_string(aac_lc); - logger_log(logger, LOGGER_INFO, "uncompressed PCM 44100/16/2"); - } else if (*compression_type == 1) { - caps = gst_caps_from_string(lpcm); - } - - g_object_set(renderer->appsrc, "caps", caps, NULL); - gst_caps_unref(caps); - - return renderer; } -void audio_renderer_start(audio_renderer_t *renderer) { - //g_signal_connect( renderer->pipeline, "deep-notify", G_CALLBACK(gst_object_default_deep_notify ), NULL ); - gst_element_set_state (renderer->pipeline, GST_STATE_PLAYING); -} - -void audio_renderer_render_buffer(audio_renderer_t *renderer, raop_ntp_t *ntp, unsigned char* data, int data_len, uint64_t pts) { +void audio_renderer_render_buffer(raop_ntp_t *ntp, unsigned char* data, int data_len, uint64_t pts) { GstBuffer *buffer; - - if (data_len == 0) return; + if (data_len == 0 || renderer == NULL) return; /* all audio received seems to be either ct = 8 (AAC_ELD 44100/2 spf 460 ) AirPlay Mirror protocol */ /* or ct = 2 (ALAC 44100/16/2 spf 352) AirPlay protocol */ /* first byte data[0] of ALAC frame is 0x20, first byte of AAC_ELD is 0x8d or 0x8e, AAC_LC is 0xff (ADTS) */ - /* GStreamer caps_filter could be used here to switch the appsrc caps between aac_eld and alac */ - /* depending on the initial byte of the buffer, with a pipeline using decodebin */ buffer = gst_buffer_new_and_alloc(data_len); assert(buffer != NULL); GST_BUFFER_DTS(buffer) = (GstClockTime)pts; gst_buffer_fill(buffer, 0, data, data_len); gst_app_src_push_buffer(GST_APP_SRC(renderer->appsrc), buffer); - } -void audio_renderer_set_volume(audio_renderer_t *renderer, float volume) { +void audio_renderer_set_volume(float volume) { float avol; if (fabs(volume) < 28) { - avol=floorf(((28-fabs(volume))/28)*10)/10; + avol=floorf(((28-fabs(volume))/28)*10)/10; g_object_set(renderer->volume, "volume", avol, NULL); } } -void audio_renderer_flush(audio_renderer_t *renderer) { +void audio_renderer_flush() { } -void audio_renderer_destroy(audio_renderer_t *renderer) { - if(renderer) { - gst_app_src_end_of_stream (GST_APP_SRC(renderer->appsrc)); - gst_element_set_state (renderer->pipeline, GST_STATE_NULL); - gst_object_unref (renderer->appsrc); - gst_object_unref (renderer->pipeline); - gst_object_unref (renderer->volume); - free (renderer); - renderer = NULL; +void audio_renderer_destroy() { + audio_renderer_stop(); + for (int i = 0; i < NFORMATS ; i++ ) { + gst_object_unref (renderer_type[i]->volume); + renderer_type[i]->volume = NULL; + gst_object_unref (renderer_type[i]->appsrc); + renderer_type[i]->appsrc = NULL; + gst_object_unref (renderer_type[i]->pipeline); + renderer_type[i]->pipeline = NULL; + free(renderer_type[i]); } } + diff --git a/renderers/video_renderer.h b/renderers/video_renderer.h index f320ed0..4daef6b 100644 --- a/renderers/video_renderer.h +++ b/renderers/video_renderer.h @@ -44,15 +44,16 @@ typedef enum videoflip_e { typedef struct video_renderer_s video_renderer_t; -video_renderer_t *video_renderer_init (logger_t *logger, const char *server_name, videoflip_t videoflip[2], const char *videosink); -void video_renderer_start (video_renderer_t *renderer); -void video_renderer_render_buffer (video_renderer_t *renderer, raop_ntp_t *ntp, unsigned char* data, int data_len, uint64_t pts, int type); -void video_renderer_flush (video_renderer_t *renderer); -unsigned int video_renderer_listen(void *loop, video_renderer_t *renderer); -void video_renderer_destroy (video_renderer_t *renderer); +void video_renderer_init (const char *server_name, videoflip_t videoflip[2], const char *videosink); +void video_renderer_start (); +void video_renderer_stop (); +void video_renderer_render_buffer (raop_ntp_t *ntp, unsigned char* data, int data_len, uint64_t pts, int type); +void video_renderer_flush (); +unsigned int video_renderer_listen(void *loop); +void video_renderer_destroy (); /* not implemented for gstreamer */ -void video_renderer_update_background (video_renderer_t *renderer, int type); +void video_renderer_update_background (int type); #ifdef __cplusplus } diff --git a/renderers/video_renderer_gstreamer.c b/renderers/video_renderer_gstreamer.c index 3b06229..7f2c547 100644 --- a/renderers/video_renderer_gstreamer.c +++ b/renderers/video_renderer_gstreamer.c @@ -27,7 +27,6 @@ #endif struct video_renderer_s { - logger_t *logger; GstElement *appsrc, *pipeline, *sink; GstBus *bus; #ifdef X_DISPLAY_FIX @@ -36,27 +35,6 @@ struct video_renderer_s { #endif }; -static gboolean check_plugins (void) -{ - int i; - gboolean ret; - GstRegistry *registry; - const gchar *needed[] = { "app", "libav", "playback", "autodetect", NULL}; - - registry = gst_registry_get (); - ret = TRUE; - for (i = 0; i < g_strv_length ((gchar **) needed); i++) { - GstPlugin *plugin; - plugin = gst_registry_find_plugin (registry, needed[i]); - if (!plugin) { - g_print ("Required gstreamer plugin '%s' not found\n", needed[i]); - ret = FALSE; - continue; - } - gst_object_unref (plugin); - } - return ret; -} static void append_videoflip (GString *launch, const videoflip_t *flip, const videoflip_t *rot) { /* videoflip image transform */ @@ -115,10 +93,12 @@ static void append_videoflip (GString *launch, const videoflip_t *flip, const vi } } -video_renderer_t *video_renderer_init(logger_t *logger, const char *server_name, videoflip_t videoflip[2], const char *videosink) { - video_renderer_t *renderer; - GError *error = NULL; +static video_renderer_t *renderer = NULL; +void video_renderer_init(const char *server_name, videoflip_t videoflip[2], const char *videosink) { + + GError *error = NULL; + /* this call to g_set_application_name makes server_name appear in the X11 display window title bar, */ /* (instead of the program name uxplay taken from (argv[0]). It is only set one time. */ @@ -127,12 +107,8 @@ video_renderer_t *video_renderer_init(logger_t *logger, const char *server_name, renderer = calloc(1, sizeof(video_renderer_t)); assert(renderer); - gst_init(NULL, NULL); - - renderer->logger = logger; - - assert(check_plugins ()); - + gst_init(NULL,NULL); + GString *launch = g_string_new("appsrc name=video_source stream-type=0 format=GST_FORMAT_TIME is-live=true !" "queue ! decodebin ! videoconvert ! "); append_videoflip(launch, &videoflip[0], &videoflip[1]); @@ -143,30 +119,33 @@ video_renderer_t *video_renderer_init(logger_t *logger, const char *server_name, g_string_free(launch, TRUE); renderer->appsrc = gst_bin_get_by_name (GST_BIN (renderer->pipeline), "video_source"); + assert(renderer->appsrc); renderer->sink = gst_bin_get_by_name (GST_BIN (renderer->pipeline), "video_sink"); - + assert(renderer->sink); + #ifdef X_DISPLAY_FIX renderer->server_name = server_name; + renderer->gst_window = NULL; bool x_display_fix = false; - if (strcmp(videosink,"autovideosink") == 0) x_display_fix = true; - if (strcmp(videosink,"ximagesink") == 0) x_display_fix = true; - if (strcmp(videosink,"xvimagesink") == 0) x_display_fix = true; + if (strcmp(videosink,"autovideosink") == 0 || + strcmp(videosink,"ximagesink") == 0 || + strcmp(videosink,"xvimagesink") == 0) { + x_display_fix = true; + } if (x_display_fix) { renderer->gst_window = calloc(1, sizeof(X11_Window_t)); assert(renderer->gst_window); get_X11_Display(renderer->gst_window); } #endif - return renderer; } -void video_renderer_start(video_renderer_t *renderer) { - //g_signal_connect( renderer->pipeline, "deep-notify", G_CALLBACK(gst_object_default_deep_notify ), NULL ); +void video_renderer_start() { gst_element_set_state (renderer->pipeline, GST_STATE_PLAYING); renderer->bus = gst_element_get_bus(renderer->pipeline); } -void video_renderer_render_buffer(video_renderer_t *renderer, raop_ntp_t *ntp, unsigned char* data, int data_len, uint64_t pts, int type) { +void video_renderer_render_buffer(raop_ntp_t *ntp, unsigned char* data, int data_len, uint64_t pts, int type) { GstBuffer *buffer; assert(data_len != 0); @@ -187,16 +166,30 @@ void video_renderer_render_buffer(video_renderer_t *renderer, raop_ntp_t *ntp, u void video_renderer_flush(video_renderer_t *renderer) { } -void video_renderer_destroy(video_renderer_t *renderer) { +void video_renderer_stop() { + if (renderer) { + gst_app_src_end_of_stream (GST_APP_SRC(renderer->appsrc)); + gst_element_set_state (renderer->pipeline, GST_STATE_NULL); + } +} + +void video_renderer_destroy() { if (renderer) { - gst_app_src_end_of_stream (GST_APP_SRC(renderer->appsrc)); - gst_element_set_state (renderer->pipeline, GST_STATE_NULL); + GstState state; + gst_element_get_state(renderer->pipeline, &state, NULL, 0); + if (state != GST_STATE_NULL) { + gst_app_src_end_of_stream (GST_APP_SRC(renderer->appsrc)); + gst_element_set_state (renderer->pipeline, GST_STATE_NULL); + } + gst_object_unref(renderer->bus); + gst_object_unref(renderer->sink); gst_object_unref (renderer->appsrc); - gst_object_unref (renderer->bus); gst_object_unref (renderer->pipeline); - gst_object_unref (renderer->sink); #ifdef X_DISPLAY_FIX - if(renderer->gst_window) free(renderer->gst_window); + if (renderer->gst_window) { + free(renderer->gst_window); + renderer->gst_window = NULL; + } #endif free (renderer); renderer = NULL; @@ -204,7 +197,7 @@ void video_renderer_destroy(video_renderer_t *renderer) { } /* not implemented for gstreamer */ -void video_renderer_update_background(video_renderer_t *renderer, int type) { +void video_renderer_update_background(int type) { } gboolean gstreamer_pipeline_bus_callback(GstBus *bus, GstMessage *message, gpointer loop) { @@ -212,17 +205,22 @@ gboolean gstreamer_pipeline_bus_callback(GstBus *bus, GstMessage *message, gpoin case GST_MESSAGE_ERROR: { GError *err; gchar *debug; + gboolean flushing; gst_message_parse_error (message, &err, &debug); g_print ("GStreamer error: %s\n", err->message); g_error_free (err); g_free (debug); - g_main_loop_quit( (GMainLoop *) loop); + gst_app_src_end_of_stream (GST_APP_SRC(renderer->appsrc)); + flushing = TRUE; + gst_bus_set_flushing(bus, flushing); + gst_element_set_state (renderer->pipeline, GST_STATE_NULL); + g_main_loop_quit( (GMainLoop *) loop); break; } case GST_MESSAGE_EOS: /* end-of-stream */ g_print("GStreamer: End-Of-Stream\n"); - g_main_loop_quit( (GMainLoop *) loop); + // g_main_loop_quit( (GMainLoop *) loop); break; default: /* unhandled message */ @@ -231,8 +229,7 @@ gboolean gstreamer_pipeline_bus_callback(GstBus *bus, GstMessage *message, gpoin return TRUE; } -unsigned int video_renderer_listen(void *loop, video_renderer_t *renderer) { +unsigned int video_renderer_listen(void *loop) { return (unsigned int) gst_bus_add_watch(renderer->bus, (GstBusFunc) gstreamer_pipeline_bus_callback, (gpointer) loop); } - diff --git a/uxplay.cpp b/uxplay.cpp index 0046045..cbf8fc1 100755 --- a/uxplay.cpp +++ b/uxplay.cpp @@ -26,6 +26,7 @@ #include #include #include +#include #include "log.h" #include "lib/raop.h" @@ -35,23 +36,20 @@ #include "renderers/video_renderer.h" #include "renderers/audio_renderer.h" -#define VERSION "1.40" +#define VERSION "1.41" #define DEFAULT_NAME "UxPlay" #define DEFAULT_DEBUG_LOG false #define LOWEST_ALLOWED_PORT 1024 #define HIGHEST_PORT 65535 -static int start_server (std::vector hw_addr, std::string name, unsigned short display[5], - unsigned short tcp[3], unsigned short udp[3], videoflip_t videoflip[2], - bool debug_log, std::string videosink); +static int start_raop_server (std::vector hw_addr, std::string name, unsigned short display[5], + unsigned short tcp[3], unsigned short udp[3], bool debug_log); -static int stop_server (); +static int stop_raop_server (); static dnssd_t *dnssd = NULL; static raop_t *raop = NULL; -static video_renderer_t *video_renderer = NULL; -static audio_renderer_t *audio_renderer = NULL; static logger_t *render_logger = NULL; static bool relaunch_server = false; @@ -96,7 +94,7 @@ static void main_loop() { if (server_timeout) { connection_watch_id = g_timeout_add_seconds(1, (GSourceFunc) connection_callback, (gpointer) loop); } - if (use_video) gst_bus_watch_id = (guint) video_renderer_listen((void *)loop, video_renderer); + if (use_video) gst_bus_watch_id = (guint) video_renderer_listen((void *)loop); guint sigterm_watch_id = g_unix_signal_add(SIGTERM, (GSourceFunc) sigterm_callback, (gpointer) loop); guint sigint_watch_id = g_unix_signal_add(SIGINT, (GSourceFunc) sigint_callback, (gpointer) loop); relaunch_server = true; @@ -161,15 +159,16 @@ static void print_info (char *name) { printf("-p Use legacy ports UDP 6000:6001:7011 TCP 7000:7001:7100\n"); printf("-p n Use TCP and UDP ports n,n+1,n+2. range %d-%d\n", LOWEST_ALLOWED_PORT, HIGHEST_PORT); printf(" use \"-p n1,n2,n3\" to set each port, \"n1,n2\" for n3 = n2+1\n"); - printf(" \"-p tcp n\" or \"-p udp n\" sets TCP or UDP ports only\n"); + printf(" \"-p tcp n\" or \"-p udp n\" sets TCP or UDP ports separately\n"); printf("-m Use random MAC address (use for concurrent UxPlay's)\n"); printf("-t n Relaunch server if no connection existed in last n seconds\n"); printf("-vs Choose the GStreamer videosink; default \"autovideosink\"\n"); - printf(" choices: ximagesink,xvimagesink,vaapisink,glimagesink, etc.\n"); + printf(" some choices: ximagesink,xvimagesink,vaapisink,glimagesink,\n"); + printf(" gtksink,waylandsink,osximagesink,fpsdisplaysink, etc.\n"); printf("-vs 0 Streamed audio only, with no video display window\n"); printf("-as Choose the GStreamer audiosink; default \"autoaudiosink\"\n"); printf(" choices: pulsesink,alsasink,osssink,oss4sink,osxaudiosink,etc.\n"); - printf("-as 0 (or -a) Turn audio off, video output only\n"); + printf("-as 0 (or -a) Turn audio off, streamed video only\n"); printf("-d Enable debug logging\n"); printf("-v or -h Displays this help and version information\n"); } @@ -392,11 +391,29 @@ int main (int argc, char *argv[]) { } } - if(audiosink == "0") { + if (audiosink == "0") { use_audio = false; } - if(!use_audio) LOGI("audio_disabled"); + if (videosink == "0") { + use_video = false; + videosink.erase(); + videosink.append("fakesink"); + LOGI("video_disabled"); + display[3] = 1; /* set fps to 1 frame per sec when no video will be shown */ + } + + if (use_audio) { + audio_renderer_init(audiosink.c_str()); + } else { + LOGI("audio_disabled"); + } + + + if (use_video) { + video_renderer_init(server_name.c_str(), videoflip, videosink.c_str()); + video_renderer_start(); + } if (udp[0]) LOGI("using network ports UDP %d %d %d TCP %d %d %d\n", udp[0],udp[1], udp[2], tcp[0], tcp[1], tcp[2]); @@ -416,19 +433,28 @@ int main (int argc, char *argv[]) { relaunch: compression_type = 0; connections_stopped = false; - if (start_server(server_hw_addr, server_name, display, tcp, udp, - videoflip, debug_log, videosink)) { + if (start_raop_server(server_hw_addr, server_name, display, tcp, udp, debug_log)) { return 1; } main_loop(); if (relaunch_server) { + assert(use_video); LOGI("Re-launching server..."); - stop_server(); + stop_raop_server(); + video_renderer_destroy(); + video_renderer_init(server_name.c_str(), videoflip, videosink.c_str()); + video_renderer_start(); goto relaunch; } else { LOGI("Stopping..."); - stop_server(); + stop_raop_server(); + } + if (use_audio) { + audio_renderer_destroy(); + } + if (use_video) { + video_renderer_destroy(); } } @@ -437,11 +463,11 @@ extern "C" void conn_init (void *cls) { open_connections++; connections_stopped = false; LOGI("Open connections: %i", open_connections); - video_renderer_update_background(video_renderer, 1); + //video_renderer_update_background(1); } extern "C" void conn_destroy (void *cls) { - video_renderer_update_background(video_renderer, -1); + //video_renderer_update_background(-1); open_connections--; LOGI("Open connections: %i", open_connections); if(!open_connections) { @@ -450,45 +476,39 @@ extern "C" void conn_destroy (void *cls) { } extern "C" void audio_process (void *cls, raop_ntp_t *ntp, aac_decode_struct *data) { - if (audio_renderer != NULL) { - audio_renderer_render_buffer(audio_renderer, ntp, data->data, data->data_len, data->pts); + if (use_audio) { + audio_renderer_render_buffer(ntp, data->data, data->data_len, data->pts); } } extern "C" void video_process (void *cls, raop_ntp_t *ntp, h264_decode_struct *data) { - video_renderer_render_buffer(video_renderer, ntp, data->data, data->data_len, data->pts, data->frame_type); -} - -extern "C" void audio_flush (void *cls) { - audio_renderer_flush(audio_renderer); -} - -extern "C" void video_flush (void *cls) { - video_renderer_flush(video_renderer); -} - -extern "C" void audio_set_volume (void *cls, float volume) { - if (audio_renderer != NULL) { - audio_renderer_set_volume(audio_renderer, volume); + if (use_video) { + video_renderer_render_buffer(ntp, data->data, data->data_len, data->pts, data->frame_type); } } -extern "C" void audio_setup (void *cls, unsigned char *ct) { - if(use_audio) { - LOGI("new audio compression type %d (was %d)",*ct, compression_type); - if (*ct != compression_type) { - if (compression_type && audio_renderer) { - audio_renderer_destroy(audio_renderer); - LOGD("previous audio_renderer destroyed"); - } - compression_type = *ct; - audio_renderer = audio_renderer_init(render_logger, &compression_type, audiosink.c_str()); - if (audio_renderer) { - audio_renderer_start(audio_renderer); - } else { - LOGW("could not init audio_renderer"); - } - } +extern "C" void audio_flush (void *cls) { + if (use_audio) { + audio_renderer_flush(); + } +} + +extern "C" void video_flush (void *cls) { + if (use_video) { + video_renderer_flush(); + } +} + +extern "C" void audio_set_volume (void *cls, float volume) { + if (use_audio) { + audio_renderer_set_volume(volume); + } +} + +extern "C" void audio_get_format (void *cls, unsigned char *ct, unsigned short *spf, bool *usingScreen, bool *isMedia, uint64_t *audioFormat) { + LOGI("ct=%d spf=%d usingScreen=%d isMedia=%d audioFormat=0x%lx",*ct, *spf, *usingScreen, *isMedia, (unsigned long) *audioFormat); + if (use_audio) { + audio_renderer_start(ct); } } @@ -516,9 +536,8 @@ extern "C" void log_callback (void *cls, int level, const char *msg) { } -int start_server (std::vector hw_addr, std::string name, unsigned short display[5], - unsigned short tcp[3], unsigned short udp[3], videoflip_t videoflip[2], - bool debug_log, std::string videosink) { +int start_raop_server (std::vector hw_addr, std::string name, unsigned short display[5], + unsigned short tcp[3], unsigned short udp[3], bool debug_log) { raop_callbacks_t raop_cbs; memset(&raop_cbs, 0, sizeof(raop_cbs)); raop_cbs.conn_init = conn_init; @@ -528,7 +547,7 @@ int start_server (std::vector hw_addr, std::string name, unsigned short di raop_cbs.audio_flush = audio_flush; raop_cbs.video_flush = video_flush; raop_cbs.audio_set_volume = audio_set_volume; - raop_cbs.audio_setup = audio_setup; + raop_cbs.audio_get_format = audio_get_format; raop = raop_init(10, &raop_cbs); if (raop == NULL) { @@ -539,11 +558,6 @@ int start_server (std::vector hw_addr, std::string name, unsigned short di /* write desired display pixel width, pixel height, refresh_rate, max_fps, overscanned. */ /* use 0 for default values 1920,1080,60,30,0; these are sent to the Airplay client */ - if(videosink == "0") { - use_video = false; - display[3] = 1; /* set fps to 1 frame per sec when no video will be shown */ - } - raop_set_display(raop, display[0], display[1], display[2], display[3], display[4]); /* network port selection (ports listed as "0" will be dynamically assigned) */ @@ -553,23 +567,6 @@ int start_server (std::vector hw_addr, std::string name, unsigned short di raop_set_log_callback(raop, log_callback, NULL); raop_set_log_level(raop, debug_log ? RAOP_LOG_DEBUG : LOGGER_INFO); - render_logger = logger_init(); - if (render_logger == NULL) { - LOGE("Could not init render_logger\n"); - stop_server(); - return -1; - } - logger_set_callback(render_logger, log_callback, NULL); - logger_set_level(render_logger, debug_log ? LOGGER_DEBUG : LOGGER_INFO); - - if ((video_renderer = video_renderer_init(render_logger, name.c_str(), videoflip, videosink.c_str())) == NULL) { - LOGE("Could not init video renderer"); - stop_server(); - return -1; - } - - if (use_video && video_renderer) video_renderer_start(video_renderer); - unsigned short port = raop_get_port(raop); raop_start(raop, &port); raop_set_port(raop, port); @@ -578,7 +575,7 @@ int start_server (std::vector hw_addr, std::string name, unsigned short di dnssd = dnssd_init(name.c_str(), strlen(name.c_str()), hw_addr.data(), hw_addr.size(), &error); if (error) { LOGE("Could not initialize dnssd library!"); - stop_server(); + stop_raop_server(); return -2; } @@ -595,16 +592,12 @@ int start_server (std::vector hw_addr, std::string name, unsigned short di return 0; } -int stop_server () { +int stop_raop_server () { if (raop) raop_destroy(raop); if (dnssd) { dnssd_unregister_raop(dnssd); dnssd_unregister_airplay(dnssd); dnssd_destroy(dnssd); } - if (audio_renderer) audio_renderer_destroy(audio_renderer); - if (video_renderer) video_renderer_destroy(video_renderer); - compression_type = 0; - if (render_logger) logger_destroy(render_logger); return 0; }