Merge pull request #688 from jb-alvarado/master

v0.24.0
This commit is contained in:
jb-alvarado 2024-07-04 07:33:25 +00:00 committed by GitHub
commit 71c9c3bc73
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
106 changed files with 7521 additions and 7151 deletions

3
.gitignore vendored
View File

@ -20,9 +20,10 @@
*.deb
*.rpm
ffplayout.1.gz
ffpapi.1.gz
/assets/*.db*
/dist/
/public/
tmp/
assets/playlist_template.json
advanced*.toml
ffplayout*.toml

4
.gitmodules vendored
View File

@ -1,3 +1,3 @@
[submodule "ffplayout-frontend"]
path = ffplayout-frontend
[submodule "frontend"]
path = frontend
url = https://github.com/ffplayout/ffplayout-frontend.git

View File

@ -19,7 +19,16 @@
},
"cSpell.words": [
"actix",
"ffpengine",
"flexi",
"lettre",
"libc",
"neli",
"paris",
"reqwest",
"rsplit",
"rustls",
"sqlx",
"starttls",
"tokio",
"uuids"

1212
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -1,10 +1,10 @@
[workspace]
members = ["ffplayout-api", "ffplayout-engine", "lib", "tests"]
default-members = ["ffplayout-api", "ffplayout-engine", "tests"]
members = ["ffplayout", "tests"]
default-members = ["ffplayout", "tests"]
resolver = "2"
[workspace.package]
version = "0.23.1"
version = "0.24.0-alpha1"
license = "GPL-3.0"
repository = "https://github.com/ffplayout/ffplayout"
authors = ["Jonathan Baecker <jonbae77@gmail.com>"]

View File

@ -13,13 +13,13 @@ Check the [releases](https://github.com/ffplayout/ffplayout/releases/latest) for
### Features
- have all values in a separate config file
- start program with [web based frontend](https://github.com/ffplayout/ffplayout-frontend), or run playout in foreground mode without frontend
- dynamic playlist
- replace missing playlist or clip with single filler or multiple fillers from folder, if no filler exists, create dummy clip
- playing clips in [watched](/docs/folder_mode.md) folder mode
- send emails with error message
- overlay a logo
- overlay text, controllable through [ffplayout-frontend](https://github.com/ffplayout/ffplayout-frontend) (needs ffmpeg with libzmq and enabled JSON RPC server)
- overlay text, controllable through [web frontend](https://github.com/ffplayout/ffplayout-frontend) (needs ffmpeg with libzmq and enabled JSON RPC server)
- loop playlist infinitely
- [remote source](/docs/remote_source.md)
- trim and fade the last clip, to get full 24 hours
@ -42,7 +42,6 @@ Check the [releases](https://github.com/ffplayout/ffplayout/releases/latest) for
- **desktop**
- **HLS**
- **null** (for debugging)
- JSON RPC server, to get information about what is playing and to control it
- [live ingest](/docs/live_ingest.md)
- image source (will loop until out duration is reached)
- extra audio source, has priority over audio from video (experimental *)
@ -51,23 +50,19 @@ Check the [releases](https://github.com/ffplayout/ffplayout/releases/latest) for
- [custom filters](/docs/custom_filters.md) globally in config, or in playlist for specific clips
- import playlist from text or m3u file, with CLI or frontend
- audio only, for radio mode (experimental *)
- [Piggyback Mode](/ffplayout-api/README.md#piggyback-mode), mostly for non Linux systems (experimental *)
- generate playlist based on [template](/docs/playlist_gen.md) (experimental *)
- During playlist import, all video clips are validated and, if desired, checked to ensure that the audio track is not completely muted.
- run multiple channels (experimental *)
For preview stream, read: [/docs/preview_stream.md](/docs/preview_stream.md)
**\* Experimental features do not guarantee the same stability and may fail under unusual circumstances. Code and configuration options may change in the future.**
## **ffplayout-api (ffpapi)**
ffpapi serves the [frontend](https://github.com/ffplayout/ffplayout-frontend) and it acts as a [REST API](/ffplayout-api/README.md) for controlling the engine, manipulate playlists, add settings etc.
### Requirements
- RAM and CPU depends on video resolution, minimum 4 threads and 3GB RAM for 720p are recommend
- RAM and CPU depends on video resolution, minimum 4 _dedicated_ threads and 3GB RAM for 720p are recommend
- **ffmpeg** v5.0+ and **ffprobe** (**ffplay** if you want to play on desktop)
- if you want to overlay text, ffmpeg needs to have **libzmq**
- if you want to overlay dynamic text, ffmpeg needs to have **libzmq**
### Install
@ -119,6 +114,7 @@ Check [install](docs/install.md) for details about how to install ffplayout.
]
}
```
If you are in playlist mode and move backwards or forwards in time, the time shift is saved so the playlist is still in sync. Bear in mind, however, that this may make your playlist too short. If you do not reset it, it will automatically reset the next day.
## **Warning**
@ -126,72 +122,6 @@ Check [install](docs/install.md) for details about how to install ffplayout.
-----
## HLS output
For outputting to HLS, output parameters should look like:
```yaml
out:
...
output_param: >-
...
-flags +cgop
-f hls
-hls_time 6
-hls_list_size 600
-hls_flags append_list+delete_segments+omit_endlist+program_date_time
-hls_segment_filename /var/www/html/live/stream-%09d.ts /var/www/html/live/stream.m3u8
```
-----
## JSON RPC
The ffplayout engine can run a simple RPC server. A request looks like:
```Bash
curl -X POST -H "Content-Type: application/json" -H "Authorization: ---auth-key---" \
-d '{"control":"next"}' \
127.0.0.1:7070
```
At the moment this commends are possible:
```Bash
'{"media":"current"}' # get infos about current clip
'{"media":"next"}' # get infos about next clip
'{"media":"last"}' # get infos about last clip
'{"control":"next"}' # jump to next clip
'{"control":"back"}' # jump to last clip
'{"control":"reset"}' # reset playlist to old state
'{"control":"text", \
"message": {"text": "Hello from ffplayout", "x": "(w-text_w)/2", "y": "(h-text_h)/2", \
"fontsize": 24, "line_spacing": 4, "fontcolor": "#ffffff", "box": 1, \
"boxcolor": "#000000", "boxborderw": 4, "alpha": 1.0}}' # send text to drawtext filter from ffmpeg
```
Output from `{"media":"current"}` show:
```JSON
{
"media": {
"category": "",
"duration": 154.2,
"out": 154.2,
"in": 0.0,
"source": "/opt/tv-media/clip.mp4"
},
"index": 39,
"mode": "playlist",
"ingest": false,
"played": 67.80771999300123,
}
```
If you are in playlist mode and move backwards or forwards in time, the time shift is saved so the playlist is still in sync. Bear in mind, however, that this may make your playlist too short. If you do not reset it, it will automatically reset the next day.
## Founding
If you like this project and would like to make a donation, please use one of the options provided.

View File

@ -1,5 +0,0 @@
# give user ffpu permission to control the ffplayout systemd service
ffpu ALL = NOPASSWD: /usr/bin/systemctl start ffplayout.service, /usr/bin/systemctl stop ffplayout.service, /usr/bin/systemctl restart ffplayout.service, /usr/bin/systemctl status ffplayout.service, /usr/bin/systemctl is-active ffplayout.service, /usr/bin/systemctl enable ffplayout.service, /usr/bin/systemctl disable ffplayout.service
ffpu ALL = NOPASSWD: /usr/bin/systemctl start ffplayout@*, /usr/bin/systemctl stop ffplayout@*, /usr/bin/systemctl restart ffplayout@*, /usr/bin/systemctl status ffplayout@*, /usr/bin/systemctl is-active ffplayout@*, /usr/bin/systemctl enable ffplayout@*, /usr/bin/systemctl disable ffplayout@*

View File

@ -1,37 +0,0 @@
# Changing these settings is for advanced users only!
# There will be no support or guarantee that it will be stable after changing them.
[decoder]
input_param = ""
# output_param get also applied to ingest instance.
output_param = ""
[encoder]
input_param = ""
[filters]
deinterlace = "" # yadif=0:-1:0
pad_scale_w = "" # scale={}:-1
pad_scale_h = "" # scale=-1:{}
pad_video = "" # pad=max(iw\\,ih*({0}/{1})):ow/({0}/{1}):(ow-iw)/2:(oh-ih)/2
fps = "" # fps={}
scale = "" # scale={}:{}
set_dar = "" # setdar=dar={}
fade_in = "" # fade=in:st=0:d=0.5
fade_out = "" # fade=out:st={}:d=1.0
overlay_logo_scale = "" # scale={}
overlay_logo_fade_in = "" # fade=in:st=0:d=1.0:alpha=1
overlay_logo_fade_out = "" # fade=out:st={}:d=1.0:alpha=1
overlay_logo = "" # null[l];[v][l]overlay={}:shortest=1
tpad = "" # tpad=stop_mode=add:stop_duration={}
drawtext_from_file = "" # drawtext=text='{}':{}{}
drawtext_from_zmq = "" # zmq=b=tcp\\\\://'{}',drawtext@dyntext={}
aevalsrc = "" # aevalsrc=0:channel_layout=stereo:duration={}:sample_rate=48000
afade_in = "" # afade=in:st=0:d=0.5
afade_out = "" # afade=out:st={}:d=1.0
apad = "" # apad=whole_dur={}
volume = "" # volume={}
split = "" # split={}{}
[ingest]
input_param = ""

View File

@ -1,12 +0,0 @@
[Unit]
Description=Rest API for ffplayout
After=network.target remote-fs.target
[Service]
ExecStart=/usr/bin/ffpapi -l 0.0.0.0:8787
Restart=always
RestartSec=1
User=ffpu
[Install]
WantedBy=multi-user.target

View File

@ -3,7 +3,7 @@ Description=Rust and ffmpeg based playout solution
After=network.target remote-fs.target
[Service]
ExecStart=/usr/bin/ffplayout
ExecStart=/usr/bin/ffplayout -l 0.0.0.0:8787
Restart=always
StartLimitInterval=20
RestartSec=1

View File

@ -1,168 +0,0 @@
[general]
help_text = """Sometimes it can happen, that a file is corrupt but still playable, \
this can produce an streaming error over all following files. The only way \
in this case is, to stop ffplayout and start it again. Here we only say when \
it stops, the starting process is in your hand. Best way is a systemd service \
on linux.
'stop_threshold' stop ffplayout, if it is async in time above this \
value. A number below 3 can cause unexpected errors."""
stop_threshold = 11
stat_file = ".ffp_status"
[rpc_server]
help_text = """Run a JSON RPC server, for getting infos about current playing and for some \
control functions."""
enable = true
address = "127.0.0.1:7070"
authorization = "av2Kx8g67lF9qj5wEH3ym1bI4cCs"
[mail]
help_text = """Send error messages to email address, like missing playlist; invalid \
json format; missing clip path. Leave recipient blank, if you don't need this.
'mail_level' can be INFO, WARNING or ERROR.
'interval' means seconds until a new mail will be sended."""
subject = "Playout Error"
smtp_server = "mail.example.org"
starttls = true
sender_addr = "ffplayout@example.org"
sender_pass = "abc123"
recipient = ""
mail_level = "ERROR"
interval = 30
[logging]
help_text = """If 'log_to_file' is true, log to file, when is false log to console.
'backup_count' says how long log files will be saved in days.
'local_time' to false will set log timestamps to UTC. Path to /var/log/ only \
if you run this program as daemon.
'level' can be DEBUG, INFO, WARNING, ERROR.
'ffmpeg_level/ingest_level' can be INFO, WARNING, ERROR.
'detect_silence' logs an error message if the audio line is silent for 15 \
seconds during the validation process.
'ignore_lines' makes logging to ignore strings that contains matched lines, \
in frontend is a semicolon separated list."""
log_to_file = true
backup_count = 7
local_time = true
timestamp = true
path = "/var/log/ffplayout/"
level = "DEBUG"
ffmpeg_level = "ERROR"
ingest_level = "WARNING"
detect_silence = false
ignore_lines = [
"P sub_mb_type 4 out of range at",
"error while decoding MB",
"negative number of zero coeffs at",
"out of range intra chroma pred mode",
"non-existing SPS 0 referenced in buffering period",
]
[processing]
help_text = """Default processing for all clips, to have them unique. Mode can be playlist \
or folder.
'aspect' must be a float number.'logo' is only used if the path exist.
'logo_scale' scale the logo to target size, leave it blank when no scaling \
is needed, format is 'width:height', for example '100:-1' for proportional \
scaling. With 'logo_opacity' logo can become transparent.
With 'audio_tracks' it is possible to configure how many audio tracks should \
be processed. 'audio_channels' can be use, if audio has more channels then only stereo.
With 'logo_position' in format 'x:y' you set the logo position.
With 'custom_filter' it is possible, to apply further filters. The filter \
outputs should end with [c_v_out] for video filter, and [c_a_out] for audio filter."""
mode = "playlist"
audio_only = false
copy_audio = false
copy_video = false
width = 1024
height = 576
aspect = 1.778
fps = 25
add_logo = true
logo = "/usr/share/ffplayout/logo.png"
logo_scale = ""
logo_opacity = 0.7
logo_position = "W-w-12:12"
audio_tracks = 1
audio_track_index = -1
audio_channels = 2
volume = 1
custom_filter = ""
[ingest]
help_text = """Run a server for a ingest stream. This stream will override the normal streaming \
until is done. There is only a very simple authentication mechanism, which check if the \
stream name is correct.
'custom_filter' can be used in the same way then the one in the process section."""
enable = false
input_param = "-f live_flv -listen 1 -i rtmp://127.0.0.1:1936/live/stream"
custom_filter = ""
[playlist]
help_text = """'path' can be a path to a single file, or a directory. For directory put \
only the root folder, for example '/playlists', subdirectories are read by the \
program. Subdirectories needs this structure '/playlists/2018/01'.
'day_start' means at which time the playlist should start, leave day_start \
blank when playlist should always start at the begin. 'length' represent the \
target length from playlist, when is blank real length will not consider.
'infinit: true' works with single playlist file and loops it infinitely. """
path = "/var/lib/ffplayout/playlists"
day_start = "05:59:25"
length = "24:00:00"
infinit = false
[storage]
help_text = """'filler' is for playing instead of a missing file or fill the end to reach 24 \
hours, can be a file or folder, it will loop when is necessary.
'extensions' search only files with this extension. Set 'shuffle' to 'true' \
to pick files randomly."""
path = "/var/lib/ffplayout/tv-media"
filler = "/var/lib/ffplayout/tv-media/filler/filler.mp4"
extensions = ["mp4", "mkv", "webm"]
shuffle = true
[text]
help_text = """Overlay text in combination with libzmq for remote text manipulation. \
On windows fontfile path need to be like this 'C\\:/WINDOWS/fonts/DejaVuSans.ttf'.
'text_from_filename' activate the extraction from text of a filename. With 'style' \
you can define the drawtext parameters like position, color, etc. Post Text over \
API will override this. With 'regex' you can format file names, to get a title from it."""
add_text = true
text_from_filename = false
fontfile = "/usr/share/fonts/truetype/dejavu/DejaVuSans.ttf"
style = "x=(w-tw)/2:y=(h-line_h)*0.9:fontsize=24:fontcolor=#ffffff:box=1:boxcolor=#000000:boxborderw=4"
regex = "^.+[/\\](.*)(.mp4|.mkv|.webm)$"
[task]
help_text = """Run an external program with a given media object. The media object is in json format \
and contains all the information about the current clip. The external program can be a script \
or a binary, but should only run for a short time."""
enable = false
path = ""
[out]
help_text = """The final playout compression. Set the settings to your needs. 'mode' \
has the options 'desktop', 'hls', 'null', 'stream'. Use 'stream' and adjust \
'output_param:' settings when you want to stream to a rtmp/rtsp/srt/... server.
In production don't serve hls playlist with ffpapi, use nginx or another web server!"""
mode = "hls"
output_param = """\
-c:v libx264
-crf 23
-x264-params keyint=50:min-keyint=25:scenecut=-1
-maxrate 1300k
-bufsize 2600k
-preset faster
-tune zerolatency
-profile:v Main
-level 3.1
-c:a aac
-ar 44100
-b:a 128k
-flags +cgop
-f hls
-hls_time 6
-hls_list_size 600
-hls_flags append_list+delete_segments+omit_endlist
-hls_segment_filename /usr/share/ffplayout/public/live/stream-%d.ts
/usr/share/ffplayout/public/live/stream.m3u8"""

View File

@ -1,14 +0,0 @@
[Unit]
Description=Rust and ffmpeg based multi channel playout solution
After=network.target remote-fs.target
[Service]
ExecStart=/usr/bin/ffplayout %I
Restart=always
StartLimitInterval=20
RestartSec=1
KillMode=mixed
User=ffpu
[Install]
WantedBy=multi-user.target

3
debian/postinst vendored
View File

@ -19,11 +19,10 @@ if [ ! -d "/usr/share/ffplayout/db" ]; then
IP=$(hostname -I | cut -d ' ' -f1)
/usr/bin/ffpapi -i -d "${IP}:8787"
/usr/bin/ffplayout -i -d "${IP}:8787"
chown -R ${sysUser}: "/usr/share/ffplayout"
chown -R ${sysUser}: "/var/lib/ffplayout"
chown -R ${sysUser}: "/etc/ffplayout"
fi
if [ ! -d "/var/log/ffplayout" ]; then

2
debian/postrm vendored
View File

@ -6,7 +6,7 @@ sysUser="ffpu"
case "$1" in
abort-install|purge)
deluser $sysUser
rm -rf /usr/share/ffplayout /var/log/ffplayout /etc/ffplayout /var/lib/ffplayout /home/$sysUser
rm -rf /usr/share/ffplayout /var/log/ffplayout /var/lib/ffplayout /home/$sysUser
;;
remove)

View File

@ -29,12 +29,9 @@ RUN dnf update -y && \
RUN [[ -f /tmp/ffplayout-${FFPLAYOUT_VERSION}-1.x86_64.rpm ]] || wget -q "https://github.com/ffplayout/ffplayout/releases/download/v${FFPLAYOUT_VERSION}/ffplayout-${FFPLAYOUT_VERSION}-1.x86_64.rpm" -P /tmp/ && \
dnf install -y /tmp/ffplayout-${FFPLAYOUT_VERSION}-1.x86_64.rpm && \
rm /tmp/ffplayout-${FFPLAYOUT_VERSION}-1.x86_64.rpm && \
sed -i "s/User=ffpu/User=root/g" /usr/lib/systemd/system/ffpapi.service && \
sed -i "s/User=ffpu/User=root/g" /usr/lib/systemd/system/ffplayout.service && \
sed -i "s/User=ffpu/User=root/g" /usr/lib/systemd/system/ffplayout@.service && \
systemctl enable ffplayout && \
systemctl enable ffpapi && \
ffpapi -u admin -p admin -m contact@example.com
ffplayout -u admin -p admin -m contact@example.com
EXPOSE 8787

View File

@ -26,7 +26,7 @@ You can take a look at the [Dockerfile](Dockerfile)
## Storage
There are some folders/files that are important for ffplayout to work well such as:
- **/usr/share/ffplayout/db** => where all the data for the `ffpapi` are stored (user/pass etc)
- **/usr/share/ffplayout/db** => where all the data are stored (user/pass etc)
- **/var/lib/ffplayout/tv-media** => where the media are stored by default (configurable)
- **/var/lib/ffplayout/playlists** => where playlists are stored (configurable)
- **/etc/ffplayout/ffplayout.yml** => the core config file

View File

@ -139,7 +139,6 @@ ENV LD_LIBRARY_PATH=/usr/local/lib64:/usr/local/lib
COPY --from=build /usr/local/ /usr/local/
ADD ./overide.conf /etc/systemd/system/ffplayout.service.d/overide.conf
ADD ./overide.conf /etc/systemd/system/ffpapi.service.d/overide.conf
RUN \
dnf update -y \
@ -155,8 +154,7 @@ RUN \
rm /tmp/ffplayout-${FFPLAYOUT_VERSION}-1.x86_64.rpm && \
mkdir -p /home/ffpu && chown -R ffpu: /home/ffpu && \
systemctl enable ffplayout && \
systemctl enable ffpapi && \
ffpapi -u admin -p admin -m contact@example.com
ffplayout -u admin -p admin -m contact@example.com
EXPOSE 8787

View File

@ -102,8 +102,7 @@ RUN yum update -y \
&& echo 'Docker!' | passwd --stdin root \
&& rm /tmp/ffplayout-${FFPLAYOUT_VERSION}-1.x86_64.rpm \
&& mkdir -p /home/ffpu && chown -R ffpu: /home/ffpu \
&& systemctl enable ffplayout \
&& systemctl enable ffpapi
&& systemctl enable ffplayout
EXPOSE 8787
RUN echo "/usr/local/lib" >> /etc/ld.so.conf.d/nvidia.conf

View File

@ -1,6 +1,6 @@
## Advanced settings
Within **/etc/ffplayout/advanced.yml** you can control all ffmpeg inputs/decoder output and filters.
With **advanced settings** you can control all ffmpeg inputs/decoder output and filters.
> **_Note:_** Changing these settings is for advanced users only! There will be no support or guarantee that it will work and be stable after changing them!

View File

@ -1,9 +1,9 @@
## Possible endpoints
### Possible endpoints
Run the API thru the systemd service, or like:
```BASH
ffpapi -l 127.0.0.1:8787
ffplayout -l 127.0.0.1:8787
```
For all endpoints an (Bearer) authentication is required.\
@ -72,7 +72,7 @@ curl -X GET 'http://127.0.0.1:8787/api/user/2' -H 'Content-Type: application/jso
-H 'Authorization: Bearer <TOKEN>'
```
#### ffpapi Settings
#### Settings
**Get Settings from Channel**
@ -87,9 +87,7 @@ curl -X GET http://127.0.0.1:8787/api/channel/1 -H "Authorization: Bearer <TOKEN
"id": 1,
"name": "Channel 1",
"preview_url": "http://localhost/live/preview.m3u8",
"config_path": "/etc/ffplayout/ffplayout.yml",
"extra_extensions": "jpg,jpeg,png",
"service": "ffplayout.service",
"utc_offset": "+120"
}
```
@ -104,7 +102,7 @@ curl -X GET http://127.0.0.1:8787/api/channels -H "Authorization: Bearer <TOKEN>
```BASH
curl -X PATCH http://127.0.0.1:8787/api/channel/1 -H "Content-Type: application/json" \
-d '{ "id": 1, "name": "Channel 1", "preview_url": "http://localhost/live/stream.m3u8", "config_path": "/etc/ffplayout/ffplayout.yml", "extra_extensions": "jpg,jpeg,png"}' \
-d '{ "id": 1, "name": "Channel 1", "preview_url": "http://localhost/live/stream.m3u8", "extra_extensions": "jpg,jpeg,png"}' \
-H "Authorization: Bearer <TOKEN>"
```
@ -112,7 +110,7 @@ curl -X PATCH http://127.0.0.1:8787/api/channel/1 -H "Content-Type: application/
```BASH
curl -X POST http://127.0.0.1:8787/api/channel/ -H "Content-Type: application/json" \
-d '{ "name": "Channel 2", "preview_url": "http://localhost/live/channel2.m3u8", "config_path": "/etc/ffplayout/channel2.yml", "extra_extensions": "jpg,jpeg,png", "service": "ffplayout@channel2.service" }' \
-d '{ "name": "Channel 2", "preview_url": "http://localhost/live/channel2.m3u8", "extra_extensions": "jpg,jpeg,png" }' \
-H "Authorization: Bearer <TOKEN>"
```
@ -124,13 +122,28 @@ curl -X DELETE http://127.0.0.1:8787/api/channel/2 -H "Authorization: Bearer <TO
#### ffplayout Config
**Get Advanced Config**
```BASH
curl -X GET http://127.0.0.1:8787/api/playout/advanced/1 -H 'Authorization: Bearer <TOKEN>'
```
Response is a JSON object
**Update Advanced Config**
```BASH
curl -X PUT http://127.0.0.1:8787/api/playout/advanced/1 -H "Content-Type: application/json" \
-d { <CONFIG DATA> } -H 'Authorization: Bearer <TOKEN>'
```
**Get Config**
```BASH
curl -X GET http://127.0.0.1:8787/api/playout/config/1 -H 'Authorization: Bearer <TOKEN>'
```
Response is a JSON object from the ffplayout.yml
Response is a JSON object
**Update Config**
@ -161,7 +174,7 @@ curl -X PUT http://127.0.0.1:8787/api/presets/1 -H 'Content-Type: application/js
**Add new Preset**
```BASH
curl -X POST http://127.0.0.1:8787/api/presets/ -H 'Content-Type: application/json' \
curl -X POST http://127.0.0.1:8787/api/presets/1/ -H 'Content-Type: application/json' \
-d '{ "name": "<PRESET NAME>", "text": "TEXT>", "x": "<X>", "y": "<Y>", "fontsize": 24, "line_spacing": 4, "fontcolor": "#ffffff", "box": 1, "boxcolor": "#000000", "boxborderw": 4, "alpha": 1.0, "channel_id": 1 }' \
-H 'Authorization: Bearer <TOKEN>'
```
@ -210,38 +223,19 @@ curl -X GET http://127.0.0.1:8787/api/control/1/media/current
**Response:**
```JSON
{
"jsonrpc": "2.0",
"result": {
"current_media": {
{
"media": {
"category": "",
"duration": 154.2,
"out": 154.2,
"seek": 0.0,
"in": 0.0,
"source": "/opt/tv-media/clip.mp4"
},
"index": 39,
"play_mode": "playlist",
"played_sec": 67.80771999300123,
"remaining_sec": 86.39228000699876,
"start_sec": 24713.631999999998,
"start_time": "06:51:53.631"
},
"id": 1
}
```
**Get next Clip**
```BASH
curl -X GET http://127.0.0.1:8787/api/control/1/media/next -H 'Authorization: Bearer <TOKEN>'
```
**Get last Clip**
```BASH
curl -X GET http://127.0.0.1:8787/api/control/1/media/last
-H 'Content-Type: application/json' -H 'Authorization: Bearer <TOKEN>'
"ingest": false,
"mode": "playlist",
"played": 67.808
}
```
#### ffplayout Process Control
@ -303,10 +297,10 @@ curl -X DELETE http://127.0.0.1:8787/api/playlist/1/2022-06-20
### Log file
**Read Log Life**
**Read Log File**
```BASH
curl -X GET http://127.0.0.1:8787/api/log/1
curl -X GET http://127.0.0.1:8787/api/log/1?date=2022-06-20
-H 'Content-Type: application/json' -H 'Authorization: Bearer <TOKEN>'
```

View File

@ -7,32 +7,26 @@ ffplayout provides ***.deb** and ***.rpm** packages, which makes it more easy to
3. install ffmpeg/ffprobe, or compile and copy it to **/usr/local/bin/**
4. activate systemd services:
- `systemctl enable ffplayout`
- `systemctl enable --now ffpapi`
5. add admin user to ffpapi:
- `ffpapi -a`
5. initial defaults and add global admin user:
- `sudo -u ffpu ffplayout -i`
6. use a revers proxy for SSL, Port is **8787**.
7. login with your browser, address without proxy would be: **http://[IP ADDRESS]:8787**
Default location for playlists and media files are: **/var/lib/ffplayout/**.
When you don't need the frontend and API, skip enable the systemd service **ffpapi**.
When playlists are created and the ffplayout output is configured, you can start the process: `systemctl start ffplayout`, or click start in frontend.
If you want to configure ffplayout over terminal, you can edit **/etc/ffplayout/ffplayout.yml**.
### Manual Install
-----
- install ffmpeg/ffprobe, or compile and copy it to **/usr/local/bin/**
- download the latest archive from [release](https://github.com/ffplayout/ffplayout/releases/latest) page
- copy the ffplayout and ffpapi binary to `/usr/bin/`
- copy the ffplayout binary to `/usr/bin/`
- copy **assets/ffplayout.yml** to `/etc/ffplayout`
- create folder `/var/log/ffplayout`
- create system user **ffpu**
- give ownership from `/etc/ffplayout` and `/var/log/ffplayout` to **ffpu**
- copy **assets/ffpapi.service**, **assets/ffplayout.service** and **assets/ffplayout@.service** to `/etc/systemd/system`
- copy **assets/11-ffplayout** to `/etc/sudoers.d/`
- copy **assets/ffpapi.1.gz** and **assets/ffplayout.1.gz** to `/usr/share/man/man1/`
- copy **assets/ffplayout.service** to `/etc/systemd/system`
- copy **assets/ffplayout.1.gz** to `/usr/share/man/man1/`
- copy **public** folder to `/usr/share/ffplayout/`
- activate service and run it: `systemctl enable --now ffpapi ffplayout`
- activate service and run it: `systemctl enable --now ffplayout`

View File

@ -72,7 +72,7 @@ In this mode you can output directly to a hls playlist. The nice thing here is,
HLS output is currently the default, mostly because it works out of the box and don't need a streaming target. In default settings it saves the segments to **/usr/share/ffplayout/public/live/**.
**It is recommend to serve the HLS stream with nginx or another web server, and not with ffpapi (which is more meant for previewing).**
**It is recommend to serve the HLS stream with nginx or another web server, and not with ffplayout (which is more meant for previewing).**
**HLS multiple outputs example:**

View File

@ -1,60 +0,0 @@
[package]
name = "ffplayout-api"
description = "Rest API for ffplayout"
readme = "README.md"
version.workspace = true
license.workspace = true
authors.workspace = true
repository.workspace = true
edition.workspace = true
[features]
default = ["embed_frontend"]
embed_frontend = []
[dependencies]
ffplayout-lib = { path = "../lib" }
actix-files = "0.6"
actix-multipart = "0.6"
actix-web = "4"
actix-web-grants = "4"
actix-web-httpauth = "0.8"
actix-web-lab = "0.20"
actix-web-static-files = "4.0"
argon2 = "0.5"
chrono = { version = "0.4", default-features = false, features = ["clock", "std"] }
clap = { version = "4.3", features = ["derive"] }
derive_more = "0.99"
faccess = "0.2"
futures-util = { version = "0.3", default-features = false, features = ["std"] }
home = "0.5"
jsonwebtoken = "9"
lazy_static = "1.4"
lexical-sort = "0.3"
local-ip-address = "0.6"
once_cell = "1.18"
parking_lot = "0.12"
path-clean = "1.0"
rand = "0.8"
regex = "1"
relative-path = "1.8"
reqwest = { version = "0.12", default-features = false, features = ["blocking", "json", "rustls-tls"] }
rpassword = "7.2"
sanitize-filename = "0.5"
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"
simplelog = { version = "0.12", features = ["paris"] }
static-files = "0.2"
sysinfo ={ version = "0.30", features = ["linux-netdevs"] }
sqlx = { version = "0.7", features = ["runtime-tokio", "sqlite"] }
tokio = { version = "1.29", features = ["full"] }
tokio-stream = "0.1"
toml_edit = {version ="0.22", features = ["serde"]}
uuid = "1.8"
[build-dependencies]
static-files = "0.2"
[[bin]]
name = "ffpapi"
path = "src/main.rs"

View File

@ -1,63 +0,0 @@
**ffplayout-api**
================
ffplayout-api (ffpapi) is a non strict REST API for ffplayout. It makes it possible to control the engine, read and manipulate the config, save playlist, etc.
To be able to use the API it is necessary to initialize the settings database first. To do that, run:
```BASH
ffpapi -i
```
Then add an admin user:
```BASH
ffpapi -u <USERNAME> -p <PASSWORD> -m <MAIL ADDRESS>
```
Then run the API thru the systemd service, or like:
```BASH
ffpapi -l 127.0.0.1:8787
```
Possible Arguments
-----
```BASH
OPTIONS:
-a, --ask ask for user credentials
-d, --domain <DOMAIN> domain name for initialization
-h, --help Print help information
-i, --init Initialize Database
-l, --listen <LISTEN> Listen on IP:PORT, like: 127.0.0.1:8787
-m, --mail <MAIL> Admin mail address
-p, --password <PASSWORD> Admin password
-u, --username <USERNAME> Create admin user
-V, --version Print version information
```
If you plan to run ffpapi with systemd set permission from **/usr/share/ffplayout** and content to user **ffpu:ffpu**. User **ffpu** has to be created.
**For possible endpoints read: [api endpoints](/docs/api.md)**
ffpapi can also serve the browser based frontend, just run in your browser `127.0.0.1:8787`.
"Piggyback" Mode
-----
ffplayout was originally planned to run under Linux as a SystemD service. It is also designed so that the engine and ffpapi run completely independently of each other. This is to increase flexibility and stability.
Nevertheless, programs compiled in Rust can basically run on all systems supported by the language. And so this repo also offers binaries for other platforms.
In the past, however, it was only possible under Linux to start/stop/restart the ffplayout engine process through ffpapi. This limit no longer exists since v0.17.0, because the "piggyback" mode was introduced here. This means that ffpapi recognizes which platform it is running on, and if it is not on Linux, it starts the engine as a child process. Thus it is now possible to control ffplayout engine completely on all platforms. The disadvantage here is that the engine process is dependent on ffpapi; if it closes or crashes, the engine also closes.
Under Linux, this mode can be simulated by starting ffpapi with the environment variable `PIGGYBACK_MODE=true`. This scenario is also conceivable in container operation, for example.
**Run in piggyback mode:**
```BASH
PIGGYBACK_MODE=True ffpapi -l 127.0.0.1:8787
```
This function is experimental, use it with caution.

View File

@ -1,352 +0,0 @@
use std::env;
use argon2::{
password_hash::{rand_core::OsRng, SaltString},
Argon2, PasswordHasher,
};
use rand::{distributions::Alphanumeric, Rng};
use simplelog::*;
use sqlx::{migrate::MigrateDatabase, sqlite::SqliteQueryResult, Pool, Sqlite};
use tokio::task;
use crate::db::{
db_pool,
models::{Channel, TextPreset, User},
};
use crate::utils::{db_path, local_utc_offset, GlobalSettings, Role};
async fn create_schema(conn: &Pool<Sqlite>) -> Result<SqliteQueryResult, sqlx::Error> {
let query = "PRAGMA foreign_keys = ON;
CREATE TABLE IF NOT EXISTS global
(
id INTEGER PRIMARY KEY AUTOINCREMENT,
secret TEXT NOT NULL,
UNIQUE(secret)
);
CREATE TABLE IF NOT EXISTS roles
(
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT NOT NULL,
UNIQUE(name)
);
CREATE TABLE IF NOT EXISTS channels
(
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT NOT NULL,
preview_url TEXT NOT NULL,
config_path TEXT NOT NULL,
extra_extensions TEXT NOT NULL,
service TEXT NOT NULL,
UNIQUE(name, service)
);
CREATE TABLE IF NOT EXISTS presets
(
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT NOT NULL,
text TEXT NOT NULL,
x TEXT NOT NULL,
y TEXT NOT NULL,
fontsize TEXT NOT NULL,
line_spacing TEXT NOT NULL,
fontcolor TEXT NOT NULL,
box TEXT NOT NULL,
boxcolor TEXT NOT NULL,
boxborderw TEXT NOT NULL,
alpha TEXT NOT NULL,
channel_id INTEGER NOT NULL DEFAULT 1,
FOREIGN KEY (channel_id) REFERENCES channels (id) ON UPDATE SET NULL ON DELETE SET NULL,
UNIQUE(name)
);
CREATE TABLE IF NOT EXISTS user
(
id INTEGER PRIMARY KEY AUTOINCREMENT,
mail TEXT NOT NULL,
username TEXT NOT NULL,
password TEXT NOT NULL,
role_id INTEGER NOT NULL DEFAULT 2,
channel_id INTEGER NOT NULL DEFAULT 1,
FOREIGN KEY (role_id) REFERENCES roles (id) ON UPDATE SET NULL ON DELETE SET NULL,
FOREIGN KEY (channel_id) REFERENCES channels (id) ON UPDATE SET NULL ON DELETE SET NULL,
UNIQUE(mail, username)
);";
sqlx::query(query).execute(conn).await
}
pub async fn db_init(domain: Option<String>) -> Result<&'static str, Box<dyn std::error::Error>> {
let db_path = db_path()?;
if !Sqlite::database_exists(db_path).await.unwrap_or(false) {
Sqlite::create_database(db_path).await.unwrap();
let pool = db_pool().await?;
match create_schema(&pool).await {
Ok(_) => info!("Database created Successfully"),
Err(e) => panic!("{e}"),
}
}
let secret: String = rand::thread_rng()
.sample_iter(&Alphanumeric)
.take(80)
.map(char::from)
.collect();
let url = match domain {
Some(d) => format!("http://{d}/live/stream.m3u8"),
None => "http://localhost/live/stream.m3u8".to_string(),
};
let config_path = if env::consts::OS == "linux" {
"/etc/ffplayout/ffplayout.toml"
} else {
"./assets/ffplayout.toml"
};
let query = "CREATE TRIGGER global_row_count
BEFORE INSERT ON global
WHEN (SELECT COUNT(*) FROM global) >= 1
BEGIN
SELECT RAISE(FAIL, 'Database is already initialized!');
END;
INSERT INTO global(secret) VALUES($1);
INSERT INTO channels(name, preview_url, config_path, extra_extensions, service)
VALUES('Channel 1', $2, $3, 'jpg,jpeg,png', 'ffplayout.service');
INSERT INTO roles(name) VALUES('admin'), ('user'), ('guest');
INSERT INTO presets(name, text, x, y, fontsize, line_spacing, fontcolor, box, boxcolor, boxborderw, alpha, channel_id)
VALUES('Default', 'Wellcome to ffplayout messenger!', '(w-text_w)/2', '(h-text_h)/2', '24', '4', '#ffffff@0xff', '0', '#000000@0x80', '4', '1.0', '1'),
('Empty Text', '', '0', '0', '24', '4', '#000000', '0', '#000000', '0', '0', '1'),
('Bottom Text fade in', 'The upcoming event will be delayed by a few minutes.', '(w-text_w)/2', '(h-line_h)*0.9', '24', '4', '#ffffff',
'1', '#000000@0x80', '4', 'ifnot(ld(1),st(1,t));if(lt(t,ld(1)+1),0,if(lt(t,ld(1)+2),(t-(ld(1)+1))/1,if(lt(t,ld(1)+8),1,if(lt(t,ld(1)+9),(1-(t-(ld(1)+8)))/1,0))))', '1'),
('Scrolling Text', 'We have a very important announcement to make.', 'ifnot(ld(1),st(1,t));if(lt(t,ld(1)+1),w+4,w-w/12*mod(t-ld(1),12*(w+tw)/w))', '(h-line_h)*0.9',
'24', '4', '#ffffff', '1', '#000000@0x80', '4', '1.0', '1');";
let pool = db_pool().await?;
sqlx::query(query)
.bind(secret)
.bind(url)
.bind(config_path)
.execute(&pool)
.await?;
Ok("Database initialized!")
}
pub async fn select_global(conn: &Pool<Sqlite>) -> Result<GlobalSettings, sqlx::Error> {
let query = "SELECT secret FROM global WHERE id = 1";
sqlx::query_as(query).fetch_one(conn).await
}
pub async fn select_channel(conn: &Pool<Sqlite>, id: &i32) -> Result<Channel, sqlx::Error> {
let query = "SELECT * FROM channels WHERE id = $1";
let mut result: Channel = sqlx::query_as(query).bind(id).fetch_one(conn).await?;
result.utc_offset = local_utc_offset();
Ok(result)
}
pub async fn select_all_channels(conn: &Pool<Sqlite>) -> Result<Vec<Channel>, sqlx::Error> {
let query = "SELECT * FROM channels";
let mut results: Vec<Channel> = sqlx::query_as(query).fetch_all(conn).await?;
for result in results.iter_mut() {
result.utc_offset = local_utc_offset();
}
Ok(results)
}
pub async fn update_channel(
conn: &Pool<Sqlite>,
id: i32,
channel: Channel,
) -> Result<SqliteQueryResult, sqlx::Error> {
let query = "UPDATE channels SET name = $2, preview_url = $3, config_path = $4, extra_extensions = $5 WHERE id = $1";
sqlx::query(query)
.bind(id)
.bind(channel.name)
.bind(channel.preview_url)
.bind(channel.config_path)
.bind(channel.extra_extensions)
.execute(conn)
.await
}
pub async fn insert_channel(conn: &Pool<Sqlite>, channel: Channel) -> Result<Channel, sqlx::Error> {
let query = "INSERT INTO channels (name, preview_url, config_path, extra_extensions, service) VALUES($1, $2, $3, $4, $5)";
let result = sqlx::query(query)
.bind(channel.name)
.bind(channel.preview_url)
.bind(channel.config_path)
.bind(channel.extra_extensions)
.bind(channel.service)
.execute(conn)
.await?;
sqlx::query_as("SELECT * FROM channels WHERE id = $1")
.bind(result.last_insert_rowid())
.fetch_one(conn)
.await
}
pub async fn delete_channel(
conn: &Pool<Sqlite>,
id: &i32,
) -> Result<SqliteQueryResult, sqlx::Error> {
let query = "DELETE FROM channels WHERE id = $1";
sqlx::query(query).bind(id).execute(conn).await
}
pub async fn select_last_channel(conn: &Pool<Sqlite>) -> Result<i32, sqlx::Error> {
let query = "SELECT id FROM channels ORDER BY id DESC LIMIT 1;";
sqlx::query_scalar(query).fetch_one(conn).await
}
pub async fn select_role(conn: &Pool<Sqlite>, id: &i32) -> Result<Role, sqlx::Error> {
let query = "SELECT name FROM roles WHERE id = $1";
let result: Role = sqlx::query_as(query).bind(id).fetch_one(conn).await?;
Ok(result)
}
pub async fn select_login(conn: &Pool<Sqlite>, user: &str) -> Result<User, sqlx::Error> {
let query = "SELECT id, mail, username, password, role_id FROM user WHERE username = $1";
sqlx::query_as(query).bind(user).fetch_one(conn).await
}
pub async fn select_user(conn: &Pool<Sqlite>, user: &str) -> Result<User, sqlx::Error> {
let query = "SELECT id, mail, username, role_id FROM user WHERE username = $1";
sqlx::query_as(query).bind(user).fetch_one(conn).await
}
pub async fn select_user_by_id(conn: &Pool<Sqlite>, id: i32) -> Result<User, sqlx::Error> {
let query = "SELECT id, mail, username, role_id FROM user WHERE id = $1";
sqlx::query_as(query).bind(id).fetch_one(conn).await
}
pub async fn select_users(conn: &Pool<Sqlite>) -> Result<Vec<User>, sqlx::Error> {
let query = "SELECT id, username FROM user";
sqlx::query_as(query).fetch_all(conn).await
}
pub async fn insert_user(
conn: &Pool<Sqlite>,
user: User,
) -> Result<SqliteQueryResult, sqlx::Error> {
let password_hash = task::spawn_blocking(move || {
let salt = SaltString::generate(&mut OsRng);
let hash = Argon2::default()
.hash_password(user.password.clone().as_bytes(), &salt)
.unwrap();
hash.to_string()
})
.await
.unwrap();
let query = "INSERT INTO user (mail, username, password, role_id) VALUES($1, $2, $3, $4)";
sqlx::query(query)
.bind(user.mail)
.bind(user.username)
.bind(password_hash)
.bind(user.role_id)
.execute(conn)
.await
}
pub async fn update_user(
conn: &Pool<Sqlite>,
id: i32,
fields: String,
) -> Result<SqliteQueryResult, sqlx::Error> {
let query = format!("UPDATE user SET {fields} WHERE id = $1");
sqlx::query(&query).bind(id).execute(conn).await
}
pub async fn delete_user(
conn: &Pool<Sqlite>,
name: &str,
) -> Result<SqliteQueryResult, sqlx::Error> {
let query = "DELETE FROM user WHERE username = $1;";
sqlx::query(query).bind(name).execute(conn).await
}
pub async fn select_presets(conn: &Pool<Sqlite>, id: i32) -> Result<Vec<TextPreset>, sqlx::Error> {
let query = "SELECT * FROM presets WHERE channel_id = $1";
sqlx::query_as(query).bind(id).fetch_all(conn).await
}
pub async fn update_preset(
conn: &Pool<Sqlite>,
id: &i32,
preset: TextPreset,
) -> Result<SqliteQueryResult, sqlx::Error> {
let query =
"UPDATE presets SET name = $1, text = $2, x = $3, y = $4, fontsize = $5, line_spacing = $6,
fontcolor = $7, alpha = $8, box = $9, boxcolor = $10, boxborderw = $11 WHERE id = $12";
sqlx::query(query)
.bind(preset.name)
.bind(preset.text)
.bind(preset.x)
.bind(preset.y)
.bind(preset.fontsize)
.bind(preset.line_spacing)
.bind(preset.fontcolor)
.bind(preset.alpha)
.bind(preset.r#box)
.bind(preset.boxcolor)
.bind(preset.boxborderw)
.bind(id)
.execute(conn)
.await
}
pub async fn insert_preset(
conn: &Pool<Sqlite>,
preset: TextPreset,
) -> Result<SqliteQueryResult, sqlx::Error> {
let query =
"INSERT INTO presets (channel_id, name, text, x, y, fontsize, line_spacing, fontcolor, alpha, box, boxcolor, boxborderw)
VALUES($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12)";
sqlx::query(query)
.bind(preset.channel_id)
.bind(preset.name)
.bind(preset.text)
.bind(preset.x)
.bind(preset.y)
.bind(preset.fontsize)
.bind(preset.line_spacing)
.bind(preset.fontcolor)
.bind(preset.alpha)
.bind(preset.r#box)
.bind(preset.boxcolor)
.bind(preset.boxborderw)
.execute(conn)
.await
}
pub async fn delete_preset(
conn: &Pool<Sqlite>,
id: &i32,
) -> Result<SqliteQueryResult, sqlx::Error> {
let query = "DELETE FROM presets WHERE id = $1;";
sqlx::query(query).bind(id).execute(conn).await
}

View File

@ -1,118 +0,0 @@
use regex::Regex;
use serde::{
de::{self, Visitor},
Deserialize, Serialize,
};
#[derive(Debug, Deserialize, Serialize, sqlx::FromRow)]
pub struct User {
#[sqlx(default)]
#[serde(skip_deserializing)]
pub id: i32,
#[sqlx(default)]
#[serde(skip_serializing_if = "Option::is_none")]
pub mail: Option<String>,
pub username: String,
#[sqlx(default)]
#[serde(skip_serializing, default = "empty_string")]
pub password: String,
#[sqlx(default)]
#[serde(skip_serializing)]
pub role_id: Option<i32>,
#[sqlx(default)]
#[serde(skip_serializing)]
pub channel_id: Option<i32>,
#[sqlx(default)]
#[serde(skip_serializing_if = "Option::is_none")]
pub token: Option<String>,
}
fn empty_string() -> String {
"".to_string()
}
#[derive(Debug, Deserialize, Serialize, Clone)]
pub struct LoginUser {
pub id: i32,
pub username: String,
}
impl LoginUser {
pub fn new(id: i32, username: String) -> Self {
Self { id, username }
}
}
#[derive(Debug, Deserialize, Serialize, Clone, sqlx::FromRow)]
pub struct TextPreset {
#[sqlx(default)]
#[serde(skip_deserializing)]
pub id: i32,
pub channel_id: i32,
pub name: String,
pub text: String,
pub x: String,
pub y: String,
#[serde(deserialize_with = "deserialize_number_or_string")]
pub fontsize: String,
#[serde(deserialize_with = "deserialize_number_or_string")]
pub line_spacing: String,
pub fontcolor: String,
pub r#box: String,
pub boxcolor: String,
#[serde(deserialize_with = "deserialize_number_or_string")]
pub boxborderw: String,
#[serde(deserialize_with = "deserialize_number_or_string")]
pub alpha: String,
}
/// Deserialize number or string
pub fn deserialize_number_or_string<'de, D>(deserializer: D) -> Result<String, D::Error>
where
D: serde::Deserializer<'de>,
{
struct StringOrNumberVisitor;
impl<'de> Visitor<'de> for StringOrNumberVisitor {
type Value = String;
fn expecting(&self, formatter: &mut std::fmt::Formatter) -> std::fmt::Result {
formatter.write_str("a string or a number")
}
fn visit_str<E: de::Error>(self, value: &str) -> Result<Self::Value, E> {
let re = Regex::new(r"0,([0-9]+)").unwrap();
let clean_string = re.replace_all(value, "0.$1").to_string();
Ok(clean_string)
}
fn visit_u64<E: de::Error>(self, value: u64) -> Result<Self::Value, E> {
Ok(value.to_string())
}
fn visit_i64<E: de::Error>(self, value: i64) -> Result<Self::Value, E> {
Ok(value.to_string())
}
fn visit_f64<E: de::Error>(self, value: f64) -> Result<Self::Value, E> {
Ok(value.to_string())
}
}
deserializer.deserialize_any(StringOrNumberVisitor)
}
#[derive(Debug, Deserialize, Serialize, sqlx::FromRow)]
pub struct Channel {
#[serde(skip_deserializing)]
pub id: i32,
pub name: String,
pub preview_url: String,
pub config_path: String,
pub extra_extensions: String,
pub service: String,
#[sqlx(default)]
#[serde(default)]
pub utc_offset: i32,
}

View File

@ -1,36 +0,0 @@
use std::path::PathBuf;
use clap::Parser;
#[derive(Parser, Debug, Clone)]
#[clap(version,
about = "REST API for ffplayout",
long_about = None)]
pub struct Args {
#[clap(short, long, help = "ask for user credentials")]
pub ask: bool,
#[clap(long, help = "path to database file")]
pub db: Option<PathBuf>,
#[clap(long, help = "path to public files")]
pub public: Option<PathBuf>,
#[clap(short, long, help = "Listen on IP:PORT, like: 127.0.0.1:8787")]
pub listen: Option<String>,
#[clap(short, long, help = "Initialize Database")]
pub init: bool,
#[clap(short, long, help = "domain name for initialization")]
pub domain: Option<String>,
#[clap(short, long, help = "Create admin user")]
pub username: Option<String>,
#[clap(short, long, help = "Admin mail address")]
pub mail: Option<String>,
#[clap(short, long, help = "Admin password")]
pub password: Option<String>,
}

View File

@ -1,74 +0,0 @@
use std::{fs, path::PathBuf};
use rand::prelude::*;
use simplelog::*;
use sqlx::{Pool, Sqlite};
use crate::utils::{
control::{control_service, ServiceCmd},
errors::ServiceError,
};
use ffplayout_lib::utils::PlayoutConfig;
use crate::db::{handles, models::Channel};
use crate::utils::playout_config;
pub async fn create_channel(
conn: &Pool<Sqlite>,
target_channel: Channel,
) -> Result<Channel, ServiceError> {
if !target_channel.service.starts_with("ffplayout@") {
return Err(ServiceError::BadRequest("Bad service name!".to_string()));
}
if !target_channel.config_path.starts_with("/etc/ffplayout") {
return Err(ServiceError::BadRequest("Bad config path!".to_string()));
}
let channel_name = target_channel.name.to_lowercase().replace(' ', "");
let channel_num = match handles::select_last_channel(conn).await {
Ok(num) => num + 1,
Err(_) => rand::thread_rng().gen_range(71..99),
};
let mut config = PlayoutConfig::new(
Some(PathBuf::from("/usr/share/ffplayout/ffplayout.toml.orig")),
None,
);
config.general.stat_file = format!(".ffp_{channel_name}",);
config.logging.path = config.logging.path.join(&channel_name);
config.rpc_server.address = format!("127.0.0.1:70{:7>2}", channel_num);
config.playlist.path = config.playlist.path.join(channel_name);
config.out.output_param = config
.out
.output_param
.replace("stream.m3u8", &format!("stream{channel_num}.m3u8"))
.replace("stream-%d.ts", &format!("stream{channel_num}-%d.ts"));
let toml_string = toml_edit::ser::to_string(&config)?;
fs::write(&target_channel.config_path, toml_string)?;
let new_channel = handles::insert_channel(conn, target_channel).await?;
control_service(conn, &config, new_channel.id, &ServiceCmd::Enable, None).await?;
Ok(new_channel)
}
pub async fn delete_channel(conn: &Pool<Sqlite>, id: i32) -> Result<(), ServiceError> {
let channel = handles::select_channel(conn, &id).await?;
let (config, _) = playout_config(conn, &id).await?;
control_service(conn, &config, channel.id, &ServiceCmd::Stop, None).await?;
control_service(conn, &config, channel.id, &ServiceCmd::Disable, None).await?;
if let Err(e) = fs::remove_file(channel.config_path) {
error!("{e}");
};
handles::delete_channel(conn, &id).await?;
Ok(())
}

View File

@ -1,345 +0,0 @@
use std::{
collections::HashMap,
env, fmt,
str::FromStr,
sync::atomic::{AtomicBool, Ordering},
};
use actix_web::web;
use reqwest::{header::AUTHORIZATION, Client, Response};
use serde::{Deserialize, Serialize};
use sqlx::{Pool, Sqlite};
use tokio::{
process::{Child, Command},
sync::Mutex,
};
use crate::db::handles::select_channel;
use crate::utils::errors::ServiceError;
use ffplayout_lib::{utils::PlayoutConfig, vec_strings};
#[derive(Debug, Deserialize, Serialize, Clone)]
struct TextParams {
control: String,
message: HashMap<String, String>,
}
#[derive(Debug, Deserialize, Serialize, Clone)]
pub struct ControlParams {
pub control: String,
}
#[derive(Debug, Deserialize, Serialize, Clone)]
struct MediaParams {
media: String,
}
/// ffplayout engine process
///
/// When running not on Linux, or with environment variable `PIGGYBACK_MODE=true`,
/// the engine get startet and controlled from ffpapi
pub struct ProcessControl {
pub engine_child: Mutex<Option<Child>>,
pub is_running: AtomicBool,
pub piggyback: AtomicBool,
}
impl ProcessControl {
pub fn new() -> Self {
let piggyback = if env::consts::OS != "linux" || env::var("PIGGYBACK_MODE").is_ok() {
AtomicBool::new(true)
} else {
AtomicBool::new(false)
};
Self {
engine_child: Mutex::new(None),
is_running: AtomicBool::new(false),
piggyback,
}
}
}
impl ProcessControl {
pub async fn start(&self) -> Result<String, ServiceError> {
#[cfg(not(debug_assertions))]
let engine_path = "ffplayout";
#[cfg(debug_assertions)]
let engine_path = "./target/debug/ffplayout";
match Command::new(engine_path).kill_on_drop(true).spawn() {
Ok(proc) => *self.engine_child.lock().await = Some(proc),
Err(_) => return Err(ServiceError::InternalServerError),
};
self.is_running.store(true, Ordering::SeqCst);
Ok("Success".to_string())
}
pub async fn stop(&self) -> Result<String, ServiceError> {
if let Some(proc) = self.engine_child.lock().await.as_mut() {
if proc.kill().await.is_err() {
return Err(ServiceError::InternalServerError);
};
}
self.wait().await?;
self.is_running.store(false, Ordering::SeqCst);
Ok("Success".to_string())
}
pub async fn restart(&self) -> Result<String, ServiceError> {
self.stop().await?;
self.start().await?;
self.is_running.store(true, Ordering::SeqCst);
Ok("Success".to_string())
}
/// Wait for process to proper close.
/// This prevents orphaned/zombi processes in system
pub async fn wait(&self) -> Result<String, ServiceError> {
if let Some(proc) = self.engine_child.lock().await.as_mut() {
if proc.wait().await.is_err() {
return Err(ServiceError::InternalServerError);
};
}
Ok("Success".to_string())
}
pub fn status(&self) -> Result<String, ServiceError> {
if self.is_running.load(Ordering::SeqCst) {
Ok("active".to_string())
} else {
Ok("not running".to_string())
}
}
}
impl Default for ProcessControl {
fn default() -> Self {
Self::new()
}
}
#[derive(Debug, Serialize, Deserialize, Clone, Eq, PartialEq)]
#[serde(rename_all = "snake_case")]
pub enum ServiceCmd {
Enable,
Disable,
Start,
Stop,
Restart,
Status,
}
impl FromStr for ServiceCmd {
type Err = String;
fn from_str(input: &str) -> Result<Self, Self::Err> {
match input.to_lowercase().as_str() {
"enable" => Ok(Self::Enable),
"disable" => Ok(Self::Disable),
"start" => Ok(Self::Start),
"stop" => Ok(Self::Stop),
"restart" => Ok(Self::Restart),
"status" => Ok(Self::Status),
_ => Err(format!("Command '{input}' not found!")),
}
}
}
impl fmt::Display for ServiceCmd {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
match *self {
Self::Enable => write!(f, "enable"),
Self::Disable => write!(f, "disable"),
Self::Start => write!(f, "start"),
Self::Stop => write!(f, "stop"),
Self::Restart => write!(f, "restart"),
Self::Status => write!(f, "status"),
}
}
}
#[derive(Debug, Deserialize, Serialize, Clone)]
pub struct Process {
pub command: ServiceCmd,
}
struct SystemD {
service: String,
cmd: Vec<String>,
}
impl SystemD {
async fn new(conn: &Pool<Sqlite>, id: i32) -> Result<Self, ServiceError> {
let channel = select_channel(conn, &id).await?;
Ok(Self {
service: channel.service,
cmd: vec_strings!["/usr/bin/systemctl"],
})
}
fn enable(mut self) -> Result<String, ServiceError> {
self.cmd
.append(&mut vec!["enable".to_string(), self.service]);
Command::new("sudo").args(self.cmd).spawn()?;
Ok("Success".to_string())
}
fn disable(mut self) -> Result<String, ServiceError> {
self.cmd
.append(&mut vec!["disable".to_string(), self.service]);
Command::new("sudo").args(self.cmd).spawn()?;
Ok("Success".to_string())
}
fn start(mut self) -> Result<String, ServiceError> {
self.cmd
.append(&mut vec!["start".to_string(), self.service]);
Command::new("sudo").args(self.cmd).spawn()?;
Ok("Success".to_string())
}
fn stop(mut self) -> Result<String, ServiceError> {
self.cmd.append(&mut vec!["stop".to_string(), self.service]);
Command::new("sudo").args(self.cmd).spawn()?;
Ok("Success".to_string())
}
fn restart(mut self) -> Result<String, ServiceError> {
self.cmd
.append(&mut vec!["restart".to_string(), self.service]);
Command::new("sudo").args(self.cmd).spawn()?;
Ok("Success".to_string())
}
async fn status(mut self) -> Result<String, ServiceError> {
self.cmd
.append(&mut vec!["is-active".to_string(), self.service]);
let output = Command::new("sudo").args(self.cmd).output().await?;
Ok(String::from_utf8_lossy(&output.stdout).trim().to_string())
}
}
async fn post_request<T>(config: &PlayoutConfig, obj: T) -> Result<Response, ServiceError>
where
T: Serialize,
{
let url = format!("http://{}", config.rpc_server.address);
let client = Client::new();
match client
.post(&url)
.header(AUTHORIZATION, &config.rpc_server.authorization)
.json(&obj)
.send()
.await
{
Ok(result) => Ok(result),
Err(e) => Err(ServiceError::ServiceUnavailable(e.to_string())),
}
}
pub async fn send_message(
config: &PlayoutConfig,
message: HashMap<String, String>,
) -> Result<Response, ServiceError> {
let json_obj = TextParams {
control: "text".into(),
message,
};
post_request(config, json_obj).await
}
pub async fn control_state(
config: &PlayoutConfig,
command: &str,
) -> Result<Response, ServiceError> {
let json_obj = ControlParams {
control: command.to_owned(),
};
post_request(config, json_obj).await
}
pub async fn media_info(config: &PlayoutConfig, command: String) -> Result<Response, ServiceError> {
let json_obj = MediaParams { media: command };
post_request(config, json_obj).await
}
pub async fn control_service(
conn: &Pool<Sqlite>,
config: &PlayoutConfig,
id: i32,
command: &ServiceCmd,
engine: Option<web::Data<ProcessControl>>,
) -> Result<String, ServiceError> {
if let Some(en) = engine {
if en.piggyback.load(Ordering::SeqCst) {
match command {
ServiceCmd::Start => en.start().await,
ServiceCmd::Stop => {
if control_state(config, "stop_all").await.is_ok() {
en.stop().await
} else {
Err(ServiceError::NoContent("Nothing to stop".to_string()))
}
}
ServiceCmd::Restart => {
if control_state(config, "stop_all").await.is_ok() {
en.restart().await
} else {
Err(ServiceError::NoContent("Nothing to restart".to_string()))
}
}
ServiceCmd::Status => en.status(),
_ => Err(ServiceError::Conflict(
"Engine runs in piggyback mode, in this mode this command is not allowed."
.to_string(),
)),
}
} else {
execute_systemd(conn, id, command).await
}
} else {
execute_systemd(conn, id, command).await
}
}
async fn execute_systemd(
conn: &Pool<Sqlite>,
id: i32,
command: &ServiceCmd,
) -> Result<String, ServiceError> {
let system_d = SystemD::new(conn, id).await?;
match command {
ServiceCmd::Enable => system_d.enable(),
ServiceCmd::Disable => system_d.disable(),
ServiceCmd::Start => system_d.start(),
ServiceCmd::Stop => system_d.stop(),
ServiceCmd::Restart => system_d.restart(),
ServiceCmd::Status => system_d.status().await,
}
}

View File

@ -1,390 +0,0 @@
use std::{
env,
error::Error,
fmt,
fs::{self, metadata, File},
io::{stdin, stdout, Read, Write},
path::{Path, PathBuf},
str::FromStr,
};
use chrono::{format::ParseErrorKind, prelude::*};
use faccess::PathExt;
use once_cell::sync::OnceCell;
use path_clean::PathClean;
use rpassword::read_password;
use serde::{de, Deserialize, Deserializer, Serialize};
use simplelog::*;
use sqlx::{sqlite::SqliteRow, FromRow, Pool, Row, Sqlite};
use crate::ARGS;
pub mod args_parse;
pub mod channels;
pub mod control;
pub mod errors;
pub mod files;
pub mod playlist;
pub mod system;
use crate::db::{
db_pool,
handles::{db_init, insert_user, select_channel, select_global},
models::{Channel, User},
};
use crate::utils::errors::ServiceError;
use ffplayout_lib::utils::{time_to_sec, PlayoutConfig};
#[derive(Clone, Debug, Eq, Hash, PartialEq, Serialize, Deserialize)]
pub enum Role {
Admin,
User,
Guest,
}
impl Role {
pub fn set_role(role: &str) -> Self {
match role {
"admin" => Role::Admin,
"user" => Role::User,
_ => Role::Guest,
}
}
}
impl FromStr for Role {
type Err = String;
fn from_str(input: &str) -> Result<Self, Self::Err> {
match input {
"admin" => Ok(Self::Admin),
"user" => Ok(Self::User),
_ => Ok(Self::Guest),
}
}
}
impl fmt::Display for Role {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
match *self {
Self::Admin => write!(f, "admin"),
Self::User => write!(f, "user"),
Self::Guest => write!(f, "guest"),
}
}
}
impl<'r> sqlx::decode::Decode<'r, ::sqlx::Sqlite> for Role
where
&'r str: sqlx::decode::Decode<'r, sqlx::Sqlite>,
{
fn decode(
value: <sqlx::Sqlite as sqlx::database::HasValueRef<'r>>::ValueRef,
) -> Result<Role, Box<dyn Error + 'static + Send + Sync>> {
let value = <&str as sqlx::decode::Decode<sqlx::Sqlite>>::decode(value)?;
Ok(value.parse()?)
}
}
impl FromRow<'_, SqliteRow> for Role {
fn from_row(row: &SqliteRow) -> sqlx::Result<Self> {
match row.get("name") {
"admin" => Ok(Self::Admin),
"user" => Ok(Self::User),
_ => Ok(Self::Guest),
}
}
}
#[derive(Debug, sqlx::FromRow)]
pub struct GlobalSettings {
pub secret: String,
}
impl GlobalSettings {
async fn new(conn: &Pool<Sqlite>) -> Self {
let global_settings = select_global(conn);
match global_settings.await {
Ok(g) => g,
Err(_) => GlobalSettings {
secret: String::new(),
},
}
}
pub fn global() -> &'static GlobalSettings {
INSTANCE.get().expect("Config is not initialized")
}
}
static INSTANCE: OnceCell<GlobalSettings> = OnceCell::new();
pub async fn init_config(conn: &Pool<Sqlite>) {
let config = GlobalSettings::new(conn).await;
INSTANCE.set(config).unwrap();
}
pub fn db_path() -> Result<&'static str, Box<dyn std::error::Error>> {
if let Some(path) = ARGS.db.clone() {
let absolute_path = if path.is_absolute() {
path
} else {
env::current_dir()?.join(path)
}
.clean();
if let Some(abs_path) = absolute_path.parent() {
if abs_path.writable() {
return Ok(Box::leak(
absolute_path.to_string_lossy().to_string().into_boxed_str(),
));
}
error!("Given database path is not writable!");
}
}
let sys_path = Path::new("/usr/share/ffplayout/db");
let mut db_path = "./ffplayout.db";
if sys_path.is_dir() && !sys_path.writable() {
error!("Path {} is not writable!", sys_path.display());
}
if sys_path.is_dir() && sys_path.writable() {
db_path = "/usr/share/ffplayout/db/ffplayout.db";
} else if Path::new("./assets").is_dir() {
db_path = "./assets/ffplayout.db";
}
Ok(db_path)
}
pub fn public_path() -> PathBuf {
let path = PathBuf::from("./ffplayout-frontend/.output/public/");
if cfg!(debug_assertions) && path.is_dir() {
return path;
}
let path = PathBuf::from("/usr/share/ffplayout/public/");
if path.is_dir() {
return path;
}
PathBuf::from("./public/")
}
pub async fn run_args() -> Result<(), i32> {
let mut args = ARGS.clone();
if !args.init && args.listen.is_none() && !args.ask && args.username.is_none() {
error!("Wrong number of arguments! Run ffpapi --help for more information.");
return Err(0);
}
if args.init {
if let Err(e) = db_init(args.domain).await {
panic!("{e}");
};
return Err(0);
}
if args.ask {
let mut user = String::new();
print!("Username: ");
stdout().flush().unwrap();
stdin()
.read_line(&mut user)
.expect("Did not enter a correct name?");
if let Some('\n') = user.chars().next_back() {
user.pop();
}
if let Some('\r') = user.chars().next_back() {
user.pop();
}
args.username = Some(user);
print!("Password: ");
stdout().flush().unwrap();
let password = read_password();
args.password = password.ok();
let mut mail = String::new();
print!("Mail: ");
stdout().flush().unwrap();
stdin()
.read_line(&mut mail)
.expect("Did not enter a correct name?");
if let Some('\n') = mail.chars().next_back() {
mail.pop();
}
if let Some('\r') = mail.chars().next_back() {
mail.pop();
}
args.mail = Some(mail);
}
if let Some(username) = args.username {
if args.mail.is_none() || args.password.is_none() {
error!("Mail/password missing!");
return Err(1);
}
let user = User {
id: 0,
mail: Some(args.mail.unwrap()),
username: username.clone(),
password: args.password.unwrap(),
role_id: Some(1),
channel_id: Some(1),
token: None,
};
match db_pool().await {
Ok(conn) => {
if let Err(e) = insert_user(&conn, user).await {
error!("{e}");
return Err(1);
};
}
Err(e) => {
error!("{e}");
return Err(1);
}
};
info!("Create admin user \"{username}\" done...");
return Err(0);
}
Ok(())
}
pub fn read_playout_config(path: &str) -> Result<PlayoutConfig, Box<dyn Error>> {
let mut file = File::open(path)?;
let mut contents = String::new();
file.read_to_string(&mut contents)?;
let mut config: PlayoutConfig = toml_edit::de::from_str(&contents)?;
config.playlist.start_sec = Some(time_to_sec(&config.playlist.day_start));
config.playlist.length_sec = Some(time_to_sec(&config.playlist.length));
Ok(config)
}
pub async fn playout_config(
conn: &Pool<Sqlite>,
channel_id: &i32,
) -> Result<(PlayoutConfig, Channel), ServiceError> {
if let Ok(channel) = select_channel(conn, channel_id).await {
match read_playout_config(&channel.config_path.clone()) {
Ok(config) => return Ok((config, channel)),
Err(e) => error!("{e}"),
}
}
Err(ServiceError::BadRequest(
"Error in getting config!".to_string(),
))
}
pub async fn read_log_file(
conn: &Pool<Sqlite>,
channel_id: &i32,
date: &str,
) -> Result<String, ServiceError> {
if let Ok(channel) = select_channel(conn, channel_id).await {
let mut date_str = "".to_string();
if !date.is_empty() {
date_str.push('.');
date_str.push_str(date);
}
if let Ok(config) = read_playout_config(&channel.config_path) {
let mut log_path = Path::new(&config.logging.path)
.join("ffplayout.log")
.display()
.to_string();
log_path.push_str(&date_str);
let file_size = metadata(&log_path)?.len() as f64;
let file_content = if file_size > 5000000.0 {
error!("Log file to big: {}", sizeof_fmt(file_size));
format!("The log file is larger ({}) than the hard limit of 5MB, the probability is very high that something is wrong with the playout. Check this on the server with `less {log_path}`.", sizeof_fmt(file_size))
} else {
fs::read_to_string(log_path)?
};
return Ok(file_content);
}
}
Err(ServiceError::NoContent(
"Requested log file not exists, or not readable.".to_string(),
))
}
/// get human readable file size
pub fn sizeof_fmt(mut num: f64) -> String {
let suffix = 'B';
for unit in ["", "Ki", "Mi", "Gi", "Ti", "Pi", "Ei", "Zi"] {
if num.abs() < 1024.0 {
return format!("{num:.1}{unit}{suffix}");
}
num /= 1024.0;
}
format!("{num:.1}Yi{suffix}")
}
pub fn local_utc_offset() -> i32 {
let mut offset = Local::now().format("%:z").to_string();
let operator = offset.remove(0);
let mut utc_offset = 0;
if let Some((r, f)) = offset.split_once(':') {
utc_offset = r.parse::<i32>().unwrap_or(0) * 60 + f.parse::<i32>().unwrap_or(0);
if operator == '-' && utc_offset > 0 {
utc_offset = -utc_offset;
}
}
utc_offset
}
pub fn naive_date_time_from_str<'de, D>(deserializer: D) -> Result<NaiveDateTime, D::Error>
where
D: Deserializer<'de>,
{
let s: String = Deserialize::deserialize(deserializer)?;
match NaiveDateTime::parse_from_str(&s, "%Y-%m-%dT%H:%M:%S") {
Ok(date_time) => Ok(date_time),
Err(e) => {
if e.kind() == ParseErrorKind::TooShort {
NaiveDateTime::parse_from_str(&format!("{s}T00:00:00"), "%Y-%m-%dT%H:%M:%S")
.map_err(de::Error::custom)
} else {
NaiveDateTime::parse_from_str(&s, "%Y-%m-%dT%H:%M:%S%#z").map_err(de::Error::custom)
}
}
}
}

View File

@ -1,34 +0,0 @@
**ffplayout-engine**
================
Start with Arguments
-----
ffplayout also allows the passing of parameters:
```
OPTIONS:
-c, --config <CONFIG> File path to ffplayout.yml
-d, --date <DATE> Target date (YYYY-MM-DD) for text/m3u to playlist import
-f, --folder <FOLDER> Play folder content
--fake-time <FAKE_TIME> fake date time, for debugging
-g, --generate <YYYY-MM-DD>... Generate playlist for dates, like: 2022-01-01 - 2022-01-10
-h, --help Print help information
-i, --infinit Loop playlist infinitely
--import <IMPORT> Import a given text/m3u file and create a playlist from it
-l, --log <LOG> File path for logging
-m, --play-mode <PLAY_MODE> Playing mode: folder, playlist
-o, --output <OUTPUT> Set output mode: desktop, hls, stream
-p, --playlist <PLAYLIST> Path from playlist
-s, --start <START> Start time in 'hh:mm:ss', 'now' for start with first
-t, --length <LENGTH> Set length in 'hh:mm:ss', 'none' for no length check
-v, --volume <VOLUME> Set audio volume
-V, --version Print version information
```
You can run the command like:
```Bash
./ffplayout -l none -p ~/playlist.json -o desktop
```

View File

@ -1,51 +0,0 @@
use std::{
sync::{atomic::AtomicBool, Arc},
thread,
};
use simplelog::*;
use ffplayout_lib::utils::{Media, PlayoutConfig, PlayoutStatus, ProcessMode::*};
pub mod folder;
pub mod ingest;
pub mod playlist;
pub use folder::watchman;
pub use ingest::ingest_server;
pub use playlist::CurrentProgram;
use ffplayout_lib::utils::{controller::PlayerControl, folder::FolderSource};
/// Create a source iterator from playlist, or from folder.
pub fn source_generator(
config: PlayoutConfig,
player_control: &PlayerControl,
playout_stat: PlayoutStatus,
is_terminated: Arc<AtomicBool>,
) -> Box<dyn Iterator<Item = Media>> {
match config.processing.mode {
Folder => {
info!("Playout in folder mode");
debug!(
"Monitor folder: <b><magenta>{:?}</></b>",
config.storage.path
);
let config_clone = config.clone();
let folder_source = FolderSource::new(&config, playout_stat.chain, player_control);
let node_clone = folder_source.player_control.current_list.clone();
// Spawn a thread to monitor folder for file changes.
thread::spawn(move || watchman(config_clone, is_terminated.clone(), node_clone));
Box::new(folder_source) as Box<dyn Iterator<Item = Media>>
}
Playlist => {
info!("Playout in playlist mode");
let program = CurrentProgram::new(&config, playout_stat, is_terminated, player_control);
Box::new(program) as Box<dyn Iterator<Item = Media>>
}
}
}

View File

@ -1,236 +0,0 @@
use std::{
fs::{self, File},
path::Path,
process::exit,
sync::{atomic::AtomicBool, Arc, Mutex},
thread,
};
#[cfg(debug_assertions)]
use chrono::prelude::*;
use serde::{Deserialize, Serialize};
use serde_json::json;
use simplelog::*;
use ffplayout::{
output::{player, write_hls},
rpc::run_server,
utils::{arg_parse::get_args, get_config},
};
use ffplayout_lib::utils::{
errors::ProcError, folder::fill_filler_list, generate_playlist, get_date, import::import_file,
init_logging, is_remote, send_mail, test_tcp_port, validate_ffmpeg, validate_playlist,
JsonPlaylist, OutputMode::*, PlayerControl, PlayoutStatus, ProcessControl,
};
#[cfg(debug_assertions)]
use ffplayout::utils::Args;
#[cfg(debug_assertions)]
use ffplayout_lib::utils::{mock_time, time_now};
const VERSION: &str = env!("CARGO_PKG_VERSION");
#[derive(Serialize, Deserialize)]
struct StatusData {
time_shift: f64,
date: String,
}
/// Here we create a status file in temp folder.
/// We need this for reading/saving program status.
/// For example when we skip a playing file,
/// we save the time difference, so we stay in sync.
///
/// When file not exists we create it, and when it exists we get its values.
fn status_file(stat_file: &str, playout_stat: &PlayoutStatus) -> Result<(), ProcError> {
debug!("Start ffplayout v{VERSION}, status file path: <b><magenta>{stat_file}</></b>");
if !Path::new(stat_file).exists() {
let data = json!({
"time_shift": 0.0,
"date": String::new(),
});
let json: String = serde_json::to_string(&data)?;
if let Err(e) = fs::write(stat_file, json) {
error!("Unable to write to status file <b><magenta>{stat_file}</></b>: {e}");
};
} else {
let stat_file = File::options().read(true).write(false).open(stat_file)?;
let data: StatusData = serde_json::from_reader(stat_file)?;
*playout_stat.time_shift.lock().unwrap() = data.time_shift;
*playout_stat.date.lock().unwrap() = data.date;
}
Ok(())
}
/// Set fake time for debugging.
/// When no time is given, we use the current time.
/// When a time is given, we use this time instead.
#[cfg(debug_assertions)]
fn fake_time(args: &Args) {
if let Some(fake_time) = &args.fake_time {
mock_time::set_mock_time(fake_time);
} else {
let local: DateTime<Local> = time_now();
mock_time::set_mock_time(&local.format("%Y-%m-%dT%H:%M:%S").to_string());
}
}
/// Main function.
/// Here we check the command line arguments and start the player.
/// We also start a JSON RPC server if enabled.
fn main() -> Result<(), ProcError> {
let args = get_args();
// use fake time function only in debugging mode
#[cfg(debug_assertions)]
fake_time(&args);
let mut config = get_config(args.clone())?;
let play_control = PlayerControl::new();
let playout_stat = PlayoutStatus::new();
let proc_control = ProcessControl::new();
let play_ctl1 = play_control.clone();
let play_ctl2 = play_control.clone();
let play_stat = playout_stat.clone();
let proc_ctl1 = proc_control.clone();
let proc_ctl2 = proc_control.clone();
let messages = Arc::new(Mutex::new(Vec::new()));
// try to create logging folder, if not exist
if config.logging.log_to_file
&& !config.logging.path.is_dir()
&& !config.logging.path.ends_with(".log")
{
if let Err(e) = fs::create_dir_all(&config.logging.path) {
eprintln!("Logging path not exists! {e}");
exit(1);
}
}
let logging = init_logging(&config, Some(proc_ctl1), Some(messages.clone()));
CombinedLogger::init(logging)?;
if let Err(e) = validate_ffmpeg(&mut config) {
error!("{e}");
exit(1);
};
let config_clone1 = config.clone();
let config_clone2 = config.clone();
if !matches!(config.processing.audio_channels, 2 | 4 | 6 | 8) {
error!(
"Encoding {} channel(s) is not allowed. Only 2, 4, 6 and 8 channels are supported!",
config.processing.audio_channels
);
exit(1);
}
if config.general.generate.is_some() {
// run a simple playlist generator and save them to disk
if let Err(e) = generate_playlist(&config, None) {
error!("{e}");
exit(1);
};
exit(0);
}
if let Some(path) = args.import {
if args.date.is_none() {
error!("Import needs date parameter!");
exit(1);
}
// convert text/m3u file to playlist
match import_file(&config, &args.date.unwrap(), None, &path) {
Ok(m) => {
info!("{m}");
exit(0);
}
Err(e) => {
error!("{e}");
exit(1);
}
}
}
if args.validate {
let play_ctl3 = play_control.clone();
let mut playlist_path = config.playlist.path.clone();
let start_sec = config.playlist.start_sec.unwrap();
let date = get_date(false, start_sec, false);
if playlist_path.is_dir() || is_remote(&playlist_path.to_string_lossy()) {
let d: Vec<&str> = date.split('-').collect();
playlist_path = playlist_path
.join(d[0])
.join(d[1])
.join(date.clone())
.with_extension("json");
}
let f = File::options()
.read(true)
.write(false)
.open(&playlist_path)?;
let playlist: JsonPlaylist = serde_json::from_reader(f)?;
validate_playlist(
config,
play_ctl3,
playlist,
Arc::new(AtomicBool::new(false)),
);
exit(0);
}
if config.rpc_server.enable {
// If RPC server is enable we also fire up a JSON RPC server.
if !test_tcp_port(&config.rpc_server.address) {
exit(1)
}
thread::spawn(move || run_server(config_clone1, play_ctl1, play_stat, proc_ctl2));
}
status_file(&config.general.stat_file, &playout_stat)?;
debug!(
"Use config: <b><magenta>{}</></b>",
config.general.config_path
);
// Fill filler list, can also be a single file.
thread::spawn(move || {
fill_filler_list(&config_clone2, Some(play_ctl2));
});
match config.out.mode {
// write files/playlist to HLS m3u8 playlist
HLS => write_hls(&config, play_control, playout_stat, proc_control),
// play on desktop or stream to a remote target
_ => player(&config, &play_control, playout_stat, proc_control),
}
info!("Playout done...");
let msg = messages.lock().unwrap();
if msg.len() > 0 {
send_mail(&config, msg.join("\n"));
}
drop(msg);
Ok(())
}

View File

@ -1,275 +0,0 @@
/*
This module write the files compression directly to a hls (m3u8) playlist,
without pre- and post-processing.
Example config:
out:
output_param: >-
...
-flags +cgop
-f hls
-hls_time 6
-hls_list_size 600
-hls_flags append_list+delete_segments+omit_endlist+program_date_time
-hls_segment_filename /var/www/html/live/stream-%d.ts /var/www/html/live/stream.m3u8
*/
use std::{
io::{BufRead, BufReader, Error},
process::{exit, Command, Stdio},
sync::atomic::Ordering,
thread::{self, sleep},
time::Duration,
};
use simplelog::*;
use crate::input::source_generator;
use crate::utils::{log_line, prepare_output_cmd, task_runner, valid_stream};
use ffplayout_lib::{
utils::{
controller::ProcessUnit::*, get_delta, sec_to_time, stderr_reader, test_tcp_port, Media,
PlayerControl, PlayoutConfig, PlayoutStatus, ProcessControl,
},
vec_strings,
};
/// Ingest Server for HLS
fn ingest_to_hls_server(
config: PlayoutConfig,
playout_stat: PlayoutStatus,
proc_control: ProcessControl,
) -> Result<(), Error> {
let playlist_init = playout_stat.list_init;
let mut server_prefix = vec_strings!["-hide_banner", "-nostats", "-v", "level+info"];
let stream_input = config.ingest.input_cmd.clone().unwrap();
let mut dummy_media = Media::new(0, "Live Stream", false);
dummy_media.unit = Ingest;
if let Some(ingest_input_cmd) = config
.advanced
.as_ref()
.and_then(|a| a.ingest.input_cmd.clone())
{
server_prefix.append(&mut ingest_input_cmd.clone());
}
server_prefix.append(&mut stream_input.clone());
let mut is_running;
if let Some(url) = stream_input.iter().find(|s| s.contains("://")) {
if !test_tcp_port(url) {
proc_control.stop_all();
exit(1);
}
info!("Start ingest server, listening on: <b><magenta>{url}</></b>");
};
loop {
dummy_media.add_filter(&config, &playout_stat.chain);
let server_cmd = prepare_output_cmd(&config, server_prefix.clone(), &dummy_media.filter);
debug!(
"Server CMD: <bright-blue>\"ffmpeg {}\"</>",
server_cmd.join(" ")
);
let proc_ctl = proc_control.clone();
let mut server_proc = match Command::new("ffmpeg")
.args(server_cmd.clone())
.stderr(Stdio::piped())
.spawn()
{
Err(e) => {
error!("couldn't spawn ingest server: {e}");
panic!("couldn't spawn ingest server: {e}");
}
Ok(proc) => proc,
};
let server_err = BufReader::new(server_proc.stderr.take().unwrap());
*proc_control.server_term.lock().unwrap() = Some(server_proc);
is_running = false;
for line in server_err.lines() {
let line = line?;
if line.contains("rtmp") && line.contains("Unexpected stream") && !valid_stream(&line) {
if let Err(e) = proc_ctl.stop(Ingest) {
error!("{e}");
};
}
if !is_running {
proc_control.server_is_running.store(true, Ordering::SeqCst);
playlist_init.store(true, Ordering::SeqCst);
is_running = true;
info!("Switch from {} to live ingest", config.processing.mode);
if let Err(e) = proc_control.stop(Decoder) {
error!("{e}");
}
}
log_line(&line, &config.logging.ffmpeg_level);
}
if proc_control.server_is_running.load(Ordering::SeqCst) {
info!("Switch from live ingest to {}", config.processing.mode);
}
proc_control
.server_is_running
.store(false, Ordering::SeqCst);
if let Err(e) = proc_control.wait(Ingest) {
error!("{e}")
}
if proc_control.is_terminated.load(Ordering::SeqCst) {
break;
}
}
Ok(())
}
/// HLS Writer
///
/// Write with single ffmpeg instance directly to a HLS playlist.
pub fn write_hls(
config: &PlayoutConfig,
player_control: PlayerControl,
playout_stat: PlayoutStatus,
proc_control: ProcessControl,
) {
let config_clone = config.clone();
let ff_log_format = format!("level+{}", config.logging.ffmpeg_level.to_lowercase());
let play_stat = playout_stat.clone();
let play_stat2 = playout_stat.clone();
let proc_control_c = proc_control.clone();
let get_source = source_generator(
config.clone(),
&player_control,
playout_stat,
proc_control.is_terminated.clone(),
);
// spawn a thread for ffmpeg ingest server and create a channel for package sending
if config.ingest.enable {
thread::spawn(move || ingest_to_hls_server(config_clone, play_stat, proc_control_c));
}
for node in get_source {
*player_control.current_media.lock().unwrap() = Some(node.clone());
let ignore = config.logging.ignore_lines.clone();
let mut cmd = match &node.cmd {
Some(cmd) => cmd.clone(),
None => break,
};
if !node.process.unwrap() {
continue;
}
info!(
"Play for <yellow>{}</>: <b><magenta>{}</></b>",
sec_to_time(node.out - node.seek),
node.source
);
if config.task.enable {
if config.task.path.is_file() {
let task_config = config.clone();
let task_node = node.clone();
let server_running = proc_control.server_is_running.load(Ordering::SeqCst);
let stat = play_stat2.clone();
thread::spawn(move || {
task_runner::run(task_config, task_node, stat, server_running)
});
} else {
error!(
"<bright-blue>{:?}</> executable not exists!",
config.task.path
);
}
}
let mut enc_prefix = vec_strings!["-hide_banner", "-nostats", "-v", &ff_log_format];
if let Some(encoder_input_cmd) = config
.advanced
.as_ref()
.and_then(|a| a.encoder.input_cmd.clone())
{
enc_prefix.append(&mut encoder_input_cmd.clone());
}
let mut read_rate = 1.0;
if let Some(begin) = &node.begin {
let (delta, _) = get_delta(config, begin);
let duration = node.out - node.seek;
let speed = duration / (duration + delta);
if node.seek == 0.0
&& speed > 0.0
&& speed < 1.3
&& delta < config.general.stop_threshold
{
read_rate = speed;
}
}
enc_prefix.append(&mut vec_strings!["-readrate", read_rate]);
enc_prefix.append(&mut cmd);
let enc_cmd = prepare_output_cmd(config, enc_prefix, &node.filter);
debug!(
"HLS writer CMD: <bright-blue>\"ffmpeg {}\"</>",
enc_cmd.join(" ")
);
let mut dec_proc = match Command::new("ffmpeg")
.args(enc_cmd)
.stderr(Stdio::piped())
.spawn()
{
Ok(proc) => proc,
Err(e) => {
error!("couldn't spawn ffmpeg process: {e}");
panic!("couldn't spawn ffmpeg process: {e}")
}
};
let enc_err = BufReader::new(dec_proc.stderr.take().unwrap());
*proc_control.decoder_term.lock().unwrap() = Some(dec_proc);
if let Err(e) = stderr_reader(enc_err, ignore, Decoder, proc_control.clone()) {
error!("{e:?}")
};
if let Err(e) = proc_control.wait(Decoder) {
error!("{e}");
}
while proc_control.server_is_running.load(Ordering::SeqCst) {
sleep(Duration::from_secs(1));
}
}
sleep(Duration::from_secs(1));
proc_control.stop_all();
}

View File

@ -1,5 +0,0 @@
mod server;
mod zmq_cmd;
pub use server::run_server;
pub use zmq_cmd::zmq_send;

View File

@ -1,587 +0,0 @@
use std::{fmt, sync::atomic::Ordering};
use regex::Regex;
extern crate serde;
extern crate serde_json;
extern crate tiny_http;
use futures::executor::block_on;
use serde::{
de::{self, Visitor},
Deserialize, Serialize,
};
use serde_json::{json, Map};
use simplelog::*;
use std::collections::HashMap;
use std::io::{Cursor, Error as IoError};
use tiny_http::{Header, Method, Request, Response, Server};
use crate::rpc::zmq_send;
use crate::utils::{get_data_map, get_media_map};
use ffplayout_lib::utils::{
get_delta, write_status, Ingest, OutputMode::*, PlayerControl, PlayoutConfig, PlayoutStatus,
ProcessControl,
};
#[derive(Default, Deserialize, Clone, Debug)]
struct TextFilter {
text: Option<String>,
#[serde(default, deserialize_with = "deserialize_number_or_string")]
x: Option<String>,
#[serde(default, deserialize_with = "deserialize_number_or_string")]
y: Option<String>,
#[serde(default, deserialize_with = "deserialize_number_or_string")]
fontsize: Option<String>,
#[serde(default, deserialize_with = "deserialize_number_or_string")]
line_spacing: Option<String>,
fontcolor: Option<String>,
#[serde(default, deserialize_with = "deserialize_number_or_string")]
alpha: Option<String>,
#[serde(default, deserialize_with = "deserialize_number_or_string")]
r#box: Option<String>,
boxcolor: Option<String>,
#[serde(default, deserialize_with = "deserialize_number_or_string")]
boxborderw: Option<String>,
}
/// Deserialize number or string
pub fn deserialize_number_or_string<'de, D>(deserializer: D) -> Result<Option<String>, D::Error>
where
D: serde::Deserializer<'de>,
{
struct StringOrNumberVisitor;
impl<'de> Visitor<'de> for StringOrNumberVisitor {
type Value = Option<String>;
fn expecting(&self, formatter: &mut std::fmt::Formatter) -> std::fmt::Result {
formatter.write_str("a string or a number")
}
fn visit_str<E: de::Error>(self, value: &str) -> Result<Self::Value, E> {
let re = Regex::new(r"0,([0-9]+)").unwrap();
let clean_string = re.replace_all(value, "0.$1").to_string();
Ok(Some(clean_string))
}
fn visit_u64<E: de::Error>(self, value: u64) -> Result<Self::Value, E> {
Ok(Some(value.to_string()))
}
fn visit_i64<E: de::Error>(self, value: i64) -> Result<Self::Value, E> {
Ok(Some(value.to_string()))
}
fn visit_f64<E: de::Error>(self, value: f64) -> Result<Self::Value, E> {
Ok(Some(value.to_string()))
}
}
deserializer.deserialize_any(StringOrNumberVisitor)
}
impl fmt::Display for TextFilter {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
let escaped_text = self
.text
.clone()
.unwrap_or_default()
.replace('\'', "'\\\\\\''")
.replace('\\', "\\\\\\\\")
.replace('%', "\\\\\\%")
.replace(':', "\\:");
let mut s = format!("text='{escaped_text}'");
if let Some(v) = &self.x {
if !v.is_empty() {
s.push_str(&format!(":x='{v}'"));
}
}
if let Some(v) = &self.y {
if !v.is_empty() {
s.push_str(&format!(":y='{v}'"));
}
}
if let Some(v) = &self.fontsize {
if !v.is_empty() {
s.push_str(&format!(":fontsize={v}"));
}
}
if let Some(v) = &self.line_spacing {
if !v.is_empty() {
s.push_str(&format!(":line_spacing={v}"));
}
}
if let Some(v) = &self.fontcolor {
if !v.is_empty() {
s.push_str(&format!(":fontcolor={v}"));
}
}
if let Some(v) = &self.alpha {
if !v.is_empty() {
s.push_str(&format!(":alpha='{v}'"));
}
}
if let Some(v) = &self.r#box {
if !v.is_empty() {
s.push_str(&format!(":box={v}"));
}
}
if let Some(v) = &self.boxcolor {
if !v.is_empty() {
s.push_str(&format!(":boxcolor={v}"));
}
}
if let Some(v) = &self.boxborderw {
if !v.is_empty() {
s.push_str(&format!(":boxborderw={v}"));
}
}
write!(f, "{s}")
}
}
/// Covert JSON string to ffmpeg filter command.
fn filter_from_json(raw_text: serde_json::Value) -> String {
let filter: TextFilter = serde_json::from_value(raw_text).unwrap_or_default();
filter.to_string()
}
#[derive(Debug, Serialize, Deserialize)]
struct ResponseData {
message: String,
}
/// Read the request body and convert it to a string
fn read_request_body(request: &mut Request) -> Result<String, IoError> {
let mut buffer = String::new();
let body = request.as_reader();
match body.read_to_string(&mut buffer) {
Ok(_) => Ok(buffer),
Err(error) => Err(error),
}
}
/// create client response in JSON format
fn json_response(data: serde_json::Map<String, serde_json::Value>) -> Response<Cursor<Vec<u8>>> {
let response_body = serde_json::to_string(&data).unwrap();
// create HTTP-Response
Response::from_string(response_body)
.with_status_code(200)
.with_header(Header::from_bytes(&b"Content-Type"[..], &b"application/json"[..]).unwrap())
}
/// create client error message
fn error_response(answer: &str, code: i32) -> Response<Cursor<Vec<u8>>> {
error!("RPC: {answer}");
Response::from_string(answer)
.with_status_code(code)
.with_header(Header::from_bytes(&b"Content-Type"[..], &b"text/plain"[..]).unwrap())
}
/// control playout: jump to last clip
fn control_back(
config: &PlayoutConfig,
play_control: &PlayerControl,
playout_stat: &PlayoutStatus,
proc: &ProcessControl,
) -> Response<Cursor<Vec<u8>>> {
let current_date = playout_stat.current_date.lock().unwrap().clone();
let current_list = play_control.current_list.lock().unwrap();
let mut date = playout_stat.date.lock().unwrap();
let index = play_control.current_index.load(Ordering::SeqCst);
let mut time_shift = playout_stat.time_shift.lock().unwrap();
if index > 1 && current_list.len() > 1 {
if let Some(proc) = proc.decoder_term.lock().unwrap().as_mut() {
if let Err(e) = proc.kill() {
error!("Decoder {e:?}")
};
if let Err(e) = proc.wait() {
error!("Decoder {e:?}")
};
info!("Move to last clip");
let mut data_map = Map::new();
let mut media = current_list[index - 2].clone();
play_control.current_index.fetch_sub(2, Ordering::SeqCst);
if let Err(e) = media.add_probe(false) {
error!("{e:?}");
};
let (delta, _) = get_delta(config, &media.begin.unwrap_or(0.0));
*time_shift = delta;
date.clone_from(&current_date);
write_status(config, &current_date, delta);
data_map.insert("operation".to_string(), json!("move_to_last"));
data_map.insert("shifted_seconds".to_string(), json!(delta));
data_map.insert("media".to_string(), get_media_map(media));
return json_response(data_map);
}
return error_response("Jump to last clip failed!", 500);
}
error_response("Clip index out of range!", 400)
}
/// control playout: jump to next clip
fn control_next(
config: &PlayoutConfig,
play_control: &PlayerControl,
playout_stat: &PlayoutStatus,
proc: &ProcessControl,
) -> Response<Cursor<Vec<u8>>> {
let current_date = playout_stat.current_date.lock().unwrap().clone();
let current_list = play_control.current_list.lock().unwrap();
let mut date = playout_stat.date.lock().unwrap();
let index = play_control.current_index.load(Ordering::SeqCst);
let mut time_shift = playout_stat.time_shift.lock().unwrap();
if index < current_list.len() {
if let Some(proc) = proc.decoder_term.lock().unwrap().as_mut() {
if let Err(e) = proc.kill() {
error!("Decoder {e:?}")
};
if let Err(e) = proc.wait() {
error!("Decoder {e:?}")
};
info!("Move to next clip");
let mut data_map = Map::new();
let mut media = current_list[index].clone();
if let Err(e) = media.add_probe(false) {
error!("{e:?}");
};
let (delta, _) = get_delta(config, &media.begin.unwrap_or(0.0));
*time_shift = delta;
date.clone_from(&current_date);
write_status(config, &current_date, delta);
data_map.insert("operation".to_string(), json!("move_to_next"));
data_map.insert("shifted_seconds".to_string(), json!(delta));
data_map.insert("media".to_string(), get_media_map(media));
return json_response(data_map);
}
return error_response("Jump to next clip failed!", 500);
}
error_response("Last clip can not be skipped!", 400)
}
/// control playout: reset playlist state
fn control_reset(
config: &PlayoutConfig,
playout_stat: &PlayoutStatus,
proc: &ProcessControl,
) -> Response<Cursor<Vec<u8>>> {
let current_date = playout_stat.current_date.lock().unwrap().clone();
let mut date = playout_stat.date.lock().unwrap();
let mut time_shift = playout_stat.time_shift.lock().unwrap();
if let Some(proc) = proc.decoder_term.lock().unwrap().as_mut() {
if let Err(e) = proc.kill() {
error!("Decoder {e:?}")
};
if let Err(e) = proc.wait() {
error!("Decoder {e:?}")
};
info!("Reset playout to original state");
let mut data_map = Map::new();
*time_shift = 0.0;
date.clone_from(&current_date);
playout_stat.list_init.store(true, Ordering::SeqCst);
write_status(config, &current_date, 0.0);
data_map.insert("operation".to_string(), json!("reset_playout_state"));
return json_response(data_map);
}
error_response("Reset playout state failed!", 400)
}
/// control playout: stop playlout
fn control_stop(proc: &ProcessControl) -> Response<Cursor<Vec<u8>>> {
proc.stop_all();
let mut data_map = Map::new();
data_map.insert("message".to_string(), json!("Stop playout!"));
json_response(data_map)
}
/// control playout: create text filter for ffmpeg
fn control_text(
data: HashMap<String, serde_json::Value>,
config: &PlayoutConfig,
playout_stat: &PlayoutStatus,
proc: &ProcessControl,
) -> Response<Cursor<Vec<u8>>> {
if data.contains_key("message") {
let filter = filter_from_json(data["message"].clone());
debug!("Got drawtext command: <bright-blue>\"{filter}\"</>");
let mut data_map = Map::new();
if !filter.is_empty() && config.text.zmq_stream_socket.is_some() {
if let Some(clips_filter) = playout_stat.chain.clone() {
*clips_filter.lock().unwrap() = vec![filter.clone()];
}
if config.out.mode == HLS {
if proc.server_is_running.load(Ordering::SeqCst) {
let filter_server = format!("drawtext@dyntext reinit {filter}");
if let Ok(reply) = block_on(zmq_send(
&filter_server,
&config.text.zmq_server_socket.clone().unwrap(),
)) {
data_map.insert("message".to_string(), json!(reply));
return json_response(data_map);
};
} else if let Err(e) = proc.stop(Ingest) {
error!("Ingest {e:?}")
}
}
if config.out.mode != HLS || !proc.server_is_running.load(Ordering::SeqCst) {
let filter_stream = format!("drawtext@dyntext reinit {filter}");
if let Ok(reply) = block_on(zmq_send(
&filter_stream,
&config.text.zmq_stream_socket.clone().unwrap(),
)) {
data_map.insert("message".to_string(), json!(reply));
return json_response(data_map);
};
}
}
}
error_response("text message missing!", 400)
}
/// media info: get infos about current clip
fn media_current(
config: &PlayoutConfig,
playout_stat: &PlayoutStatus,
play_control: &PlayerControl,
proc: &ProcessControl,
) -> Response<Cursor<Vec<u8>>> {
if let Some(media) = play_control.current_media.lock().unwrap().clone() {
let data_map = get_data_map(
config,
media,
playout_stat,
proc.server_is_running.load(Ordering::SeqCst),
);
return json_response(data_map);
};
error_response("No current clip...", 204)
}
/// media info: get infos about next clip
fn media_next(
config: &PlayoutConfig,
playout_stat: &PlayoutStatus,
play_control: &PlayerControl,
) -> Response<Cursor<Vec<u8>>> {
let index = play_control.current_index.load(Ordering::SeqCst);
let current_list = play_control.current_list.lock().unwrap();
if index < current_list.len() {
let media = current_list[index].clone();
let data_map = get_data_map(config, media, playout_stat, false);
return json_response(data_map);
}
error_response("There is no next clip", 500)
}
/// media info: get infos about last clip
fn media_last(
config: &PlayoutConfig,
playout_stat: &PlayoutStatus,
play_control: &PlayerControl,
) -> Response<Cursor<Vec<u8>>> {
let index = play_control.current_index.load(Ordering::SeqCst);
let current_list = play_control.current_list.lock().unwrap();
if index > 1 && index - 2 < current_list.len() {
let media = current_list[index - 2].clone();
let data_map = get_data_map(config, media, playout_stat, false);
return json_response(data_map);
}
error_response("There is no last clip", 500)
}
/// response builder
/// convert request body to struct and create response according to the request values
fn build_response(
mut request: Request,
config: &PlayoutConfig,
play_control: &PlayerControl,
playout_stat: &PlayoutStatus,
proc_control: &ProcessControl,
) {
if let Ok(body) = read_request_body(&mut request) {
if let Ok(data) = serde_json::from_str::<HashMap<String, serde_json::Value>>(&body) {
if let Some(control_value) = data.get("control").and_then(|c| c.as_str()) {
match control_value {
"back" => {
let _ = request.respond(control_back(
config,
play_control,
playout_stat,
proc_control,
));
}
"next" => {
let _ = request.respond(control_next(
config,
play_control,
playout_stat,
proc_control,
));
}
"reset" => {
let _ = request.respond(control_reset(config, playout_stat, proc_control));
}
"stop_all" => {
let _ = request.respond(control_stop(proc_control));
}
"text" => {
let _ =
request.respond(control_text(data, config, playout_stat, proc_control));
}
_ => (),
}
} else if let Some(media_value) = data.get("media").and_then(|m| m.as_str()) {
match media_value {
"current" => {
let _ = request.respond(media_current(
config,
playout_stat,
play_control,
proc_control,
));
}
"next" => {
let _ = request.respond(media_next(config, playout_stat, play_control));
}
"last" => {
let _ = request.respond(media_last(config, playout_stat, play_control));
}
_ => (),
}
}
} else {
error!("Error parsing JSON request.");
let _ = request.respond(error_response("Invalid JSON request", 400));
}
} else {
error!("Error reading request body.");
let _ = request.respond(error_response("Invalid JSON request", 500));
}
}
/// request handler
/// check if authorization header with correct value exists and forward traffic to build_response()
fn handle_request(
request: Request,
config: &PlayoutConfig,
play_control: &PlayerControl,
playout_stat: &PlayoutStatus,
proc_control: &ProcessControl,
) {
// Check Authorization-Header
match request
.headers()
.iter()
.find(|h| h.field.equiv("Authorization"))
{
Some(header) => {
let auth_value = header.value.as_str();
if auth_value == config.rpc_server.authorization {
// create and send response
build_response(request, config, play_control, playout_stat, proc_control)
} else {
let _ = request.respond(error_response("Unauthorized", 401));
}
}
None => {
let _ = request.respond(error_response("Missing authorization", 401));
}
}
}
/// JSON RPC Server
///
/// A simple rpc server for getting status information and controlling player:
///
/// - current clip information
/// - jump to next clip
/// - get last clip
/// - reset player state to original clip
pub fn run_server(
config: PlayoutConfig,
play_control: PlayerControl,
playout_stat: PlayoutStatus,
proc_control: ProcessControl,
) {
let addr = config.rpc_server.address.clone();
info!("RPC server listening on {addr}");
let server = Server::http(addr).expect("Failed to start server");
for request in server.incoming_requests() {
match request.method() {
Method::Post => handle_request(
request,
&config,
&play_control,
&playout_stat,
&proc_control,
),
_ => {
// Method not allowed
let response = Response::from_string("Method not allowed")
.with_status_code(405)
.with_header(
Header::from_bytes(&b"Content-Type"[..], &b"text/plain"[..]).unwrap(),
);
let _ = request.respond(response);
}
}
}
}

View File

@ -1,14 +0,0 @@
use std::error::Error;
use zeromq::Socket;
use zeromq::{SocketRecv, SocketSend, ZmqMessage};
pub async fn zmq_send(msg: &str, socket_addr: &str) -> Result<String, Box<dyn Error>> {
let mut socket = zeromq::ReqSocket::new();
socket.connect(&format!("tcp://{socket_addr}")).await?;
socket.send(msg.into()).await?;
let repl: ZmqMessage = socket.recv().await?;
let response = String::from_utf8(repl.into_vec()[0].to_vec())?;
Ok(response)
}

View File

@ -1,109 +0,0 @@
use std::path::PathBuf;
use clap::Parser;
use ffplayout_lib::utils::{OutputMode, ProcessMode};
#[derive(Parser, Debug, Clone)]
#[clap(version,
about = "ffplayout, Rust based 24/7 playout solution.",
override_usage = "Run without any command to use config file only, or with commands to override parameters:
\n ffplayout (ARGS) [OPTIONS]\n\n Pass channel name only in multi channel environment!",
long_about = None)]
pub struct Args {
#[clap(long, help = "File path to advanced.toml")]
pub advanced_config: Option<PathBuf>,
#[clap(index = 1, value_parser, help = "Channel name")]
pub channel: Option<String>,
#[clap(short, long, help = "File path to ffplayout.toml")]
pub config: Option<PathBuf>,
#[clap(short, long, help = "File path for logging")]
pub log: Option<PathBuf>,
#[clap(
short,
long,
help = "Target date (YYYY-MM-DD) for text/m3u to playlist import"
)]
pub date: Option<String>,
#[cfg(debug_assertions)]
#[clap(long, help = "fake date time, for debugging")]
pub fake_time: Option<String>,
#[clap(short, long, help = "Play folder content")]
pub folder: Option<PathBuf>,
#[clap(
short,
long,
help = "Generate playlist for dates, like: 2022-01-01 - 2022-01-10",
name = "YYYY-MM-DD",
num_args = 1..,
)]
pub generate: Option<Vec<String>>,
#[clap(
long,
help = "Import a given text/m3u file and create a playlist from it"
)]
pub import: Option<PathBuf>,
#[clap(short, long, help = "Loop playlist infinitely")]
pub infinit: bool,
#[clap(
short = 't',
long,
help = "Set length in 'hh:mm:ss', 'none' for no length check"
)]
pub length: Option<String>,
#[clap(long, help = "Override logging level")]
pub level: Option<String>,
#[clap(long, help = "Optional path list for playlist generations", num_args = 1..)]
pub paths: Option<Vec<PathBuf>>,
#[clap(short = 'm', long, help = "Playing mode: folder, playlist")]
pub play_mode: Option<ProcessMode>,
#[clap(short, long, help = "Path to playlist, or playlist root folder.")]
pub playlist: Option<PathBuf>,
#[clap(
short,
long,
help = "Start time in 'hh:mm:ss', 'now' for start with first"
)]
pub start: Option<String>,
#[clap(short = 'T', long, help = "JSON Template file for generating playlist")]
pub template: Option<PathBuf>,
#[clap(short, long, help = "Set output mode: desktop, hls, null, stream")]
pub output: Option<OutputMode>,
#[clap(short, long, help = "Set audio volume")]
pub volume: Option<f64>,
#[clap(long, help = "Skip validation process")]
pub skip_validation: bool,
#[clap(long, help = "validate given playlist")]
pub validate: bool,
}
/// Get arguments from command line, and return them.
#[cfg(not(test))]
pub fn get_args() -> Args {
Args::parse()
}
#[cfg(test)]
pub fn get_args() -> Args {
Args::parse_from(["-o desktop"].iter())
}

View File

@ -1,298 +0,0 @@
use std::{
env,
fs::File,
path::{Path, PathBuf},
};
use regex::Regex;
use serde_json::{json, Map, Value};
use simplelog::*;
pub mod arg_parse;
pub mod task_runner;
pub use arg_parse::Args;
use ffplayout_lib::{
filter::Filters,
utils::{
config::Template, errors::ProcError, parse_log_level_filter, time_in_seconds, time_to_sec,
Media, OutputMode::*, PlayoutConfig, PlayoutStatus, ProcessMode::*,
},
vec_strings,
};
/// Read command line arguments, and override the config with them.
pub fn get_config(args: Args) -> Result<PlayoutConfig, ProcError> {
let cfg_path = match args.channel {
Some(c) => {
let path = PathBuf::from(format!("/etc/ffplayout/{c}.toml"));
if !path.is_file() {
return Err(ProcError::Custom(format!(
"Config file \"{c}\" under \"/etc/ffplayout/\" not found.\n\nCheck arguments!"
)));
}
Some(path)
}
None => args.config,
};
let mut adv_config_path = PathBuf::from("/etc/ffplayout/advanced.toml");
if let Some(adv_path) = args.advanced_config {
adv_config_path = adv_path;
} else if !adv_config_path.is_file() {
if Path::new("./assets/advanced.toml").is_file() {
adv_config_path = PathBuf::from("./assets/advanced.toml")
} else if let Some(p) = env::current_exe().ok().as_ref().and_then(|op| op.parent()) {
adv_config_path = p.join("advanced.toml")
};
}
let mut config = PlayoutConfig::new(cfg_path, Some(adv_config_path));
if let Some(gen) = args.generate {
config.general.generate = Some(gen);
}
if args.validate {
config.general.validate = true;
}
if let Some(template_file) = args.template {
let f = File::options()
.read(true)
.write(false)
.open(template_file)?;
let mut template: Template = serde_json::from_reader(f)?;
template.sources.sort_by(|d1, d2| d1.start.cmp(&d2.start));
config.general.template = Some(template);
}
if let Some(paths) = args.paths {
config.storage.paths = paths;
}
if let Some(log_path) = args.log {
if log_path != Path::new("none") {
config.logging.log_to_file = true;
config.logging.path = log_path;
} else {
config.logging.log_to_file = false;
config.logging.timestamp = false;
}
}
if let Some(playlist) = args.playlist {
config.playlist.path = playlist;
}
if let Some(mode) = args.play_mode {
config.processing.mode = mode;
}
if let Some(folder) = args.folder {
config.storage.path = folder;
config.processing.mode = Folder;
}
if let Some(start) = args.start {
config.playlist.day_start.clone_from(&start);
config.playlist.start_sec = Some(time_to_sec(&start));
}
if let Some(length) = args.length {
config.playlist.length.clone_from(&length);
if length.contains(':') {
config.playlist.length_sec = Some(time_to_sec(&length));
} else {
config.playlist.length_sec = Some(86400.0);
}
}
if let Some(level) = args.level {
if let Ok(filter) = parse_log_level_filter(&level) {
config.logging.level = filter;
}
}
if args.infinit {
config.playlist.infinit = args.infinit;
}
if let Some(output) = args.output {
config.out.mode = output;
if config.out.mode == Null {
config.out.output_count = 1;
config.out.output_filter = None;
config.out.output_cmd = Some(vec_strings!["-f", "null", "-"]);
}
}
config.general.skip_validation = args.skip_validation;
if let Some(volume) = args.volume {
config.processing.volume = volume;
}
Ok(config)
}
/// Format ingest and HLS logging output
pub fn log_line(line: &str, level: &str) {
if line.contains("[info]") && level.to_lowercase() == "info" {
info!("<bright black>[Server]</> {}", line.replace("[info] ", ""))
} else if line.contains("[warning]")
&& (level.to_lowercase() == "warning" || level.to_lowercase() == "info")
{
warn!(
"<bright black>[Server]</> {}",
line.replace("[warning] ", "")
)
} else if line.contains("[error]")
&& !line.contains("Input/output error")
&& !line.contains("Broken pipe")
{
error!("<bright black>[Server]</> {}", line.replace("[error] ", ""));
} else if line.contains("[fatal]") {
error!("<bright black>[Server]</> {}", line.replace("[fatal] ", ""))
}
}
/// Compare incoming stream name with expecting name, but ignore question mark.
pub fn valid_stream(msg: &str) -> bool {
if let Some((unexpected, expected)) = msg.split_once(',') {
let re = Regex::new(r".*Unexpected stream|expecting|[\s]+|\?$").unwrap();
let unexpected = re.replace_all(unexpected, "");
let expected = re.replace_all(expected, "");
if unexpected == expected {
return true;
}
}
false
}
/// Prepare output parameters
///
/// Seek for multiple outputs and add mapping for it.
pub fn prepare_output_cmd(
config: &PlayoutConfig,
mut cmd: Vec<String>,
filters: &Option<Filters>,
) -> Vec<String> {
let mut output_params = config.out.clone().output_cmd.unwrap();
let mut new_params = vec![];
let mut count = 0;
let re_v = Regex::new(r"\[?0:v(:0)?\]?").unwrap();
if let Some(mut filter) = filters.clone() {
for (i, param) in output_params.iter().enumerate() {
if filter.video_out_link.len() > count && re_v.is_match(param) {
// replace mapping with link from filter struct
new_params.push(filter.video_out_link[count].clone());
} else {
new_params.push(param.clone());
}
// Check if parameter is a output
if i > 0
&& !param.starts_with('-')
&& !output_params[i - 1].starts_with('-')
&& i < output_params.len() - 1
{
count += 1;
if filter.video_out_link.len() > count
&& !output_params.contains(&"-map".to_string())
{
new_params.append(&mut vec_strings![
"-map",
filter.video_out_link[count].clone()
]);
for i in 0..config.processing.audio_tracks {
new_params.append(&mut vec_strings!["-map", format!("0:a:{i}")]);
}
}
}
}
output_params = new_params;
cmd.append(&mut filter.cmd());
// add mapping at the begin, if needed
if !filter.map().iter().all(|item| output_params.contains(item))
&& filter.output_chain.is_empty()
&& filter.video_out_link.is_empty()
{
cmd.append(&mut filter.map())
} else if &output_params[0] != "-map" && !filter.video_out_link.is_empty() {
cmd.append(&mut vec_strings!["-map", filter.video_out_link[0].clone()]);
for i in 0..config.processing.audio_tracks {
cmd.append(&mut vec_strings!["-map", format!("0:a:{i}")]);
}
}
}
cmd.append(&mut output_params);
cmd
}
/// map media struct to json object
pub fn get_media_map(media: Media) -> Value {
let mut obj = json!({
"in": media.seek,
"out": media.out,
"duration": media.duration,
"category": media.category,
"source": media.source,
});
if let Some(title) = media.title {
obj.as_object_mut()
.unwrap()
.insert("title".to_string(), Value::String(title));
}
obj
}
/// prepare json object for response
pub fn get_data_map(
config: &PlayoutConfig,
media: Media,
playout_stat: &PlayoutStatus,
server_is_running: bool,
) -> Map<String, Value> {
let mut data_map = Map::new();
let current_time = time_in_seconds();
let shift = *playout_stat.time_shift.lock().unwrap();
let begin = media.begin.unwrap_or(0.0) - shift;
let played_time = current_time - begin;
data_map.insert("index".to_string(), json!(media.index));
data_map.insert("ingest".to_string(), json!(server_is_running));
data_map.insert("mode".to_string(), json!(config.processing.mode));
data_map.insert(
"shift".to_string(),
json!((shift * 1000.0).round() / 1000.0),
);
data_map.insert(
"elapsed".to_string(),
json!((played_time * 1000.0).round() / 1000.0),
);
data_map.insert("media".to_string(), get_media_map(media));
data_map
}

View File

@ -1,25 +0,0 @@
use std::process::Command;
use simplelog::*;
use crate::utils::get_data_map;
use ffplayout_lib::utils::{config::PlayoutConfig, Media, PlayoutStatus};
pub fn run(config: PlayoutConfig, node: Media, playout_stat: PlayoutStatus, server_running: bool) {
let obj =
serde_json::to_string(&get_data_map(&config, node, &playout_stat, server_running)).unwrap();
trace!("Run task: {obj}");
match Command::new(config.task.path).arg(obj).spawn() {
Ok(mut c) => {
let status = c.wait().expect("Error in waiting for the task process!");
if !status.success() {
error!("Process stops with error.");
}
}
Err(e) => {
error!("Couldn't spawn task runner: {e}")
}
}
}

@ -1 +0,0 @@
Subproject commit 8d63cc4f85f3cbd530d509d74494b6fefbb9bf2c

View File

@ -1,40 +1,83 @@
[package]
name = "ffplayout"
description = "24/7 playout based on rust and ffmpeg"
readme = "README.md"
readme = "../README.md"
version.workspace = true
license.workspace = true
authors.workspace = true
repository.workspace = true
edition.workspace = true
default-run = "ffplayout"
[features]
default = ["embed_frontend"]
embed_frontend = []
[dependencies]
ffplayout-lib = { path = "../lib" }
chrono = { version = "0.4", default-features = false, features = ["clock", "std"] }
clap = { version = "4.3", features = ["derive"] }
actix-files = "0.6"
actix-multipart = "0.6"
actix-web = "4"
actix-web-grants = "4"
actix-web-httpauth = "0.8"
actix-web-lab = "0.20"
actix-web-static-files = "4.0"
argon2 = "0.5"
chrono = { version = "0.4", default-features = false, features = ["clock", "std", "serde"] }
clap = { version = "4.3", features = ["derive", "env"] }
crossbeam-channel = "0.5"
futures = "0.3"
itertools = "0.12"
derive_more = "0.99"
faccess = "0.2"
ffprobe = "0.4"
flexi_logger = { version = "0.28", features = ["kv", "colors"] }
futures-util = { version = "0.3", default-features = false, features = ["std"] }
home = "0.5"
jsonwebtoken = "9"
lazy_static = "1.4"
lettre = { version = "0.11", features = ["builder", "rustls-tls", "smtp-transport", "tokio1", "tokio1-rustls-tls"], default-features = false }
lexical-sort = "0.3"
local-ip-address = "0.6"
log = { version = "0.4", features = ["std", "serde", "kv", "kv_std", "kv_sval", "kv_serde"] }
notify = "6.0"
notify-debouncer-full = { version = "*", default-features = false }
num-traits = "0.2"
once_cell = "1"
paris = "1.5"
parking_lot = "0.12"
path-clean = "1.0"
rand = "0.8"
regex = "1"
relative-path = "1.8"
reqwest = { version = "0.12", default-features = false, features = ["blocking", "json", "rustls-tls"] }
rpassword = "7.2"
sanitize-filename = "0.5"
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"
simplelog = { version = "0.12", features = ["paris"] }
tiny_http = { version = "0.12", default-features = false }
zeromq = { version = "0.3", default-features = false, features = [
"async-std-runtime",
serde_with = "3.8"
shlex = "1.1"
static-files = "0.2"
sysinfo ={ version = "0.30", features = ["linux-netdevs", "linux-tmpfs"] }
sqlx = { version = "0.7", features = ["runtime-tokio", "sqlite"] }
time = { version = "0.3", features = ["formatting", "macros"] }
tokio = { version = "1.29", features = ["full"] }
tokio-stream = "0.1"
toml_edit = {version ="0.22", features = ["serde"]}
uuid = "1.8"
walkdir = "2"
zeromq = { version = "0.4", default-features = false, features = [
"tokio-runtime",
"tcp-transport",
] }
[target.'cfg(not(target_arch = "windows"))'.dependencies]
signal-child = "1"
[build-dependencies]
static-files = "0.2"
[[bin]]
name = "ffplayout"
path = "src/main.rs"
# DEBIAN DEB PACKAGE
[package.metadata.deb]
name = "ffplayout"
@ -42,61 +85,24 @@ priority = "optional"
section = "net"
license-file = ["../LICENSE", "0"]
depends = ""
recommends = "sudo"
suggests = "ffmpeg"
copyright = "Copyright (c) 2022, Jonathan Baecker. All rights reserved."
conf-files = ["/etc/ffplayout/ffplayout.toml", "/etc/ffplayout/advanced.toml"]
copyright = "Copyright (c) 2024, Jonathan Baecker. All rights reserved."
assets = [
[
"../target/x86_64-unknown-linux-musl/release/ffpapi",
"/usr/bin/",
"755",
],
[
"../target/x86_64-unknown-linux-musl/release/ffplayout",
"/usr/bin/",
"755",
],
[
"../assets/ffpapi.service",
"/lib/systemd/system/",
"644",
],
[
"../assets/ffplayout.service",
"/lib/systemd/system/",
"644",
],
[
"../assets/ffplayout@.service",
"/lib/systemd/system/",
"644",
],
[
"../assets/11-ffplayout",
"/etc/sudoers.d/",
"644",
],
[
"../assets/advanced.toml",
"/etc/ffplayout/",
"644",
],
[
"../assets/ffplayout.toml",
"/etc/ffplayout/",
"644",
],
[
"../assets/logo.png",
"/usr/share/ffplayout/",
"644",
],
[
"../assets/ffplayout.toml",
"/usr/share/ffplayout/ffplayout.toml.orig",
"644",
],
[
"../assets/ffplayout.conf",
"/usr/share/ffplayout/ffplayout.conf.example",
@ -107,11 +113,6 @@ assets = [
"/usr/share/doc/ffplayout/README",
"644",
],
[
"../assets/ffpapi.1.gz",
"/usr/share/man/man1/",
"644",
],
[
"../assets/ffplayout.1.gz",
"/usr/share/man/man1/",
@ -123,56 +124,21 @@ systemd-units = { enable = false, unit-scripts = "../assets" }
[package.metadata.deb.variants.arm64]
assets = [
[
"../target/aarch64-unknown-linux-gnu/release/ffpapi",
"/usr/bin/",
"755",
],
[
"../target/aarch64-unknown-linux-gnu/release/ffplayout",
"/usr/bin/",
"755",
],
[
"../assets/ffpapi.service",
"/lib/systemd/system/",
"644",
],
[
"../assets/ffplayout.service",
"/lib/systemd/system/",
"644",
],
[
"../assets/ffplayout@.service",
"/lib/systemd/system/",
"644",
],
[
"../assets/11-ffplayout",
"/etc/sudoers.d/",
"644",
],
[
"../assets/ffplayout.toml",
"/etc/ffplayout/",
"644",
],
[
"../assets/advanced.toml",
"/etc/ffplayout/",
"644",
],
[
"../assets/logo.png",
"/usr/share/ffplayout/",
"644",
],
[
"../assets/ffplayout.toml",
"/usr/share/ffplayout/ffplayout.toml.orig",
"644",
],
[
"../assets/ffplayout.conf",
"/usr/share/ffplayout/ffplayout.conf.example",
@ -183,11 +149,6 @@ assets = [
"/usr/share/doc/ffplayout/README",
"644",
],
[
"../assets/ffpapi.1.gz",
"/usr/share/man/man1/",
"644",
],
[
"../assets/ffplayout.1.gz",
"/usr/share/man/man1/",
@ -200,20 +161,12 @@ assets = [
name = "ffplayout"
license = "GPL-3.0"
assets = [
{ source = "../target/x86_64-unknown-linux-musl/release/ffpapi", dest = "/usr/bin/ffpapi", mode = "755" },
{ source = "../target/x86_64-unknown-linux-musl/release/ffplayout", dest = "/usr/bin/ffplayout", mode = "755" },
{ source = "../assets/advanced.toml", dest = "/etc/ffplayout/advanced.toml", mode = "644", config = true },
{ source = "../assets/ffplayout.toml", dest = "/etc/ffplayout/ffplayout.toml", mode = "644", config = true },
{ source = "../assets/ffpapi.service", dest = "/lib/systemd/system/ffpapi.service", mode = "644" },
{ source = "../assets/ffplayout.service", dest = "/lib/systemd/system/ffplayout.service", mode = "644" },
{ source = "../assets/ffplayout@.service", dest = "/lib/systemd/system/ffplayout@.service", mode = "644" },
{ source = "../assets/11-ffplayout", dest = "/etc/sudoers.d/11-ffplayout", mode = "644" },
{ source = "../README.md", dest = "/usr/share/doc/ffplayout/README", mode = "644" },
{ source = "../assets/ffpapi.1.gz", dest = "/usr/share/man/man1/ffpapi.1.gz", mode = "644", doc = true },
{ source = "../assets/ffplayout.1.gz", dest = "/usr/share/man/man1/ffplayout.1.gz", mode = "644", doc = true },
{ source = "../LICENSE", dest = "/usr/share/doc/ffplayout/LICENSE", mode = "644" },
{ source = "../assets/logo.png", dest = "/usr/share/ffplayout/logo.png", mode = "644" },
{ source = "../assets/ffplayout.toml", dest = "/usr/share/ffplayout/ffplayout.toml.orig", mode = "644" },
{ source = "../assets/ffplayout.conf", dest = "/usr/share/ffplayout/ffplayout.conf.example", mode = "644" },
{ source = "../debian/postinst", dest = "/usr/share/ffplayout/postinst", mode = "755" },
]

View File

@ -2,10 +2,10 @@ use static_files::NpmBuild;
fn main() -> std::io::Result<()> {
if !cfg!(debug_assertions) && cfg!(feature = "embed_frontend") {
NpmBuild::new("../ffplayout-frontend")
NpmBuild::new("../frontend")
.install()?
.run("generate")?
.target("../ffplayout-frontend/.output/public")
.target("../frontend/.output/public")
.change_detection()
.to_resource_dir()
.build()

View File

@ -0,0 +1,176 @@
use log::*;
use std::io::Write;
// use std::io::{Error, ErrorKind};
// use std::sync::{Arc, Mutex};
use flexi_logger::writers::{FileLogWriter, LogWriter};
use flexi_logger::{Age, Cleanup, Criterion, DeferredNow, FileSpec, Logger, Naming};
use paris::formatter::colorize_string;
pub struct LogMailer;
impl LogWriter for LogMailer {
fn write(&self, now: &mut DeferredNow, record: &Record<'_>) -> std::io::Result<()> {
println!("target: {:?}", record.target());
println!("key/value: {:?}", record.key_values().get("channel".into()));
println!(
"[{}] [{:>5}] Mail logger: {:?}",
now.now().format("%Y-%m-%d %H:%M:%S"),
record.level(),
record.args()
);
Ok(())
}
fn flush(&self) -> std::io::Result<()> {
Ok(())
}
}
pub struct LogConsole;
impl LogWriter for LogConsole {
fn write(&self, now: &mut DeferredNow, record: &Record<'_>) -> std::io::Result<()> {
console_formatter(&mut std::io::stderr(), now, record)?;
println!();
Ok(())
}
fn flush(&self) -> std::io::Result<()> {
Ok(())
}
}
pub fn file_logger(to_file: bool) -> Box<dyn LogWriter> {
if to_file {
Box::new(
FileLogWriter::builder(
FileSpec::default()
.suppress_timestamp()
// .directory("/var/log")
.basename("ffplayout"),
)
.append()
.format(file_formatter)
.rotate(
Criterion::Age(Age::Day),
Naming::Timestamps,
Cleanup::KeepLogFiles(7),
)
.print_message()
.try_build()
.unwrap(),
)
} else {
Box::new(LogConsole)
}
}
// struct MyWriter<F> {
// file: Arc<Mutex<F>>,
// }
// impl<F: std::io::Write + Send + Sync> LogWriter for MyWriter<F> {
// fn write(
// &self,
// now: &mut flexi_logger::DeferredNow,
// record: &flexi_logger::Record,
// ) -> std::io::Result<()> {
// let mut file = self
// .file
// .lock()
// .map_err(|e| Error::new(ErrorKind::Other, e.to_string()))?;
// flexi_logger::default_format(&mut *file, now, record)
// }
// fn flush(&self) -> std::io::Result<()> {
// let mut file = self
// .file
// .lock()
// .map_err(|e| Error::new(ErrorKind::Other, e.to_string()))?;
// file.flush()
// }
// }
// Define a macro for writing messages to the alert log and to the normal log
#[macro_use]
mod macros {
#[macro_export]
macro_rules! file_error {
($($arg:tt)*) => (
error!(target: "{File}", $($arg)*);
)
}
}
pub fn console_formatter(
w: &mut dyn Write,
now: &mut DeferredNow,
record: &Record,
) -> std::io::Result<()> {
let timestamp = colorize_string(format!(
"<dimmed>[{}]</>",
now.now().format("%Y-%m-%d %H:%M:%S%.6f")
));
let level = match record.level() {
Level::Debug => colorize_string("<bright magenta>[DEBUG]</>"),
Level::Error => colorize_string("<bright red>[ERROR]</>"),
Level::Info => colorize_string("<bright green>[ INFO]</>"),
Level::Trace => colorize_string("<bright yellow>[TRACE]</>"),
Level::Warn => colorize_string("<yellow>[ WARN]</>"),
};
write!(
w,
"{} {} {}",
timestamp,
level,
colorize_string(record.args().to_string()),
)
}
pub fn file_formatter(
w: &mut dyn Write,
now: &mut DeferredNow,
record: &Record,
) -> std::io::Result<()> {
let timestamp = format!("<dimmed>[{}]</>", now.now().format("%Y-%m-%d %H:%M:%S%.6f"));
let level = match record.level() {
Level::Debug => "<magenta>[DEBUG]</>",
Level::Error => "<red>[ERROR]</>",
Level::Info => "<green>[ INFO]</>",
Level::Trace => "<orange>[TRACE]</>",
Level::Warn => "<yellow>[ WARN]</>",
};
write!(w, "{} {} {}", timestamp, level, record.args())
}
fn main() {
let to_file = true;
Logger::try_with_str("WARN")
.expect("LogSpecification String has errors")
.format(console_formatter)
.print_message()
.log_to_stderr()
.add_writer("File", file_logger(to_file))
.add_writer("Mail", Box::new(LogMailer))
.start()
.unwrap();
// Explicitly send logs to different loggers
info!(target: "{Mail}", "This logs only to Mail");
warn!(target: "{File,Mail}", channel = 1; "This logs to File and Mail");
error!(target: "{File}", "This logs only to file");
error!(target: "{_Default}", "This logs to console");
file_error!("This is another file log");
error!("This is a <bright red>normal error</> message");
warn!("This is a warning");
info!("This is an info message");
debug!("This is an debug message");
trace!("This is an trace message");
}

View File

@ -0,0 +1,85 @@
use flexi_logger::writers::{FileLogWriter, LogWriter};
use flexi_logger::{Age, Cleanup, Criterion, DeferredNow, FileSpec, Naming, Record};
use log::{debug, error, info, kv::Value, trace, warn};
use std::collections::HashMap;
use std::io;
use std::sync::{Arc, Mutex};
struct MultiFileLogger {
writers: Arc<Mutex<HashMap<String, Arc<Mutex<FileLogWriter>>>>>,
}
impl MultiFileLogger {
pub fn new() -> Self {
MultiFileLogger {
writers: Arc::new(Mutex::new(HashMap::new())),
}
}
fn get_writer(&self, channel: &str) -> io::Result<Arc<Mutex<FileLogWriter>>> {
let mut writers = self.writers.lock().unwrap();
if !writers.contains_key(channel) {
let writer = FileLogWriter::builder(
FileSpec::default()
.suppress_timestamp()
.basename("ffplayout"),
)
.append()
.rotate(
Criterion::Age(Age::Day),
Naming::TimestampsCustomFormat {
current_infix: Some(""),
format: "%Y-%m-%d",
},
Cleanup::KeepLogFiles(7),
)
.print_message()
.try_build()
.map_err(|e| io::Error::new(io::ErrorKind::Other, e.to_string()))?;
writers.insert(channel.to_string(), Arc::new(Mutex::new(writer)));
}
Ok(writers.get(channel).unwrap().clone())
}
}
impl LogWriter for MultiFileLogger {
fn write(&self, now: &mut DeferredNow, record: &Record) -> io::Result<()> {
let channel = record
.key_values()
.get("channel".into())
.unwrap_or(Value::null())
.to_string();
let writer = self.get_writer(&channel);
let w = writer?.lock().unwrap().write(now, record);
w
}
fn flush(&self) -> io::Result<()> {
let writers = self.writers.lock().unwrap();
for writer in writers.values() {
writer.lock().unwrap().flush()?;
}
Ok(())
}
}
fn main() {
let logger = MultiFileLogger::new();
flexi_logger::Logger::try_with_str("trace")
.expect("LogSpecification String has errors")
.print_message()
.add_writer("file", Box::new(logger))
.log_to_stderr()
.start()
.unwrap();
trace!(target: "{file}", channel = 1; "This is a trace message for file1");
trace!("This is a trace message for console");
debug!(target: "{file}", channel = 2; "This is a debug message for file2");
info!(target:"{file}", channel = 2; "This is an info message for file2");
warn!(target: "{file}", channel = 1; "This is a warning for file1");
error!(target: "{file}", channel = 2; "This is an error message for file2");
info!("This is a info message for console");
}

View File

@ -4,7 +4,10 @@ use chrono::{TimeDelta, Utc};
use jsonwebtoken::{self, DecodingKey, EncodingKey, Header, Validation};
use serde::{Deserialize, Serialize};
use crate::utils::{GlobalSettings, Role};
use crate::{
db::models::{GlobalSettings, Role},
utils::errors::ServiceError,
};
// Token lifetime
const JWT_EXPIRATION_DAYS: i64 = 7;
@ -12,15 +15,17 @@ const JWT_EXPIRATION_DAYS: i64 = 7;
#[derive(Clone, Debug, Serialize, Deserialize, Eq, PartialEq)]
pub struct Claims {
pub id: i32,
pub channels: Vec<i32>,
pub username: String,
pub role: Role,
exp: i64,
}
impl Claims {
pub fn new(id: i32, username: String, role: Role) -> Self {
pub fn new(id: i32, channels: Vec<i32>, username: String, role: Role) -> Self {
Self {
id,
channels,
username,
role,
exp: (Utc::now() + TimeDelta::try_days(JWT_EXPIRATION_DAYS).unwrap()).timestamp(),
@ -29,17 +34,20 @@ impl Claims {
}
/// Create a json web token (JWT)
pub fn create_jwt(claims: Claims) -> Result<String, Error> {
pub async fn create_jwt(claims: Claims) -> Result<String, ServiceError> {
let config = GlobalSettings::global();
let encoding_key = EncodingKey::from_secret(config.secret.as_bytes());
jsonwebtoken::encode(&Header::default(), &claims, &encoding_key)
.map_err(|e| ErrorUnauthorized(e.to_string()))
let encoding_key = EncodingKey::from_secret(config.secret.clone().unwrap().as_bytes());
Ok(jsonwebtoken::encode(
&Header::default(),
&claims,
&encoding_key,
)?)
}
/// Decode a json web token (JWT)
pub async fn decode_jwt(token: &str) -> Result<Claims, Error> {
let config = GlobalSettings::global();
let decoding_key = DecodingKey::from_secret(config.secret.as_bytes());
let decoding_key = DecodingKey::from_secret(config.secret.clone().unwrap().as_bytes());
jsonwebtoken::decode::<Claims>(token, &decoding_key, &Validation::default())
.map(|data| data.claims)
.map_err(|e| ErrorUnauthorized(e.to_string()))

479
ffplayout/src/db/handles.rs Normal file
View File

@ -0,0 +1,479 @@
use argon2::{
password_hash::{rand_core::OsRng, SaltString},
Argon2, PasswordHasher,
};
use log::*;
use rand::{distributions::Alphanumeric, Rng};
use sqlx::{sqlite::SqliteQueryResult, Pool, Row, Sqlite};
use tokio::task;
use super::models::{AdvancedConfiguration, Configuration};
use crate::db::models::{Channel, GlobalSettings, Role, TextPreset, User};
use crate::utils::{advanced_config::AdvancedConfig, config::PlayoutConfig, local_utc_offset};
pub async fn db_migrate(conn: &Pool<Sqlite>) -> Result<&'static str, Box<dyn std::error::Error>> {
match sqlx::migrate!("../migrations").run(conn).await {
Ok(_) => info!("Database migration successfully"),
Err(e) => panic!("{e}"),
}
if select_global(conn).await.is_err() {
let secret: String = rand::thread_rng()
.sample_iter(&Alphanumeric)
.take(80)
.map(char::from)
.collect();
let query = "CREATE TRIGGER global_row_count
BEFORE INSERT ON global
WHEN (SELECT COUNT(*) FROM global) >= 1
BEGIN
SELECT RAISE(FAIL, 'Database is already initialized!');
END;
INSERT INTO global(secret) VALUES($1);";
sqlx::query(query).bind(secret).execute(conn).await?;
}
Ok("Database migrated!")
}
pub async fn select_global(conn: &Pool<Sqlite>) -> Result<GlobalSettings, sqlx::Error> {
let query = "SELECT id, secret, hls_path, logging_path, playlist_path, storage_path, shared_storage FROM global WHERE id = 1";
sqlx::query_as(query).fetch_one(conn).await
}
pub async fn update_global(
conn: &Pool<Sqlite>,
global: GlobalSettings,
) -> Result<SqliteQueryResult, sqlx::Error> {
let query = "UPDATE global SET hls_path = $2, playlist_path = $3, storage_path = $4, logging_path = $5, shared_storage = $6 WHERE id = 1";
sqlx::query(query)
.bind(global.id)
.bind(global.hls_path)
.bind(global.playlist_path)
.bind(global.storage_path)
.bind(global.logging_path)
.bind(global.shared_storage)
.execute(conn)
.await
}
pub async fn select_channel(conn: &Pool<Sqlite>, id: &i32) -> Result<Channel, sqlx::Error> {
let query = "SELECT * FROM channels WHERE id = $1";
let mut result: Channel = sqlx::query_as(query).bind(id).fetch_one(conn).await?;
result.utc_offset = local_utc_offset();
Ok(result)
}
pub async fn select_related_channels(
conn: &Pool<Sqlite>,
user_id: Option<i32>,
) -> Result<Vec<Channel>, sqlx::Error> {
let query = match user_id {
Some(id) => format!(
"SELECT c.id, c.name, c.preview_url, c.extra_extensions, c.active, c.last_date, c.time_shift FROM channels c
left join user_channels uc on uc.channel_id = c.id
left join user u on u.id = uc.user_id
WHERE u.id = {id} ORDER BY c.id ASC;"
),
None => "SELECT * FROM channels ORDER BY id ASC;".to_string(),
};
let mut results: Vec<Channel> = sqlx::query_as(&query).fetch_all(conn).await?;
for result in results.iter_mut() {
result.utc_offset = local_utc_offset();
}
Ok(results)
}
pub async fn update_channel(
conn: &Pool<Sqlite>,
id: i32,
channel: Channel,
) -> Result<SqliteQueryResult, sqlx::Error> {
let query =
"UPDATE channels SET name = $2, preview_url = $3, extra_extensions = $4 WHERE id = $1";
sqlx::query(query)
.bind(id)
.bind(channel.name)
.bind(channel.preview_url)
.bind(channel.extra_extensions)
.execute(conn)
.await
}
pub async fn update_stat(
conn: &Pool<Sqlite>,
id: i32,
last_date: String,
time_shift: f64,
) -> Result<SqliteQueryResult, sqlx::Error> {
let query = "UPDATE channels SET last_date = $2, time_shift = $3 WHERE id = $1";
sqlx::query(query)
.bind(id)
.bind(last_date)
.bind(time_shift)
.execute(conn)
.await
}
pub async fn update_player(
conn: &Pool<Sqlite>,
id: i32,
active: bool,
) -> Result<SqliteQueryResult, sqlx::Error> {
let query = "UPDATE channels SET active = $2 WHERE id = $1";
sqlx::query(query).bind(id).bind(active).execute(conn).await
}
pub async fn insert_channel(conn: &Pool<Sqlite>, channel: Channel) -> Result<Channel, sqlx::Error> {
let query = "INSERT INTO channels (name, preview_url, extra_extensions) VALUES($1, $2, $3)";
let result = sqlx::query(query)
.bind(channel.name)
.bind(channel.preview_url)
.bind(channel.extra_extensions)
.execute(conn)
.await?;
sqlx::query_as("SELECT * FROM channels WHERE id = $1")
.bind(result.last_insert_rowid())
.fetch_one(conn)
.await
}
pub async fn delete_channel(
conn: &Pool<Sqlite>,
id: &i32,
) -> Result<SqliteQueryResult, sqlx::Error> {
let query = "DELETE FROM channels WHERE id = $1";
sqlx::query(query).bind(id).execute(conn).await
}
pub async fn select_last_channel(conn: &Pool<Sqlite>) -> Result<i32, sqlx::Error> {
let query = "select seq from sqlite_sequence WHERE name = 'channel';";
sqlx::query_scalar(query).fetch_one(conn).await
}
pub async fn select_configuration(
conn: &Pool<Sqlite>,
channel: i32,
) -> Result<Configuration, sqlx::Error> {
let query = "SELECT * FROM configurations WHERE channel_id = $1";
sqlx::query_as(query).bind(channel).fetch_one(conn).await
}
pub async fn insert_configuration(
conn: &Pool<Sqlite>,
channel_id: i32,
output_param: String,
) -> Result<SqliteQueryResult, sqlx::Error> {
let query = "INSERT INTO configurations (channel_id, output_param) VALUES($1, $2)";
sqlx::query(query)
.bind(channel_id)
.bind(output_param)
.execute(conn)
.await
}
pub async fn update_configuration(
conn: &Pool<Sqlite>,
id: i32,
config: PlayoutConfig,
) -> Result<SqliteQueryResult, sqlx::Error> {
let query = "UPDATE configurations SET general_stop_threshold = $2, mail_subject = $3, mail_smtp = $4, mail_addr = $5, mail_pass = $6, mail_recipient = $7, mail_starttls = $8, mail_level = $9, mail_interval = $10, logging_ffmpeg_level = $11, logging_ingest_level = $12, logging_detect_silence = $13, logging_ignore = $14, processing_mode = $15, processing_audio_only = $16, processing_copy_audio = $17, processing_copy_video = $18, processing_width = $19, processing_height = $20, processing_aspect = $21, processing_fps = $22, processing_add_logo = $23, processing_logo = $24, processing_logo_scale = $25, processing_logo_opacity = $26, processing_logo_position = $27, processing_audio_tracks = $28, processing_audio_track_index = $29, processing_audio_channels = $30, processing_volume = $31, processing_filter = $32, ingest_enable = $33, ingest_param = $34, ingest_filter = $35, playlist_day_start = $36, playlist_length = $37, playlist_infinit = $38, storage_filler = $39, storage_extensions = $40, storage_shuffle = $41, text_add = $42, text_from_filename = $43, text_font = $44, text_style = $45, text_regex = $46, task_enable = $47, task_path = $48, output_mode = $49, output_param = $50 WHERE id = $1";
sqlx::query(query)
.bind(id)
.bind(config.general.stop_threshold)
.bind(config.mail.subject)
.bind(config.mail.smtp_server)
.bind(config.mail.sender_addr)
.bind(config.mail.sender_pass)
.bind(config.mail.recipient)
.bind(config.mail.starttls)
.bind(config.mail.mail_level.as_str())
.bind(config.mail.interval)
.bind(config.logging.ffmpeg_level)
.bind(config.logging.ingest_level)
.bind(config.logging.detect_silence)
.bind(config.logging.ignore_lines.join(";"))
.bind(config.processing.mode.to_string())
.bind(config.processing.audio_only)
.bind(config.processing.copy_audio)
.bind(config.processing.copy_video)
.bind(config.processing.width)
.bind(config.processing.height)
.bind(config.processing.aspect)
.bind(config.processing.fps)
.bind(config.processing.add_logo)
.bind(config.processing.logo)
.bind(config.processing.logo_scale)
.bind(config.processing.logo_opacity)
.bind(config.processing.logo_position)
.bind(config.processing.audio_tracks)
.bind(config.processing.audio_track_index)
.bind(config.processing.audio_channels)
.bind(config.processing.volume)
.bind(config.processing.custom_filter)
.bind(config.ingest.enable)
.bind(config.ingest.input_param)
.bind(config.ingest.custom_filter)
.bind(config.playlist.day_start)
.bind(config.playlist.length)
.bind(config.playlist.infinit)
.bind(config.storage.filler.to_string_lossy().to_string())
.bind(config.storage.extensions.join(";"))
.bind(config.storage.shuffle)
.bind(config.text.add_text)
.bind(config.text.text_from_filename)
.bind(config.text.fontfile)
.bind(config.text.style)
.bind(config.text.regex)
.bind(config.task.enable)
.bind(config.task.path.to_string_lossy().to_string())
.bind(config.output.mode.to_string())
.bind(config.output.output_param)
.execute(conn)
.await
}
pub async fn insert_advanced_configuration(
conn: &Pool<Sqlite>,
channel_id: i32,
) -> Result<SqliteQueryResult, sqlx::Error> {
let query = "INSERT INTO advanced_configurations (channel_id) VALUES($1)";
sqlx::query(query).bind(channel_id).execute(conn).await
}
pub async fn update_advanced_configuration(
conn: &Pool<Sqlite>,
channel_id: i32,
config: AdvancedConfig,
) -> Result<SqliteQueryResult, sqlx::Error> {
let query = "UPDATE advanced_configurations SET decoder_input_param = $2, decoder_output_param = $3, encoder_input_param = $4, ingest_input_param = $5, filter_deinterlace = $6, filter_pad_scale_w = $7, filter_pad_scale_h = $8, filter_pad_video = $9, filter_fps = $10, filter_scale = $11, filter_set_dar = $12, filter_fade_in = $13, filter_fade_out = $14, filter_overlay_logo_scale = $15, filter_overlay_logo_fade_in = $16, filter_overlay_logo_fade_out = $17, filter_overlay_logo = $18, filter_tpad = $19, filter_drawtext_from_file = $20, filter_drawtext_from_zmq = $21, filter_aevalsrc = $22, filter_afade_in = $23, filter_afade_out = $24, filter_apad = $25, filter_volume = $26, filter_split = $27 WHERE channel_id = $1";
sqlx::query(query)
.bind(channel_id)
.bind(config.decoder.input_param)
.bind(config.decoder.output_param)
.bind(config.encoder.input_param)
.bind(config.ingest.input_param)
.bind(config.filter.deinterlace)
.bind(config.filter.pad_scale_w)
.bind(config.filter.pad_scale_h)
.bind(config.filter.pad_video)
.bind(config.filter.fps)
.bind(config.filter.scale)
.bind(config.filter.set_dar)
.bind(config.filter.fade_in)
.bind(config.filter.fade_out)
.bind(config.filter.overlay_logo_scale)
.bind(config.filter.overlay_logo_fade_in)
.bind(config.filter.overlay_logo_fade_out)
.bind(config.filter.overlay_logo)
.bind(config.filter.tpad)
.bind(config.filter.drawtext_from_file)
.bind(config.filter.drawtext_from_zmq)
.bind(config.filter.aevalsrc)
.bind(config.filter.afade_in)
.bind(config.filter.afade_out)
.bind(config.filter.apad)
.bind(config.filter.volume)
.bind(config.filter.split)
.execute(conn)
.await
}
pub async fn select_advanced_configuration(
conn: &Pool<Sqlite>,
channel: i32,
) -> Result<AdvancedConfiguration, sqlx::Error> {
let query = "SELECT * FROM advanced_configurations WHERE channel_id = $1";
sqlx::query_as(query).bind(channel).fetch_one(conn).await
}
pub async fn select_role(conn: &Pool<Sqlite>, id: &i32) -> Result<Role, sqlx::Error> {
let query = "SELECT name FROM roles WHERE id = $1";
let result: Role = sqlx::query_as(query).bind(id).fetch_one(conn).await?;
Ok(result)
}
pub async fn select_login(conn: &Pool<Sqlite>, user: &str) -> Result<User, sqlx::Error> {
let query =
"SELECT u.id, u.mail, u.username, u.password, u.role_id, group_concat(uc.channel_id, ',') as channel_ids FROM user u
left join user_channels uc on uc.user_id = u.id
WHERE u.username = $1";
sqlx::query_as(query).bind(user).fetch_one(conn).await
}
pub async fn select_user(conn: &Pool<Sqlite>, id: i32) -> Result<User, sqlx::Error> {
let query = "SELECT u.id, u.mail, u.username, u.role_id, group_concat(uc.channel_id, ',') as channel_ids FROM user u
left join user_channels uc on uc.user_id = u.id
WHERE u.id = $1";
sqlx::query_as(query).bind(id).fetch_one(conn).await
}
pub async fn select_global_admins(conn: &Pool<Sqlite>) -> Result<Vec<User>, sqlx::Error> {
let query = "SELECT u.id, u.mail, u.username, u.role_id, group_concat(uc.channel_id, ',') as channel_ids FROM user u
left join user_channels uc on uc.user_id = u.id
WHERE u.role_id = 1";
sqlx::query_as(query).fetch_all(conn).await
}
pub async fn select_users(conn: &Pool<Sqlite>) -> Result<Vec<User>, sqlx::Error> {
let query = "SELECT id, username FROM user";
sqlx::query_as(query).fetch_all(conn).await
}
pub async fn insert_user(conn: &Pool<Sqlite>, user: User) -> Result<(), sqlx::Error> {
let password_hash = task::spawn_blocking(move || {
let salt = SaltString::generate(&mut OsRng);
let hash = Argon2::default()
.hash_password(user.password.clone().as_bytes(), &salt)
.unwrap();
hash.to_string()
})
.await
.unwrap();
let query =
"INSERT INTO user (mail, username, password, role_id) VALUES($1, $2, $3, $4) RETURNING id";
let user_id: i32 = sqlx::query(query)
.bind(user.mail)
.bind(user.username)
.bind(password_hash)
.bind(user.role_id)
.fetch_one(conn)
.await?
.get("id");
if let Some(channel_ids) = user.channel_ids {
insert_user_channel(conn, user_id, channel_ids).await?;
}
Ok(())
}
pub async fn update_user(
conn: &Pool<Sqlite>,
id: i32,
fields: String,
) -> Result<SqliteQueryResult, sqlx::Error> {
let query = format!("UPDATE user SET {fields} WHERE id = $1");
sqlx::query(&query).bind(id).execute(conn).await
}
pub async fn insert_user_channel(
conn: &Pool<Sqlite>,
user_id: i32,
channel_ids: Vec<i32>,
) -> Result<(), sqlx::Error> {
for channel in &channel_ids {
let query = "INSERT OR IGNORE INTO user_channels (channel_id, user_id) VALUES ($1, $2);";
sqlx::query(query)
.bind(channel)
.bind(user_id)
.execute(conn)
.await?;
}
Ok(())
}
pub async fn delete_user(conn: &Pool<Sqlite>, id: i32) -> Result<SqliteQueryResult, sqlx::Error> {
let query = "DELETE FROM user WHERE id = $1;";
sqlx::query(query).bind(id).execute(conn).await
}
pub async fn select_presets(conn: &Pool<Sqlite>, id: i32) -> Result<Vec<TextPreset>, sqlx::Error> {
let query = "SELECT * FROM presets WHERE channel_id = $1";
sqlx::query_as(query).bind(id).fetch_all(conn).await
}
pub async fn update_preset(
conn: &Pool<Sqlite>,
id: &i32,
preset: TextPreset,
) -> Result<SqliteQueryResult, sqlx::Error> {
let query =
"UPDATE presets SET name = $1, text = $2, x = $3, y = $4, fontsize = $5, line_spacing = $6,
fontcolor = $7, alpha = $8, box = $9, boxcolor = $10, boxborderw = $11 WHERE id = $12";
sqlx::query(query)
.bind(preset.name)
.bind(preset.text)
.bind(preset.x)
.bind(preset.y)
.bind(preset.fontsize)
.bind(preset.line_spacing)
.bind(preset.fontcolor)
.bind(preset.alpha)
.bind(preset.r#box)
.bind(preset.boxcolor)
.bind(preset.boxborderw)
.bind(id)
.execute(conn)
.await
}
pub async fn insert_preset(
conn: &Pool<Sqlite>,
preset: TextPreset,
) -> Result<SqliteQueryResult, sqlx::Error> {
let query =
"INSERT INTO presets (channel_id, name, text, x, y, fontsize, line_spacing, fontcolor, alpha, box, boxcolor, boxborderw)
VALUES($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12)";
sqlx::query(query)
.bind(preset.channel_id)
.bind(preset.name)
.bind(preset.text)
.bind(preset.x)
.bind(preset.y)
.bind(preset.fontsize)
.bind(preset.line_spacing)
.bind(preset.fontcolor)
.bind(preset.alpha)
.bind(preset.r#box)
.bind(preset.boxcolor)
.bind(preset.boxborderw)
.execute(conn)
.await
}
pub async fn delete_preset(
conn: &Pool<Sqlite>,
id: &i32,
) -> Result<SqliteQueryResult, sqlx::Error> {
let query = "DELETE FROM presets WHERE id = $1;";
sqlx::query(query).bind(id).execute(conn).await
}

View File

@ -1,4 +1,4 @@
use sqlx::{Pool, Sqlite, SqlitePool};
use sqlx::{migrate::MigrateDatabase, Pool, Sqlite, SqlitePool};
pub mod handles;
pub mod models;
@ -7,6 +7,11 @@ use crate::utils::db_path;
pub async fn db_pool() -> Result<Pool<Sqlite>, sqlx::Error> {
let db_path = db_path().unwrap();
if !Sqlite::database_exists(db_path).await.unwrap_or(false) {
Sqlite::create_database(db_path).await.unwrap();
}
let conn = SqlitePool::connect(db_path).await?;
Ok(conn)

445
ffplayout/src/db/models.rs Normal file
View File

@ -0,0 +1,445 @@
use std::{error::Error, fmt, str::FromStr};
use once_cell::sync::OnceCell;
use regex::Regex;
use serde::{
de::{self, Visitor},
Deserialize, Serialize,
};
// use serde_with::{formats::CommaSeparator, serde_as, StringWithSeparator};
use sqlx::{sqlite::SqliteRow, FromRow, Pool, Row, Sqlite};
use crate::db::handles;
use crate::utils::config::PlayoutConfig;
#[derive(Clone, Debug, Deserialize, Serialize, sqlx::FromRow)]
pub struct GlobalSettings {
pub id: i32,
pub secret: Option<String>,
pub hls_path: String,
pub logging_path: String,
pub playlist_path: String,
pub storage_path: String,
pub shared_storage: bool,
}
impl GlobalSettings {
pub async fn new(conn: &Pool<Sqlite>) -> Self {
let global_settings = handles::select_global(conn);
match global_settings.await {
Ok(g) => g,
Err(_) => GlobalSettings {
id: 0,
secret: None,
hls_path: String::new(),
logging_path: String::new(),
playlist_path: String::new(),
storage_path: String::new(),
shared_storage: false,
},
}
}
pub fn global() -> &'static GlobalSettings {
INSTANCE.get().expect("Config is not initialized")
}
}
static INSTANCE: OnceCell<GlobalSettings> = OnceCell::new();
pub async fn init_globales(conn: &Pool<Sqlite>) {
let config = GlobalSettings::new(conn).await;
INSTANCE.set(config).unwrap();
}
// #[serde_as]
#[derive(Clone, Debug, Deserialize, Serialize)]
pub struct User {
#[serde(skip_deserializing)]
pub id: i32,
#[serde(skip_serializing_if = "Option::is_none")]
pub mail: Option<String>,
pub username: String,
#[serde(skip_serializing, default = "empty_string")]
pub password: String,
pub role_id: Option<i32>,
// #[serde_as(as = "StringWithSeparator::<CommaSeparator, i32>")]
pub channel_ids: Option<Vec<i32>>,
#[serde(skip_serializing_if = "Option::is_none")]
pub token: Option<String>,
}
impl FromRow<'_, SqliteRow> for User {
fn from_row(row: &SqliteRow) -> sqlx::Result<Self> {
Ok(Self {
id: row.try_get("id").unwrap_or_default(),
mail: row.try_get("mail").unwrap_or_default(),
username: row.try_get("username").unwrap_or_default(),
password: row.try_get("password").unwrap_or_default(),
role_id: row.try_get("role_id").unwrap_or_default(),
channel_ids: Some(
row.try_get::<String, &str>("channel_ids")
.unwrap_or_default()
.split(',')
.map(|i| i.parse::<i32>().unwrap_or_default())
.collect(),
),
token: None,
})
}
}
fn empty_string() -> String {
"".to_string()
}
#[derive(Debug, Deserialize, Serialize, Clone)]
pub struct UserMeta {
pub id: i32,
pub channels: Vec<i32>,
}
impl UserMeta {
pub fn new(id: i32, channels: Vec<i32>) -> Self {
Self { id, channels }
}
}
#[derive(Clone, Debug, Eq, Hash, PartialEq, Serialize, Deserialize)]
pub enum Role {
GlobalAdmin,
ChannelAdmin,
User,
Guest,
}
impl Role {
pub fn set_role(role: &str) -> Self {
match role {
"global_admin" => Role::GlobalAdmin,
"channel_admin" => Role::ChannelAdmin,
"user" => Role::User,
_ => Role::Guest,
}
}
}
impl FromStr for Role {
type Err = String;
fn from_str(input: &str) -> Result<Self, Self::Err> {
match input {
"global_admin" => Ok(Self::GlobalAdmin),
"channel_admin" => Ok(Self::ChannelAdmin),
"user" => Ok(Self::User),
_ => Ok(Self::Guest),
}
}
}
impl fmt::Display for Role {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
match *self {
Self::GlobalAdmin => write!(f, "global_admin"),
Self::ChannelAdmin => write!(f, "channel_admin"),
Self::User => write!(f, "user"),
Self::Guest => write!(f, "guest"),
}
}
}
impl<'r> sqlx::decode::Decode<'r, ::sqlx::Sqlite> for Role
where
&'r str: sqlx::decode::Decode<'r, sqlx::Sqlite>,
{
fn decode(
value: <sqlx::Sqlite as sqlx::database::HasValueRef<'r>>::ValueRef,
) -> Result<Role, Box<dyn Error + 'static + Send + Sync>> {
let value = <&str as sqlx::decode::Decode<sqlx::Sqlite>>::decode(value)?;
Ok(value.parse()?)
}
}
impl FromRow<'_, SqliteRow> for Role {
fn from_row(row: &SqliteRow) -> sqlx::Result<Self> {
match row.get("name") {
"global_admin" => Ok(Self::GlobalAdmin),
"channel_admin" => Ok(Self::ChannelAdmin),
"user" => Ok(Self::User),
_ => Ok(Self::Guest),
}
}
}
#[derive(Debug, Deserialize, Serialize, Clone, sqlx::FromRow)]
pub struct TextPreset {
#[sqlx(default)]
#[serde(skip_deserializing)]
pub id: i32,
pub channel_id: i32,
pub name: String,
pub text: String,
pub x: String,
pub y: String,
#[serde(deserialize_with = "deserialize_number_or_string")]
pub fontsize: String,
#[serde(deserialize_with = "deserialize_number_or_string")]
pub line_spacing: String,
pub fontcolor: String,
pub r#box: String,
pub boxcolor: String,
#[serde(deserialize_with = "deserialize_number_or_string")]
pub boxborderw: String,
#[serde(deserialize_with = "deserialize_number_or_string")]
pub alpha: String,
}
/// Deserialize number or string
pub fn deserialize_number_or_string<'de, D>(deserializer: D) -> Result<String, D::Error>
where
D: serde::Deserializer<'de>,
{
struct StringOrNumberVisitor;
impl<'de> Visitor<'de> for StringOrNumberVisitor {
type Value = String;
fn expecting(&self, formatter: &mut std::fmt::Formatter) -> std::fmt::Result {
formatter.write_str("a string or a number")
}
fn visit_str<E: de::Error>(self, value: &str) -> Result<Self::Value, E> {
let re = Regex::new(r"0,([0-9]+)").unwrap();
let clean_string = re.replace_all(value, "0.$1").to_string();
Ok(clean_string)
}
fn visit_u64<E: de::Error>(self, value: u64) -> Result<Self::Value, E> {
Ok(value.to_string())
}
fn visit_i64<E: de::Error>(self, value: i64) -> Result<Self::Value, E> {
Ok(value.to_string())
}
fn visit_f64<E: de::Error>(self, value: f64) -> Result<Self::Value, E> {
Ok(value.to_string())
}
}
deserializer.deserialize_any(StringOrNumberVisitor)
}
#[derive(Clone, Debug, Default, Deserialize, Serialize, sqlx::FromRow)]
pub struct Channel {
#[serde(default = "default_id", skip_deserializing)]
pub id: i32,
pub name: String,
pub preview_url: String,
pub extra_extensions: String,
pub active: bool,
pub last_date: Option<String>,
pub time_shift: f64,
#[sqlx(default)]
#[serde(default)]
pub utc_offset: i32,
}
fn default_id() -> i32 {
1
}
#[derive(Clone, Debug, Deserialize, Serialize, sqlx::FromRow)]
pub struct Configuration {
pub id: i32,
pub channel_id: i32,
pub general_help: String,
pub general_stop_threshold: f64,
pub mail_help: String,
pub mail_subject: String,
pub mail_smtp: String,
pub mail_addr: String,
pub mail_pass: String,
pub mail_recipient: String,
pub mail_starttls: bool,
pub mail_level: String,
pub mail_interval: i64,
pub logging_help: String,
pub logging_ffmpeg_level: String,
pub logging_ingest_level: String,
pub logging_detect_silence: bool,
#[serde(default)]
pub logging_ignore: String,
pub processing_help: String,
pub processing_mode: String,
pub processing_audio_only: bool,
pub processing_copy_audio: bool,
pub processing_copy_video: bool,
pub processing_width: i64,
pub processing_height: i64,
pub processing_aspect: f64,
pub processing_fps: f64,
pub processing_add_logo: bool,
pub processing_logo: String,
pub processing_logo_scale: String,
pub processing_logo_opacity: f64,
pub processing_logo_position: String,
#[serde(default = "default_tracks")]
pub processing_audio_tracks: i32,
#[serde(default = "default_track_index")]
pub processing_audio_track_index: i32,
#[serde(default = "default_channels")]
pub processing_audio_channels: u8,
pub processing_volume: f64,
#[serde(default)]
pub processing_filter: String,
pub ingest_help: String,
pub ingest_enable: bool,
pub ingest_param: String,
#[serde(default)]
pub ingest_filter: String,
pub playlist_help: String,
pub playlist_day_start: String,
pub playlist_length: String,
pub playlist_infinit: bool,
pub storage_help: String,
pub storage_filler: String,
pub storage_extensions: String,
pub storage_shuffle: bool,
pub text_help: String,
pub text_add: bool,
pub text_from_filename: bool,
pub text_font: String,
pub text_style: String,
pub text_regex: String,
pub task_help: String,
pub task_enable: bool,
pub task_path: String,
pub output_help: String,
pub output_mode: String,
pub output_param: String,
}
impl Configuration {
pub fn from(id: i32, channel_id: i32, config: PlayoutConfig) -> Self {
Self {
id,
channel_id,
general_help: config.general.help_text,
general_stop_threshold: config.general.stop_threshold,
mail_help: config.mail.help_text,
mail_subject: config.mail.subject,
mail_smtp: config.mail.smtp_server,
mail_starttls: config.mail.starttls,
mail_addr: config.mail.sender_addr,
mail_pass: config.mail.sender_pass,
mail_recipient: config.mail.recipient,
mail_level: config.mail.mail_level.to_string(),
mail_interval: config.mail.interval,
logging_help: config.logging.help_text,
logging_ffmpeg_level: config.logging.ffmpeg_level,
logging_ingest_level: config.logging.ingest_level,
logging_detect_silence: config.logging.detect_silence,
logging_ignore: config.logging.ignore_lines.join(";"),
processing_help: config.processing.help_text,
processing_mode: config.processing.mode.to_string(),
processing_audio_only: config.processing.audio_only,
processing_audio_track_index: config.processing.audio_track_index,
processing_copy_audio: config.processing.copy_audio,
processing_copy_video: config.processing.copy_video,
processing_width: config.processing.width,
processing_height: config.processing.height,
processing_aspect: config.processing.aspect,
processing_fps: config.processing.fps,
processing_add_logo: config.processing.add_logo,
processing_logo: config.processing.logo,
processing_logo_scale: config.processing.logo_scale,
processing_logo_opacity: config.processing.logo_opacity,
processing_logo_position: config.processing.logo_position,
processing_audio_tracks: config.processing.audio_tracks,
processing_audio_channels: config.processing.audio_channels,
processing_volume: config.processing.volume,
processing_filter: config.processing.custom_filter,
ingest_help: config.ingest.help_text,
ingest_enable: config.ingest.enable,
ingest_param: config.ingest.input_param,
ingest_filter: config.ingest.custom_filter,
playlist_help: config.playlist.help_text,
playlist_day_start: config.playlist.day_start,
playlist_length: config.playlist.length,
playlist_infinit: config.playlist.infinit,
storage_help: config.storage.help_text,
storage_filler: config.storage.filler.to_string_lossy().to_string(),
storage_extensions: config.storage.extensions.join(";"),
storage_shuffle: config.storage.shuffle,
text_help: config.text.help_text,
text_add: config.text.add_text,
text_font: config.text.fontfile,
text_from_filename: config.text.text_from_filename,
text_style: config.text.style,
text_regex: config.text.regex,
task_help: config.task.help_text,
task_enable: config.task.enable,
task_path: config.task.path.to_string_lossy().to_string(),
output_help: config.output.help_text,
output_mode: config.output.mode.to_string(),
output_param: config.output.output_param,
}
}
}
fn default_track_index() -> i32 {
-1
}
fn default_tracks() -> i32 {
1
}
fn default_channels() -> u8 {
2
}
#[derive(Clone, Debug, Deserialize, Serialize, sqlx::FromRow)]
pub struct AdvancedConfiguration {
pub id: i32,
pub channel_id: i32,
pub decoder_input_param: Option<String>,
pub decoder_output_param: Option<String>,
pub encoder_input_param: Option<String>,
pub ingest_input_param: Option<String>,
pub filter_deinterlace: Option<String>,
pub filter_pad_scale_w: Option<String>,
pub filter_pad_scale_h: Option<String>,
pub filter_pad_video: Option<String>,
pub filter_fps: Option<String>,
pub filter_scale: Option<String>,
pub filter_set_dar: Option<String>,
pub filter_fade_in: Option<String>,
pub filter_fade_out: Option<String>,
pub filter_overlay_logo_scale: Option<String>,
pub filter_overlay_logo_fade_in: Option<String>,
pub filter_overlay_logo_fade_out: Option<String>,
pub filter_overlay_logo: Option<String>,
pub filter_tpad: Option<String>,
pub filter_drawtext_from_file: Option<String>,
pub filter_drawtext_from_zmq: Option<String>,
pub filter_aevalsrc: Option<String>,
pub filter_afade_in: Option<String>,
pub filter_afade_out: Option<String>,
pub filter_apad: Option<String>,
pub filter_volume: Option<String>,
pub filter_split: Option<String>,
}

View File

@ -6,9 +6,12 @@ use sysinfo::{Disks, Networks, System};
pub mod api;
pub mod db;
pub mod macros;
pub mod player;
pub mod sse;
pub mod utils;
use utils::advanced_config::AdvancedConfig;
use utils::args_parse::Args;
lazy_static! {

View File

@ -1,4 +1,12 @@
use std::{collections::HashSet, env, process::exit, sync::Arc};
use std::{
collections::HashSet,
env,
fs::File,
io,
process::exit,
sync::{atomic::AtomicBool, Arc, Mutex},
thread,
};
use actix_files::Files;
use actix_web::{
@ -10,26 +18,43 @@ use actix_web_httpauth::{extractors::bearer::BearerAuth, middleware::HttpAuthent
#[cfg(all(not(debug_assertions), feature = "embed_frontend"))]
use actix_web_static_files::ResourceFiles;
use log::*;
use path_clean::PathClean;
use simplelog::*;
use tokio::sync::Mutex;
use ffplayout_api::{
use ffplayout::{
api::{auth, routes::*},
db::{db_pool, models::LoginUser},
sse::{broadcast::Broadcaster, routes::*, AuthState},
utils::{control::ProcessControl, db_path, init_config, run_args},
db::{
db_pool, handles,
models::{init_globales, UserMeta},
},
player::{
controller::{ChannelController, ChannelManager},
utils::{get_date, is_remote, json_validate::validate_playlist, JsonPlaylist},
},
sse::{broadcast::Broadcaster, routes::*, SseAuthState},
utils::{
args_parse::run_args,
config::get_config,
logging::{init_logging, MailQueue},
playlist::generate_playlist,
},
ARGS,
};
#[cfg(any(debug_assertions, not(feature = "embed_frontend")))]
use ffplayout_api::utils::public_path;
use ffplayout_lib::utils::{init_logging, PlayoutConfig};
use ffplayout::utils::public_path;
#[cfg(all(not(debug_assertions), feature = "embed_frontend"))]
include!(concat!(env!("OUT_DIR"), "/generated.rs"));
fn thread_counter() -> usize {
let available_threads = thread::available_parallelism()
.map(|n| n.get())
.unwrap_or(1);
(available_threads / 2).max(2)
}
async fn validator(
req: ServiceRequest,
credentials: BearerAuth,
@ -40,7 +65,7 @@ async fn validator(
req.attach(vec![claims.role]);
req.extensions_mut()
.insert(LoginUser::new(claims.id, claims.username));
.insert(UserMeta::new(claims.id, claims.channels));
Ok(req)
}
@ -50,45 +75,67 @@ async fn validator(
#[actix_web::main]
async fn main() -> std::io::Result<()> {
let mut config = PlayoutConfig::new(None, None);
config.mail.recipient = String::new();
config.logging.log_to_file = false;
config.logging.timestamp = false;
let mail_queues = Arc::new(Mutex::new(vec![]));
let logging = init_logging(&config, None, None);
CombinedLogger::init(logging).unwrap();
let pool = db_pool()
.await
.map_err(|e| io::Error::new(io::ErrorKind::Other, e.to_string()))?;
if let Err(c) = run_args().await {
if ARGS.dump_advanced.is_none() && ARGS.dump_config.is_none() {
if let Err(e) = handles::db_migrate(&pool).await {
panic!("{e}");
};
}
if let Err(c) = run_args(&pool).await {
exit(c);
}
let pool = match db_pool().await {
Ok(p) => p,
Err(e) => {
error!("{e}");
exit(1);
}
};
init_globales(&pool).await;
init_logging(mail_queues.clone())?;
let channel_controllers = Arc::new(Mutex::new(ChannelController::new()));
if let Some(conn) = &ARGS.listen {
if db_path().is_err() {
error!("Database is not initialized! Init DB first and add admin user.");
exit(1);
let channels = handles::select_related_channels(&pool, None)
.await
.map_err(|e| io::Error::new(io::ErrorKind::Other, e.to_string()))?;
for channel in channels.iter() {
let config = get_config(&pool, channel.id).await?;
let manager = ChannelManager::new(Some(pool.clone()), channel.clone(), config.clone());
let m_queue = Arc::new(Mutex::new(MailQueue::new(channel.id, config.mail)));
channel_controllers
.lock()
.map_err(|e| io::Error::new(io::ErrorKind::Other, e.to_string()))?
.add(manager.clone());
if let Ok(mut mqs) = mail_queues.lock() {
mqs.push(m_queue.clone());
}
if channel.active {
manager.async_start().await;
}
}
init_config(&pool).await;
let ip_port = conn.split(':').collect::<Vec<&str>>();
let addr = ip_port[0];
let port = ip_port[1].parse::<u16>().unwrap();
let engine_process = web::Data::new(ProcessControl::new());
let auth_state = web::Data::new(AuthState {
uuids: Mutex::new(HashSet::new()),
let controllers = web::Data::from(channel_controllers.clone());
let auth_state = web::Data::new(SseAuthState {
uuids: tokio::sync::Mutex::new(HashSet::new()),
});
let broadcast_data = Broadcaster::create();
let thread_count = thread_counter();
info!("running ffplayout API, listen on http://{conn}");
info!("Running ffplayout API, listen on http://{conn}");
// no 'allow origin' here, give it to the reverse proxy
HttpServer::new(move || {
let queues = mail_queues.clone();
let auth = HttpAuthentication::bearer(validator);
let db_pool = web::Data::new(pool.clone());
// Customize logging format to get IP though proxies.
@ -97,7 +144,8 @@ async fn main() -> std::io::Result<()> {
let mut web_app = App::new()
.app_data(db_pool)
.app_data(engine_process.clone())
.app_data(web::Data::from(queues))
.app_data(controllers.clone())
.app_data(auth_state.clone())
.app_data(web::Data::from(Arc::clone(&broadcast_data)))
.wrap(logger)
@ -110,6 +158,8 @@ async fn main() -> std::io::Result<()> {
.service(get_by_name)
.service(get_users)
.service(remove_user)
.service(get_advanced_config)
.service(update_advanced_config)
.service(get_playout_config)
.service(update_playout_config)
.service(add_preset)
@ -125,8 +175,6 @@ async fn main() -> std::io::Result<()> {
.service(send_text_message)
.service(control_playout)
.service(media_current)
.service(media_next)
.service(media_last)
.service(process_control)
.service(get_playlist)
.service(save_playlist)
@ -184,11 +232,72 @@ async fn main() -> std::io::Result<()> {
web_app
})
.bind((addr, port))?
.workers(thread_count)
.run()
.await
.await?;
} else {
error!("Run ffpapi with listen parameter!");
let channels = ARGS.channels.clone().unwrap_or_else(|| vec![1]);
Ok(())
for (index, channel_id) in channels.iter().enumerate() {
let config = get_config(&pool, *channel_id).await?;
let channel = handles::select_channel(&pool, channel_id).await.unwrap();
let manager = ChannelManager::new(Some(pool.clone()), channel.clone(), config.clone());
if ARGS.foreground {
let m_queue = Arc::new(Mutex::new(MailQueue::new(*channel_id, config.mail)));
channel_controllers
.lock()
.map_err(|e| io::Error::new(io::ErrorKind::Other, e.to_string()))?
.add(manager.clone());
if let Ok(mut mqs) = mail_queues.lock() {
mqs.push(m_queue.clone());
}
manager.foreground_start(index).await;
} else if ARGS.generate.is_some() {
// run a simple playlist generator and save them to disk
if let Err(e) = generate_playlist(manager) {
error!("{e}");
exit(1);
};
} else if ARGS.validate {
let mut playlist_path = config.global.playlist_path.clone();
let start_sec = config.playlist.start_sec.unwrap();
let date = get_date(false, start_sec, false);
if playlist_path.is_dir() || is_remote(&playlist_path.to_string_lossy()) {
let d: Vec<&str> = date.split('-').collect();
playlist_path = playlist_path
.join(d[0])
.join(d[1])
.join(date.clone())
.with_extension("json");
}
let f = File::options()
.read(true)
.write(false)
.open(&playlist_path)?;
let playlist: JsonPlaylist = serde_json::from_reader(f)?;
validate_playlist(
config,
Arc::new(Mutex::new(Vec::new())),
playlist,
Arc::new(AtomicBool::new(false)),
);
} else {
error!("Run ffplayout with parameters! Run ffplayout -h for more information.");
}
}
}
for channel in &channel_controllers.lock().unwrap().channels {
channel.stop_all();
}
Ok(())
}

View File

@ -0,0 +1,410 @@
use std::{
fmt, fs, io,
path::Path,
process::Child,
sync::{
atomic::{AtomicBool, AtomicUsize, Ordering},
Arc, Mutex,
},
thread,
};
#[cfg(not(windows))]
use signal_child::Signalable;
use log::*;
use regex::Regex;
use serde::{Deserialize, Serialize};
use sqlx::{Pool, Sqlite};
use sysinfo::Disks;
use walkdir::WalkDir;
use crate::player::{
output::{player, write_hls},
utils::{folder::fill_filler_list, Media},
};
use crate::utils::{
config::{OutputMode::*, PlayoutConfig},
errors::ProcessError,
};
use crate::ARGS;
use crate::{
db::{handles, models::Channel},
utils::logging::Target,
};
const VERSION: &str = env!("CARGO_PKG_VERSION");
/// Defined process units.
#[derive(Clone, Debug, Default, Copy, Eq, Serialize, Deserialize, PartialEq)]
pub enum ProcessUnit {
#[default]
Decoder,
Encoder,
Ingest,
}
impl fmt::Display for ProcessUnit {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
match *self {
ProcessUnit::Decoder => write!(f, "Decoder"),
ProcessUnit::Encoder => write!(f, "Encoder"),
ProcessUnit::Ingest => write!(f, "Ingest"),
}
}
}
use ProcessUnit::*;
#[derive(Clone, Debug, Default)]
pub struct ChannelManager {
pub db_pool: Option<Pool<Sqlite>>,
pub config: Arc<Mutex<PlayoutConfig>>,
pub channel: Arc<Mutex<Channel>>,
pub decoder: Arc<Mutex<Option<Child>>>,
pub encoder: Arc<Mutex<Option<Child>>>,
pub ingest: Arc<Mutex<Option<Child>>>,
pub ingest_is_running: Arc<AtomicBool>,
pub is_terminated: Arc<AtomicBool>,
pub is_alive: Arc<AtomicBool>,
pub filter_chain: Option<Arc<Mutex<Vec<String>>>>,
pub current_date: Arc<Mutex<String>>,
pub list_init: Arc<AtomicBool>,
pub current_media: Arc<Mutex<Option<Media>>>,
pub current_list: Arc<Mutex<Vec<Media>>>,
pub filler_list: Arc<Mutex<Vec<Media>>>,
pub current_index: Arc<AtomicUsize>,
pub filler_index: Arc<AtomicUsize>,
pub run_count: Arc<AtomicUsize>,
}
impl ChannelManager {
pub fn new(db_pool: Option<Pool<Sqlite>>, channel: Channel, config: PlayoutConfig) -> Self {
Self {
db_pool,
is_alive: Arc::new(AtomicBool::new(false)),
channel: Arc::new(Mutex::new(channel)),
config: Arc::new(Mutex::new(config)),
list_init: Arc::new(AtomicBool::new(true)),
current_media: Arc::new(Mutex::new(None)),
current_list: Arc::new(Mutex::new(vec![Media::new(0, "", false)])),
filler_list: Arc::new(Mutex::new(vec![])),
current_index: Arc::new(AtomicUsize::new(0)),
filler_index: Arc::new(AtomicUsize::new(0)),
run_count: Arc::new(AtomicUsize::new(0)),
..Default::default()
}
}
pub fn update_channel(self, other: &Channel) {
let mut channel = self.channel.lock().unwrap();
channel.name.clone_from(&other.name);
channel.preview_url.clone_from(&other.preview_url);
channel.extra_extensions.clone_from(&other.extra_extensions);
channel.active.clone_from(&other.active);
channel.last_date.clone_from(&other.last_date);
channel.time_shift.clone_from(&other.time_shift);
channel.utc_offset.clone_from(&other.utc_offset);
}
pub fn update_config(&self, new_config: PlayoutConfig) {
let mut config = self.config.lock().unwrap();
*config = new_config;
}
pub async fn async_start(&self) {
if !self.is_alive.load(Ordering::SeqCst) {
self.run_count.fetch_add(1, Ordering::SeqCst);
self.is_alive.store(true, Ordering::SeqCst);
self.is_terminated.store(false, Ordering::SeqCst);
self.list_init.store(true, Ordering::SeqCst);
let pool_clone = self.db_pool.clone().unwrap();
let self_clone = self.clone();
let channel_id = self.channel.lock().unwrap().id;
if let Err(e) = handles::update_player(&pool_clone, channel_id, true).await {
error!("Unable write to player status: {e}");
};
thread::spawn(move || {
let run_count = self_clone.run_count.clone();
if let Err(e) = start_channel(self_clone) {
run_count.fetch_sub(1, Ordering::SeqCst);
error!("{e}");
};
});
}
}
pub async fn foreground_start(&self, index: usize) {
if !self.is_alive.load(Ordering::SeqCst) {
self.run_count.fetch_add(1, Ordering::SeqCst);
self.is_alive.store(true, Ordering::SeqCst);
self.is_terminated.store(false, Ordering::SeqCst);
self.list_init.store(true, Ordering::SeqCst);
let pool_clone = self.db_pool.clone().unwrap();
let self_clone = self.clone();
let channel_id = self.channel.lock().unwrap().id;
if let Err(e) = handles::update_player(&pool_clone, channel_id, true).await {
error!("Unable write to player status: {e}");
};
if index + 1 == ARGS.channels.clone().unwrap_or_default().len() {
let run_count = self_clone.run_count.clone();
tokio::task::spawn_blocking(move || {
if let Err(e) = start_channel(self_clone) {
run_count.fetch_sub(1, Ordering::SeqCst);
error!("{e}");
}
})
.await
.unwrap();
} else {
thread::spawn(move || {
let run_count = self_clone.run_count.clone();
if let Err(e) = start_channel(self_clone) {
run_count.fetch_sub(1, Ordering::SeqCst);
error!("{e}");
};
});
}
}
}
pub fn stop(&self, unit: ProcessUnit) -> Result<(), ProcessError> {
let mut channel = self.channel.lock()?;
match unit {
Decoder => {
if let Some(proc) = self.decoder.lock()?.as_mut() {
#[cfg(not(windows))]
proc.term()
.map_err(|e| ProcessError::Custom(format!("Decoder: {e}")))?;
#[cfg(windows)]
proc.kill()
.map_err(|e| ProcessError::Custom(format!("Decoder: {e}")))?;
}
}
Encoder => {
if let Some(proc) = self.encoder.lock()?.as_mut() {
proc.kill()
.map_err(|e| ProcessError::Custom(format!("Encoder: {e}")))?;
}
}
Ingest => {
if let Some(proc) = self.ingest.lock()?.as_mut() {
proc.kill()
.map_err(|e| ProcessError::Custom(format!("Ingest: {e}")))?;
}
}
}
channel.active = false;
self.wait(unit)?;
Ok(())
}
/// Wait for process to proper close.
/// This prevents orphaned/zombi processes in system
pub fn wait(&self, unit: ProcessUnit) -> Result<(), ProcessError> {
match unit {
Decoder => {
if let Some(proc) = self.decoder.lock().unwrap().as_mut() {
proc.wait()
.map_err(|e| ProcessError::Custom(format!("Decoder: {e}")))?;
}
}
Encoder => {
if let Some(proc) = self.encoder.lock().unwrap().as_mut() {
proc.wait()
.map_err(|e| ProcessError::Custom(format!("Encoder: {e}")))?;
}
}
Ingest => {
if let Some(proc) = self.ingest.lock().unwrap().as_mut() {
proc.wait()
.map_err(|e| ProcessError::Custom(format!("Ingest: {e}")))?;
}
}
}
Ok(())
}
pub async fn async_stop(&self) {
debug!("Stop all child processes");
self.is_terminated.store(true, Ordering::SeqCst);
self.is_alive.store(false, Ordering::SeqCst);
self.ingest_is_running.store(false, Ordering::SeqCst);
self.run_count.fetch_sub(1, Ordering::SeqCst);
let pool = self.db_pool.clone().unwrap();
let channel_id = self.channel.lock().unwrap().id;
if let Err(e) = handles::update_player(&pool, channel_id, false).await {
error!("Unable write to player status: {e}");
};
for unit in [Decoder, Encoder, Ingest] {
if let Err(e) = self.stop(unit) {
if !e.to_string().contains("exited process") {
error!("{e}")
}
}
}
}
/// No matter what is running, terminate them all.
pub fn stop_all(&self) {
debug!("Stop all child processes");
self.is_terminated.store(true, Ordering::SeqCst);
self.ingest_is_running.store(false, Ordering::SeqCst);
self.run_count.fetch_sub(1, Ordering::SeqCst);
if self.is_alive.load(Ordering::SeqCst) {
self.is_alive.store(false, Ordering::SeqCst);
trace!("Playout is alive and processes are terminated");
for unit in [Decoder, Encoder, Ingest] {
if let Err(e) = self.stop(unit) {
if !e.to_string().contains("exited process") {
error!("{e}")
}
}
if let Err(e) = self.wait(unit) {
if !e.to_string().contains("exited process") {
error!("{e}")
}
}
}
}
}
}
#[derive(Clone, Debug, Default)]
pub struct ChannelController {
pub channels: Vec<ChannelManager>,
}
impl ChannelController {
pub fn new() -> Self {
Self { channels: vec![] }
}
pub fn add(&mut self, manager: ChannelManager) {
self.channels.push(manager);
}
pub fn get(&self, id: i32) -> Option<ChannelManager> {
for manager in self.channels.iter() {
if manager.channel.lock().unwrap().id == id {
return Some(manager.clone());
}
}
None
}
pub fn remove(&mut self, channel_id: i32) {
self.channels.retain(|manager| {
let channel = manager.channel.lock().unwrap();
channel.id != channel_id
});
}
pub fn run_count(&self) -> usize {
self.channels
.iter()
.filter(|manager| manager.is_alive.load(Ordering::SeqCst))
.count()
}
}
pub fn start_channel(manager: ChannelManager) -> Result<(), ProcessError> {
let config = manager.config.lock()?.clone();
let mode = config.output.mode.clone();
let filler_list = manager.filler_list.clone();
let channel_id = config.general.channel_id;
drain_hls_path(
&config.global.hls_path,
&config.output.output_cmd.clone().unwrap_or_default(),
channel_id,
)?;
debug!(target: Target::all(), channel = channel_id; "Start ffplayout v{VERSION}, channel: <yellow>{channel_id}</>");
// Fill filler list, can also be a single file.
thread::spawn(move || {
fill_filler_list(&config, Some(filler_list));
});
match mode {
// write files/playlist to HLS m3u8 playlist
HLS => write_hls(manager),
// play on desktop or stream to a remote target
_ => player(manager),
}
}
pub fn drain_hls_path(path: &Path, params: &[String], channel_id: i32) -> io::Result<()> {
let disks = Disks::new_with_refreshed_list();
for disk in &disks {
if disk.mount_point().to_string_lossy().len() > 1
&& path.starts_with(disk.mount_point())
&& disk.available_space() < 1073741824
&& path.is_dir()
{
warn!(target: Target::file_mail(), channel = channel_id; "HLS storage space is less then 1GB, drain TS files...");
delete_ts(path, params)?
}
}
Ok(())
}
fn delete_ts<P: AsRef<Path> + Clone + std::fmt::Debug>(
path: P,
params: &[String],
) -> io::Result<()> {
let ts_file = params
.iter()
.filter(|f| f.to_lowercase().ends_with(".ts") || f.to_lowercase().ends_with(".m3u8"))
.collect::<Vec<&String>>();
for entry in WalkDir::new(path.clone())
.into_iter()
.flat_map(|e| e.ok())
.filter(|f| f.path().is_file())
.filter(|f| paths_match(&ts_file, &f.path().to_string_lossy()))
.map(|p| p.path().to_string_lossy().to_string())
{
fs::remove_file(entry)?;
}
Ok(())
}
fn paths_match(patterns: &Vec<&String>, actual_path: &str) -> bool {
for pattern in patterns {
let pattern_escaped = regex::escape(pattern);
let pattern_regex = pattern_escaped.replace(r"%d", r"\d+");
let re = Regex::new(&pattern_regex).unwrap();
if re.is_match(actual_path) {
return true;
}
}
false
}

View File

@ -1,8 +1,10 @@
use log::*;
use regex::Regex;
use simplelog::*;
use crate::utils::logging::Target;
/// Apply custom filters
pub fn filter_node(filter: &str) -> (String, String) {
pub fn filter_node(id: i32, filter: &str) -> (String, String) {
let re = Regex::new(r"^;?(\[[0-9]:[^\[]+\])?|\[[^\[]+\]$").unwrap(); // match start/end link
let mut video_filter = String::new();
let mut audio_filter = String::new();
@ -32,7 +34,7 @@ pub fn filter_node(filter: &str) -> (String, String) {
} else if filter.contains("[c_a_out]") {
audio_filter = re.replace_all(filter, "").to_string();
} else if !filter.is_empty() && filter != "~" {
error!("Custom filter is not well formatted, use correct out link names (\"[c_v_out]\" and/or \"[c_a_out]\"). Filter skipped!")
error!(target: Target::file_mail(), channel = id; "Custom filter is not well formatted, use correct out link names (\"[c_v_out]\" and/or \"[c_a_out]\"). Filter skipped!")
}
(video_filter, audio_filter)

View File

@ -4,18 +4,21 @@ use std::{
sync::{Arc, Mutex},
};
use log::*;
use regex::Regex;
use simplelog::*;
mod custom;
pub mod v_drawtext;
use crate::utils::{
controller::ProcessUnit::*, custom_format, fps_calc, is_close, Media, OutputMode::*,
PlayoutConfig,
use crate::player::{
controller::ProcessUnit::*,
utils::{custom_format, fps_calc, is_close, Media},
};
use super::vec_strings;
use crate::utils::{
config::{OutputMode::*, PlayoutConfig},
logging::Target,
};
use crate::vec_strings;
#[derive(Clone, Debug, Copy, Eq, PartialEq)]
pub enum FilterType {
@ -179,18 +182,14 @@ impl Filters {
impl Default for Filters {
fn default() -> Self {
Self::new(PlayoutConfig::new(None, None), 0)
Self::new(PlayoutConfig::default(), 0)
}
}
fn deinterlace(field_order: &Option<String>, chain: &mut Filters, config: &PlayoutConfig) {
if let Some(order) = field_order {
if order != "progressive" {
let deinterlace = match config
.advanced
.as_ref()
.and_then(|a| a.filters.deinterlace.clone())
{
let deinterlace = match config.advanced.filter.deinterlace.clone() {
Some(deinterlace) => deinterlace,
None => "yadif=0:-1:0".to_string(),
};
@ -206,22 +205,14 @@ fn pad(aspect: f64, chain: &mut Filters, v_stream: &ffprobe::Stream, config: &Pl
if let (Some(w), Some(h)) = (v_stream.width, v_stream.height) {
if w > config.processing.width && aspect > config.processing.aspect {
scale = match config
.advanced
.as_ref()
.and_then(|a| a.filters.pad_scale_w.clone())
{
scale = match config.advanced.filter.pad_scale_w.clone() {
Some(pad_scale_w) => {
custom_format(&format!("{pad_scale_w},"), &[&config.processing.width])
}
None => format!("scale={}:-1,", config.processing.width),
};
} else if h > config.processing.height && aspect < config.processing.aspect {
scale = match config
.advanced
.as_ref()
.and_then(|a| a.filters.pad_scale_h.clone())
{
scale = match config.advanced.filter.pad_scale_h.clone() {
Some(pad_scale_h) => {
custom_format(&format!("{pad_scale_h},"), &[&config.processing.width])
}
@ -230,11 +221,7 @@ fn pad(aspect: f64, chain: &mut Filters, v_stream: &ffprobe::Stream, config: &Pl
}
}
let pad = match config
.advanced
.as_ref()
.and_then(|a| a.filters.pad_video.clone())
{
let pad = match config.advanced.filter.pad_video.clone() {
Some(pad_video) => custom_format(
&format!("{scale}{pad_video}"),
&[
@ -254,7 +241,7 @@ fn pad(aspect: f64, chain: &mut Filters, v_stream: &ffprobe::Stream, config: &Pl
fn fps(fps: f64, chain: &mut Filters, config: &PlayoutConfig) {
if fps != config.processing.fps {
let fps_filter = match config.advanced.as_ref().and_then(|a| a.filters.fps.clone()) {
let fps_filter = match config.advanced.filter.fps.clone() {
Some(fps) => custom_format(&fps, &[&config.processing.fps]),
None => format!("fps={}", config.processing.fps),
};
@ -273,11 +260,7 @@ fn scale(
// width: i64, height: i64
if let (Some(w), Some(h)) = (width, height) {
if w != config.processing.width || h != config.processing.height {
let scale = match config
.advanced
.as_ref()
.and_then(|a| a.filters.scale.clone())
{
let scale = match config.advanced.filter.scale.clone() {
Some(scale) => custom_format(
&scale,
&[&config.processing.width, &config.processing.height],
@ -294,11 +277,7 @@ fn scale(
}
if !is_close(aspect, config.processing.aspect, 0.03) {
let dar = match config
.advanced
.as_ref()
.and_then(|a| a.filters.set_dar.clone())
{
let dar = match config.advanced.filter.set_dar.clone() {
Some(set_dar) => custom_format(&set_dar, &[&config.processing.aspect]),
None => format!("setdar=dar={}", config.processing.aspect),
};
@ -306,11 +285,7 @@ fn scale(
chain.add_filter(&dar, 0, Video);
}
} else {
let scale = match config
.advanced
.as_ref()
.and_then(|a| a.filters.scale.clone())
{
let scale = match config.advanced.filter.scale.clone() {
Some(scale) => custom_format(
&scale,
&[&config.processing.width, &config.processing.height],
@ -322,11 +297,7 @@ fn scale(
};
chain.add_filter(&scale, 0, Video);
let dar = match config
.advanced
.as_ref()
.and_then(|a| a.filters.set_dar.clone())
{
let dar = match config.advanced.filter.set_dar.clone() {
Some(set_dar) => custom_format(&set_dar, &[&config.processing.aspect]),
None => format!("setdar=dar={}", config.processing.aspect),
};
@ -357,18 +328,10 @@ fn fade(
let mut fade_in = format!("{t}fade=in:st=0:d=0.5");
if t == "a" {
if let Some(fade) = config
.advanced
.as_ref()
.and_then(|a| a.filters.afade_in.clone())
{
if let Some(fade) = config.advanced.filter.afade_in.clone() {
fade_in = custom_format(&fade, &[t]);
}
} else if let Some(fade) = config
.advanced
.as_ref()
.and_then(|a| a.filters.fade_in.clone())
{
} else if let Some(fade) = config.advanced.filter.fade_in.clone() {
fade_in = custom_format(&fade, &[t]);
};
@ -379,19 +342,10 @@ fn fade(
let mut fade_out = format!("{t}fade=out:st={}:d=1.0", (node.out - node.seek - 1.0));
if t == "a" {
if let Some(fade) = config
.advanced
.as_ref()
.and_then(|a| a.filters.afade_out.clone())
{
if let Some(fade) = config.advanced.filter.afade_out.clone() {
fade_out = custom_format(&fade, &[node.out - node.seek - 1.0]);
}
} else if let Some(fade) = config
.advanced
.as_ref()
.and_then(|a| a.filters.fade_out.clone())
.clone()
{
} else if let Some(fade) = config.advanced.filter.fade_out.clone().clone() {
fade_out = custom_format(&fade, &[node.out - node.seek - 1.0]);
};
@ -415,11 +369,7 @@ fn overlay(node: &mut Media, chain: &mut Filters, config: &PlayoutConfig) {
);
if node.last_ad {
match config
.advanced
.as_ref()
.and_then(|a| a.filters.overlay_logo_fade_in.clone())
{
match config.advanced.filter.overlay_logo_fade_in.clone() {
Some(fade_in) => logo_chain.push_str(&format!(",{fade_in}")),
None => logo_chain.push_str(",fade=in:st=0:d=1.0:alpha=1"),
};
@ -428,11 +378,7 @@ fn overlay(node: &mut Media, chain: &mut Filters, config: &PlayoutConfig) {
if node.next_ad {
let length = node.out - node.seek - 1.0;
match config
.advanced
.as_ref()
.and_then(|a| a.filters.overlay_logo_fade_out.clone())
{
match config.advanced.filter.overlay_logo_fade_out.clone() {
Some(fade_out) => {
logo_chain.push_str(&custom_format(&format!(",{fade_out}"), &[length]))
}
@ -441,11 +387,7 @@ fn overlay(node: &mut Media, chain: &mut Filters, config: &PlayoutConfig) {
}
if !config.processing.logo_scale.is_empty() {
match &config
.advanced
.as_ref()
.and_then(|a| a.filters.overlay_logo_scale.clone())
{
match &config.advanced.filter.overlay_logo_scale.clone() {
Some(logo_scale) => logo_chain.push_str(&custom_format(
&format!(",{logo_scale}"),
&[&config.processing.logo_scale],
@ -454,11 +396,7 @@ fn overlay(node: &mut Media, chain: &mut Filters, config: &PlayoutConfig) {
}
}
match config
.advanced
.as_ref()
.and_then(|a| a.filters.overlay_logo.clone())
{
match config.advanced.filter.overlay_logo.clone() {
Some(overlay) => {
if !overlay.starts_with(',') {
logo_chain.push(',');
@ -490,11 +428,7 @@ fn extend_video(node: &mut Media, chain: &mut Filters, config: &PlayoutConfig) {
if node.out - node.seek > video_duration - node.seek + 0.1 && node.duration >= node.out {
let duration = (node.out - node.seek) - (video_duration - node.seek);
let tpad = match config
.advanced
.as_ref()
.and_then(|a| a.filters.tpad.clone())
{
let tpad = match config.advanced.filter.tpad.clone() {
Some(pad) => custom_format(&pad, &[duration]),
None => format!("tpad=stop_mode=add:stop_duration={duration}"),
};
@ -512,7 +446,7 @@ fn add_text(
filter_chain: &Option<Arc<Mutex<Vec<String>>>>,
) {
if config.text.add_text
&& (config.text.text_from_filename || config.out.mode == HLS || node.unit == Encoder)
&& (config.text.text_from_filename || config.output.mode == HLS || node.unit == Encoder)
{
let filter = v_drawtext::filter_node(config, Some(node), filter_chain);
@ -521,11 +455,7 @@ fn add_text(
}
fn add_audio(node: &Media, chain: &mut Filters, nr: i32, config: &PlayoutConfig) {
let audio = match config
.advanced
.as_ref()
.and_then(|a| a.filters.aevalsrc.clone())
{
let audio = match config.advanced.filter.aevalsrc.clone() {
Some(aevalsrc) => custom_format(&aevalsrc, &[node.out - node.seek]),
None => format!(
"aevalsrc=0:channel_layout=stereo:duration={}:sample_rate=48000",
@ -547,11 +477,7 @@ fn extend_audio(node: &mut Media, chain: &mut Filters, nr: i32, config: &Playout
{
if node.out - node.seek > audio_duration - node.seek + 0.1 && node.duration >= node.out
{
let apad = match config
.advanced
.as_ref()
.and_then(|a| a.filters.apad.clone())
{
let apad = match config.advanced.filter.apad.clone() {
Some(apad) => custom_format(&apad, &[node.out - node.seek]),
None => format!("apad=whole_dur={}", node.out - node.seek),
};
@ -564,11 +490,7 @@ fn extend_audio(node: &mut Media, chain: &mut Filters, nr: i32, config: &Playout
fn audio_volume(chain: &mut Filters, config: &PlayoutConfig, nr: i32) {
if config.processing.volume != 1.0 {
let volume = match config
.advanced
.as_ref()
.and_then(|a| a.filters.volume.clone())
{
let volume = match config.advanced.filter.volume.clone() {
Some(volume) => custom_format(&volume, &[config.processing.volume]),
None => format!("volume={}", config.processing.volume),
};
@ -610,11 +532,7 @@ pub fn split_filter(
}
}
let split = match config
.advanced
.as_ref()
.and_then(|a| a.filters.split.clone())
{
let split = match config.advanced.filter.split.clone() {
Some(split) => custom_format(&split, &[count.to_string(), out_link.join("")]),
None => format!("split={count}{}", out_link.join("")),
};
@ -626,7 +544,7 @@ pub fn split_filter(
/// Process output filter chain and add new filters to existing ones.
fn process_output_filters(config: &PlayoutConfig, chain: &mut Filters, custom_filter: &str) {
let filter =
if (config.text.add_text && !config.text.text_from_filename) || config.out.mode == HLS {
if (config.text.add_text && !config.text.text_from_filename) || config.output.mode == HLS {
let re_v = Regex::new(r"\[[0:]+[v^\[]+([:0]+)?\]").unwrap(); // match video filter input link
let _re_a = Regex::new(r"\[[0:]+[a^\[]+([:0]+)?\]").unwrap(); // match video filter input link
let mut cf = custom_filter.to_string();
@ -679,10 +597,10 @@ pub fn filter_chains(
add_text(node, &mut filters, config, filter_chain);
}
if let Some(f) = config.out.output_filter.clone() {
if let Some(f) = config.output.output_filter.clone() {
process_output_filters(config, &mut filters, &f)
} else if config.out.output_count > 1 && !config.processing.audio_only {
split_filter(&mut filters, config.out.output_count, 0, Video, config);
} else if config.output.output_count > 1 && !config.processing.audio_only {
split_filter(&mut filters, config.output.output_count, 0, Video, config);
}
return filters;
@ -722,12 +640,12 @@ pub fn filter_chains(
}
let (proc_vf, proc_af) = if node.unit == Ingest {
custom::filter_node(&config.ingest.custom_filter)
custom::filter_node(config.general.channel_id, &config.ingest.custom_filter)
} else {
custom::filter_node(&config.processing.custom_filter)
custom::filter_node(config.general.channel_id, &config.processing.custom_filter)
};
let (list_vf, list_af) = custom::filter_node(&node.custom_filter);
let (list_vf, list_af) = custom::filter_node(config.general.channel_id, &node.custom_filter);
if !config.processing.copy_video {
custom(&proc_vf, &mut filters, 0, Video);
@ -756,7 +674,7 @@ pub fn filter_chains(
extend_audio(node, &mut filters, i, config);
} else if node.unit == Decoder {
if !node.source.contains("color=c=") {
warn!(
warn!(target: Target::file_mail(), channel = config.general.channel_id;
"Missing audio track (id {i}) from <b><magenta>{}</></b>",
node.source
);
@ -776,11 +694,11 @@ pub fn filter_chains(
custom(&list_af, &mut filters, i, Audio);
}
} else if config.processing.audio_track_index > -1 {
error!("Setting 'audio_track_index' other than '-1' is not allowed in audio copy mode!")
error!(target: Target::file_mail(), channel = config.general.channel_id; "Setting 'audio_track_index' other than '-1' is not allowed in audio copy mode!")
}
if config.out.mode == HLS {
if let Some(f) = config.out.output_filter.clone() {
if config.output.mode == HLS {
if let Some(f) = config.output.output_filter.clone() {
process_output_filters(config, &mut filters, &f)
}
}

View File

@ -6,7 +6,11 @@ use std::{
use regex::Regex;
use crate::utils::{controller::ProcessUnit::*, custom_format, Media, PlayoutConfig};
use crate::player::{
controller::ProcessUnit::*,
utils::{custom_format, Media},
};
use crate::utils::config::PlayoutConfig;
pub fn filter_node(
config: &PlayoutConfig,
@ -44,11 +48,7 @@ pub fn filter_node(
.replace('%', "\\\\\\%")
.replace(':', "\\:");
filter = match &config
.advanced
.clone()
.and_then(|a| a.filters.drawtext_from_file)
{
filter = match &config.advanced.filter.drawtext_from_file {
Some(drawtext) => custom_format(drawtext, &[&escaped_text, &config.text.style, &font]),
None => format!("drawtext=text='{escaped_text}':{}{font}", config.text.style),
};
@ -61,11 +61,7 @@ pub fn filter_node(
}
}
filter = match config
.advanced
.as_ref()
.and_then(|a| a.filters.drawtext_from_zmq.clone())
{
filter = match config.advanced.filter.drawtext_from_zmq.clone() {
Some(drawtext) => custom_format(&drawtext, &[&socket.replace(':', "\\:"), &filter_cmd]),
None => format!(
"zmq=b=tcp\\\\://'{}',drawtext@dyntext={filter_cmd}",

View File

@ -9,15 +9,16 @@ use std::{
time::Duration,
};
use log::*;
use notify::{
event::{CreateKind, ModifyKind, RemoveKind, RenameMode},
EventKind::{Create, Modify, Remove},
RecursiveMode, Watcher,
};
use notify_debouncer_full::new_debouncer;
use simplelog::*;
use ffplayout_lib::utils::{include_file_extension, Media, PlayoutConfig};
use crate::player::utils::{include_file_extension, Media};
use crate::utils::{config::PlayoutConfig, logging::Target};
/// Create a watcher, which monitor file changes.
/// When a change is register, update the current file list.
@ -27,7 +28,8 @@ pub fn watchman(
is_terminated: Arc<AtomicBool>,
sources: Arc<Mutex<Vec<Media>>>,
) {
let path = Path::new(&config.storage.path);
let id = config.general.channel_id;
let path = Path::new(&config.global.storage_path);
if !path.exists() {
error!("Folder path not exists: '{path:?}'");
@ -57,7 +59,7 @@ pub fn watchman(
let media = Media::new(index, &new_path.to_string_lossy(), false);
sources.lock().unwrap().push(media);
info!("Create new file: <b><magenta>{new_path:?}</></b>");
info!(target: Target::file_mail(), channel = id; "Create new file: <b><magenta>{new_path:?}</></b>");
}
}
Remove(RemoveKind::File) | Modify(ModifyKind::Name(RenameMode::From)) => {
@ -68,7 +70,7 @@ pub fn watchman(
.lock()
.unwrap()
.retain(|x| x.source != old_path.to_string_lossy());
info!("Remove file: <b><magenta>{old_path:?}</></b>");
info!(target: Target::file_mail(), channel = id; "Remove file: <b><magenta>{old_path:?}</></b>");
}
}
Modify(ModifyKind::Name(RenameMode::Both)) => {
@ -82,16 +84,16 @@ pub fn watchman(
.position(|x| *x.source == old_path.display().to_string()) {
let media = Media::new(index, &new_path.to_string_lossy(), false);
media_list[index] = media;
info!("Move file: <b><magenta>{old_path:?}</></b> to <b><magenta>{new_path:?}</></b>");
info!(target: Target::file_mail(), channel = id; "Move file: <b><magenta>{old_path:?}</></b> to <b><magenta>{new_path:?}</></b>");
} else if include_file_extension(&config, new_path) {
let index = media_list.len();
let media = Media::new(index, &new_path.to_string_lossy(), false);
media_list.push(media);
info!("Create new file: <b><magenta>{new_path:?}</></b>");
info!(target: Target::file_mail(), channel = id; "Create new file: <b><magenta>{new_path:?}</></b>");
}
}
_ => debug!("Not tracked file event: {event:?}")
_ => debug!(target: Target::file_mail(), channel = id; "Not tracked file event: {event:?}")
}),
Err(errors) => errors.iter().for_each(|error| error!("{error:?}")),
}

View File

@ -1,28 +1,33 @@
use std::{
io::{BufRead, BufReader, Error, Read},
process::{exit, ChildStderr, Command, Stdio},
io::{BufRead, BufReader, Read},
process::{ChildStderr, Command, Stdio},
sync::atomic::Ordering,
thread,
};
use crossbeam_channel::Sender;
use simplelog::*;
use log::*;
use crate::utils::{log_line, valid_stream};
use ffplayout_lib::{
utils::{
controller::ProcessUnit::*, test_tcp_port, Media, PlayoutConfig, ProcessControl,
FFMPEG_IGNORE_ERRORS, FFMPEG_UNRECOVERABLE_ERRORS,
use crate::utils::{
config::{PlayoutConfig, FFMPEG_IGNORE_ERRORS, FFMPEG_UNRECOVERABLE_ERRORS},
logging::{log_line, Target},
};
use crate::vec_strings;
use crate::{
player::{
controller::{ChannelManager, ProcessUnit::*},
utils::{test_tcp_port, valid_stream, Media},
},
vec_strings,
utils::errors::ProcessError,
};
fn server_monitor(
id: i32,
level: &str,
ignore: Vec<String>,
buffer: BufReader<ChildStderr>,
proc_ctl: ProcessControl,
) -> Result<(), Error> {
channel_mgr: ChannelManager,
) -> Result<(), ProcessError> {
for line in buffer.lines() {
let line = line?;
@ -33,8 +38,8 @@ fn server_monitor(
}
if line.contains("rtmp") && line.contains("Unexpected stream") && !valid_stream(&line) {
if let Err(e) = proc_ctl.stop(Ingest) {
error!("{e}");
if let Err(e) = channel_mgr.stop(Ingest) {
error!(target: Target::file_mail(), channel = id; "{e}");
};
}
@ -42,7 +47,7 @@ fn server_monitor(
.iter()
.any(|i| line.contains(*i))
{
proc_ctl.stop_all();
channel_mgr.stop_all();
}
}
@ -55,20 +60,19 @@ fn server_monitor(
pub fn ingest_server(
config: PlayoutConfig,
ingest_sender: Sender<(usize, [u8; 65088])>,
proc_control: ProcessControl,
) -> Result<(), Error> {
channel_mgr: ChannelManager,
) -> Result<(), ProcessError> {
let id = config.general.channel_id;
let mut buffer: [u8; 65088] = [0; 65088];
let mut server_cmd = vec_strings!["-hide_banner", "-nostats", "-v", "level+info"];
let stream_input = config.ingest.input_cmd.clone().unwrap();
let mut dummy_media = Media::new(0, "Live Stream", false);
dummy_media.unit = Ingest;
dummy_media.add_filter(&config, &None);
let is_terminated = channel_mgr.is_terminated.clone();
let ingest_is_running = channel_mgr.ingest_is_running.clone();
if let Some(ingest_input_cmd) = config
.advanced
.as_ref()
.and_then(|a| a.ingest.input_cmd.clone())
{
if let Some(ingest_input_cmd) = config.advanced.ingest.input_cmd {
server_cmd.append(&mut ingest_input_cmd.clone());
}
@ -86,22 +90,21 @@ pub fn ingest_server(
let mut is_running;
if let Some(url) = stream_input.iter().find(|s| s.contains("://")) {
if !test_tcp_port(url) {
proc_control.stop_all();
exit(1);
if !test_tcp_port(id, url) {
channel_mgr.stop_all();
}
info!("Start ingest server, listening on: <b><magenta>{url}</></b>",);
info!(target: Target::file_mail(), channel = id; "Start ingest server, listening on: <b><magenta>{url}</></b>",);
};
debug!(
debug!(target: Target::file_mail(), channel = id;
"Server CMD: <bright-blue>\"ffmpeg {}\"</>",
server_cmd.join(" ")
);
while !proc_control.is_terminated.load(Ordering::SeqCst) {
let proc_ctl = proc_control.clone();
let level = config.logging.ingest_level.clone().unwrap();
while !is_terminated.load(Ordering::SeqCst) {
let proc_ctl = channel_mgr.clone();
let level = config.logging.ingest_level.clone();
let ignore = config.logging.ignore_lines.clone();
let mut server_proc = match Command::new("ffmpeg")
.args(server_cmd.clone())
@ -110,7 +113,7 @@ pub fn ingest_server(
.spawn()
{
Err(e) => {
error!("couldn't spawn ingest server: {e}");
error!(target: Target::file_mail(), channel = id; "couldn't spawn ingest server: {e}");
panic!("couldn't spawn ingest server: {e}")
}
Ok(proc) => proc,
@ -118,30 +121,30 @@ pub fn ingest_server(
let mut ingest_reader = BufReader::new(server_proc.stdout.take().unwrap());
let server_err = BufReader::new(server_proc.stderr.take().unwrap());
let error_reader_thread =
thread::spawn(move || server_monitor(&level, ignore, server_err, proc_ctl));
thread::spawn(move || server_monitor(id, &level, ignore, server_err, proc_ctl));
*proc_control.server_term.lock().unwrap() = Some(server_proc);
*channel_mgr.ingest.lock().unwrap() = Some(server_proc);
is_running = false;
loop {
let bytes_len = match ingest_reader.read(&mut buffer[..]) {
Ok(length) => length,
Err(e) => {
debug!("Ingest server read {e:?}");
debug!(target: Target::file_mail(), channel = id; "Ingest server read {e:?}");
break;
}
};
if !is_running {
proc_control.server_is_running.store(true, Ordering::SeqCst);
ingest_is_running.store(true, Ordering::SeqCst);
is_running = true;
}
if bytes_len > 0 {
if let Err(e) = ingest_sender.send((bytes_len, buffer)) {
error!("Ingest server write error: {e:?}");
error!(target: Target::file_mail(), channel = id; "Ingest server write error: {e:?}");
proc_control.is_terminated.store(true, Ordering::SeqCst);
is_terminated.store(true, Ordering::SeqCst);
break;
}
} else {
@ -150,16 +153,14 @@ pub fn ingest_server(
}
drop(ingest_reader);
proc_control
.server_is_running
.store(false, Ordering::SeqCst);
ingest_is_running.store(false, Ordering::SeqCst);
if let Err(e) = proc_control.wait(Ingest) {
error!("{e}")
if let Err(e) = channel_mgr.wait(Ingest) {
error!(target: Target::file_mail(), channel = id; "{e}")
}
if let Err(e) = error_reader_thread.join() {
error!("{e:?}");
error!(target: Target::file_mail(), channel = id; "{e:?}");
};
}

View File

@ -0,0 +1,50 @@
use std::thread;
use log::*;
pub mod folder;
pub mod ingest;
pub mod playlist;
pub use folder::watchman;
pub use ingest::ingest_server;
pub use playlist::CurrentProgram;
use crate::player::{
controller::ChannelManager,
utils::{folder::FolderSource, Media},
};
use crate::utils::{config::ProcessMode::*, logging::Target};
/// Create a source iterator from playlist, or from folder.
pub fn source_generator(manager: ChannelManager) -> Box<dyn Iterator<Item = Media>> {
let config = manager.config.lock().unwrap().clone();
let id = config.general.channel_id;
let is_terminated = manager.is_terminated.clone();
let current_list = manager.current_list.clone();
match config.processing.mode {
Folder => {
info!(target: Target::file_mail(), channel = id; "Playout in folder mode");
debug!(target: Target::file_mail(), channel = id;
"Monitor folder: <b><magenta>{:?}</></b>",
config.global.storage_path
);
let config_clone = config.clone();
let folder_source = FolderSource::new(&config, manager);
let list_clone = current_list.clone();
// Spawn a thread to monitor folder for file changes.
thread::spawn(move || watchman(config_clone, is_terminated.clone(), list_clone));
Box::new(folder_source) as Box<dyn Iterator<Item = Media>>
}
Playlist => {
info!(target: Target::file_mail(), channel = id; "Playout in playlist mode");
let program = CurrentProgram::new(manager);
Box::new(program) as Box<dyn Iterator<Item = Media>>
}
}
}

View File

@ -1,21 +1,26 @@
use std::{
fs,
path::Path,
sync::{
atomic::{AtomicBool, Ordering},
Arc,
Arc, Mutex,
},
};
use serde_json::json;
use simplelog::*;
use log::*;
use ffplayout_lib::utils::{
controller::PlayerControl,
gen_dummy, get_delta, is_close, is_remote,
json_serializer::{read_json, set_defaults},
loop_filler, loop_image, modified_time, seek_and_length, time_in_seconds, JsonPlaylist, Media,
MediaProbe, PlayoutConfig, PlayoutStatus, IMAGE_FORMAT,
use crate::db::handles;
use crate::player::{
controller::ChannelManager,
utils::{
gen_dummy, get_delta, is_close, is_remote,
json_serializer::{read_json, set_defaults},
loop_filler, loop_image, modified_time, seek_and_length, time_in_seconds, JsonPlaylist,
Media, MediaProbe,
},
};
use crate::utils::{
config::{PlayoutConfig, IMAGE_FORMAT},
logging::Target,
};
/// Struct for current playlist.
@ -23,38 +28,36 @@ use ffplayout_lib::utils::{
/// Here we prepare the init clip and build a iterator where we pull our clips.
#[derive(Debug)]
pub struct CurrentProgram {
id: i32,
config: PlayoutConfig,
manager: ChannelManager,
start_sec: f64,
end_sec: f64,
json_playlist: JsonPlaylist,
player_control: PlayerControl,
current_node: Media,
is_terminated: Arc<AtomicBool>,
playout_stat: PlayoutStatus,
last_json_path: Option<String>,
last_node_ad: bool,
}
/// Prepare a playlist iterator.
impl CurrentProgram {
pub fn new(
config: &PlayoutConfig,
playout_stat: PlayoutStatus,
is_terminated: Arc<AtomicBool>,
player_control: &PlayerControl,
) -> Self {
pub fn new(manager: ChannelManager) -> Self {
let config = manager.config.lock().unwrap().clone();
let is_terminated = manager.is_terminated.clone();
Self {
id: config.general.channel_id,
config: config.clone(),
manager,
start_sec: config.playlist.start_sec.unwrap(),
end_sec: config.playlist.length_sec.unwrap(),
json_playlist: JsonPlaylist::new(
"1970-01-01".to_string(),
config.playlist.start_sec.unwrap(),
),
player_control: player_control.clone(),
current_node: Media::new(0, "", false),
is_terminated,
playout_stat,
last_json_path: None,
last_node_ad: false,
}
@ -70,8 +73,8 @@ impl CurrentProgram {
if (Path::new(&path).is_file() || is_remote(&path))
&& self.json_playlist.modified != modified_time(&path)
{
info!("Reload playlist <b><magenta>{path}</></b>");
self.playout_stat.list_init.store(true, Ordering::SeqCst);
info!(target: Target::file_mail(), channel = self.id; "Reload playlist <b><magenta>{path}</></b>");
self.manager.list_init.store(true, Ordering::SeqCst);
get_current = true;
reload = true;
}
@ -82,7 +85,7 @@ impl CurrentProgram {
if get_current {
self.json_playlist = read_json(
&mut self.config,
&self.player_control,
self.manager.current_list.clone(),
self.json_playlist.path.clone(),
self.is_terminated.clone(),
seek,
@ -91,21 +94,30 @@ impl CurrentProgram {
if !reload {
if let Some(file) = &self.json_playlist.path {
info!("Read playlist: <b><magenta>{file}</></b>");
info!(target: Target::file_mail(), channel = self.id; "Read playlist: <b><magenta>{file}</></b>");
}
if *self.playout_stat.date.lock().unwrap() != self.json_playlist.date {
if *self
.manager
.channel
.lock()
.unwrap()
.last_date
.clone()
.unwrap_or_default()
!= self.json_playlist.date
{
self.set_status(self.json_playlist.date.clone());
}
self.playout_stat
self.manager
.current_date
.lock()
.unwrap()
.clone_from(&self.json_playlist.date);
}
self.player_control
self.manager
.current_list
.lock()
.unwrap()
@ -115,8 +127,8 @@ impl CurrentProgram {
trace!("missing playlist");
self.current_node = Media::new(0, "", false);
self.playout_stat.list_init.store(true, Ordering::SeqCst);
self.player_control.current_index.store(0, Ordering::SeqCst);
self.manager.list_init.store(true, Ordering::SeqCst);
self.manager.current_index.store(0, Ordering::SeqCst);
}
}
}
@ -138,14 +150,12 @@ impl CurrentProgram {
let mut next_start =
self.current_node.begin.unwrap_or_default() - self.start_sec + duration + delta;
if node_index > 0
&& node_index == self.player_control.current_list.lock().unwrap().len() - 1
{
if node_index > 0 && node_index == self.manager.current_list.lock().unwrap().len() - 1 {
next_start += self.config.general.stop_threshold;
}
trace!(
"delta: {delta} | total_delta: {total_delta}, index: {node_index} \nnext_start: {next_start} | end_sec: {} | source {}",
"delta: {delta} | total_delta: {total_delta}, index: {node_index} \n next_start: {next_start} | end_sec: {} | source {}",
self.end_sec,
self.current_node.source
);
@ -161,7 +171,7 @@ impl CurrentProgram {
self.json_playlist = read_json(
&mut self.config,
&self.player_control,
self.manager.current_list.clone(),
None,
self.is_terminated.clone(),
false,
@ -169,18 +179,18 @@ impl CurrentProgram {
);
if let Some(file) = &self.json_playlist.path {
info!("Read next playlist: <b><magenta>{file}</></b>");
info!(target: Target::file_mail(), channel = self.id; "Read next playlist: <b><magenta>{file}</></b>");
}
self.playout_stat.list_init.store(false, Ordering::SeqCst);
self.manager.list_init.store(false, Ordering::SeqCst);
self.set_status(self.json_playlist.date.clone());
self.player_control
self.manager
.current_list
.lock()
.unwrap()
.clone_from(&self.json_playlist.program);
self.player_control.current_index.store(0, Ordering::SeqCst);
self.manager.current_index.store(0, Ordering::SeqCst);
} else {
self.load_or_update_playlist(seek)
}
@ -189,35 +199,39 @@ impl CurrentProgram {
}
fn set_status(&mut self, date: String) {
if *self.playout_stat.date.lock().unwrap() != date
&& *self.playout_stat.time_shift.lock().unwrap() != 0.0
if self.manager.channel.lock().unwrap().last_date != Some(date.clone())
&& self.manager.channel.lock().unwrap().time_shift != 0.0
{
info!("Reset playout status");
info!(target: Target::file_mail(), channel = self.id; "Reset playout status");
}
self.playout_stat
.current_date
self.manager.current_date.lock().unwrap().clone_from(&date);
self.manager
.channel
.lock()
.unwrap()
.clone_from(&date);
*self.playout_stat.time_shift.lock().unwrap() = 0.0;
.last_date
.clone_from(&Some(date.clone()));
self.manager.channel.lock().unwrap().time_shift = 0.0;
let db_pool = self.manager.db_pool.clone().unwrap();
if let Err(e) = fs::write(
&self.config.general.stat_file,
serde_json::to_string(&json!({
"time_shift": 0.0,
"date": date,
}))
.unwrap(),
) {
error!("Unable to write status file: {e}");
if let Err(e) = tokio::runtime::Runtime::new()
.unwrap()
.block_on(handles::update_stat(
&db_pool,
self.config.general.channel_id,
date,
0.0,
))
{
error!(target: Target::file_mail(), channel = self.id; "Unable to write status: {e}");
};
}
// Check if last and/or next clip is a advertisement.
fn last_next_ad(&mut self, node: &mut Media) {
let index = self.player_control.current_index.load(Ordering::SeqCst);
let current_list = self.player_control.current_list.lock().unwrap();
let index = self.manager.current_index.load(Ordering::SeqCst);
let current_list = self.manager.current_list.lock().unwrap();
if index + 1 < current_list.len() && &current_list[index + 1].category == "advertisement" {
node.next_ad = true;
@ -246,10 +260,10 @@ impl CurrentProgram {
// On init or reload we need to seek for the current clip.
fn get_current_clip(&mut self) {
let mut time_sec = self.get_current_time();
let shift = *self.playout_stat.time_shift.lock().unwrap();
let shift = self.manager.channel.lock().unwrap().time_shift;
if shift != 0.0 {
info!("Shift playlist start for <yellow>{shift:.3}</> seconds");
info!(target: Target::file_mail(), channel = self.id; "Shift playlist start for <yellow>{shift:.3}</> seconds");
time_sec += shift;
}
@ -260,17 +274,10 @@ impl CurrentProgram {
self.recalculate_begin(true)
}
for (i, item) in self
.player_control
.current_list
.lock()
.unwrap()
.iter()
.enumerate()
{
for (i, item) in self.manager.current_list.lock().unwrap().iter().enumerate() {
if item.begin.unwrap() + item.out - item.seek > time_sec {
self.playout_stat.list_init.store(false, Ordering::SeqCst);
self.player_control.current_index.store(i, Ordering::SeqCst);
self.manager.list_init.store(false, Ordering::SeqCst);
self.manager.current_index.store(i, Ordering::SeqCst);
break;
}
@ -283,10 +290,10 @@ impl CurrentProgram {
self.get_current_clip();
let mut is_filler = false;
if !self.playout_stat.list_init.load(Ordering::SeqCst) {
if !self.manager.list_init.load(Ordering::SeqCst) {
let time_sec = self.get_current_time();
let index = self.player_control.current_index.load(Ordering::SeqCst);
let nodes = self.player_control.current_list.lock().unwrap();
let index = self.manager.current_index.load(Ordering::SeqCst);
let nodes = self.manager.current_list.lock().unwrap();
let last_index = nodes.len() - 1;
// de-instance node to preserve original values in list
@ -298,27 +305,23 @@ impl CurrentProgram {
trace!("Clip from init: {}", node_clone.source);
node_clone.seek += time_sec
- (node_clone.begin.unwrap() - *self.playout_stat.time_shift.lock().unwrap());
- (node_clone.begin.unwrap() - self.manager.channel.lock().unwrap().time_shift);
self.last_next_ad(&mut node_clone);
self.player_control
.current_index
.fetch_add(1, Ordering::SeqCst);
self.manager.current_index.fetch_add(1, Ordering::SeqCst);
self.current_node = handle_list_init(
&self.config,
node_clone,
&self.playout_stat,
&self.player_control,
last_index,
);
self.current_node =
handle_list_init(&self.config, node_clone, &self.manager, last_index);
if self
.current_node
.source
.contains(&self.config.storage.path.to_string_lossy().to_string())
|| self.current_node.source.contains("color=c=#121212")
if self.current_node.source.contains(
&self
.config
.global
.storage_path
.to_string_lossy()
.to_string(),
) || self.current_node.source.contains("color=c=#121212")
{
is_filler = true;
}
@ -329,7 +332,7 @@ impl CurrentProgram {
fn fill_end(&mut self, total_delta: f64) {
// Fill end from playlist
let index = self.player_control.current_index.load(Ordering::SeqCst);
let index = self.manager.current_index.load(Ordering::SeqCst);
let mut media = Media::new(index, "", false);
media.begin = Some(time_in_seconds());
media.duration = total_delta;
@ -337,15 +340,9 @@ impl CurrentProgram {
self.last_next_ad(&mut media);
self.current_node = gen_source(
&self.config,
media,
&self.playout_stat,
&self.player_control,
0,
);
self.current_node = gen_source(&self.config, media, &self.manager, 0);
self.player_control
self.manager
.current_list
.lock()
.unwrap()
@ -353,15 +350,13 @@ impl CurrentProgram {
self.current_node.last_ad = self.last_node_ad;
self.current_node
.add_filter(&self.config, &self.playout_stat.chain);
.add_filter(&self.config, &self.manager.filter_chain);
self.player_control
.current_index
.fetch_add(1, Ordering::SeqCst);
self.manager.current_index.fetch_add(1, Ordering::SeqCst);
}
fn recalculate_begin(&mut self, extend: bool) {
debug!("Infinit playlist reaches end, recalculate clip begins.");
debug!(target: Target::file_mail(), channel = self.id; "Infinit playlist reaches end, recalculate clip begins.");
let mut time_sec = time_in_seconds();
@ -371,7 +366,7 @@ impl CurrentProgram {
self.json_playlist.start_sec = Some(time_sec);
set_defaults(&mut self.json_playlist);
self.player_control
self.manager
.current_list
.lock()
.unwrap()
@ -386,9 +381,9 @@ impl Iterator for CurrentProgram {
fn next(&mut self) -> Option<Self::Item> {
self.last_json_path.clone_from(&self.json_playlist.path);
self.last_node_ad = self.current_node.last_ad;
self.check_for_playlist(self.playout_stat.list_init.load(Ordering::SeqCst));
self.check_for_playlist(self.manager.list_init.load(Ordering::SeqCst));
if self.playout_stat.list_init.load(Ordering::SeqCst) {
if self.manager.list_init.load(Ordering::SeqCst) {
trace!("Init playlist, from next iterator");
let mut init_clip_is_filler = false;
@ -396,7 +391,7 @@ impl Iterator for CurrentProgram {
init_clip_is_filler = self.init_clip();
}
if self.playout_stat.list_init.load(Ordering::SeqCst) && !init_clip_is_filler {
if self.manager.list_init.load(Ordering::SeqCst) && !init_clip_is_filler {
// On init load, playlist could be not long enough, or clips are not found
// so we fill the gap with a dummy.
trace!("Init clip is no filler");
@ -409,7 +404,7 @@ impl Iterator for CurrentProgram {
}
let mut last_index = 0;
let length = self.player_control.current_list.lock().unwrap().len();
let length = self.manager.current_list.lock().unwrap().len();
if length > 0 {
last_index = length - 1;
@ -422,26 +417,20 @@ impl Iterator for CurrentProgram {
self.last_next_ad(&mut media);
self.current_node = gen_source(
&self.config,
media,
&self.playout_stat,
&self.player_control,
last_index,
);
self.current_node = gen_source(&self.config, media, &self.manager, last_index);
}
return Some(self.current_node.clone());
}
if self.player_control.current_index.load(Ordering::SeqCst)
< self.player_control.current_list.lock().unwrap().len()
if self.manager.current_index.load(Ordering::SeqCst)
< self.manager.current_list.lock().unwrap().len()
{
// get next clip from current playlist
let mut is_last = false;
let index = self.player_control.current_index.load(Ordering::SeqCst);
let node_list = self.player_control.current_list.lock().unwrap();
let index = self.manager.current_index.load(Ordering::SeqCst);
let node_list = self.manager.current_list.lock().unwrap();
let mut node = node_list[index].clone();
let last_index = node_list.len() - 1;
@ -453,18 +442,10 @@ impl Iterator for CurrentProgram {
self.last_next_ad(&mut node);
self.current_node = timed_source(
node,
&self.config,
is_last,
&self.playout_stat,
&self.player_control,
last_index,
);
self.current_node =
timed_source(node, &self.config, is_last, &self.manager, last_index);
self.player_control
.current_index
.fetch_add(1, Ordering::SeqCst);
self.manager.current_index.fetch_add(1, Ordering::SeqCst);
Some(self.current_node.clone())
} else {
@ -484,7 +465,7 @@ impl Iterator for CurrentProgram {
}
// Get first clip from next playlist.
let c_list = self.player_control.current_list.lock().unwrap();
let c_list = self.manager.current_list.lock().unwrap();
let mut first_node = c_list[0].clone();
drop(c_list);
@ -493,19 +474,13 @@ impl Iterator for CurrentProgram {
self.recalculate_begin(false)
}
self.player_control.current_index.store(0, Ordering::SeqCst);
self.manager.current_index.store(0, Ordering::SeqCst);
self.last_next_ad(&mut first_node);
first_node.last_ad = self.last_node_ad;
self.current_node = gen_source(
&self.config,
first_node,
&self.playout_stat,
&self.player_control,
0,
);
self.current_node = gen_source(&self.config, first_node, &self.manager, 0);
self.player_control.current_index.store(1, Ordering::SeqCst);
self.manager.current_index.store(1, Ordering::SeqCst);
Some(self.current_node.clone())
}
@ -520,35 +495,40 @@ fn timed_source(
node: Media,
config: &PlayoutConfig,
last: bool,
playout_stat: &PlayoutStatus,
player_control: &PlayerControl,
manager: &ChannelManager,
last_index: usize,
) -> Media {
let id = config.general.channel_id;
let time_shift = manager.channel.lock().unwrap().time_shift;
let current_date = manager.current_date.lock().unwrap().clone();
let last_date = manager.channel.lock().unwrap().last_date.clone();
let (delta, total_delta) = get_delta(config, &node.begin.unwrap());
let mut shifted_delta = delta;
let mut new_node = node.clone();
new_node.process = Some(false);
trace!("Node begin: {}", node.begin.unwrap());
trace!("timed source is last: {last}");
trace!(
"Node - begin: {} | source: {}",
node.begin.unwrap(),
node.source
);
trace!(
"timed source is last: {last} | current_date: {current_date} | last_date: {last_date:?} | time_shift: {time_shift}"
);
if config.playlist.length.contains(':') {
let time_shift = playout_stat.time_shift.lock().unwrap();
if Some(current_date) == last_date && time_shift != 0.0 {
shifted_delta = delta - time_shift;
if *playout_stat.current_date.lock().unwrap() == *playout_stat.date.lock().unwrap()
&& *time_shift != 0.0
{
shifted_delta = delta - *time_shift;
debug!("Delta: <yellow>{shifted_delta:.3}</>, shifted: <yellow>{delta:.3}</>");
debug!(target: Target::file_mail(), channel = id; "Delta: <yellow>{shifted_delta:.3}</>, shifted: <yellow>{delta:.3}</>");
} else {
debug!("Delta: <yellow>{shifted_delta:.3}</>");
debug!(target: Target::file_mail(), channel = id; "Delta: <yellow>{shifted_delta:.3}</>");
}
if config.general.stop_threshold > 0.0
&& shifted_delta.abs() > config.general.stop_threshold
{
error!("Clip begin out of sync for <yellow>{delta:.3}</> seconds.");
error!(target: Target::file_mail(), channel = id; "Clip begin out of sync for <yellow>{delta:.3}</> seconds.");
new_node.cmd = None;
@ -563,26 +543,18 @@ fn timed_source(
{
// when we are in the 24 hour range, get the clip
new_node.process = Some(true);
new_node = gen_source(config, node, playout_stat, player_control, last_index);
new_node = gen_source(config, node, manager, last_index);
} else if total_delta <= 0.0 {
info!("Begin is over play time, skip: {}", node.source);
info!(target: Target::file_mail(), channel = id; "Begin is over play time, skip: {}", node.source);
} else if total_delta < node.duration - node.seek || last {
new_node = handle_list_end(
config,
node,
total_delta,
playout_stat,
player_control,
last_index,
);
new_node = handle_list_end(config, node, total_delta, manager, last_index);
}
new_node
}
fn duplicate_for_seek_and_loop(node: &mut Media, player_control: &PlayerControl) {
warn!("Clip loops and has seek value: duplicate clip to separate loop and seek.");
let mut nodes = player_control.current_list.lock().unwrap();
fn duplicate_for_seek_and_loop(node: &mut Media, current_list: &Arc<Mutex<Vec<Media>>>) {
let mut nodes = current_list.lock().unwrap();
let index = node.index.unwrap_or_default();
let mut node_duplicate = node.clone();
@ -617,15 +589,17 @@ fn duplicate_for_seek_and_loop(node: &mut Media, player_control: &PlayerControl)
pub fn gen_source(
config: &PlayoutConfig,
mut node: Media,
playout_stat: &PlayoutStatus,
player_control: &PlayerControl,
manager: &ChannelManager,
last_index: usize,
) -> Media {
let node_index = node.index.unwrap_or_default();
let mut duration = node.out - node.seek;
if duration < 1.0 {
warn!("Clip is less then 1 second long (<yellow>{duration:.3}</>), adjust length.");
warn!(
target: Target::file_mail(), channel = config.general.channel_id;
"Clip is less then 1 second long (<yellow>{duration:.3}</>), adjust length."
);
duration = 1.2;
@ -658,7 +632,8 @@ pub fn gen_source(
node.cmd = Some(loop_image(&node));
} else {
if node.seek > 0.0 && node.out > node.duration {
duplicate_for_seek_and_loop(&mut node, player_control);
warn!(target: Target::file_mail(), channel = config.general.channel_id; "Clip loops and has seek value: duplicate clip to separate loop and seek.");
duplicate_for_seek_and_loop(&mut node, &manager.current_list);
}
node.cmd = Some(seek_and_length(&mut node));
@ -668,33 +643,35 @@ pub fn gen_source(
// Last index is the index from the last item from the node list.
if node_index < last_index {
error!("Source not found: <b><magenta>{}</></b>", node.source);
error!(target: Target::file_mail(), channel = config.general.channel_id; "Source not found: <b><magenta>{}</></b>", node.source);
}
let mut filler_list = vec![];
let mut fillers = vec![];
match player_control.filler_list.try_lock() {
Ok(list) => filler_list = list.to_vec(),
Err(e) => error!("Lock filler list error: {e}"),
match manager.filler_list.try_lock() {
Ok(list) => fillers = list.to_vec(),
Err(e) => {
error!(target: Target::file_mail(), channel = config.general.channel_id; "Lock filler list error: {e}")
}
}
// Set list_init to true, to stay in sync.
playout_stat.list_init.store(true, Ordering::SeqCst);
manager.list_init.store(true, Ordering::SeqCst);
if config.storage.filler.is_dir() && !filler_list.is_empty() {
let filler_index = player_control.filler_index.fetch_add(1, Ordering::SeqCst);
let mut filler_media = filler_list[filler_index].clone();
if config.storage.filler.is_dir() && !fillers.is_empty() {
let index = manager.filler_index.fetch_add(1, Ordering::SeqCst);
let mut filler_media = fillers[index].clone();
trace!("take filler: {}", filler_media.source);
if filler_index == filler_list.len() - 1 {
if index == fillers.len() - 1 {
// reset index for next round
player_control.filler_index.store(0, Ordering::SeqCst)
manager.filler_index.store(0, Ordering::SeqCst)
}
if filler_media.probe.is_none() {
if let Err(e) = filler_media.add_probe(false) {
error!("{e:?}");
error!(target: Target::file_mail(), channel = config.general.channel_id; "{e:?}");
};
}
@ -752,7 +729,7 @@ pub fn gen_source(
}
Err(e) => {
// Create colored placeholder.
error!("Filler error: {e}");
error!(target: Target::file_mail(), channel = config.general.channel_id; "Filler error: {e}");
let mut dummy_duration = 60.0;
@ -771,12 +748,13 @@ pub fn gen_source(
}
warn!(
target: Target::file_mail(), channel = config.general.channel_id;
"Generate filler with <yellow>{:.2}</> seconds length!",
node.out
);
}
node.add_filter(config, &playout_stat.chain);
node.add_filter(config, &manager.filter_chain.clone());
trace!(
"return gen_source: {}, seek: {}, out: {}",
@ -793,18 +771,17 @@ pub fn gen_source(
fn handle_list_init(
config: &PlayoutConfig,
mut node: Media,
playout_stat: &PlayoutStatus,
player_control: &PlayerControl,
manager: &ChannelManager,
last_index: usize,
) -> Media {
debug!("Playlist init");
debug!(target: Target::file_mail(), channel = config.general.channel_id; "Playlist init");
let (_, total_delta) = get_delta(config, &node.begin.unwrap());
if !config.playlist.infinit && node.out - node.seek > total_delta {
node.out = total_delta + node.seek;
}
gen_source(config, node, playout_stat, player_control, last_index)
gen_source(config, node, manager, last_index)
}
/// when we come to last clip in playlist,
@ -814,17 +791,16 @@ fn handle_list_end(
config: &PlayoutConfig,
mut node: Media,
total_delta: f64,
playout_stat: &PlayoutStatus,
player_control: &PlayerControl,
manager: &ChannelManager,
last_index: usize,
) -> Media {
debug!("Last clip from day");
debug!(target: Target::file_mail(), channel = config.general.channel_id; "Last clip from day");
let mut out = if node.seek > 0.0 {
node.seek + total_delta
} else {
if node.duration > total_delta {
warn!("Adjust clip duration to: <yellow>{total_delta:.2}</>");
warn!(target: Target::file_mail(), channel = config.general.channel_id; "Adjust clip duration to: <yellow>{total_delta:.2}</>");
}
total_delta
@ -839,10 +815,10 @@ fn handle_list_end(
{
node.out = out;
} else {
warn!("Playlist is not long enough: <yellow>{total_delta:.2}</> seconds needed");
warn!(target: Target::file_mail(), channel = config.general.channel_id; "Playlist is not long enough: <yellow>{total_delta:.2}</> seconds needed");
}
node.process = Some(true);
gen_source(config, node, playout_stat, player_control, last_index)
gen_source(config, node, manager, last_index)
}

View File

@ -1,4 +1,5 @@
pub mod controller;
pub mod filter;
pub mod input;
pub mod output;
pub mod rpc;
pub mod utils;

View File

@ -1,8 +1,10 @@
use std::process::{self, Command, Stdio};
use simplelog::*;
use log::*;
use ffplayout_lib::{filter::v_drawtext, utils::PlayoutConfig, vec_strings};
use crate::player::filter::v_drawtext;
use crate::utils::{config::PlayoutConfig, logging::Target};
use crate::vec_strings;
/// Desktop Output
///
@ -12,11 +14,7 @@ pub fn output(config: &PlayoutConfig, log_format: &str) -> process::Child {
let mut enc_cmd = vec_strings!["-hide_banner", "-nostats", "-v", log_format];
if let Some(encoder_input_cmd) = config
.advanced
.as_ref()
.and_then(|a| a.encoder.input_cmd.clone())
{
if let Some(encoder_input_cmd) = &config.advanced.encoder.input_cmd {
enc_cmd.append(&mut encoder_input_cmd.clone());
}
@ -28,7 +26,7 @@ pub fn output(config: &PlayoutConfig, log_format: &str) -> process::Child {
"ffplayout"
]);
if let Some(mut cmd) = config.out.output_cmd.clone() {
if let Some(mut cmd) = config.output.output_cmd.clone() {
if !cmd.iter().any(|i| {
[
"-c",
@ -47,13 +45,13 @@ pub fn output(config: &PlayoutConfig, log_format: &str) -> process::Child {
}) {
enc_cmd.append(&mut cmd);
} else {
warn!("ffplay doesn't support given output parameters, they will be skipped!");
warn!(target: Target::file_mail(), channel = config.general.channel_id; "ffplay doesn't support given output parameters, they will be skipped!");
}
}
if config.text.add_text && !config.text.text_from_filename && !config.processing.audio_only {
if let Some(socket) = config.text.zmq_stream_socket.clone() {
debug!(
debug!(target: Target::file_mail(), channel = config.general.channel_id;
"Using drawtext filter, listening on address: <yellow>{}</>",
socket
);
@ -66,7 +64,7 @@ pub fn output(config: &PlayoutConfig, log_format: &str) -> process::Child {
enc_cmd.append(&mut enc_filter);
debug!(
debug!(target: Target::file_mail(), channel = config.general.channel_id;
"Encoder CMD: <bright-blue>\"ffplay {}\"</>",
enc_cmd.join(" ")
);
@ -78,7 +76,7 @@ pub fn output(config: &PlayoutConfig, log_format: &str) -> process::Child {
.spawn()
{
Err(e) => {
error!("couldn't spawn encoder process: {e}");
error!(target: Target::file_mail(), channel = config.general.channel_id; "couldn't spawn encoder process: {e}");
panic!("couldn't spawn encoder process: {e}")
}
Ok(proc) => proc,

View File

@ -0,0 +1,284 @@
/*
This module write the files compression directly to a hls (m3u8) playlist,
without pre- and post-processing.
Example config:
out:
output_param: >-
...
-flags +cgop
-f hls
-hls_time 6
-hls_list_size 600
-hls_flags append_list+delete_segments+omit_endlist+program_date_time
-hls_segment_filename /var/www/html/live/stream-%d.ts /var/www/html/live/stream.m3u8
*/
use std::{
io::{BufRead, BufReader},
process::{Command, Stdio},
sync::atomic::Ordering,
thread::{self, sleep},
time::{Duration, SystemTime},
};
use log::*;
use crate::utils::{logging::log_line, task_runner};
use crate::vec_strings;
use crate::{
player::{
controller::{ChannelManager, ProcessUnit::*},
input::source_generator,
utils::{
get_delta, prepare_output_cmd, sec_to_time, stderr_reader, test_tcp_port, valid_stream,
Media,
},
},
utils::{errors::ProcessError, logging::Target},
};
/// Ingest Server for HLS
fn ingest_to_hls_server(manager: ChannelManager) -> Result<(), ProcessError> {
let config = manager.config.lock().unwrap();
let id = config.general.channel_id;
let playlist_init = manager.list_init.clone();
let chain = manager.filter_chain.clone();
let mut server_prefix = vec_strings!["-hide_banner", "-nostats", "-v", "level+info"];
let stream_input = config.ingest.input_cmd.clone().unwrap();
let mut dummy_media = Media::new(0, "Live Stream", false);
dummy_media.unit = Ingest;
let is_terminated = manager.is_terminated.clone();
let ingest_is_running = manager.ingest_is_running.clone();
if let Some(ingest_input_cmd) = &config.advanced.ingest.input_cmd {
server_prefix.append(&mut ingest_input_cmd.clone());
}
server_prefix.append(&mut stream_input.clone());
let mut is_running;
if let Some(url) = stream_input.iter().find(|s| s.contains("://")) {
if !test_tcp_port(id, url) {
manager.stop_all();
}
info!(target: Target::file_mail(), channel = id; "Start ingest server, listening on: <b><magenta>{url}</></b>");
};
drop(config);
loop {
let config = manager.config.lock().unwrap().clone();
dummy_media.add_filter(&config, &chain);
let server_cmd = prepare_output_cmd(&config, server_prefix.clone(), &dummy_media.filter);
debug!(target: Target::file_mail(), channel = id;
"Server CMD: <bright-blue>\"ffmpeg {}\"</>",
server_cmd.join(" ")
);
let proc_ctl = manager.clone();
let mut server_proc = match Command::new("ffmpeg")
.args(server_cmd.clone())
.stderr(Stdio::piped())
.spawn()
{
Err(e) => {
error!(target: Target::file_mail(), channel = id; "couldn't spawn ingest server: {e}");
panic!("couldn't spawn ingest server: {e}");
}
Ok(proc) => proc,
};
let server_err = BufReader::new(server_proc.stderr.take().unwrap());
*manager.ingest.lock().unwrap() = Some(server_proc);
is_running = false;
for line in server_err.lines() {
let line = line?;
if line.contains("rtmp") && line.contains("Unexpected stream") && !valid_stream(&line) {
if let Err(e) = proc_ctl.stop(Ingest) {
error!(target: Target::file_mail(), channel = id; "{e}");
};
}
if !is_running {
ingest_is_running.store(true, Ordering::SeqCst);
playlist_init.store(true, Ordering::SeqCst);
is_running = true;
info!(target: Target::file_mail(), channel = id; "Switch from {} to live ingest", config.processing.mode);
if let Err(e) = manager.stop(Decoder) {
error!(target: Target::file_mail(), channel = id; "{e}");
}
}
log_line(&line, &config.logging.ffmpeg_level);
}
if ingest_is_running.load(Ordering::SeqCst) {
info!(target: Target::file_mail(), channel = id; "Switch from live ingest to {}", config.processing.mode);
}
ingest_is_running.store(false, Ordering::SeqCst);
if let Err(e) = manager.wait(Ingest) {
error!(target: Target::file_mail(), channel = id; "{e}")
}
if is_terminated.load(Ordering::SeqCst) {
break;
}
}
Ok(())
}
/// HLS Writer
///
/// Write with single ffmpeg instance directly to a HLS playlist.
pub fn write_hls(manager: ChannelManager) -> Result<(), ProcessError> {
let config = manager.config.lock()?.clone();
let id = config.general.channel_id;
let current_media = manager.current_media.clone();
let is_terminated = manager.is_terminated.clone();
let ff_log_format = format!("level+{}", config.logging.ffmpeg_level.to_lowercase());
let channel_mgr_2 = manager.clone();
let ingest_is_running = manager.ingest_is_running.clone();
let get_source = source_generator(manager.clone());
// spawn a thread for ffmpeg ingest server and create a channel for package sending
if config.ingest.enable {
thread::spawn(move || ingest_to_hls_server(channel_mgr_2));
}
let mut error_count = 0;
for node in get_source {
*current_media.lock().unwrap() = Some(node.clone());
let ignore = config.logging.ignore_lines.clone();
let timer = SystemTime::now();
if is_terminated.load(Ordering::SeqCst) {
break;
}
let mut cmd = match &node.cmd {
Some(cmd) => cmd.clone(),
None => break,
};
if !node.process.unwrap() {
continue;
}
info!(target: Target::file_mail(), channel = id;
"Play for <yellow>{}</>: <b><magenta>{}</></b>",
sec_to_time(node.out - node.seek),
node.source
);
if config.task.enable {
if config.task.path.is_file() {
let channel_mgr_3 = manager.clone();
thread::spawn(move || task_runner::run(channel_mgr_3));
} else {
error!(target: Target::file_mail(), channel = id;
"<bright-blue>{:?}</> executable not exists!",
config.task.path
);
}
}
let mut dec_prefix = vec_strings!["-hide_banner", "-nostats", "-v", &ff_log_format];
if let Some(decoder_input_cmd) = &config.advanced.decoder.input_cmd {
dec_prefix.append(&mut decoder_input_cmd.clone());
}
let mut read_rate = 1.0;
if let Some(begin) = &node.begin {
let (delta, _) = get_delta(&config, begin);
let duration = node.out - node.seek;
let speed = duration / (duration + delta);
if node.seek == 0.0
&& speed > 0.0
&& speed < 1.3
&& delta < config.general.stop_threshold
{
read_rate = speed;
}
}
dec_prefix.append(&mut vec_strings!["-readrate", read_rate]);
dec_prefix.append(&mut cmd);
let dec_cmd = prepare_output_cmd(&config, dec_prefix, &node.filter);
debug!(target: Target::file_mail(), channel = id;
"HLS writer CMD: <bright-blue>\"ffmpeg {}\"</>",
dec_cmd.join(" ")
);
let mut dec_proc = match Command::new("ffmpeg")
.args(dec_cmd)
.stderr(Stdio::piped())
.spawn()
{
Ok(proc) => proc,
Err(e) => {
error!(target: Target::file_mail(), channel = id; "couldn't spawn ffmpeg process: {e}");
panic!("couldn't spawn ffmpeg process: {e}")
}
};
let dec_err = BufReader::new(dec_proc.stderr.take().unwrap());
*manager.decoder.lock().unwrap() = Some(dec_proc);
if let Err(e) = stderr_reader(dec_err, ignore, Decoder, manager.clone()) {
error!(target: Target::file_mail(), channel = id; "{e:?}")
};
if let Err(e) = manager.wait(Decoder) {
error!(target: Target::file_mail(), channel = id; "{e}");
}
while ingest_is_running.load(Ordering::SeqCst) {
sleep(Duration::from_secs(1));
}
if let Ok(elapsed) = timer.elapsed() {
if elapsed.as_millis() < 300 {
error_count += 1;
if error_count > 10 {
error!(target: Target::file_mail(), channel = id; "Reach fatal error count, terminate channel!");
break;
}
} else {
error_count = 0;
}
}
}
sleep(Duration::from_secs(1));
manager.stop_all();
Ok(())
}

View File

@ -3,11 +3,11 @@ use std::{
process::{Command, Stdio},
sync::atomic::Ordering,
thread::{self, sleep},
time::Duration,
time::{Duration, SystemTime},
};
use crossbeam_channel::bounded;
use simplelog::*;
use log::*;
mod desktop;
mod hls;
@ -16,14 +16,13 @@ mod stream;
pub use hls::write_hls;
use crate::input::{ingest_server, source_generator};
use crate::utils::task_runner;
use ffplayout_lib::utils::{
sec_to_time, stderr_reader, OutputMode::*, PlayerControl, PlayoutConfig, PlayoutStatus,
ProcessControl, ProcessUnit::*,
use crate::player::{
controller::{ChannelManager, ProcessUnit::*},
input::{ingest_server, source_generator},
utils::{sec_to_time, stderr_reader},
};
use ffplayout_lib::vec_strings;
use crate::utils::{config::OutputMode::*, errors::ProcessError, logging::Target, task_runner};
use crate::vec_strings;
/// Player
///
@ -34,62 +33,63 @@ use ffplayout_lib::vec_strings;
/// for getting live feeds.
/// When a live ingest arrive, it stops the current playing and switch to the live source.
/// When ingest stops, it switch back to playlist/folder mode.
pub fn player(
config: &PlayoutConfig,
play_control: &PlayerControl,
playout_stat: PlayoutStatus,
proc_control: ProcessControl,
) {
pub fn player(manager: ChannelManager) -> Result<(), ProcessError> {
let config = manager.config.lock()?.clone();
let id = config.general.channel_id;
let config_clone = config.clone();
let ff_log_format = format!("level+{}", config.logging.ffmpeg_level.to_lowercase());
let ignore_enc = config.logging.ignore_lines.clone();
let mut buffer = [0; 65088];
let mut live_on = false;
let playlist_init = playout_stat.list_init.clone();
let play_stat = playout_stat.clone();
let playlist_init = manager.list_init.clone();
let is_terminated = manager.is_terminated.clone();
let ingest_is_running = manager.ingest_is_running.clone();
// get source iterator
let node_sources = source_generator(
config.clone(),
play_control,
playout_stat,
proc_control.is_terminated.clone(),
);
let node_sources = source_generator(manager.clone());
// get ffmpeg output instance
let mut enc_proc = match config.out.mode {
Desktop => desktop::output(config, &ff_log_format),
Null => null::output(config, &ff_log_format),
Stream => stream::output(config, &ff_log_format),
let mut enc_proc = match config.output.mode {
Desktop => desktop::output(&config, &ff_log_format),
Null => null::output(&config, &ff_log_format),
Stream => stream::output(&config, &ff_log_format),
_ => panic!("Output mode doesn't exists!"),
};
let mut enc_writer = BufWriter::new(enc_proc.stdin.take().unwrap());
let enc_err = BufReader::new(enc_proc.stderr.take().unwrap());
*proc_control.encoder_term.lock().unwrap() = Some(enc_proc);
let enc_p_ctl = proc_control.clone();
*manager.encoder.lock().unwrap() = Some(enc_proc);
let enc_p_ctl = manager.clone();
// spawn a thread to log ffmpeg output error messages
let error_encoder_thread =
thread::spawn(move || stderr_reader(enc_err, ignore_enc, Encoder, enc_p_ctl));
let proc_control_c = proc_control.clone();
let channel_mgr_2 = manager.clone();
let mut ingest_receiver = None;
// spawn a thread for ffmpeg ingest server and create a channel for package sending
if config.ingest.enable {
let (ingest_sender, rx) = bounded(96);
ingest_receiver = Some(rx);
thread::spawn(move || ingest_server(config_clone, ingest_sender, proc_control_c));
thread::spawn(move || ingest_server(config_clone, ingest_sender, channel_mgr_2));
}
'source_iter: for node in node_sources {
*play_control.current_media.lock().unwrap() = Some(node.clone());
let ignore_dec = config.logging.ignore_lines.clone();
drop(config);
if proc_control.is_terminated.load(Ordering::SeqCst) {
debug!("Playout is terminated, break out from source loop");
let mut error_count = 0;
'source_iter: for node in node_sources {
let config = manager.config.lock()?.clone();
*manager.current_media.lock().unwrap() = Some(node.clone());
let ignore_dec = config.logging.ignore_lines.clone();
let timer = SystemTime::now();
if is_terminated.load(Ordering::SeqCst) {
debug!(target: Target::file_mail(), channel = id; "Playout is terminated, break out from source loop");
break;
}
@ -111,13 +111,13 @@ pub fn player(
format!(
" ({}/{})",
node.index.unwrap() + 1,
play_control.current_list.lock().unwrap().len()
manager.current_list.lock().unwrap().len()
)
} else {
String::new()
};
info!(
info!(target: Target::file_mail(), channel = id;
"Play for <yellow>{}</>{c_index}: <b><magenta>{} {}</></b>",
sec_to_time(node.out - node.seek),
node.source,
@ -126,16 +126,11 @@ pub fn player(
if config.task.enable {
if config.task.path.is_file() {
let task_config = config.clone();
let task_node = node.clone();
let server_running = proc_control.server_is_running.load(Ordering::SeqCst);
let stat = play_stat.clone();
let channel_mgr_3 = manager.clone();
thread::spawn(move || {
task_runner::run(task_config, task_node, stat, server_running)
});
thread::spawn(move || task_runner::run(channel_mgr_3));
} else {
error!(
error!(target: Target::file_mail(), channel = id;
"<bright-blue>{:?}</> executable not exists!",
config.task.path
);
@ -144,11 +139,7 @@ pub fn player(
let mut dec_cmd = vec_strings!["-hide_banner", "-nostats", "-v", &ff_log_format];
if let Some(decoder_input_cmd) = config
.advanced
.as_ref()
.and_then(|a| a.decoder.input_cmd.clone())
{
if let Some(decoder_input_cmd) = &config.advanced.decoder.input_cmd {
dec_cmd.append(&mut decoder_input_cmd.clone());
}
@ -163,7 +154,7 @@ pub fn player(
dec_cmd.append(&mut cmd);
}
debug!(
debug!(target: Target::file_mail(), channel = id;
"Decoder CMD: <bright-blue>\"ffmpeg {}\"</>",
dec_cmd.join(" ")
);
@ -177,7 +168,7 @@ pub fn player(
{
Ok(proc) => proc,
Err(e) => {
error!("couldn't spawn decoder process: {e}");
error!(target: Target::file_mail(), channel = id; "couldn't spawn decoder process: {e}");
panic!("couldn't spawn decoder process: {e}")
}
};
@ -185,20 +176,20 @@ pub fn player(
let mut dec_reader = BufReader::new(dec_proc.stdout.take().unwrap());
let dec_err = BufReader::new(dec_proc.stderr.take().unwrap());
*proc_control.decoder_term.lock().unwrap() = Some(dec_proc);
let dec_p_ctl = proc_control.clone();
*manager.clone().decoder.lock().unwrap() = Some(dec_proc);
let channel_mgr_c = manager.clone();
let error_decoder_thread =
thread::spawn(move || stderr_reader(dec_err, ignore_dec, Decoder, dec_p_ctl));
thread::spawn(move || stderr_reader(dec_err, ignore_dec, Decoder, channel_mgr_c));
loop {
// when server is running, read from it
if proc_control.server_is_running.load(Ordering::SeqCst) {
if ingest_is_running.load(Ordering::SeqCst) {
if !live_on {
info!("Switch from {} to live ingest", config.processing.mode);
info!(target: Target::file_mail(), channel = id; "Switch from {} to live ingest", config.processing.mode);
if let Err(e) = proc_control.stop(Decoder) {
error!("{e}")
if let Err(e) = manager.stop(Decoder) {
error!(target: Target::file_mail(), channel = id; "{e}")
}
live_on = true;
@ -207,7 +198,7 @@ pub fn player(
for rx in ingest_receiver.as_ref().unwrap().try_iter() {
if let Err(e) = enc_writer.write(&rx.1[..rx.0]) {
error!("Error from Ingest: {:?}", e);
error!(target: Target::file_mail(), channel = id; "Error from Ingest: {:?}", e);
break 'source_iter;
};
@ -215,7 +206,7 @@ pub fn player(
// read from decoder instance
} else {
if live_on {
info!("Switch from live ingest to {}", config.processing.mode);
info!(target: Target::file_mail(), channel = id; "Switch from live ingest to {}", config.processing.mode);
live_on = false;
break;
@ -224,7 +215,7 @@ pub fn player(
let dec_bytes_len = match dec_reader.read(&mut buffer[..]) {
Ok(length) => length,
Err(e) => {
error!("Reading error from decoder: {e:?}");
error!(target: Target::file_mail(), channel = id; "Reading error from decoder: {e:?}");
break 'source_iter;
}
@ -232,7 +223,7 @@ pub fn player(
if dec_bytes_len > 0 {
if let Err(e) = enc_writer.write(&buffer[..dec_bytes_len]) {
error!("Encoder write error: {}", e.kind());
error!(target: Target::file_mail(), channel = id; "Encoder write error: {}", e.kind());
break 'source_iter;
};
@ -242,22 +233,37 @@ pub fn player(
}
}
if let Err(e) = proc_control.wait(Decoder) {
error!("{e}")
if let Err(e) = manager.wait(Decoder) {
error!(target: Target::file_mail(), channel = id; "{e}")
}
if let Err(e) = error_decoder_thread.join() {
error!("{e:?}");
error!(target: Target::file_mail(), channel = id; "{e:?}");
};
if let Ok(elapsed) = timer.elapsed() {
if elapsed.as_millis() < 300 {
error_count += 1;
if error_count > 10 {
error!(target: Target::file_mail(), channel = id; "Reach fatal error count, terminate channel!");
break;
}
} else {
error_count = 0;
}
}
}
trace!("Out of source loop");
sleep(Duration::from_secs(1));
proc_control.stop_all();
manager.stop_all();
if let Err(e) = error_encoder_thread.join() {
error!("{e:?}");
error!(target: Target::file_mail(), channel = id; "{e:?}");
};
Ok(())
}

View File

@ -1,28 +1,26 @@
use std::process::{self, Command, Stdio};
use simplelog::*;
use log::*;
use crate::utils::prepare_output_cmd;
use ffplayout_lib::{
utils::{Media, PlayoutConfig, ProcessUnit::*},
vec_strings,
use crate::player::{
controller::ProcessUnit::*,
utils::{prepare_output_cmd, Media},
};
use crate::utils::{config::PlayoutConfig, logging::Target};
use crate::vec_strings;
/// Desktop Output
///
/// Instead of streaming, we run a ffplay instance and play on desktop.
pub fn output(config: &PlayoutConfig, log_format: &str) -> process::Child {
let mut media = Media::new(0, "", false);
let id = config.general.channel_id;
media.unit = Encoder;
media.add_filter(config, &None);
let mut enc_prefix = vec_strings!["-hide_banner", "-nostats", "-v", log_format];
if let Some(input_cmd) = config
.advanced
.as_ref()
.and_then(|a| a.encoder.input_cmd.clone())
{
if let Some(input_cmd) = &config.advanced.encoder.input_cmd {
enc_prefix.append(&mut input_cmd.clone());
}
@ -30,7 +28,7 @@ pub fn output(config: &PlayoutConfig, log_format: &str) -> process::Child {
let enc_cmd = prepare_output_cmd(config, enc_prefix, &media.filter);
debug!(
debug!(target: Target::file_mail(), channel = id;
"Encoder CMD: <bright-blue>\"ffmpeg {}\"</>",
enc_cmd.join(" ")
);
@ -42,7 +40,7 @@ pub fn output(config: &PlayoutConfig, log_format: &str) -> process::Child {
.spawn()
{
Err(e) => {
error!("couldn't spawn encoder process: {e}");
error!(target: Target::file_mail(), channel = id; "couldn't spawn encoder process: {e}");
panic!("couldn't spawn encoder process: {e}")
}
Ok(proc) => proc,

View File

@ -1,28 +1,26 @@
use std::process::{self, Command, Stdio};
use simplelog::*;
use log::*;
use crate::utils::prepare_output_cmd;
use ffplayout_lib::{
utils::{Media, PlayoutConfig, ProcessUnit::*},
vec_strings,
use crate::player::{
controller::ProcessUnit::*,
utils::{prepare_output_cmd, Media},
};
use crate::utils::{config::PlayoutConfig, logging::Target};
use crate::vec_strings;
/// Streaming Output
///
/// Prepare the ffmpeg command for streaming output
pub fn output(config: &PlayoutConfig, log_format: &str) -> process::Child {
let mut media = Media::new(0, "", false);
let id = config.general.channel_id;
media.unit = Encoder;
media.add_filter(config, &None);
let mut enc_prefix = vec_strings!["-hide_banner", "-nostats", "-v", log_format];
if let Some(input_cmd) = config
.advanced
.as_ref()
.and_then(|a| a.encoder.input_cmd.clone())
{
if let Some(input_cmd) = &config.advanced.encoder.input_cmd {
enc_prefix.append(&mut input_cmd.clone());
}
@ -30,7 +28,7 @@ pub fn output(config: &PlayoutConfig, log_format: &str) -> process::Child {
let enc_cmd = prepare_output_cmd(config, enc_prefix, &media.filter);
debug!(
debug!(target: Target::file_mail(), channel = id;
"Encoder CMD: <bright-blue>\"ffmpeg {}\"</>",
enc_cmd.join(" ")
);
@ -42,7 +40,7 @@ pub fn output(config: &PlayoutConfig, log_format: &str) -> process::Child {
.spawn()
{
Err(e) => {
error!("couldn't spawn encoder process: {e}");
error!(target: Target::file_mail(), channel = id; "couldn't spawn encoder process: {e}");
panic!("couldn't spawn encoder process: {e}")
}
Ok(proc) => proc,

View File

@ -4,46 +4,48 @@ use std::sync::{
};
use lexical_sort::natural_lexical_cmp;
use log::*;
use rand::{seq::SliceRandom, thread_rng};
use simplelog::*;
use walkdir::WalkDir;
use crate::utils::{
controller::PlayerControl, include_file_extension, time_in_seconds, Media, PlayoutConfig,
use crate::player::{
controller::ChannelManager,
utils::{include_file_extension, time_in_seconds, Media, PlayoutConfig},
};
use crate::utils::logging::Target;
/// Folder Sources
///
/// Like playlist source, we create here a folder list for iterate over it.
#[derive(Debug, Clone)]
pub struct FolderSource {
config: PlayoutConfig,
filter_chain: Option<Arc<Mutex<Vec<String>>>>,
pub player_control: PlayerControl,
manager: ChannelManager,
current_node: Media,
}
impl FolderSource {
pub fn new(
config: &PlayoutConfig,
filter_chain: Option<Arc<Mutex<Vec<String>>>>,
player_control: &PlayerControl,
) -> Self {
pub fn new(config: &PlayoutConfig, manager: ChannelManager) -> Self {
let id = config.general.channel_id;
let mut path_list = vec![];
let mut media_list = vec![];
let mut index: usize = 0;
debug!(target: Target::file_mail(), channel = id;
"generate: {:?}, paths: {:?}",
config.general.generate, config.storage.paths
);
if config.general.generate.is_some() && !config.storage.paths.is_empty() {
for path in &config.storage.paths {
path_list.push(path)
}
} else {
path_list.push(&config.storage.path)
path_list.push(&config.global.storage_path)
}
for path in &path_list {
if !path.is_dir() {
error!("Path not exists: <b><magenta>{path:?}</></b>");
error!(target: Target::file_mail(), channel = id; "Path not exists: <b><magenta>{path:?}</></b>");
}
for entry in WalkDir::new(path)
@ -58,14 +60,14 @@ impl FolderSource {
}
if media_list.is_empty() {
error!(
error!(target: Target::file_mail(), channel = id;
"no playable files found under: <b><magenta>{:?}</></b>",
path_list
);
}
if config.storage.shuffle {
info!("Shuffle files");
info!(target: Target::file_mail(), channel = id; "Shuffle files");
let mut rng = thread_rng();
media_list.shuffle(&mut rng);
} else {
@ -78,35 +80,26 @@ impl FolderSource {
index += 1;
}
*player_control.current_list.lock().unwrap() = media_list;
*manager.current_list.lock().unwrap() = media_list;
Self {
config: config.clone(),
filter_chain,
player_control: player_control.clone(),
manager,
current_node: Media::new(0, "", false),
}
}
pub fn from_list(
config: &PlayoutConfig,
filter_chain: Option<Arc<Mutex<Vec<String>>>>,
player_control: &PlayerControl,
list: Vec<Media>,
) -> Self {
*player_control.current_list.lock().unwrap() = list;
pub fn from_list(manager: &ChannelManager, list: Vec<Media>) -> Self {
*manager.current_list.lock().unwrap() = list;
Self {
config: config.clone(),
filter_chain,
player_control: player_control.clone(),
manager: manager.clone(),
current_node: Media::new(0, "", false),
}
}
fn shuffle(&mut self) {
let mut rng = thread_rng();
let mut nodes = self.player_control.current_list.lock().unwrap();
let mut nodes = self.manager.current_list.lock().unwrap();
nodes.shuffle(&mut rng);
@ -116,7 +109,7 @@ impl FolderSource {
}
fn sort(&mut self) {
let mut nodes = self.player_control.current_list.lock().unwrap();
let mut nodes = self.manager.current_list.lock().unwrap();
nodes.sort_by(|d1, d2| d1.source.cmp(&d2.source));
@ -131,43 +124,44 @@ impl Iterator for FolderSource {
type Item = Media;
fn next(&mut self) -> Option<Self::Item> {
if self.player_control.current_index.load(Ordering::SeqCst)
< self.player_control.current_list.lock().unwrap().len()
let config = self.manager.config.lock().unwrap().clone();
let id = config.general.id;
if self.manager.current_index.load(Ordering::SeqCst)
< self.manager.current_list.lock().unwrap().len()
{
let i = self.player_control.current_index.load(Ordering::SeqCst);
self.current_node = self.player_control.current_list.lock().unwrap()[i].clone();
let i = self.manager.current_index.load(Ordering::SeqCst);
self.current_node = self.manager.current_list.lock().unwrap()[i].clone();
let _ = self.current_node.add_probe(false).ok();
self.current_node
.add_filter(&self.config, &self.filter_chain);
.add_filter(&config, &self.manager.filter_chain);
self.current_node.begin = Some(time_in_seconds());
self.player_control
.current_index
.fetch_add(1, Ordering::SeqCst);
self.manager.current_index.fetch_add(1, Ordering::SeqCst);
Some(self.current_node.clone())
} else {
if self.config.storage.shuffle {
if self.config.general.generate.is_none() {
info!("Shuffle files");
if config.storage.shuffle {
if config.general.generate.is_none() {
info!(target: Target::file_mail(), channel = id; "Shuffle files");
}
self.shuffle();
} else {
if self.config.general.generate.is_none() {
info!("Sort files");
if config.general.generate.is_none() {
info!(target: Target::file_mail(), channel = id; "Sort files");
}
self.sort();
}
self.current_node = self.player_control.current_list.lock().unwrap()[0].clone();
self.current_node = self.manager.current_list.lock().unwrap()[0].clone();
let _ = self.current_node.add_probe(false).ok();
self.current_node
.add_filter(&self.config, &self.filter_chain);
.add_filter(&config, &self.manager.filter_chain);
self.current_node.begin = Some(time_in_seconds());
self.player_control.current_index.store(1, Ordering::SeqCst);
self.manager.current_index.store(1, Ordering::SeqCst);
Some(self.current_node.clone())
}
@ -176,8 +170,9 @@ impl Iterator for FolderSource {
pub fn fill_filler_list(
config: &PlayoutConfig,
player_control: Option<PlayerControl>,
fillers: Option<Arc<Mutex<Vec<Media>>>>,
) -> Vec<Media> {
let id = config.general.channel_id;
let mut filler_list = vec![];
let filler_path = &config.storage.filler;
@ -191,9 +186,9 @@ pub fn fill_filler_list(
{
let mut media = Media::new(index, &entry.path().to_string_lossy(), false);
if player_control.is_none() {
if fillers.is_none() {
if let Err(e) = media.add_probe(false) {
error!("{e:?}");
error!(target: Target::file_mail(), channel = id; "{e:?}");
};
}
@ -212,22 +207,22 @@ pub fn fill_filler_list(
item.index = Some(index);
}
if let Some(control) = player_control.as_ref() {
control.filler_list.lock().unwrap().clone_from(&filler_list);
if let Some(f) = fillers.as_ref() {
f.lock().unwrap().clone_from(&filler_list);
}
} else if filler_path.is_file() {
let mut media = Media::new(0, &config.storage.filler.to_string_lossy(), false);
if player_control.is_none() {
if fillers.is_none() {
if let Err(e) = media.add_probe(false) {
error!("{e:?}");
error!(target: Target::file_mail(), channel = id; "{e:?}");
};
}
filler_list.push(media);
if let Some(control) = player_control.as_ref() {
control.filler_list.lock().unwrap().clone_from(&filler_list);
if let Some(f) = fillers.as_ref() {
f.lock().unwrap().clone_from(&filler_list);
}
}

View File

@ -6,7 +6,9 @@ use std::{
path::Path,
};
use crate::utils::{json_reader, json_serializer::JsonPlaylist, json_writer, Media, PlayoutConfig};
use crate::player::utils::{
json_reader, json_serializer::JsonPlaylist, json_writer, Media, PlayoutConfig,
};
pub fn import_file(
config: &PlayoutConfig,
@ -26,13 +28,13 @@ pub fn import_file(
program: vec![],
};
let playlist_root = &config.playlist.path;
let playlist_root = &config.global.playlist_path;
if !playlist_root.is_dir() {
return Err(Error::new(
ErrorKind::Other,
format!(
"Playlist folder <b><magenta>{:?}</></b> not exists!",
config.playlist.path,
config.global.playlist_path,
),
));
}

View File

@ -2,16 +2,17 @@ use serde::{Deserialize, Serialize};
use std::{
fs::File,
path::Path,
sync::{atomic::AtomicBool, Arc},
sync::{atomic::AtomicBool, Arc, Mutex},
thread,
};
use simplelog::*;
use log::*;
use crate::utils::{
get_date, is_remote, modified_time, time_from_header, validate_playlist, Media, PlayerControl,
PlayoutConfig, DUMMY_LEN,
use crate::player::utils::{
get_date, is_remote, json_validate::validate_playlist, modified_time, time_from_header, Media,
PlayoutConfig,
};
use crate::utils::{config::DUMMY_LEN, logging::Target};
/// This is our main playlist object, it holds all necessary information for the current day.
#[derive(Debug, Serialize, Deserialize, Clone)]
@ -91,19 +92,19 @@ pub fn set_defaults(playlist: &mut JsonPlaylist) {
/// which we need to process.
pub fn read_json(
config: &mut PlayoutConfig,
player_control: &PlayerControl,
current_list: Arc<Mutex<Vec<Media>>>,
path: Option<String>,
is_terminated: Arc<AtomicBool>,
seek: bool,
get_next: bool,
) -> JsonPlaylist {
let id = config.general.channel_id;
let config_clone = config.clone();
let control_clone = player_control.clone();
let mut playlist_path = config.playlist.path.clone();
let mut playlist_path = config.global.playlist_path.clone();
let start_sec = config.playlist.start_sec.unwrap();
let date = get_date(seek, start_sec, get_next);
if playlist_path.is_dir() || is_remote(&config.playlist.path.to_string_lossy()) {
if playlist_path.is_dir() || is_remote(&config.global.playlist_path.to_string_lossy()) {
let d: Vec<&str> = date.split('-').collect();
playlist_path = playlist_path
.join(d[0])
@ -130,7 +131,7 @@ pub fn read_json(
let mut playlist: JsonPlaylist = match serde_json::from_str(&body) {
Ok(p) => p,
Err(e) => {
error!("Could't read remote json playlist. {e:?}");
error!(target: Target::file_mail(), channel = id; "Could't read remote json playlist. {e:?}");
JsonPlaylist::new(date.clone(), start_sec)
}
};
@ -146,12 +147,7 @@ pub fn read_json(
if !config.general.skip_validation {
thread::spawn(move || {
validate_playlist(
config_clone,
control_clone,
list_clone,
is_terminated,
)
validate_playlist(config_clone, current_list, list_clone, is_terminated)
});
}
@ -172,7 +168,7 @@ pub fn read_json(
let mut playlist: JsonPlaylist = match serde_json::from_reader(f) {
Ok(p) => p,
Err(e) => {
error!("Playlist file not readable! {e}");
error!(target: Target::file_mail(), channel = id; "Playlist file not readable! {e}");
JsonPlaylist::new(date.clone(), start_sec)
}
};
@ -190,7 +186,7 @@ pub fn read_json(
if !config.general.skip_validation {
thread::spawn(move || {
validate_playlist(config_clone, control_clone, list_clone, is_terminated)
validate_playlist(config_clone, current_list, list_clone, is_terminated)
});
}
@ -199,7 +195,7 @@ pub fn read_json(
return playlist;
}
error!("Playlist <b><magenta>{current_file}</></b> not exist!");
error!(target: Target::file_mail(), channel = id; "Playlist <b><magenta>{current_file}</></b> not exist!");
JsonPlaylist::new(date, start_sec)
}

View File

@ -3,20 +3,24 @@ use std::{
process::{Command, Stdio},
sync::{
atomic::{AtomicBool, Ordering},
Arc,
Arc, Mutex,
},
time::Instant,
};
use log::*;
use regex::Regex;
use simplelog::*;
use crate::filter::FilterType::Audio;
use crate::utils::{
errors::ProcError, is_close, is_remote, loop_image, sec_to_time, seek_and_length, vec_strings,
JsonPlaylist, Media, OutputMode::Null, PlayerControl, PlayoutConfig, FFMPEG_IGNORE_ERRORS,
IMAGE_FORMAT,
use crate::player::filter::FilterType::Audio;
use crate::player::utils::{
is_close, is_remote, loop_image, sec_to_time, seek_and_length, JsonPlaylist, Media,
};
use crate::utils::{
config::{OutputMode::Null, PlayoutConfig, FFMPEG_IGNORE_ERRORS, IMAGE_FORMAT},
errors::ProcessError,
logging::Target,
};
use crate::vec_strings;
/// Validate a single media file.
///
@ -29,19 +33,16 @@ fn check_media(
pos: usize,
begin: f64,
config: &PlayoutConfig,
) -> Result<(), ProcError> {
) -> Result<(), ProcessError> {
let id = config.general.channel_id;
let mut dec_cmd = vec_strings!["-hide_banner", "-nostats", "-v", "level+info"];
let mut error_list = vec![];
let mut config = config.clone();
config.out.mode = Null;
config.output.mode = Null;
let mut process_length = 0.1;
if let Some(decoder_input_cmd) = config
.advanced
.as_ref()
.and_then(|a| a.decoder.input_cmd.clone())
{
if let Some(decoder_input_cmd) = &config.advanced.decoder.input_cmd {
dec_cmd.append(&mut decoder_input_cmd.clone());
}
@ -129,7 +130,7 @@ fn check_media(
}
if !error_list.is_empty() {
error!(
error!(target: Target::file_mail(), channel = id;
"<bright black>[Validator]</> ffmpeg error on position <yellow>{pos}</> - {}: <b><magenta>{}</></b>: {}",
sec_to_time(begin),
node.source,
@ -140,7 +141,7 @@ fn check_media(
error_list.clear();
if let Err(e) = enc_proc.wait() {
error!("Validation process: {e:?}");
error!(target: Target::file_mail(), channel = id; "Validation process: {e:?}");
}
Ok(())
@ -155,10 +156,11 @@ fn check_media(
/// This function we run in a thread, to don't block the main function.
pub fn validate_playlist(
mut config: PlayoutConfig,
player_control: PlayerControl,
current_list: Arc<Mutex<Vec<Media>>>,
mut playlist: JsonPlaylist,
is_terminated: Arc<AtomicBool>,
) {
let id = config.general.channel_id;
let date = playlist.date;
if config.text.add_text && !config.text.text_from_filename {
@ -171,7 +173,7 @@ pub fn validate_playlist(
length += begin;
debug!("Validate playlist from: <yellow>{date}</>");
debug!(target: Target::file_mail(), channel = id; "Validate playlist from: <yellow>{date}</>");
let timer = Instant::now();
for (index, item) in playlist.program.iter_mut().enumerate() {
@ -184,13 +186,13 @@ pub fn validate_playlist(
if !is_remote(&item.source) {
if item.audio.is_empty() {
if let Err(e) = item.add_probe(false) {
error!(
error!(target: Target::file_mail(), channel = id;
"[Validation] Error on position <yellow>{pos:0>3}</> <yellow>{}</>: {e}",
sec_to_time(begin)
);
}
} else if let Err(e) = item.add_probe(true) {
error!(
error!(target: Target::file_mail(), channel = id;
"[Validation] Error on position <yellow>{pos:0>3}</> <yellow>{}</>: {e}",
sec_to_time(begin)
);
@ -199,14 +201,14 @@ pub fn validate_playlist(
if item.probe.is_some() {
if let Err(e) = check_media(item.clone(), pos, begin, &config) {
error!("{e}");
error!(target: Target::file_mail(), channel = id; "{e}");
} else if config.general.validate {
debug!(
debug!(target: Target::file_mail(), channel = id;
"[Validation] Source at <yellow>{}</>, seems fine: <b><magenta>{}</></b>",
sec_to_time(begin),
item.source
)
} else if let Ok(mut list) = player_control.current_list.try_lock() {
} else if let Ok(mut list) = current_list.try_lock() {
// Filter out same item in current playlist, then add the probe to it.
// Check also if duration differs with playlist value, log error if so and adjust that value.
list.iter_mut().filter(|list_item| list_item.source == item.source).for_each(|o| {
@ -218,7 +220,7 @@ pub fn validate_playlist(
let probe_duration = dur.parse().unwrap_or_default();
if !is_close(o.duration, probe_duration, 1.2) {
error!(
error!(target: Target::file_mail(), channel = id;
"[Validation] File duration (at: <yellow>{}</>) differs from playlist value. File duration: <yellow>{}</>, playlist value: <yellow>{}</>, source <b><magenta>{}</></b>",
sec_to_time(o.begin.unwrap_or_default()), sec_to_time(probe_duration), sec_to_time(o.duration), o.source
);
@ -239,20 +241,20 @@ pub fn validate_playlist(
}
if !config.playlist.infinit && length > begin + 1.2 {
error!(
error!(target: Target::file_mail(), channel = id;
"[Validation] Playlist from <yellow>{date}</> not long enough, <yellow>{}</> needed!",
sec_to_time(length - begin),
);
}
if config.general.validate {
info!(
info!(target: Target::file_mail(), channel = id;
"[Validation] Playlist length: <yellow>{}</>",
sec_to_time(begin - config.playlist.start_sec.unwrap())
);
}
debug!(
debug!(target: Target::file_mail(), channel = id;
"Validation done, in <yellow>{:.3?}</>, playlist length: <yellow>{}</> ...",
timer.elapsed(),
sec_to_time(begin - config.playlist.start_sec.unwrap())

View File

@ -1,56 +1,181 @@
use std::{
ffi::OsStr,
fmt,
fs::{self, metadata, File},
fs::{metadata, File},
io::{BufRead, BufReader, Error},
net::TcpListener,
path::{Path, PathBuf},
process::{exit, ChildStderr, Command, Stdio},
str::FromStr,
sync::{Arc, Mutex},
sync::{atomic::Ordering, Arc, Mutex},
};
use chrono::{prelude::*, TimeDelta};
use ffprobe::{ffprobe, Stream as FFStream};
use log::*;
use rand::prelude::*;
use regex::Regex;
use reqwest::header;
use serde::{de::Deserializer, Deserialize, Serialize};
use serde_json::json;
use simplelog::*;
use serde_json::{json, Map, Value};
pub mod advanced_config;
pub mod config;
pub mod controller;
pub mod errors;
pub mod folder;
pub mod generator;
pub mod import;
pub mod json_serializer;
mod json_validate;
mod logging;
pub mod json_validate;
pub use config::{
self as playout_config,
OutputMode::{self, *},
PlayoutConfig,
ProcessMode::{self, *},
Template, DUMMY_LEN, FFMPEG_IGNORE_ERRORS, FFMPEG_UNRECOVERABLE_ERRORS, IMAGE_FORMAT,
};
pub use controller::{
PlayerControl, PlayoutStatus, ProcessControl,
ProcessUnit::{self, *},
};
use errors::ProcError;
pub use generator::generate_playlist;
pub use json_serializer::{read_json, JsonPlaylist};
pub use json_validate::validate_playlist;
pub use logging::{init_logging, send_mail};
use crate::{
use crate::player::{
controller::{
ChannelManager,
ProcessUnit::{self, *},
},
filter::{filter_chains, Filters},
vec_strings,
};
use crate::utils::{
config::{OutputMode::*, PlayoutConfig, FFMPEG_IGNORE_ERRORS, FFMPEG_UNRECOVERABLE_ERRORS},
errors::ProcessError,
logging::Target,
};
pub use json_serializer::{read_json, JsonPlaylist};
use crate::vec_strings;
/// Compare incoming stream name with expecting name, but ignore question mark.
pub fn valid_stream(msg: &str) -> bool {
if let Some((unexpected, expected)) = msg.split_once(',') {
let re = Regex::new(r".*Unexpected stream|expecting|[\s]+|\?$").unwrap();
let unexpected = re.replace_all(unexpected, "");
let expected = re.replace_all(expected, "");
if unexpected == expected {
return true;
}
}
false
}
/// Prepare output parameters
///
/// Seek for multiple outputs and add mapping for it.
pub fn prepare_output_cmd(
config: &PlayoutConfig,
mut cmd: Vec<String>,
filters: &Option<Filters>,
) -> Vec<String> {
let mut output_params = config.output.clone().output_cmd.unwrap();
let mut new_params = vec![];
let mut count = 0;
let re_v = Regex::new(r"\[?0:v(:0)?\]?").unwrap();
if let Some(mut filter) = filters.clone() {
for (i, param) in output_params.iter().enumerate() {
if filter.video_out_link.len() > count && re_v.is_match(param) {
// replace mapping with link from filter struct
new_params.push(filter.video_out_link[count].clone());
} else {
new_params.push(param.clone());
}
// Check if parameter is a output
if i > 0
&& !param.starts_with('-')
&& !output_params[i - 1].starts_with('-')
&& i < output_params.len() - 1
{
count += 1;
if filter.video_out_link.len() > count
&& !output_params.contains(&"-map".to_string())
{
new_params.append(&mut vec_strings![
"-map",
filter.video_out_link[count].clone()
]);
for i in 0..config.processing.audio_tracks {
new_params.append(&mut vec_strings!["-map", format!("0:a:{i}")]);
}
}
}
}
output_params = new_params;
cmd.append(&mut filter.cmd());
// add mapping at the begin, if needed
if !filter.map().iter().all(|item| output_params.contains(item))
&& filter.output_chain.is_empty()
&& filter.video_out_link.is_empty()
{
cmd.append(&mut filter.map())
} else if &output_params[0] != "-map" && !filter.video_out_link.is_empty() {
cmd.append(&mut vec_strings!["-map", filter.video_out_link[0].clone()]);
for i in 0..config.processing.audio_tracks {
cmd.append(&mut vec_strings!["-map", format!("0:a:{i}")]);
}
}
}
cmd.append(&mut output_params);
cmd
}
/// map media struct to json object
pub fn get_media_map(media: Media) -> Value {
let mut obj = json!({
"in": media.seek,
"out": media.out,
"duration": media.duration,
"category": media.category,
"source": media.source,
});
if let Some(title) = media.title {
obj.as_object_mut()
.unwrap()
.insert("title".to_string(), Value::String(title));
}
obj
}
/// prepare json object for response
pub fn get_data_map(manager: &ChannelManager) -> Map<String, Value> {
let media = manager
.current_media
.lock()
.unwrap()
.clone()
.unwrap_or(Media::new(0, "", false));
let channel = manager.channel.lock().unwrap().clone();
let config = manager.config.lock().unwrap().processing.clone();
let ingest_is_running = manager.ingest_is_running.load(Ordering::SeqCst);
let mut data_map = Map::new();
let current_time = time_in_seconds();
let shift = channel.time_shift;
let begin = media.begin.unwrap_or(0.0) - shift;
let played_time = current_time - begin;
data_map.insert("index".to_string(), json!(media.index));
data_map.insert("ingest".to_string(), json!(ingest_is_running));
data_map.insert("mode".to_string(), json!(config.mode));
data_map.insert(
"shift".to_string(),
json!((shift * 1000.0).round() / 1000.0),
);
data_map.insert(
"elapsed".to_string(),
json!((played_time * 1000.0).round() / 1000.0),
);
data_map.insert("media".to_string(), get_media_map(media));
data_map
}
/// Video clip struct to hold some important states and comments for current media.
#[derive(Debug, Serialize, Deserialize, Clone)]
@ -250,7 +375,7 @@ pub struct MediaProbe {
}
impl MediaProbe {
pub fn new(input: &str) -> Result<Self, ProcError> {
pub fn new(input: &str) -> Result<Self, ProcessError> {
let probe = ffprobe(input);
let mut a_stream = vec![];
let mut v_stream = vec![];
@ -279,11 +404,11 @@ impl MediaProbe {
}
Err(e) => {
if !Path::new(input).is_file() && !is_remote(input) {
Err(ProcError::Custom(format!(
Err(ProcessError::Custom(format!(
"File <b><magenta>{input}</></b> not exist!"
)))
} else {
Err(ProcError::Ffprobe(e))
Err(ProcessError::Ffprobe(e))
}
}
}
@ -319,34 +444,6 @@ pub fn json_writer(path: &PathBuf, data: JsonPlaylist) -> Result<(), Error> {
Ok(())
}
/// Write current status to status file in temp folder.
///
/// The status file is init in main function and mostly modified in RPC server.
pub fn write_status(config: &PlayoutConfig, date: &str, shift: f64) {
let data = json!({
"time_shift": shift,
"date": date,
});
match serde_json::to_string(&data) {
Ok(status) => {
if let Err(e) = fs::write(&config.general.stat_file, status) {
error!(
"Unable to write to status file <b><magenta>{}</></b>: {e}",
config.general.stat_file
)
};
}
Err(e) => error!("Serialize status data failed: {e}"),
};
}
// pub fn get_timestamp() -> i32 {
// let local: DateTime<Local> = time_now();
// local.timestamp_millis() as i32
// }
/// Get current time in seconds.
pub fn time_in_seconds() -> f64 {
let local: DateTime<Local> = time_now();
@ -639,9 +736,9 @@ pub fn include_file_extension(config: &PlayoutConfig, file_path: &Path) -> bool
}
}
if config.out.mode == HLS {
if config.output.mode == HLS {
if let Some(ts_path) = config
.out
.output
.output_cmd
.clone()
.unwrap_or_else(|| vec![String::new()])
@ -656,7 +753,7 @@ pub fn include_file_extension(config: &PlayoutConfig, file_path: &Path) -> bool
}
if let Some(m3u8_path) = config
.out
.output
.output_cmd
.clone()
.unwrap_or_else(|| vec![String::new()])
@ -680,8 +777,9 @@ pub fn stderr_reader(
buffer: BufReader<ChildStderr>,
ignore: Vec<String>,
suffix: ProcessUnit,
proc_control: ProcessControl,
) -> Result<(), Error> {
manager: ChannelManager,
) -> Result<(), ProcessError> {
let id = manager.channel.lock().unwrap().id;
for line in buffer.lines() {
let line = line?;
@ -692,17 +790,17 @@ pub fn stderr_reader(
}
if line.contains("[info]") {
info!(
info!(target: Target::file_mail(), channel = id;
"<bright black>[{suffix}]</> {}",
line.replace("[info] ", "")
)
} else if line.contains("[warning]") {
warn!(
warn!(target: Target::file_mail(), channel = id;
"<bright black>[{suffix}]</> {}",
line.replace("[warning] ", "")
)
} else if line.contains("[error]") || line.contains("[fatal]") {
error!(
error!(target: Target::file_mail(), channel = id;
"<bright black>[{suffix}]</> {}",
line.replace("[error] ", "").replace("[fatal] ", "")
);
@ -713,8 +811,7 @@ pub fn stderr_reader(
|| (line.contains("No such file or directory")
&& !line.contains("failed to delete old segment"))
{
proc_control.stop_all();
exit(1);
manager.stop_all();
}
}
}
@ -741,6 +838,7 @@ fn is_in_system(name: &str) -> Result<(), String> {
}
fn ffmpeg_filter_and_libs(config: &mut PlayoutConfig) -> Result<(), String> {
let id = config.general.channel_id;
let ignore_flags = [
"--enable-gpl",
"--enable-version3",
@ -800,7 +898,7 @@ fn ffmpeg_filter_and_libs(config: &mut PlayoutConfig) -> Result<(), String> {
}
if let Err(e) = ff_proc.wait() {
error!("{e}")
error!(target: Target::file_mail(), channel = id; "{e}")
};
Ok(())
@ -813,14 +911,14 @@ pub fn validate_ffmpeg(config: &mut PlayoutConfig) -> Result<(), String> {
is_in_system("ffmpeg")?;
is_in_system("ffprobe")?;
if config.out.mode == Desktop {
if config.output.mode == Desktop {
is_in_system("ffplay")?;
}
ffmpeg_filter_and_libs(config)?;
if config
.out
.output
.output_cmd
.as_ref()
.unwrap()
@ -841,7 +939,7 @@ pub fn validate_ffmpeg(config: &mut PlayoutConfig) -> Result<(), String> {
}
if config
.out
.output
.output_cmd
.as_ref()
.unwrap()
@ -872,7 +970,7 @@ pub fn free_tcp_socket(exclude_socket: String) -> Option<String> {
}
/// check if tcp port is free
pub fn test_tcp_port(url: &str) -> bool {
pub fn test_tcp_port(id: i32, url: &str) -> bool {
let re = Regex::new(r"^[\w]+\://").unwrap();
let mut addr = url.to_string();
@ -891,19 +989,19 @@ pub fn test_tcp_port(url: &str) -> bool {
}
};
error!("Address <b><magenta>{url}</></b> already in use!");
error!(target: Target::file_mail(), channel = id; "Address <b><magenta>{url}</></b> already in use!");
false
}
/// Generate a vector with dates, from given range.
pub fn get_date_range(date_range: &[String]) -> Vec<String> {
pub fn get_date_range(id: i32, date_range: &[String]) -> Vec<String> {
let mut range = vec![];
let start = match NaiveDate::parse_from_str(&date_range[0], "%Y-%m-%d") {
Ok(s) => s,
Err(_) => {
error!("date format error in: <yellow>{:?}</>", date_range[0]);
error!(target: Target::file_mail(), channel = id; "date format error in: <yellow>{:?}</>", date_range[0]);
exit(1);
}
};
@ -911,7 +1009,7 @@ pub fn get_date_range(date_range: &[String]) -> Vec<String> {
let end = match NaiveDate::parse_from_str(&date_range[2], "%Y-%m-%d") {
Ok(e) => e,
Err(_) => {
error!("date format error in: <yellow>{:?}</>", date_range[2]);
error!(target: Target::file_mail(), channel = id; "date format error in: <yellow>{:?}</>", date_range[2]);
exit(1);
}
};

View File

@ -1,4 +1,7 @@
use std::{sync::Arc, time::Duration};
use std::{
sync::{atomic::Ordering, Arc},
time::Duration,
};
use actix_web::{rt::time::interval, web};
use actix_web_lab::{
@ -6,31 +9,24 @@ use actix_web_lab::{
util::InfallibleStream,
};
use ffplayout_lib::utils::PlayoutConfig;
use parking_lot::Mutex;
use tokio::sync::mpsc;
use tokio_stream::wrappers::ReceiverStream;
use crate::utils::{control::media_info, system};
use crate::player::{controller::ChannelManager, utils::get_data_map};
use crate::utils::system;
#[derive(Debug, Clone)]
struct Client {
_channel: i32,
config: PlayoutConfig,
manager: ChannelManager,
endpoint: String,
sender: mpsc::Sender<sse::Event>,
}
impl Client {
fn new(
_channel: i32,
config: PlayoutConfig,
endpoint: String,
sender: mpsc::Sender<sse::Event>,
) -> Self {
fn new(manager: ChannelManager, endpoint: String, sender: mpsc::Sender<sse::Event>) -> Self {
Self {
_channel,
config,
manager,
endpoint,
sender,
}
@ -103,8 +99,7 @@ impl Broadcaster {
/// Registers client with broadcaster, returning an SSE response body.
pub async fn new_client(
&self,
channel: i32,
config: PlayoutConfig,
manager: ChannelManager,
endpoint: String,
) -> Sse<InfallibleStream<ReceiverStream<sse::Event>>> {
let (tx, rx) = mpsc::channel(10);
@ -114,7 +109,7 @@ impl Broadcaster {
self.inner
.lock()
.clients
.push(Client::new(channel, config, endpoint, tx));
.push(Client::new(manager, endpoint, tx));
Sse::from_infallible_receiver(rx)
}
@ -124,23 +119,22 @@ impl Broadcaster {
let clients = self.inner.lock().clients.clone();
for client in clients.iter().filter(|client| client.endpoint == "playout") {
match media_info(&client.config, "current".into()).await {
Ok(res) => {
let _ = client
.sender
.send(
sse::Data::new(res.text().await.unwrap_or_else(|_| "Success".into()))
.into(),
)
.await;
}
Err(_) => {
let _ = client
.sender
.send(sse::Data::new("not running").into())
.await;
}
};
let media_map = get_data_map(&client.manager);
if client.manager.is_alive.load(Ordering::SeqCst) {
let _ = client
.sender
.send(
sse::Data::new(serde_json::to_string(&media_map).unwrap_or_default())
.into(),
)
.await;
} else {
let _ = client
.sender
.send(sse::Data::new("not running").into())
.await;
}
}
}
@ -150,7 +144,8 @@ impl Broadcaster {
for client in clients {
if &client.endpoint == "system" {
if let Ok(stat) = web::block(move || system::stat(client.config.clone())).await {
let config = client.manager.config.lock().unwrap().clone();
if let Ok(stat) = web::block(move || system::stat(config.clone())).await {
let stat_string = stat.to_string();
let _ = client.sender.send(sse::Data::new(stat_string).into()).await;
};

View File

@ -32,7 +32,7 @@ impl Default for UuidData {
}
}
pub struct AuthState {
pub struct SseAuthState {
pub uuids: Mutex<HashSet<UuidData>>,
}

View File

@ -1,11 +1,14 @@
use std::sync::Mutex;
use actix_web::{get, post, web, Responder};
use actix_web_grants::proc_macro::protect;
use serde::{Deserialize, Serialize};
use sqlx::{Pool, Sqlite};
use super::{check_uuid, prune_uuids, AuthState, UuidData};
use super::{check_uuid, prune_uuids, SseAuthState, UuidData};
use crate::db::models::Role;
use crate::player::controller::ChannelController;
use crate::sse::broadcast::Broadcaster;
use crate::utils::{errors::ServiceError, playout_config, Role};
use crate::utils::errors::ServiceError;
#[derive(Deserialize, Serialize)]
struct User {
@ -26,8 +29,11 @@ impl User {
/// curl -X GET 'http://127.0.0.1:8787/api/generate-uuid' -H 'Authorization: Bearer <TOKEN>'
/// ```
#[post("/generate-uuid")]
#[protect(any("Role::Admin", "Role::User"), ty = "Role")]
async fn generate_uuid(data: web::Data<AuthState>) -> Result<impl Responder, ServiceError> {
#[protect(
any("Role::GlobalAdmin", "Role::ChannelAdmin", "Role::User"),
ty = "Role"
)]
async fn generate_uuid(data: web::Data<SseAuthState>) -> Result<impl Responder, ServiceError> {
let mut uuids = data.uuids.lock().await;
let new_uuid = UuidData::new();
let user_auth = User::new(String::new(), new_uuid.uuid.to_string());
@ -46,7 +52,7 @@ async fn generate_uuid(data: web::Data<AuthState>) -> Result<impl Responder, Ser
/// ```
#[get("/validate")]
async fn validate_uuid(
data: web::Data<AuthState>,
data: web::Data<SseAuthState>,
user: web::Query<User>,
) -> Result<impl Responder, ServiceError> {
let mut uuids = data.uuids.lock().await;
@ -62,21 +68,21 @@ async fn validate_uuid(
/// ```BASH
/// curl -X GET 'http://127.0.0.1:8787/data/event/1?endpoint=system&uuid=f2f8c29b-712a-48c5-8919-b535d3a05a3a'
/// ```
#[get("/event/{channel}")]
#[get("/event/{id}")]
async fn event_stream(
pool: web::Data<Pool<Sqlite>>,
broadcaster: web::Data<Broadcaster>,
data: web::Data<AuthState>,
data: web::Data<SseAuthState>,
id: web::Path<i32>,
user: web::Query<User>,
controllers: web::Data<Mutex<ChannelController>>,
) -> Result<impl Responder, ServiceError> {
let mut uuids = data.uuids.lock().await;
check_uuid(&mut uuids, user.uuid.as_str())?;
let (config, _) = playout_config(&pool.clone().into_inner(), &id).await?;
let manager = controllers.lock().unwrap().get(*id).unwrap();
Ok(broadcaster
.new_client(*id, config, user.endpoint.clone())
.new_client(manager.clone(), user.endpoint.clone())
.await)
}

View File

@ -0,0 +1,273 @@
use std::path::Path;
use serde::{Deserialize, Serialize};
use serde_with::{serde_as, NoneAsEmptyString};
use shlex::split;
use sqlx::{Pool, Sqlite};
use tokio::io::AsyncReadExt;
use crate::db::{handles, models::AdvancedConfiguration};
use crate::utils::ServiceError;
#[derive(Debug, Default, Serialize, Deserialize, Clone)]
pub struct AdvancedConfig {
pub decoder: DecoderConfig,
pub encoder: EncoderConfig,
pub filter: FilterConfig,
pub ingest: IngestConfig,
}
#[serde_as]
#[derive(Debug, Default, Serialize, Deserialize, Clone)]
pub struct DecoderConfig {
#[serde_as(as = "NoneAsEmptyString")]
pub input_param: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub output_param: Option<String>,
#[serde(skip_serializing, skip_deserializing)]
pub input_cmd: Option<Vec<String>>,
#[serde(skip_serializing, skip_deserializing)]
pub output_cmd: Option<Vec<String>>,
}
#[serde_as]
#[derive(Debug, Default, Serialize, Deserialize, Clone)]
pub struct EncoderConfig {
#[serde_as(as = "NoneAsEmptyString")]
pub input_param: Option<String>,
#[serde(skip_serializing, skip_deserializing)]
pub input_cmd: Option<Vec<String>>,
}
#[serde_as]
#[derive(Debug, Default, Serialize, Deserialize, Clone)]
pub struct IngestConfig {
#[serde_as(as = "NoneAsEmptyString")]
pub input_param: Option<String>,
#[serde(skip_serializing, skip_deserializing)]
pub input_cmd: Option<Vec<String>>,
}
#[serde_as]
#[derive(Debug, Default, Serialize, Deserialize, Clone)]
pub struct FilterConfig {
#[serde_as(as = "NoneAsEmptyString")]
pub deinterlace: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub pad_scale_w: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub pad_scale_h: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub pad_video: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub fps: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub scale: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub set_dar: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub fade_in: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub fade_out: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub overlay_logo_scale: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub overlay_logo_fade_in: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub overlay_logo_fade_out: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub overlay_logo: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub tpad: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub drawtext_from_file: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub drawtext_from_zmq: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub aevalsrc: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub afade_in: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub afade_out: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub apad: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub volume: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub split: Option<String>,
}
impl AdvancedConfig {
pub fn new(config: AdvancedConfiguration) -> Self {
Self {
decoder: DecoderConfig {
input_param: config.decoder_input_param.clone(),
output_param: config.decoder_output_param.clone(),
input_cmd: match config.decoder_input_param {
Some(input_param) => split(&input_param),
None => None,
},
output_cmd: match config.decoder_output_param {
Some(output_param) => split(&output_param),
None => None,
},
},
encoder: EncoderConfig {
input_param: config.encoder_input_param.clone(),
input_cmd: match config.encoder_input_param {
Some(input_param) => split(&input_param),
None => None,
},
},
filter: FilterConfig {
deinterlace: config.filter_deinterlace,
pad_scale_w: config.filter_pad_scale_w,
pad_scale_h: config.filter_pad_scale_h,
pad_video: config.filter_pad_video,
fps: config.filter_fps,
scale: config.filter_scale,
set_dar: config.filter_set_dar,
fade_in: config.filter_fade_in,
fade_out: config.filter_fade_out,
overlay_logo_scale: config.filter_overlay_logo_scale,
overlay_logo_fade_in: config.filter_overlay_logo_fade_in,
overlay_logo_fade_out: config.filter_overlay_logo_fade_out,
overlay_logo: config.filter_overlay_logo,
tpad: config.filter_tpad,
drawtext_from_file: config.filter_drawtext_from_file,
drawtext_from_zmq: config.filter_drawtext_from_zmq,
aevalsrc: config.filter_aevalsrc,
afade_in: config.filter_afade_in,
afade_out: config.filter_afade_out,
apad: config.filter_apad,
volume: config.filter_volume,
split: config.filter_split,
},
ingest: IngestConfig {
input_param: config.ingest_input_param.clone(),
input_cmd: match config.ingest_input_param {
Some(input_param) => split(&input_param),
None => None,
},
},
}
}
pub async fn dump(pool: &Pool<Sqlite>, id: i32) -> Result<(), ServiceError> {
let config = Self::new(handles::select_advanced_configuration(pool, id).await?);
let f_keys = [
"deinterlace",
"pad_scale_w",
"pad_scale_h",
"pad_video",
"fps",
"scale",
"set_dar",
"fade_in",
"fade_out",
"overlay_logo_scale",
"overlay_logo_fade_in",
"overlay_logo_fade_out",
"overlay_logo",
"tpad",
"drawtext_from_file",
"drawtext_from_zmq",
"aevalsrc",
"afade_in",
"afade_out",
"apad",
"volume",
"split",
];
let toml_string = toml_edit::ser::to_string_pretty(&config)?;
let mut doc = toml_string.parse::<toml_edit::DocumentMut>()?;
if let Some(decoder) = doc.get_mut("decoder").and_then(|o| o.as_table_mut()) {
decoder
.decor_mut()
.set_prefix("# Changing these settings is for advanced users only!\n# There will be no support or guarantee that it will be stable after changing them.\n\n");
}
if let Some(output_param) = doc
.get_mut("decoder")
.and_then(|d| d.get_mut("output_param"))
.and_then(|o| o.as_value_mut())
{
output_param
.decor_mut()
.set_suffix(" # get also applied to ingest instance.");
}
if let Some(filter) = doc.get_mut("filter") {
for key in &f_keys {
if let Some(item) = filter.get_mut(*key).and_then(|o| o.as_value_mut()) {
match *key {
"deinterlace" => item.decor_mut().set_suffix(" # yadif=0:-1:0"),
"pad_scale_w" => item.decor_mut().set_suffix(" # scale={}:-1"),
"pad_scale_h" => item.decor_mut().set_suffix(" # scale=-1:{}"),
"pad_video" => item.decor_mut().set_suffix(
" # pad=max(iw\\,ih*({0}/{1})):ow/({0}/{1}):(ow-iw)/2:(oh-ih)/2",
),
"fps" => item.decor_mut().set_suffix(" # fps={}"),
"scale" => item.decor_mut().set_suffix(" # scale={}:{}"),
"set_dar" => item.decor_mut().set_suffix(" # setdar=dar={}"),
"fade_in" => item.decor_mut().set_suffix(" # fade=in:st=0:d=0.5"),
"fade_out" => item.decor_mut().set_suffix(" # fade=out:st={}:d=1.0"),
"overlay_logo_scale" => item.decor_mut().set_suffix(" # scale={}"),
"overlay_logo_fade_in" => {
item.decor_mut().set_suffix(" # fade=in:st=0:d=1.0:alpha=1")
}
"overlay_logo_fade_out" => item
.decor_mut()
.set_suffix(" # fade=out:st={}:d=1.0:alpha=1"),
"overlay_logo" => item
.decor_mut()
.set_suffix(" # null[l];[v][l]overlay={}:shortest=1"),
"tpad" => item
.decor_mut()
.set_suffix(" # tpad=stop_mode=add:stop_duration={}"),
"drawtext_from_file" => {
item.decor_mut().set_suffix(" # drawtext=text='{}':{}{}")
}
"drawtext_from_zmq" => item
.decor_mut()
.set_suffix(" # zmq=b=tcp\\\\://'{}',drawtext@dyntext={}"),
"aevalsrc" => item.decor_mut().set_suffix(
" # aevalsrc=0:channel_layout=stereo:duration={}:sample_rate=48000",
),
"afade_in" => item.decor_mut().set_suffix(" # afade=in:st=0:d=0.5"),
"afade_out" => item.decor_mut().set_suffix(" # afade=out:st={}:d=1.0"),
"apad" => item.decor_mut().set_suffix(" # apad=whole_dur={}"),
"volume" => item.decor_mut().set_suffix(" # volume={}"),
"split" => item.decor_mut().set_suffix(" # split={}{}"),
_ => (),
}
}
}
};
tokio::fs::write(&format!("advanced_{id}.toml"), doc.to_string()).await?;
Ok(())
}
pub async fn import(pool: &Pool<Sqlite>, import: Vec<String>) -> Result<(), ServiceError> {
let id = import[0].parse::<i32>()?;
let path = Path::new(&import[1]);
if path.is_file() {
let mut file = tokio::fs::File::open(path).await?;
let mut contents = String::new();
file.read_to_string(&mut contents).await?;
let config: Self = toml_edit::de::from_str(&contents).unwrap();
handles::update_advanced_configuration(pool, id, config).await?;
} else {
return Err(ServiceError::BadRequest("Path not exists!".to_string()));
}
Ok(())
}
}

View File

@ -0,0 +1,437 @@
use std::{
io::{stdin, stdout, Write},
path::PathBuf,
process::exit,
};
use clap::Parser;
use rpassword::read_password;
use sqlx::{Pool, Sqlite};
use crate::db::{
handles::{self, insert_user},
models::{Channel, GlobalSettings, User},
};
use crate::utils::{
advanced_config::AdvancedConfig,
config::{OutputMode, PlayoutConfig},
};
use crate::ARGS;
#[derive(Parser, Debug, Clone)]
#[clap(version,
about = "ffplayout - 24/7 broadcasting solution",
long_about = None)]
pub struct Args {
#[clap(
short,
long,
help = "Initialize defaults: global admin, paths, settings, etc."
)]
pub init: bool,
#[clap(short, long, help = "Add a global admin user")]
pub add: bool,
#[clap(long, env, help = "path to database file")]
pub db: Option<PathBuf>,
#[clap(
short,
long,
env,
help = "Channels by ids to process (for foreground, etc.)",
num_args = 1..,
)]
pub channels: Option<Vec<i32>>,
#[clap(long, env, help = "Run playout without webserver and frontend.")]
pub foreground: bool,
#[clap(
long,
help = "Dump advanced channel configuration to advanced_{channel}.toml"
)]
pub dump_advanced: Option<i32>,
#[clap(long, help = "Dump channel configuration to ffplayout_{channel}.toml")]
pub dump_config: Option<i32>,
#[clap(
long,
help = "import advanced channel configuration from file. Input must be `{channel id} {path to toml}`",
num_args = 2
)]
pub import_advanced: Option<Vec<String>>,
#[clap(
long,
help = "import channel configuration from file. Input must be `{channel id} {path to toml}`",
num_args = 2
)]
pub import_config: Option<Vec<String>>,
#[clap(long, help = "List available channel ids")]
pub list_channels: bool,
#[clap(long, env, help = "path to public files")]
pub public: Option<PathBuf>,
#[clap(short, env, long, help = "Listen on IP:PORT, like: 127.0.0.1:8787")]
pub listen: Option<String>,
#[clap(short, long, help = "Play folder content")]
pub folder: Option<PathBuf>,
#[clap(
short,
long,
help = "Generate playlist for dates, like: 2022-01-01 - 2022-01-10",
name = "YYYY-MM-DD",
num_args = 1..,
)]
pub generate: Option<Vec<String>>,
#[clap(long, help = "Optional folder path list for playlist generations", num_args = 1..)]
pub gen_paths: Option<Vec<PathBuf>>,
#[clap(long, env, help = "Keep log file for given days")]
pub log_backup_count: Option<usize>,
#[clap(
long,
env,
help = "Override logging level: trace, debug, println, warn, eprintln"
)]
pub log_level: Option<String>,
#[clap(long, env, help = "Logging path")]
pub log_path: Option<PathBuf>,
#[clap(long, env, help = "Log to console")]
pub log_to_console: bool,
#[clap(long, env, help = "HLS output path")]
pub hls_path: Option<PathBuf>,
#[clap(long, env, help = "Playlist root path")]
pub playlist_path: Option<PathBuf>,
#[clap(long, env, help = "Storage root path")]
pub storage_path: Option<PathBuf>,
#[clap(long, env, help = "Share storage across channels")]
pub shared_storage: bool,
#[clap(short, long, help = "Create admin user")]
pub username: Option<String>,
#[clap(short, long, help = "Admin mail address")]
pub mail: Option<String>,
#[clap(short, long, help = "Admin password")]
pub password: Option<String>,
#[clap(long, help = "Path to playlist, or playlist root folder.")]
pub playlist: Option<PathBuf>,
#[clap(
short,
long,
help = "Start time in 'hh:mm:ss', 'now' for start with first"
)]
pub start: Option<String>,
#[clap(short = 'T', long, help = "JSON Template file for generating playlist")]
pub template: Option<PathBuf>,
#[clap(short, long, help = "Set output mode: desktop, hls, null, stream")]
pub output: Option<OutputMode>,
#[clap(short, long, help = "Set audio volume")]
pub volume: Option<f64>,
#[clap(long, help = "Skip validation process")]
pub skip_validation: bool,
#[clap(long, help = "Only validate given playlist")]
pub validate: bool,
}
fn global_user(args: &mut Args) {
let mut user = String::new();
let mut mail = String::new();
print!("Global admin: ");
stdout().flush().unwrap();
stdin()
.read_line(&mut user)
.expect("Did not enter a correct name?");
args.username = Some(user.trim().to_string());
print!("Password: ");
stdout().flush().unwrap();
let password = read_password();
args.password = password.ok();
print!("Mail: ");
stdout().flush().unwrap();
stdin()
.read_line(&mut mail)
.expect("Did not enter a correct name?");
args.mail = Some(mail.trim().to_string());
}
pub async fn run_args(pool: &Pool<Sqlite>) -> Result<(), i32> {
let channels = handles::select_related_channels(pool, None).await;
let mut args = ARGS.clone();
if args.init {
let check_user = handles::select_users(pool).await;
let mut storage = String::new();
let mut playlist = String::new();
let mut logging = String::new();
let mut hls = String::new();
let mut shared_store = String::new();
let mut global = GlobalSettings {
id: 0,
secret: None,
hls_path: String::new(),
playlist_path: String::new(),
storage_path: String::new(),
logging_path: String::new(),
shared_storage: false,
};
if check_user.unwrap_or_default().is_empty() {
global_user(&mut args);
}
print!("Storage path [/var/lib/ffplayout/tv-media]: ");
stdout().flush().unwrap();
stdin()
.read_line(&mut storage)
.expect("Did not enter a correct path?");
if storage.trim().is_empty() {
global.storage_path = "/var/lib/ffplayout/tv-media".to_string();
} else {
global.storage_path = storage
.trim()
.trim_matches(|c| c == '"' || c == '\'')
.to_string();
}
print!("Playlist path [/var/lib/ffplayout/playlists]: ");
stdout().flush().unwrap();
stdin()
.read_line(&mut playlist)
.expect("Did not enter a correct path?");
if playlist.trim().is_empty() {
global.playlist_path = "/var/lib/ffplayout/playlists".to_string();
} else {
global.playlist_path = playlist
.trim()
.trim_matches(|c| c == '"' || c == '\'')
.to_string();
}
print!("Logging path [/var/log/ffplayout]: ");
stdout().flush().unwrap();
stdin()
.read_line(&mut logging)
.expect("Did not enter a correct path?");
if logging.trim().is_empty() {
global.logging_path = "/var/log/ffplayout".to_string();
} else {
global.logging_path = logging
.trim()
.trim_matches(|c| c == '"' || c == '\'')
.to_string();
}
print!("HLS path [/usr/share/ffplayout/public]: ");
stdout().flush().unwrap();
stdin()
.read_line(&mut hls)
.expect("Did not enter a correct path?");
if hls.trim().is_empty() {
global.hls_path = "/usr/share/ffplayout/public".to_string();
} else {
global.hls_path = hls
.trim()
.trim_matches(|c| c == '"' || c == '\'')
.to_string();
}
print!("Shared storage [Y/n]: ");
stdout().flush().unwrap();
stdin()
.read_line(&mut shared_store)
.expect("Did not enter a correct path?");
global.shared_storage = shared_store.trim().to_lowercase().starts_with('y');
if let Err(e) = handles::update_global(pool, global.clone()).await {
eprintln!("{e}");
return Err(1);
};
if !global.shared_storage {
let mut channel = handles::select_channel(pool, &1).await.unwrap();
channel.preview_url = "http://127.0.0.1:8787/1/stream.m3u8".to_string();
handles::update_channel(pool, 1, channel).await.unwrap();
};
println!("Set global settings...");
}
if args.add {
global_user(&mut args);
}
if let Some(username) = args.username {
if args.mail.is_none() || args.password.is_none() {
eprintln!("Mail/password missing!");
return Err(1);
}
let user = User {
id: 0,
mail: Some(args.mail.unwrap()),
username: username.clone(),
password: args.password.unwrap(),
role_id: Some(1),
channel_ids: Some(
channels
.unwrap_or(vec![Channel::default()])
.iter()
.map(|c| c.id)
.collect(),
),
token: None,
};
if let Err(e) = insert_user(pool, user).await {
eprintln!("{e}");
return Err(1);
};
println!("Create global admin user \"{username}\" done...");
return Err(0);
}
if ARGS.list_channels {
match channels {
Ok(channels) => {
let chl = channels
.iter()
.map(|c| (c.id, c.name.clone()))
.collect::<Vec<(i32, String)>>();
println!(
"Available channels:\n{}",
chl.iter()
.map(|(i, t)| format!(" {i}: '{t}'"))
.collect::<Vec<String>>()
.join("\n")
);
return Err(0);
}
Err(e) => {
eprintln!("List channels: {e}");
exit(1);
}
}
}
if let Some(id) = ARGS.dump_config {
match PlayoutConfig::dump(pool, id).await {
Ok(_) => {
println!("Dump config to: ffplayout_{id}.toml");
exit(0);
}
Err(e) => {
eprintln!("Dump config: {e}");
exit(1);
}
};
}
if let Some(id) = ARGS.dump_config {
match PlayoutConfig::dump(pool, id).await {
Ok(_) => {
println!("Dump config to: ffplayout_{id}.toml");
exit(0);
}
Err(e) => {
eprintln!("Dump config: {e}");
exit(1);
}
};
}
if let Some(id) = ARGS.dump_advanced {
match AdvancedConfig::dump(pool, id).await {
Ok(_) => {
println!("Dump config to: advanced_{id}.toml");
exit(0);
}
Err(e) => {
eprintln!("Dump config: {e}");
exit(1);
}
};
}
if let Some(import) = &ARGS.import_config {
match PlayoutConfig::import(pool, import.clone()).await {
Ok(_) => {
println!("Import config done...");
exit(0);
}
Err(e) => {
eprintln!("{e}");
exit(1);
}
};
}
if let Some(import) = &ARGS.import_advanced {
match AdvancedConfig::import(pool, import.clone()).await {
Ok(_) => {
println!("Import config done...");
exit(0);
}
Err(e) => {
eprintln!("{e}");
exit(1);
}
};
}
Ok(())
}

View File

@ -0,0 +1,102 @@
use std::{
io,
path::Path,
sync::{Arc, Mutex},
};
use log::*;
use sqlx::{Pool, Sqlite};
use super::logging::MailQueue;
use crate::db::{handles, models::Channel};
use crate::player::controller::{ChannelController, ChannelManager};
use crate::utils::{config::PlayoutConfig, errors::ServiceError};
async fn map_global_admins(conn: &Pool<Sqlite>) -> Result<(), ServiceError> {
let channels = handles::select_related_channels(conn, None).await?;
let admins = handles::select_global_admins(conn).await?;
for admin in admins {
if let Err(e) =
handles::insert_user_channel(conn, admin.id, channels.iter().map(|c| c.id).collect())
.await
{
error!("Update global admin: {e}");
};
}
Ok(())
}
fn preview_url(url: &str, id: i32) -> String {
let url_path = Path::new(url);
if let Some(parent) = url_path.parent() {
if let Some(filename) = url_path.file_name() {
let new_path = parent.join(id.to_string()).join(filename);
if let Some(new_url) = new_path.to_str() {
return new_url.to_string();
}
}
}
url.to_string()
}
pub async fn create_channel(
conn: &Pool<Sqlite>,
controllers: Arc<Mutex<ChannelController>>,
queue: Arc<Mutex<Vec<Arc<Mutex<MailQueue>>>>>,
target_channel: Channel,
) -> Result<Channel, ServiceError> {
let mut channel = handles::insert_channel(conn, target_channel).await?;
channel.preview_url = preview_url(&channel.preview_url, channel.id);
handles::update_channel(conn, channel.id, channel.clone()).await?;
let output_param = format!("-c:v libx264 -crf 23 -x264-params keyint=50:min-keyint=25:scenecut=-1 -maxrate 1300k -bufsize 2600k -preset faster -tune zerolatency -profile:v Main -level 3.1 -c:a aac -ar 44100 -b:a 128k -flags +cgop -f hls -hls_time 6 -hls_list_size 600 -hls_flags append_list+delete_segments+omit_endlist -hls_segment_filename {0}/stream-%d.ts {0}/stream.m3u8", channel.id);
handles::insert_advanced_configuration(conn, channel.id).await?;
handles::insert_configuration(conn, channel.id, output_param).await?;
let config = PlayoutConfig::new(conn, channel.id).await;
let m_queue = Arc::new(Mutex::new(MailQueue::new(channel.id, config.mail.clone())));
let manager = ChannelManager::new(Some(conn.clone()), channel.clone(), config);
controllers
.lock()
.map_err(|e| io::Error::new(io::ErrorKind::Other, e.to_string()))?
.add(manager);
if let Ok(mut mqs) = queue.lock() {
mqs.push(m_queue.clone());
}
map_global_admins(conn).await?;
Ok(channel)
}
pub async fn delete_channel(
conn: &Pool<Sqlite>,
id: i32,
controllers: Arc<Mutex<ChannelController>>,
queue: Arc<Mutex<Vec<Arc<Mutex<MailQueue>>>>>,
) -> Result<(), ServiceError> {
let channel = handles::select_channel(conn, &id).await?;
handles::delete_channel(conn, &channel.id).await?;
controllers
.lock()
.map_err(|e| io::Error::new(io::ErrorKind::Other, e.to_string()))?
.remove(id);
if let Ok(mut mqs) = queue.lock() {
mqs.retain(|q| q.lock().unwrap().id != id);
}
map_global_admins(conn).await?;
Ok(())
}

View File

@ -0,0 +1,864 @@
use std::{
fmt, io,
path::{Path, PathBuf},
str::FromStr,
};
use chrono::NaiveTime;
use flexi_logger::Level;
use serde::{Deserialize, Serialize};
use shlex::split;
use sqlx::{Pool, Sqlite};
use tokio::{fs, io::AsyncReadExt};
use crate::db::{handles, models};
use crate::utils::{files::norm_abs_path, free_tcp_socket, time_to_sec};
use crate::vec_strings;
use crate::AdvancedConfig;
use crate::ARGS;
use super::errors::ServiceError;
pub const DUMMY_LEN: f64 = 60.0;
pub const IMAGE_FORMAT: [&str; 21] = [
"bmp", "dds", "dpx", "exr", "gif", "hdr", "j2k", "jpg", "jpeg", "pcx", "pfm", "pgm", "phm",
"png", "psd", "ppm", "sgi", "svg", "tga", "tif", "webp",
];
// Some well known errors can be safely ignore
pub const FFMPEG_IGNORE_ERRORS: [&str; 12] = [
"ac-tex damaged",
"codec s302m, is muxed as a private data stream",
"corrupt decoded frame in stream",
"corrupt input packet in stream",
"end mismatch left",
"Packet corrupt",
"Referenced QT chapter track not found",
"skipped MB in I-frame at",
"Thread message queue blocking",
"timestamp discontinuity",
"Warning MVs not available",
"frame size not set",
];
pub const FFMPEG_UNRECOVERABLE_ERRORS: [&str; 5] = [
"Address already in use",
"Invalid argument",
"Numerical result",
"Error initializing complex filters",
"Error while decoding stream #0:0: Invalid data found when processing input",
];
#[derive(Debug, Clone, Eq, PartialEq, Deserialize, Serialize)]
#[serde(rename_all = "lowercase")]
pub enum OutputMode {
Desktop,
HLS,
Null,
Stream,
}
impl OutputMode {
fn new(s: &str) -> Self {
match s {
"desktop" => Self::Desktop,
"null" => Self::Null,
"stream" => Self::Stream,
_ => Self::HLS,
}
}
}
impl Default for OutputMode {
fn default() -> Self {
Self::HLS
}
}
impl FromStr for OutputMode {
type Err = String;
fn from_str(input: &str) -> Result<Self, Self::Err> {
match input {
"desktop" => Ok(Self::Desktop),
"hls" => Ok(Self::HLS),
"null" => Ok(Self::Null),
"stream" => Ok(Self::Stream),
_ => Err("Use 'desktop', 'hls', 'null' or 'stream'".to_string()),
}
}
}
impl fmt::Display for OutputMode {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
match *self {
OutputMode::Desktop => write!(f, "desktop"),
OutputMode::HLS => write!(f, "hls"),
OutputMode::Null => write!(f, "null"),
OutputMode::Stream => write!(f, "stream"),
}
}
}
#[derive(Debug, Default, Clone, Serialize, Deserialize, Eq, PartialEq)]
#[serde(rename_all = "lowercase")]
pub enum ProcessMode {
Folder,
#[default]
Playlist,
}
impl ProcessMode {
fn new(s: &str) -> Self {
match s {
"folder" => Self::Folder,
_ => Self::Playlist,
}
}
}
impl fmt::Display for ProcessMode {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
match *self {
ProcessMode::Folder => write!(f, "folder"),
ProcessMode::Playlist => write!(f, "playlist"),
}
}
}
impl FromStr for ProcessMode {
type Err = String;
fn from_str(input: &str) -> Result<Self, Self::Err> {
match input {
"folder" => Ok(Self::Folder),
"playlist" => Ok(Self::Playlist),
_ => Err("Use 'folder' or 'playlist'".to_string()),
}
}
}
#[derive(Clone, Debug, Default, Deserialize, Serialize)]
pub struct Template {
pub sources: Vec<Source>,
}
#[derive(Clone, Debug, Default, Deserialize, Serialize)]
pub struct Source {
pub start: NaiveTime,
pub duration: NaiveTime,
pub shuffle: bool,
pub paths: Vec<PathBuf>,
}
/// Global Config
///
/// This we init ones, when ffplayout is starting and use them globally in the hole program.
#[derive(Debug, Default, Clone, Deserialize, Serialize)]
pub struct PlayoutConfig {
#[serde(skip_serializing, skip_deserializing)]
pub global: Global,
#[serde(skip_serializing, skip_deserializing)]
pub advanced: AdvancedConfig,
pub general: General,
pub mail: Mail,
pub logging: Logging,
pub processing: Processing,
pub ingest: Ingest,
pub playlist: Playlist,
pub storage: Storage,
pub text: Text,
pub task: Task,
#[serde(alias = "out")]
pub output: Output,
}
#[derive(Debug, Default, Clone, Deserialize, Serialize)]
pub struct Global {
pub hls_path: PathBuf,
pub playlist_path: PathBuf,
pub storage_path: PathBuf,
pub logging_path: PathBuf,
pub shared_storage: bool,
}
impl Global {
pub fn new(config: &models::GlobalSettings) -> Self {
Self {
hls_path: PathBuf::from(config.hls_path.clone()),
playlist_path: PathBuf::from(config.playlist_path.clone()),
storage_path: PathBuf::from(config.storage_path.clone()),
logging_path: PathBuf::from(config.logging_path.clone()),
shared_storage: config.shared_storage,
}
}
}
#[derive(Debug, Default, Clone, Deserialize, Serialize)]
pub struct General {
pub help_text: String,
#[serde(skip_serializing, skip_deserializing)]
pub id: i32,
#[serde(skip_serializing, skip_deserializing)]
pub channel_id: i32,
pub stop_threshold: f64,
#[serde(skip_serializing, skip_deserializing)]
pub generate: Option<Vec<String>>,
#[serde(skip_serializing, skip_deserializing)]
pub ffmpeg_filters: Vec<String>,
#[serde(skip_serializing, skip_deserializing)]
pub ffmpeg_libs: Vec<String>,
#[serde(skip_serializing, skip_deserializing)]
pub template: Option<Template>,
#[serde(skip_serializing, skip_deserializing)]
pub skip_validation: bool,
#[serde(skip_serializing, skip_deserializing)]
pub validate: bool,
}
impl General {
fn new(config: &models::Configuration) -> Self {
Self {
help_text: config.general_help.clone(),
id: config.id,
channel_id: config.channel_id,
stop_threshold: config.general_stop_threshold,
generate: None,
ffmpeg_filters: vec![],
ffmpeg_libs: vec![],
template: None,
skip_validation: false,
validate: false,
}
}
}
#[derive(Debug, Clone, Deserialize, Serialize)]
pub struct Mail {
pub help_text: String,
pub subject: String,
pub smtp_server: String,
pub starttls: bool,
pub sender_addr: String,
pub sender_pass: String,
pub recipient: String,
pub mail_level: Level,
pub interval: i64,
}
impl Mail {
fn new(config: &models::Configuration) -> Self {
Self {
help_text: config.mail_help.clone(),
subject: config.mail_subject.clone(),
smtp_server: config.mail_smtp.clone(),
starttls: config.mail_starttls,
sender_addr: config.mail_addr.clone(),
sender_pass: config.mail_pass.clone(),
recipient: config.mail_recipient.clone(),
mail_level: string_to_log_level(config.mail_level.clone()),
interval: config.mail_interval,
}
}
}
impl Default for Mail {
fn default() -> Self {
Mail {
help_text: String::default(),
subject: String::default(),
smtp_server: String::default(),
starttls: bool::default(),
sender_addr: String::default(),
sender_pass: String::default(),
recipient: String::default(),
mail_level: Level::Debug,
interval: i64::default(),
}
}
}
#[derive(Debug, Default, Clone, Deserialize, Serialize)]
pub struct Logging {
pub help_text: String,
pub ffmpeg_level: String,
pub ingest_level: String,
pub detect_silence: bool,
pub ignore_lines: Vec<String>,
}
impl Logging {
fn new(config: &models::Configuration) -> Self {
Self {
help_text: config.logging_help.clone(),
ffmpeg_level: config.logging_ffmpeg_level.clone(),
ingest_level: config.logging_ingest_level.clone(),
detect_silence: config.logging_detect_silence,
ignore_lines: config
.logging_ignore
.split(';')
.map(|s| s.to_string())
.collect(),
}
}
}
#[derive(Debug, Default, Clone, Deserialize, Serialize)]
pub struct Processing {
pub help_text: String,
pub mode: ProcessMode,
pub audio_only: bool,
pub copy_audio: bool,
pub copy_video: bool,
pub width: i64,
pub height: i64,
pub aspect: f64,
pub fps: f64,
pub add_logo: bool,
pub logo: String,
pub logo_scale: String,
pub logo_opacity: f64,
pub logo_position: String,
pub audio_tracks: i32,
#[serde(default = "default_track_index")]
pub audio_track_index: i32,
pub audio_channels: u8,
pub volume: f64,
pub custom_filter: String,
#[serde(skip_serializing, skip_deserializing)]
pub cmd: Option<Vec<String>>,
}
impl Processing {
fn new(config: &models::Configuration) -> Self {
Self {
help_text: config.processing_help.clone(),
mode: ProcessMode::new(&config.processing_mode.clone()),
audio_only: config.processing_audio_only,
audio_track_index: config.processing_audio_track_index,
copy_audio: config.processing_copy_audio,
copy_video: config.processing_copy_video,
width: config.processing_width,
height: config.processing_height,
aspect: config.processing_aspect,
fps: config.processing_fps,
add_logo: config.processing_add_logo,
logo: config.processing_logo.clone(),
logo_scale: config.processing_logo_scale.clone(),
logo_opacity: config.processing_logo_opacity,
logo_position: config.processing_logo_position.clone(),
audio_tracks: config.processing_audio_tracks,
audio_channels: config.processing_audio_channels,
volume: config.processing_volume,
custom_filter: config.processing_filter.clone(),
cmd: None,
}
}
}
#[derive(Debug, Default, Clone, Deserialize, Serialize)]
pub struct Ingest {
pub help_text: String,
pub enable: bool,
pub input_param: String,
pub custom_filter: String,
#[serde(skip_serializing, skip_deserializing)]
pub input_cmd: Option<Vec<String>>,
}
impl Ingest {
fn new(config: &models::Configuration) -> Self {
Self {
help_text: config.ingest_help.clone(),
enable: config.ingest_enable,
input_param: config.ingest_param.clone(),
custom_filter: config.ingest_filter.clone(),
input_cmd: None,
}
}
}
#[derive(Debug, Default, Clone, Deserialize, Serialize)]
pub struct Playlist {
pub help_text: String,
pub day_start: String,
#[serde(skip_serializing, skip_deserializing)]
pub start_sec: Option<f64>,
pub length: String,
#[serde(skip_serializing, skip_deserializing)]
pub length_sec: Option<f64>,
pub infinit: bool,
}
impl Playlist {
fn new(config: &models::Configuration) -> Self {
Self {
help_text: config.playlist_help.clone(),
day_start: config.playlist_day_start.clone(),
start_sec: None,
length: config.playlist_length.clone(),
length_sec: None,
infinit: config.playlist_infinit,
}
}
}
#[derive(Debug, Default, Clone, Deserialize, Serialize)]
pub struct Storage {
pub help_text: String,
#[serde(skip_serializing, skip_deserializing)]
pub paths: Vec<PathBuf>,
pub filler: PathBuf,
pub extensions: Vec<String>,
pub shuffle: bool,
}
impl Storage {
fn new(config: &models::Configuration) -> Self {
Self {
help_text: config.storage_help.clone(),
paths: vec![],
filler: PathBuf::from(config.storage_filler.clone()),
extensions: config
.storage_extensions
.split(';')
.map(|s| s.to_string())
.collect(),
shuffle: config.storage_shuffle,
}
}
}
#[derive(Debug, Default, Clone, Deserialize, Serialize)]
pub struct Text {
pub help_text: String,
pub add_text: bool,
#[serde(skip_serializing, skip_deserializing)]
pub node_pos: Option<usize>,
#[serde(skip_serializing, skip_deserializing)]
pub zmq_stream_socket: Option<String>,
#[serde(skip_serializing, skip_deserializing)]
pub zmq_server_socket: Option<String>,
pub fontfile: String,
pub text_from_filename: bool,
pub style: String,
pub regex: String,
}
impl Text {
fn new(config: &models::Configuration) -> Self {
Self {
help_text: config.text_help.clone(),
add_text: config.text_add,
node_pos: None,
zmq_stream_socket: None,
zmq_server_socket: None,
fontfile: config.text_font.clone(),
text_from_filename: config.text_from_filename,
style: config.text_style.clone(),
regex: config.text_regex.clone(),
}
}
}
#[derive(Debug, Default, Clone, Deserialize, Serialize)]
pub struct Task {
pub help_text: String,
pub enable: bool,
pub path: PathBuf,
}
impl Task {
fn new(config: &models::Configuration) -> Self {
Self {
help_text: config.task_help.clone(),
enable: config.task_enable,
path: PathBuf::from(config.task_path.clone()),
}
}
}
#[derive(Debug, Default, Clone, Deserialize, Serialize)]
pub struct Output {
pub help_text: String,
pub mode: OutputMode,
pub output_param: String,
#[serde(skip_serializing, skip_deserializing)]
pub output_count: usize,
#[serde(skip_serializing, skip_deserializing)]
pub output_filter: Option<String>,
#[serde(skip_serializing, skip_deserializing)]
pub output_cmd: Option<Vec<String>>,
}
impl Output {
fn new(config: &models::Configuration) -> Self {
Self {
help_text: config.output_help.clone(),
mode: OutputMode::new(&config.output_mode),
output_param: config.output_param.clone(),
output_count: 0,
output_filter: None,
output_cmd: None,
}
}
}
pub fn string_to_log_level(l: String) -> Level {
match l.to_lowercase().as_str() {
"error" => Level::Error,
"info" => Level::Info,
"trace" => Level::Trace,
"warning" => Level::Warn,
_ => Level::Debug,
}
}
pub fn string_to_processing_mode(l: String) -> ProcessMode {
match l.to_lowercase().as_str() {
"playlist" => ProcessMode::Playlist,
"folder" => ProcessMode::Folder,
_ => ProcessMode::Playlist,
}
}
pub fn string_to_output_mode(l: String) -> OutputMode {
match l.to_lowercase().as_str() {
"desktop" => OutputMode::Desktop,
"hls" => OutputMode::HLS,
"null" => OutputMode::Null,
"stream" => OutputMode::Stream,
_ => OutputMode::HLS,
}
}
fn default_track_index() -> i32 {
-1
}
// fn default_tracks() -> i32 {
// 1
// }
// fn default_channels() -> u8 {
// 2
// }
impl PlayoutConfig {
pub async fn new(pool: &Pool<Sqlite>, channel_id: i32) -> Self {
let global = handles::select_global(pool)
.await
.expect("Can't read globals");
let config = handles::select_configuration(pool, channel_id)
.await
.expect("Can't read config");
let adv_config = handles::select_advanced_configuration(pool, channel_id)
.await
.expect("Can't read advanced config");
let mut global = Global::new(&global);
let advanced = AdvancedConfig::new(adv_config);
let general = General::new(&config);
let mail = Mail::new(&config);
let logging = Logging::new(&config);
let mut processing = Processing::new(&config);
let mut ingest = Ingest::new(&config);
let mut playlist = Playlist::new(&config);
let mut storage = Storage::new(&config);
let mut text = Text::new(&config);
let task = Task::new(&config);
let mut output = Output::new(&config);
if !global.shared_storage {
global.storage_path = global.storage_path.join(channel_id.to_string());
}
if !global.storage_path.is_dir() {
tokio::fs::create_dir_all(&global.storage_path)
.await
.expect("Can't create storage folder");
}
if channel_id > 1 || !global.shared_storage {
global.playlist_path = global.playlist_path.join(channel_id.to_string());
global.hls_path = global.hls_path.join(channel_id.to_string());
}
if !global.playlist_path.is_dir() {
tokio::fs::create_dir_all(&global.playlist_path)
.await
.expect("Can't create playlist folder");
}
let (filler_path, _, _) = norm_abs_path(&global.storage_path, &config.storage_filler)
.expect("Can't get filler path");
storage.filler = filler_path;
playlist.start_sec = Some(time_to_sec(&playlist.day_start));
if playlist.length.contains(':') {
playlist.length_sec = Some(time_to_sec(&playlist.length));
} else {
playlist.length_sec = Some(86400.0);
}
if processing.add_logo && !Path::new(&processing.logo).is_file() {
processing.add_logo = false;
}
if processing.audio_tracks < 1 {
processing.audio_tracks = 1
}
let mut process_cmd = vec_strings![];
if processing.audio_only {
process_cmd.append(&mut vec_strings!["-vn"]);
} else if processing.copy_video {
process_cmd.append(&mut vec_strings!["-c:v", "copy"]);
} else if let Some(decoder_cmd) = &advanced.decoder.output_cmd {
process_cmd.append(&mut decoder_cmd.clone());
} else {
let bitrate = format!("{}k", processing.width * processing.height / 16);
let buff_size = format!("{}k", (processing.width * processing.height / 16) / 2);
process_cmd.append(&mut vec_strings![
"-pix_fmt",
"yuv420p",
"-r",
&processing.fps,
"-c:v",
"mpeg2video",
"-g",
"1",
"-b:v",
&bitrate,
"-minrate",
&bitrate,
"-maxrate",
&bitrate,
"-bufsize",
&buff_size
]);
}
if processing.copy_audio {
process_cmd.append(&mut vec_strings!["-c:a", "copy"]);
} else if advanced.decoder.output_cmd.is_none() {
process_cmd.append(&mut pre_audio_codec(
&processing.custom_filter,
&ingest.custom_filter,
processing.audio_channels,
));
}
process_cmd.append(&mut vec_strings!["-f", "mpegts", "-"]);
processing.cmd = Some(process_cmd);
ingest.input_cmd = split(ingest.input_param.as_str());
output.output_count = 1;
output.output_filter = None;
if output.mode == OutputMode::Null {
output.output_cmd = Some(vec_strings!["-f", "null", "-"]);
} else if let Some(mut cmd) = split(output.output_param.as_str()) {
// get output count according to the var_stream_map value, or by counting output parameters
if let Some(i) = cmd.clone().iter().position(|m| m == "-var_stream_map") {
output.output_count = cmd[i + 1].split_whitespace().count();
} else {
output.output_count = cmd
.iter()
.enumerate()
.filter(|(i, p)| i > &0 && !p.starts_with('-') && !cmd[i - 1].starts_with('-'))
.count();
}
if let Some(i) = cmd.clone().iter().position(|r| r == "-filter_complex") {
output.output_filter = Some(cmd[i + 1].clone());
cmd.remove(i);
cmd.remove(i);
}
for item in cmd.iter_mut() {
if item.ends_with(".ts") || (item.ends_with(".m3u8") && item != "master.m3u8") {
if let Ok((hls_path, _, _)) = norm_abs_path(&global.hls_path, item) {
let parent = hls_path.parent().expect("HLS parent path");
if !parent.is_dir() {
fs::create_dir_all(parent).await.expect("Create HLS path");
}
item.clone_from(&hls_path.to_string_lossy().to_string());
};
}
}
output.output_cmd = Some(cmd);
}
// when text overlay without text_from_filename is on, turn also the RPC server on,
// to get text messages from it
if text.add_text && !text.text_from_filename {
text.zmq_stream_socket = free_tcp_socket(String::new());
text.zmq_server_socket =
free_tcp_socket(text.zmq_stream_socket.clone().unwrap_or_default());
text.node_pos = Some(2);
} else {
text.zmq_stream_socket = None;
text.zmq_server_socket = None;
text.node_pos = None;
}
Self {
global,
advanced,
general,
mail,
logging,
processing,
ingest,
playlist,
storage,
text,
task,
output,
}
}
pub async fn dump(pool: &Pool<Sqlite>, id: i32) -> Result<(), ServiceError> {
let mut config = Self::new(pool, id).await;
config.storage.filler.clone_from(
&config
.storage
.filler
.strip_prefix(config.global.storage_path.clone())
.unwrap_or(&config.storage.filler)
.to_path_buf(),
);
let toml_string = toml_edit::ser::to_string_pretty(&config)?;
tokio::fs::write(&format!("ffplayout_{id}.toml"), toml_string).await?;
Ok(())
}
pub async fn import(pool: &Pool<Sqlite>, import: Vec<String>) -> Result<(), ServiceError> {
let id = import[0].parse::<i32>()?;
let path = Path::new(&import[1]);
if path.is_file() {
let mut file = tokio::fs::File::open(path).await?;
let mut contents = String::new();
file.read_to_string(&mut contents).await?;
let config: PlayoutConfig = toml_edit::de::from_str(&contents).unwrap();
handles::update_configuration(pool, id, config).await?;
} else {
return Err(ServiceError::BadRequest("Path not exists!".to_string()));
}
Ok(())
}
}
// impl Default for PlayoutConfig {
// fn default() -> Self {
// Self::new(1)
// }
// }
/// When custom_filter contains loudnorm filter use a different audio encoder,
/// s302m has higher quality, but is experimental
/// and works not well together with the loudnorm filter.
fn pre_audio_codec(proc_filter: &str, ingest_filter: &str, channel_count: u8) -> Vec<String> {
let mut codec = vec_strings![
"-c:a",
"s302m",
"-strict",
"-2",
"-sample_fmt",
"s16",
"-ar",
"48000",
"-ac",
channel_count
];
if proc_filter.contains("loudnorm") || ingest_filter.contains("loudnorm") {
codec = vec_strings![
"-c:a",
"mp2",
"-b:a",
"384k",
"-ar",
"48000",
"-ac",
channel_count
];
}
codec
}
/// Read command line arguments, and override the config with them.
pub async fn get_config(pool: &Pool<Sqlite>, channel_id: i32) -> Result<PlayoutConfig, io::Error> {
let mut config = PlayoutConfig::new(pool, channel_id).await;
let args = ARGS.clone();
config.general.generate = args.generate;
config.general.validate = args.validate;
config.general.skip_validation = args.skip_validation;
if let Some(template_file) = args.template {
let mut f = fs::File::options()
.read(true)
.write(false)
.open(template_file)
.await?;
let mut buffer = Vec::new();
f.read_to_end(&mut buffer).await?;
let mut template: Template = serde_json::from_slice(&buffer)?;
template.sources.sort_by(|d1, d2| d1.start.cmp(&d2.start));
config.general.template = Some(template);
}
if let Some(paths) = args.gen_paths {
config.storage.paths = paths;
}
if let Some(playlist) = args.playlist {
config.global.playlist_path = playlist;
}
if let Some(folder) = args.folder {
config.global.storage_path = folder;
config.processing.mode = ProcessMode::Folder;
}
if let Some(start) = args.start {
config.playlist.day_start.clone_from(&start);
config.playlist.start_sec = Some(time_to_sec(&start));
}
if let Some(output) = args.output {
config.output.mode = output;
if config.output.mode == OutputMode::Null {
config.output.output_count = 1;
config.output.output_filter = None;
config.output.output_cmd = Some(vec_strings!["-f", "null", "-"]);
}
}
if let Some(volume) = args.volume {
config.processing.volume = volume;
}
Ok(config)
}

View File

@ -0,0 +1,259 @@
use std::{error::Error, fmt, str::FromStr, sync::atomic::Ordering};
use log::*;
use serde::{Deserialize, Serialize};
use serde_json::{json, Map, Value};
use sqlx::{Pool, Sqlite};
use zeromq::{Socket, SocketRecv, SocketSend, ZmqMessage};
use crate::db::handles;
use crate::player::{
controller::{ChannelManager, ProcessUnit::*},
utils::{get_delta, get_media_map},
};
use crate::utils::{config::OutputMode::*, errors::ServiceError, TextFilter};
#[derive(Debug, Deserialize, Serialize, Clone)]
struct TextParams {
control: String,
message: TextFilter,
}
#[derive(Debug, Deserialize, Serialize, Clone)]
pub struct ControlParams {
pub control: String,
}
#[derive(Debug, Deserialize, Serialize, Clone)]
struct MediaParams {
media: String,
}
#[derive(Debug, Serialize, Deserialize, Clone, Eq, PartialEq)]
#[serde(rename_all = "snake_case")]
pub enum ProcessCtl {
Status,
Start,
Stop,
Restart,
}
impl FromStr for ProcessCtl {
type Err = String;
fn from_str(input: &str) -> Result<Self, Self::Err> {
match input.to_lowercase().as_str() {
"status" => Ok(Self::Status),
"start" => Ok(Self::Start),
"stop" => Ok(Self::Stop),
"restart" => Ok(Self::Restart),
_ => Err(format!("Command '{input}' not found!")),
}
}
}
impl fmt::Display for ProcessCtl {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
match *self {
Self::Status => write!(f, "status"),
Self::Start => write!(f, "start"),
Self::Stop => write!(f, "stop"),
Self::Restart => write!(f, "restart"),
}
}
}
#[derive(Debug, Deserialize, Serialize, Clone)]
pub struct Process {
pub command: ProcessCtl,
}
async fn zmq_send(msg: &str, socket_addr: &str) -> Result<String, Box<dyn Error>> {
let mut socket = zeromq::ReqSocket::new();
socket.connect(&format!("tcp://{socket_addr}")).await?;
socket.send(msg.into()).await?;
let repl: ZmqMessage = socket.recv().await?;
let response = String::from_utf8(repl.into_vec()[0].to_vec())?;
Ok(response)
}
pub async fn send_message(
manager: ChannelManager,
message: TextFilter,
) -> Result<Map<String, Value>, ServiceError> {
let filter = message.to_string();
let mut data_map = Map::new();
let config = manager.config.lock().unwrap().clone();
if config.text.zmq_stream_socket.is_some() {
if let Some(clips_filter) = manager.filter_chain.clone() {
*clips_filter.lock().unwrap() = vec![filter.clone()];
}
if config.output.mode == HLS {
if manager.ingest_is_running.load(Ordering::SeqCst) {
let filter_server = format!("drawtext@dyntext reinit {filter}");
if let Ok(reply) = zmq_send(
&filter_server,
&config.text.zmq_server_socket.clone().unwrap(),
)
.await
{
data_map.insert("message".to_string(), json!(reply));
return Ok(data_map);
};
} else if let Err(e) = manager.stop(Ingest) {
error!("Ingest {e:?}")
}
}
if config.output.mode != HLS || !manager.ingest_is_running.load(Ordering::SeqCst) {
let filter_stream = format!("drawtext@dyntext reinit {filter}");
if let Ok(reply) = zmq_send(
&filter_stream,
&config.text.zmq_stream_socket.clone().unwrap(),
)
.await
{
data_map.insert("message".to_string(), json!(reply));
return Ok(data_map);
};
}
}
Err(ServiceError::ServiceUnavailable(
"text message missing!".to_string(),
))
}
pub async fn control_state(
conn: &Pool<Sqlite>,
manager: ChannelManager,
command: &str,
) -> Result<Map<String, Value>, ServiceError> {
let config = manager.config.lock().unwrap().clone();
let current_date = manager.current_date.lock().unwrap().clone();
let current_list = manager.current_list.lock().unwrap().clone();
let mut date = manager.current_date.lock().unwrap().clone();
let index = manager.current_index.load(Ordering::SeqCst);
match command {
"back" => {
if index > 1 && current_list.len() > 1 {
if let Some(proc) = manager.decoder.lock().unwrap().as_mut() {
if let Err(e) = proc.kill() {
error!("Decoder {e:?}")
};
if let Err(e) = proc.wait() {
error!("Decoder {e:?}")
};
} else {
return Err(ServiceError::InternalServerError);
}
info!("Move to last clip");
let mut data_map = Map::new();
let mut media = current_list[index - 2].clone();
manager.current_index.fetch_sub(2, Ordering::SeqCst);
if let Err(e) = media.add_probe(false) {
error!("{e:?}");
};
let (delta, _) = get_delta(&config, &media.begin.unwrap_or(0.0));
manager.channel.lock().unwrap().time_shift = delta;
date.clone_from(&current_date);
handles::update_stat(conn, config.general.channel_id, current_date, delta).await?;
data_map.insert("operation".to_string(), json!("move_to_last"));
data_map.insert("shifted_seconds".to_string(), json!(delta));
data_map.insert("media".to_string(), get_media_map(media));
return Ok(data_map);
}
}
"next" => {
if index < current_list.len() {
if let Some(proc) = manager.decoder.lock().unwrap().as_mut() {
if let Err(e) = proc.kill() {
error!("Decoder {e:?}")
};
if let Err(e) = proc.wait() {
error!("Decoder {e:?}")
};
} else {
return Err(ServiceError::InternalServerError);
}
info!("Move to next clip");
let mut data_map = Map::new();
let mut media = current_list[index].clone();
if let Err(e) = media.add_probe(false) {
error!("{e:?}");
};
let (delta, _) = get_delta(&config, &media.begin.unwrap_or(0.0));
manager.channel.lock().unwrap().time_shift = delta;
date.clone_from(&current_date);
handles::update_stat(conn, config.general.channel_id, current_date, delta).await?;
data_map.insert("operation".to_string(), json!("move_to_next"));
data_map.insert("shifted_seconds".to_string(), json!(delta));
data_map.insert("media".to_string(), get_media_map(media));
return Ok(data_map);
}
}
"reset" => {
if let Some(proc) = manager.decoder.lock().unwrap().as_mut() {
if let Err(e) = proc.kill() {
error!("Decoder {e:?}")
};
if let Err(e) = proc.wait() {
error!("Decoder {e:?}")
};
} else {
return Err(ServiceError::InternalServerError);
}
info!("Reset playout to original state");
let mut data_map = Map::new();
manager.channel.lock().unwrap().time_shift = 0.0;
date.clone_from(&current_date);
manager.list_init.store(true, Ordering::SeqCst);
handles::update_stat(conn, config.general.channel_id, current_date, 0.0).await?;
data_map.insert("operation".to_string(), json!("reset_playout_state"));
return Ok(data_map);
}
"stop_all" => {
manager.stop_all();
let mut data_map = Map::new();
data_map.insert("message".to_string(), json!("Stop playout!"));
return Ok(data_map);
}
_ => {
return Err(ServiceError::ServiceUnavailable(
"Command not found!".to_string(),
))
}
}
Ok(Map::new())
}

View File

@ -1,5 +1,8 @@
use std::io;
use actix_web::{error::ResponseError, Error, HttpResponse};
use derive_more::Display;
use ffprobe::FfProbeError;
#[derive(Debug, Display)]
pub enum ServiceError {
@ -74,6 +77,12 @@ impl From<std::num::ParseIntError> for ServiceError {
}
}
impl From<jsonwebtoken::errors::Error> for ServiceError {
fn from(err: jsonwebtoken::errors::Error) -> ServiceError {
ServiceError::Unauthorized(err.to_string())
}
}
impl From<actix_web::error::BlockingError> for ServiceError {
fn from(err: actix_web::error::BlockingError) -> ServiceError {
ServiceError::BadRequest(err.to_string())
@ -98,8 +107,90 @@ impl From<toml_edit::ser::Error> for ServiceError {
}
}
impl From<toml_edit::TomlError> for ServiceError {
fn from(err: toml_edit::TomlError) -> ServiceError {
ServiceError::BadRequest(err.to_string())
}
}
impl From<uuid::Error> for ServiceError {
fn from(err: uuid::Error) -> ServiceError {
ServiceError::BadRequest(err.to_string())
}
}
impl From<serde_json::Error> for ServiceError {
fn from(err: serde_json::Error) -> ServiceError {
ServiceError::BadRequest(err.to_string())
}
}
#[derive(Debug, Display)]
pub enum ProcessError {
#[display(fmt = "Failed to spawn ffmpeg/ffprobe. {}", _0)]
CommandSpawn(io::Error),
#[display(fmt = "{}", _0)]
Custom(String),
#[display(fmt = "IO error: {}", _0)]
IO(io::Error),
#[display(fmt = "{}", _0)]
Ffprobe(FfProbeError),
#[display(fmt = "Regex compile error {}", _0)]
Regex(String),
#[display(fmt = "Thread error {}", _0)]
Thread(String),
}
impl From<std::io::Error> for ProcessError {
fn from(err: std::io::Error) -> ProcessError {
ProcessError::IO(err)
}
}
impl From<FfProbeError> for ProcessError {
fn from(err: FfProbeError) -> Self {
Self::Ffprobe(err)
}
}
impl From<lettre::address::AddressError> for ProcessError {
fn from(err: lettre::address::AddressError) -> ProcessError {
ProcessError::Custom(err.to_string())
}
}
impl From<lettre::transport::smtp::Error> for ProcessError {
fn from(err: lettre::transport::smtp::Error) -> ProcessError {
ProcessError::Custom(err.to_string())
}
}
impl From<lettre::error::Error> for ProcessError {
fn from(err: lettre::error::Error) -> ProcessError {
ProcessError::Custom(err.to_string())
}
}
impl<T> From<std::sync::PoisonError<T>> for ProcessError {
fn from(err: std::sync::PoisonError<T>) -> ProcessError {
ProcessError::Custom(err.to_string())
}
}
impl From<regex::Error> for ProcessError {
fn from(err: regex::Error) -> Self {
Self::Regex(err.to_string())
}
}
impl From<serde_json::Error> for ProcessError {
fn from(err: serde_json::Error) -> Self {
Self::Custom(err.to_string())
}
}
impl From<Box<dyn std::any::Any + std::marker::Send>> for ProcessError {
fn from(err: Box<dyn std::any::Any + std::marker::Send>) -> Self {
Self::Thread(format!("{err:?}"))
}
}

View File

@ -6,18 +6,17 @@ use std::{
use actix_multipart::Multipart;
use actix_web::{web, HttpResponse};
use futures_util::TryStreamExt as _;
use lazy_static::lazy_static;
use lexical_sort::{natural_lexical_cmp, PathSort};
use rand::{distributions::Alphanumeric, Rng};
use relative_path::RelativePath;
use serde::{Deserialize, Serialize};
use sqlx::{Pool, Sqlite};
use tokio::fs;
use simplelog::*;
use log::*;
use crate::utils::{errors::ServiceError, playout_config};
use ffplayout_lib::utils::{file_extension, MediaProbe};
use crate::db::models::Channel;
use crate::player::utils::{file_extension, MediaProbe};
use crate::utils::{config::PlayoutConfig, errors::ServiceError};
#[derive(Debug, Deserialize, Serialize, Clone)]
pub struct PathObject {
@ -55,23 +54,6 @@ pub struct VideoFile {
duration: f64,
}
lazy_static! {
pub static ref HOME_DIR: String = home::home_dir()
.unwrap_or("/home/h1wl3n2og".into()) // any random not existing folder
.as_os_str()
.to_string_lossy()
.to_string();
}
const FOLDER_WHITELIST: &[&str; 6] = &[
"/media",
"/mnt",
"/playlists",
"/tv-media",
"/usr/share/ffplayout",
"/var/lib/ffplayout",
];
/// Normalize absolut path
///
/// This function takes care, that it is not possible to break out from root_path.
@ -111,14 +93,6 @@ pub fn norm_abs_path(
let path = &root_path.join(&source_relative);
if !FOLDER_WHITELIST.iter().any(|f| path.starts_with(f))
&& !path.starts_with(HOME_DIR.to_string())
{
return Err(ServiceError::Forbidden(
"Access forbidden: Folder cannot be opened.".to_string(),
));
}
Ok((path.to_path_buf(), path_suffix, source_relative))
}
@ -128,26 +102,26 @@ pub fn norm_abs_path(
/// Input should be a relative path segment, but when it is a absolut path, the norm_abs_path function
/// will take care, that user can not break out from given storage path in config.
pub async fn browser(
conn: &Pool<Sqlite>,
id: i32,
config: &PlayoutConfig,
channel: &Channel,
path_obj: &PathObject,
) -> Result<PathObject, ServiceError> {
let (config, channel) = playout_config(conn, &id).await?;
let mut channel_extensions = channel
.extra_extensions
.split(',')
.map(|e| e.to_string())
.collect::<Vec<String>>();
let mut parent_folders = vec![];
let mut extensions = config.storage.extensions;
let mut extensions = config.storage.extensions.clone();
extensions.append(&mut channel_extensions);
let (path, parent, path_component) = norm_abs_path(&config.storage.path, &path_obj.source)?;
let (path, parent, path_component) =
norm_abs_path(&config.global.storage_path, &path_obj.source)?;
let parent_path = if !path_component.is_empty() {
path.parent().unwrap()
} else {
&config.storage.path
&config.global.storage_path
};
let mut obj = PathObject::new(path_component, Some(parent));
@ -235,12 +209,10 @@ pub async fn browser(
}
pub async fn create_directory(
conn: &Pool<Sqlite>,
id: i32,
config: &PlayoutConfig,
path_obj: &PathObject,
) -> Result<HttpResponse, ServiceError> {
let (config, _) = playout_config(conn, &id).await?;
let (path, _, _) = norm_abs_path(&config.storage.path, &path_obj.source)?;
let (path, _, _) = norm_abs_path(&config.global.storage_path, &path_obj.source)?;
if let Err(e) = fs::create_dir_all(&path).await {
return Err(ServiceError::BadRequest(e.to_string()));
@ -306,13 +278,11 @@ async fn rename(source: &PathBuf, target: &PathBuf) -> Result<MoveObject, Servic
}
pub async fn rename_file(
conn: &Pool<Sqlite>,
id: i32,
config: &PlayoutConfig,
move_object: &MoveObject,
) -> Result<MoveObject, ServiceError> {
let (config, _) = playout_config(conn, &id).await?;
let (source_path, _, _) = norm_abs_path(&config.storage.path, &move_object.source)?;
let (mut target_path, _, _) = norm_abs_path(&config.storage.path, &move_object.target)?;
let (source_path, _, _) = norm_abs_path(&config.global.storage_path, &move_object.source)?;
let (mut target_path, _, _) = norm_abs_path(&config.global.storage_path, &move_object.target)?;
if !source_path.exists() {
return Err(ServiceError::BadRequest("Source file not exist!".into()));
@ -341,12 +311,10 @@ pub async fn rename_file(
}
pub async fn remove_file_or_folder(
conn: &Pool<Sqlite>,
id: i32,
config: &PlayoutConfig,
source_path: &str,
) -> Result<(), ServiceError> {
let (config, _) = playout_config(conn, &id).await?;
let (source, _, _) = norm_abs_path(&config.storage.path, source_path)?;
let (source, _, _) = norm_abs_path(&config.global.storage_path, source_path)?;
if !source.exists() {
return Err(ServiceError::BadRequest("Source does not exists!".into()));
@ -377,9 +345,8 @@ pub async fn remove_file_or_folder(
Err(ServiceError::InternalServerError)
}
async fn valid_path(conn: &Pool<Sqlite>, id: i32, path: &str) -> Result<PathBuf, ServiceError> {
let (config, _) = playout_config(conn, &id).await?;
let (test_path, _, _) = norm_abs_path(&config.storage.path, path)?;
async fn valid_path(config: &PlayoutConfig, path: &str) -> Result<PathBuf, ServiceError> {
let (test_path, _, _) = norm_abs_path(&config.global.storage_path, path)?;
if !test_path.is_dir() {
return Err(ServiceError::BadRequest("Target folder not exists!".into()));
@ -389,8 +356,7 @@ async fn valid_path(conn: &Pool<Sqlite>, id: i32, path: &str) -> Result<PathBuf,
}
pub async fn upload(
conn: &Pool<Sqlite>,
id: i32,
config: &PlayoutConfig,
_size: u64,
mut payload: Multipart,
path: &Path,
@ -411,7 +377,7 @@ pub async fn upload(
let filepath = if abs_path {
path.to_path_buf()
} else {
valid_path(conn, id, &path.to_string_lossy())
valid_path(config, &path.to_string_lossy())
.await?
.join(filename)
};

View File

@ -7,19 +7,27 @@
use std::{
fs::{create_dir_all, write},
io::Error,
process::exit,
};
use chrono::Timelike;
use lexical_sort::{natural_lexical_cmp, StringSort};
use log::*;
use rand::{seq::SliceRandom, thread_rng, Rng};
use simplelog::*;
use walkdir::WalkDir;
use super::{folder::FolderSource, PlayerControl};
use crate::player::{
controller::ChannelManager,
utils::{
folder::{fill_filler_list, FolderSource},
gen_dummy, get_date_range, include_file_extension,
json_serializer::JsonPlaylist,
sum_durations, Media,
},
};
use crate::utils::{
folder::fill_filler_list, gen_dummy, get_date_range, include_file_extension,
json_serializer::JsonPlaylist, sum_durations, time_to_sec, Media, PlayoutConfig, Template,
config::{PlayoutConfig, Template},
logging::Target,
time_to_sec,
};
pub fn random_list(clip_list: Vec<Media>, total_length: f64) -> Vec<Media> {
@ -126,12 +134,13 @@ pub fn filler_list(config: &PlayoutConfig, total_length: f64) -> Vec<Media> {
pub fn generate_from_template(
config: &PlayoutConfig,
player_control: &PlayerControl,
manager: &ChannelManager,
template: Template,
) -> FolderSource {
let mut media_list = vec![];
let mut rng = thread_rng();
let mut index: usize = 0;
let id = config.general.channel_id;
for source in template.sources {
let mut source_list = vec![];
@ -139,7 +148,7 @@ pub fn generate_from_template(
+ (source.duration.minute() as f64 * 60.0)
+ source.duration.second() as f64;
debug!("Generating playlist block with <yellow>{duration:.2}</> seconds length");
debug!(target: Target::all(), channel = id; "Generating playlist block with <yellow>{duration:.2}</> seconds length");
for path in source.paths {
debug!("Search files in <b><magenta>{path:?}</></b>");
@ -187,14 +196,15 @@ pub fn generate_from_template(
index += 1;
}
FolderSource::from_list(config, None, player_control, media_list)
FolderSource::from_list(manager, media_list)
}
/// Generate playlists
pub fn generate_playlist(
config: &PlayoutConfig,
channel_name: Option<String>,
) -> Result<Vec<JsonPlaylist>, Error> {
pub fn playlist_generator(manager: &ChannelManager) -> Result<Vec<JsonPlaylist>, Error> {
let config = manager.config.lock().unwrap().clone();
let id = config.general.channel_id;
let channel_name = manager.channel.lock().unwrap().name.clone();
let total_length = match config.playlist.length_sec {
Some(length) => length,
None => {
@ -205,24 +215,17 @@ pub fn generate_playlist(
}
}
};
let player_control = PlayerControl::new();
let playlist_root = &config.playlist.path;
let playlist_root = &config.global.playlist_path;
let mut playlists = vec![];
let mut date_range = vec![];
let mut from_template = false;
let channel = match channel_name {
Some(name) => name,
None => "Channel 1".to_string(),
};
if !playlist_root.is_dir() {
error!(
target: Target::all(), channel = id;
"Playlist folder <b><magenta>{:?}</></b> not exists!",
config.playlist.path
config.global.playlist_path
);
exit(1);
}
if let Some(range) = config.general.generate.clone() {
@ -230,19 +233,19 @@ pub fn generate_playlist(
}
if date_range.contains(&"-".to_string()) && date_range.len() == 3 {
date_range = get_date_range(&date_range)
date_range = get_date_range(id, &date_range)
}
// gives an iterator with infinit length
let folder_iter = if let Some(template) = &config.general.template {
from_template = true;
generate_from_template(config, &player_control, template.clone())
generate_from_template(&config, manager, template.clone())
} else {
FolderSource::new(config, None, &player_control)
FolderSource::new(&config, manager.clone())
};
let list_length = player_control.current_list.lock().unwrap().len();
let list_length = manager.current_list.lock().unwrap().len();
for date in date_range {
let d: Vec<&str> = date.split('-').collect();
@ -257,6 +260,7 @@ pub fn generate_playlist(
if playlist_file.is_file() {
warn!(
target: Target::all(), channel = id;
"Playlist exists, skip: <b><magenta>{}</></b>",
playlist_file.display()
);
@ -265,12 +269,13 @@ pub fn generate_playlist(
}
info!(
target: Target::all(), channel = id;
"Generate playlist: <b><magenta>{}</></b>",
playlist_file.display()
);
let mut playlist = JsonPlaylist {
channel: channel.clone(),
channel: channel_name.clone(),
date,
path: None,
start_sec: None,
@ -280,7 +285,7 @@ pub fn generate_playlist(
};
if from_template {
let media_list = player_control.current_list.lock().unwrap();
let media_list = manager.current_list.lock().unwrap();
playlist.program = media_list.to_vec();
} else {
for item in folder_iter.clone() {
@ -301,7 +306,7 @@ pub fn generate_playlist(
if config.playlist.length_sec.unwrap() > list_duration {
let time_left = config.playlist.length_sec.unwrap() - list_duration;
let mut fillers = filler_list(config, time_left);
let mut fillers = filler_list(&config, time_left);
playlist.program.append(&mut fillers);
}

View File

@ -0,0 +1,480 @@
use std::{
collections::{hash_map, HashMap},
env,
io::{self, ErrorKind, Write},
path::PathBuf,
sync::{Arc, Mutex},
time::Duration,
};
use actix_web::rt::time::interval;
use flexi_logger::{
writers::{FileLogWriter, LogWriter},
Age, Cleanup, Criterion, DeferredNow, FileSpec, Level, LogSpecification, Logger, Naming,
};
use lettre::{
message::header, transport::smtp::authentication::Credentials, AsyncSmtpTransport,
AsyncTransport, Message, Tokio1Executor,
};
use log::{kv::Value, *};
use paris::formatter::colorize_string;
use super::ARGS;
use crate::db::models::GlobalSettings;
use crate::utils::{config::Mail, errors::ProcessError, round_to_nearest_ten};
#[derive(Debug)]
pub struct Target;
impl Target {
pub fn all() -> &'static str {
"{file,mail,_Default}"
}
pub fn console() -> &'static str {
"{console}"
}
pub fn file() -> &'static str {
"{file}"
}
pub fn mail() -> &'static str {
"{mail}"
}
pub fn file_mail() -> &'static str {
"{file,mail}"
}
}
pub struct LogConsole;
impl LogWriter for LogConsole {
fn write(&self, now: &mut DeferredNow, record: &Record<'_>) -> std::io::Result<()> {
console_formatter(&mut std::io::stderr(), now, record)?;
println!();
Ok(())
}
fn flush(&self) -> std::io::Result<()> {
Ok(())
}
}
struct MultiFileLogger {
log_path: PathBuf,
writers: Arc<Mutex<HashMap<i32, Arc<Mutex<FileLogWriter>>>>>,
}
impl MultiFileLogger {
pub fn new(log_path: PathBuf) -> Self {
MultiFileLogger {
log_path,
writers: Arc::new(Mutex::new(HashMap::new())),
}
}
fn get_writer(&self, channel: i32) -> io::Result<Arc<Mutex<FileLogWriter>>> {
let mut writers = self.writers.lock().unwrap();
if let hash_map::Entry::Vacant(e) = writers.entry(channel) {
let writer = FileLogWriter::builder(
FileSpec::default()
.suppress_timestamp()
.directory(&self.log_path)
.basename("ffplayout")
.discriminant(channel.to_string()),
)
.format(file_formatter)
.append()
.rotate(
Criterion::Age(Age::Day),
Naming::TimestampsCustomFormat {
current_infix: Some(""),
format: "%Y-%m-%d",
},
Cleanup::KeepLogFiles(ARGS.log_backup_count.unwrap_or(14)),
)
.try_build()
.map_err(|e| io::Error::new(io::ErrorKind::Other, e.to_string()))?;
e.insert(Arc::new(Mutex::new(writer)));
}
Ok(writers.get(&channel).unwrap().clone())
}
}
impl LogWriter for MultiFileLogger {
fn write(&self, now: &mut DeferredNow, record: &Record) -> io::Result<()> {
let channel = i32::try_from(
record
.key_values()
.get("channel".into())
.unwrap_or(Value::null())
.to_i64()
.unwrap_or(0),
)
.unwrap_or(0);
let writer = self.get_writer(channel);
let w = writer?.lock().unwrap().write(now, record);
w
}
fn flush(&self) -> io::Result<()> {
let writers = self.writers.lock().unwrap();
for writer in writers.values() {
writer.lock().unwrap().flush()?;
}
Ok(())
}
}
pub struct LogMailer {
pub mail_queues: Arc<Mutex<Vec<Arc<Mutex<MailQueue>>>>>,
}
impl LogMailer {
pub fn new(mail_queues: Arc<Mutex<Vec<Arc<Mutex<MailQueue>>>>>) -> Self {
Self { mail_queues }
}
}
impl LogWriter for LogMailer {
fn write(&self, now: &mut DeferredNow, record: &Record<'_>) -> std::io::Result<()> {
let id = i32::try_from(
record
.key_values()
.get("channel".into())
.unwrap_or(Value::null())
.to_i64()
.unwrap_or(0),
)
.unwrap_or(0);
let mut queues = self.mail_queues.lock().unwrap_or_else(|poisoned| {
error!("Queues mutex was poisoned");
poisoned.into_inner()
});
for queue in queues.iter_mut() {
let mut q_lock = queue.lock().unwrap_or_else(|poisoned| {
error!("Queue mutex was poisoned");
poisoned.into_inner()
});
if q_lock.id == id && q_lock.level_eq(record.level()) {
q_lock.push(format!(
"[{}] [{:>5}] {}",
now.now().format("%Y-%m-%d %H:%M:%S"),
record.level(),
record.args()
));
break;
}
}
Ok(())
}
fn flush(&self) -> std::io::Result<()> {
Ok(())
}
}
#[derive(Clone, Debug)]
pub struct MailQueue {
pub id: i32,
pub config: Mail,
pub lines: Vec<String>,
}
impl MailQueue {
pub fn new(id: i32, config: Mail) -> Self {
Self {
id,
config,
lines: vec![],
}
}
pub fn level_eq(&self, level: Level) -> bool {
match level {
Level::Error => self.config.mail_level == Level::Error,
Level::Warn => matches!(self.config.mail_level, Level::Warn | Level::Error),
Level::Info => matches!(
self.config.mail_level,
Level::Info | Level::Warn | Level::Error
),
_ => false,
}
}
pub fn update(&mut self, config: Mail) {
self.config = config;
}
pub fn clear(&mut self) {
self.lines.clear();
}
pub fn push(&mut self, line: String) {
self.lines.push(line);
}
fn text(&self) -> String {
self.lines.join("\n")
}
fn is_empty(&self) -> bool {
self.lines.is_empty()
}
}
fn console_formatter(w: &mut dyn Write, _now: &mut DeferredNow, record: &Record) -> io::Result<()> {
match record.level() {
Level::Debug => write!(
w,
"{}",
colorize_string(format!("<bright-blue>[DEBUG]</> {}", record.args()))
),
Level::Error => write!(
w,
"{}",
colorize_string(format!("<bright-red>[ERROR]</> {}", record.args()))
),
Level::Info => write!(
w,
"{}",
colorize_string(format!("<bright-green>[ INFO]</> {}", record.args()))
),
Level::Trace => write!(
w,
"{}",
colorize_string(format!(
"<bright-yellow>[TRACE]</> {}:{} {}",
record.file().unwrap_or_default(),
record.line().unwrap_or_default(),
record.args()
))
),
Level::Warn => write!(
w,
"{}",
colorize_string(format!("<yellow>[ WARN]</> {}", record.args()))
),
}
}
fn file_formatter(
w: &mut dyn Write,
now: &mut DeferredNow,
record: &Record,
) -> std::io::Result<()> {
write!(
w,
"[{}] [{:>5}] {}",
now.now().format("%Y-%m-%d %H:%M:%S%.6f"),
record.level(),
record.args()
)
}
pub fn log_file_path() -> PathBuf {
let config = GlobalSettings::global();
let mut log_path = ARGS
.log_path
.clone()
.unwrap_or(PathBuf::from(&config.logging_path));
if !log_path.is_dir() {
log_path = env::current_dir().unwrap();
}
log_path
}
fn file_logger() -> Box<dyn LogWriter> {
if ARGS.log_to_console {
Box::new(LogConsole)
} else {
Box::new(MultiFileLogger::new(log_file_path()))
}
}
/// send log messages to mail recipient
pub async fn send_mail(config: &Mail, msg: String) -> Result<(), ProcessError> {
let recipient = config
.recipient
.split_terminator([',', ';', ' '])
.filter(|s| s.contains('@'))
.map(|s| s.trim())
.collect::<Vec<&str>>();
let mut message = Message::builder()
.from(config.sender_addr.parse()?)
.subject(&config.subject)
.header(header::ContentType::TEXT_PLAIN);
for r in recipient {
message = message.to(r.parse()?);
}
let mail = message.body(msg)?;
let credentials = Credentials::new(config.sender_addr.clone(), config.sender_pass.clone());
let mut transporter =
AsyncSmtpTransport::<Tokio1Executor>::relay(config.smtp_server.clone().as_str());
if config.starttls {
transporter = AsyncSmtpTransport::<Tokio1Executor>::starttls_relay(
config.smtp_server.clone().as_str(),
);
}
let mailer = transporter?.credentials(credentials).build();
// Send the mail
mailer.send(mail).await?;
Ok(())
}
/// Basic Mail Queue
///
/// Check every give seconds for messages and send them.
pub fn mail_queue(mail_queues: Arc<Mutex<Vec<Arc<Mutex<MailQueue>>>>>) {
actix_web::rt::spawn(async move {
let sec = 10;
let mut interval = interval(Duration::from_secs(sec));
let mut counter = 0;
loop {
interval.tick().await;
let mut tasks = vec![];
// Reset the counter after one day
if counter >= 86400 {
counter = 0;
} else {
counter += sec;
}
{
let mut queues = match mail_queues.lock() {
Ok(l) => l,
Err(e) => {
error!("Failed to lock mail_queues {e}");
continue;
}
};
// Process mail queues and send emails
for queue in queues.iter_mut() {
let interval = round_to_nearest_ten(counter as i64);
let mut q_lock = queue.lock().unwrap_or_else(|poisoned| {
error!("Queue mutex was poisoned");
poisoned.into_inner()
});
let expire = round_to_nearest_ten(q_lock.config.interval);
if interval % expire == 0 && !q_lock.is_empty() {
if q_lock.config.recipient.contains('@') {
tasks.push((q_lock.config.clone(), q_lock.text().clone(), q_lock.id));
}
// Clear the messages after sending the email
q_lock.clear();
}
}
}
for (config, text, id) in tasks {
if let Err(e) = send_mail(&config, text).await {
error!(target: "{file}", channel = id; "Failed to send mail: {e}");
}
}
}
});
}
/// Initialize our logging, to have:
///
/// - console logger
/// - file logger
/// - mail logger
pub fn init_logging(mail_queues: Arc<Mutex<Vec<Arc<Mutex<MailQueue>>>>>) -> io::Result<()> {
let log_level = match ARGS
.log_level
.clone()
.unwrap_or("debug".to_string())
.to_lowercase()
.as_str()
{
"debug" => LevelFilter::Debug,
"error" => LevelFilter::Error,
"info" => LevelFilter::Info,
"trace" => LevelFilter::Trace,
"warn" => LevelFilter::Warn,
"off" => LevelFilter::Off,
_ => LevelFilter::Debug,
};
mail_queue(mail_queues.clone());
// Build the initial log specification
let mut builder = LogSpecification::builder();
builder
.default(log_level)
.module("actix", LevelFilter::Info)
.module("actix_files", LevelFilter::Info)
.module("actix_web", LevelFilter::Info)
.module("actix_web_service", LevelFilter::Error)
.module("hyper", LevelFilter::Error)
.module("flexi_logger", LevelFilter::Error)
.module("libc", LevelFilter::Error)
.module("log", LevelFilter::Error)
.module("mio", LevelFilter::Error)
.module("neli", LevelFilter::Error)
.module("reqwest", LevelFilter::Error)
.module("rpc", LevelFilter::Error)
.module("rustls", LevelFilter::Error)
.module("serial_test", LevelFilter::Error)
.module("sqlx", LevelFilter::Error)
.module("tokio", LevelFilter::Error);
Logger::with(builder.build())
.format(console_formatter)
.log_to_stderr()
.add_writer("file", file_logger())
.add_writer("mail", Box::new(LogMailer::new(mail_queues)))
.start()
.map_err(|e| io::Error::new(ErrorKind::Other, e.to_string()))?;
Ok(())
}
/// Format ingest and HLS logging output
pub fn log_line(line: &str, level: &str) {
if line.contains("[info]") && level.to_lowercase() == "info" {
info!("<bright black>[Server]</> {}", line.replace("[info] ", ""))
} else if line.contains("[warning]")
&& (level.to_lowercase() == "warning" || level.to_lowercase() == "info")
{
warn!(
"<bright black>[Server]</> {}",
line.replace("[warning] ", "")
)
} else if line.contains("[error]")
&& !line.contains("Input/output error")
&& !line.contains("Broken pipe")
{
error!("<bright black>[Server]</> {}", line.replace("[error] ", ""));
} else if line.contains("[fatal]") {
error!("<bright black>[Server]</> {}", line.replace("[fatal] ", ""))
}
}

298
ffplayout/src/utils/mod.rs Normal file
View File

@ -0,0 +1,298 @@
use std::{
env, fmt,
net::TcpListener,
path::{Path, PathBuf},
};
use chrono::{format::ParseErrorKind, prelude::*};
use faccess::PathExt;
use log::*;
use path_clean::PathClean;
use rand::Rng;
use regex::Regex;
use tokio::fs;
use serde::{
de::{self, Visitor},
Deserialize, Deserializer, Serialize,
};
pub mod advanced_config;
pub mod args_parse;
pub mod channels;
pub mod config;
pub mod control;
pub mod errors;
pub mod files;
pub mod generator;
pub mod logging;
pub mod playlist;
pub mod system;
pub mod task_runner;
use crate::player::utils::time_to_sec;
use crate::utils::{errors::ServiceError, logging::log_file_path};
use crate::ARGS;
#[derive(Clone, Debug, Default, Deserialize, Serialize)]
pub struct TextFilter {
pub text: Option<String>,
#[serde(default, deserialize_with = "deserialize_number_or_string")]
pub x: Option<String>,
#[serde(default, deserialize_with = "deserialize_number_or_string")]
pub y: Option<String>,
#[serde(default, deserialize_with = "deserialize_number_or_string")]
pub fontsize: Option<String>,
#[serde(default, deserialize_with = "deserialize_number_or_string")]
pub line_spacing: Option<String>,
pub fontcolor: Option<String>,
#[serde(default, deserialize_with = "deserialize_number_or_string")]
pub alpha: Option<String>,
#[serde(default, deserialize_with = "deserialize_number_or_string")]
pub r#box: Option<String>,
pub boxcolor: Option<String>,
#[serde(default, deserialize_with = "deserialize_number_or_string")]
pub boxborderw: Option<String>,
}
/// Deserialize number or string
pub fn deserialize_number_or_string<'de, D>(deserializer: D) -> Result<Option<String>, D::Error>
where
D: serde::Deserializer<'de>,
{
struct StringOrNumberVisitor;
impl<'de> Visitor<'de> for StringOrNumberVisitor {
type Value = Option<String>;
fn expecting(&self, formatter: &mut std::fmt::Formatter) -> std::fmt::Result {
formatter.write_str("a string or a number")
}
fn visit_str<E: de::Error>(self, value: &str) -> Result<Self::Value, E> {
let re = Regex::new(r"0,([0-9]+)").unwrap();
let clean_string = re.replace_all(value, "0.$1").to_string();
Ok(Some(clean_string))
}
fn visit_u64<E: de::Error>(self, value: u64) -> Result<Self::Value, E> {
Ok(Some(value.to_string()))
}
fn visit_i64<E: de::Error>(self, value: i64) -> Result<Self::Value, E> {
Ok(Some(value.to_string()))
}
fn visit_f64<E: de::Error>(self, value: f64) -> Result<Self::Value, E> {
Ok(Some(value.to_string()))
}
}
deserializer.deserialize_any(StringOrNumberVisitor)
}
impl fmt::Display for TextFilter {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
let escaped_text = self
.text
.clone()
.unwrap_or_default()
.replace('\'', "'\\\\\\''")
.replace('\\', "\\\\\\\\")
.replace('%', "\\\\\\%")
.replace(':', "\\:");
let mut s = format!("text='{escaped_text}'");
if let Some(v) = &self.x {
if !v.is_empty() {
s.push_str(&format!(":x='{v}'"));
}
}
if let Some(v) = &self.y {
if !v.is_empty() {
s.push_str(&format!(":y='{v}'"));
}
}
if let Some(v) = &self.fontsize {
if !v.is_empty() {
s.push_str(&format!(":fontsize={v}"));
}
}
if let Some(v) = &self.line_spacing {
if !v.is_empty() {
s.push_str(&format!(":line_spacing={v}"));
}
}
if let Some(v) = &self.fontcolor {
if !v.is_empty() {
s.push_str(&format!(":fontcolor={v}"));
}
}
if let Some(v) = &self.alpha {
if !v.is_empty() {
s.push_str(&format!(":alpha='{v}'"));
}
}
if let Some(v) = &self.r#box {
if !v.is_empty() {
s.push_str(&format!(":box={v}"));
}
}
if let Some(v) = &self.boxcolor {
if !v.is_empty() {
s.push_str(&format!(":boxcolor={v}"));
}
}
if let Some(v) = &self.boxborderw {
if !v.is_empty() {
s.push_str(&format!(":boxborderw={v}"));
}
}
write!(f, "{s}")
}
}
pub fn db_path() -> Result<&'static str, Box<dyn std::error::Error>> {
if let Some(path) = ARGS.db.clone() {
let absolute_path = if path.is_absolute() {
path
} else {
env::current_dir()?.join(path)
}
.clean();
if let Some(abs_path) = absolute_path.parent() {
if abs_path.writable() {
return Ok(Box::leak(
absolute_path.to_string_lossy().to_string().into_boxed_str(),
));
}
error!("Given database path is not writable!");
}
}
let sys_path = Path::new("/usr/share/ffplayout/db");
let mut db_path = "./ffplayout.db";
if sys_path.is_dir() && !sys_path.writable() {
error!("Path {} is not writable!", sys_path.display());
}
if sys_path.is_dir() && sys_path.writable() {
db_path = "/usr/share/ffplayout/db/ffplayout.db";
} else if Path::new("./assets").is_dir() {
db_path = "./assets/ffplayout.db";
}
Ok(db_path)
}
pub fn public_path() -> PathBuf {
let path = PathBuf::from("./ffplayout-frontend/.output/public/");
if cfg!(debug_assertions) && path.is_dir() {
return path;
}
let path = PathBuf::from("/usr/share/ffplayout/public/");
if path.is_dir() {
return path;
}
PathBuf::from("./public/")
}
pub async fn read_log_file(channel_id: &i32, date: &str) -> Result<String, ServiceError> {
let date_str = if date.is_empty() {
"".to_string()
} else {
format!("_{date}")
};
let log_path = log_file_path().join(format!("ffplayout_{channel_id}{date_str}.log"));
let file_size = fs::metadata(&log_path).await?.len() as f64;
let log_content = if file_size > 5000000.0 {
error!("Log file to big: {}", sizeof_fmt(file_size));
format!("The log file is larger ({}) than the hard limit of 5MB, the probability is very high that something is wrong with the playout. Check this on the server with `less {log_path:?}`.", sizeof_fmt(file_size))
} else {
fs::read_to_string(log_path).await?
};
Ok(log_content)
}
/// get human readable file size
pub fn sizeof_fmt(mut num: f64) -> String {
let suffix = 'B';
for unit in ["", "Ki", "Mi", "Gi", "Ti", "Pi", "Ei", "Zi"] {
if num.abs() < 1024.0 {
return format!("{num:.1}{unit}{suffix}");
}
num /= 1024.0;
}
format!("{num:.1}Yi{suffix}")
}
pub fn local_utc_offset() -> i32 {
let mut offset = Local::now().format("%:z").to_string();
let operator = offset.remove(0);
let mut utc_offset = 0;
if let Some((r, f)) = offset.split_once(':') {
utc_offset = r.parse::<i32>().unwrap_or(0) * 60 + f.parse::<i32>().unwrap_or(0);
if operator == '-' && utc_offset > 0 {
utc_offset = -utc_offset;
}
}
utc_offset
}
pub fn naive_date_time_from_str<'de, D>(deserializer: D) -> Result<NaiveDateTime, D::Error>
where
D: Deserializer<'de>,
{
let s: String = Deserialize::deserialize(deserializer)?;
match NaiveDateTime::parse_from_str(&s, "%Y-%m-%dT%H:%M:%S") {
Ok(date_time) => Ok(date_time),
Err(e) => {
if e.kind() == ParseErrorKind::TooShort {
NaiveDateTime::parse_from_str(&format!("{s}T00:00:00"), "%Y-%m-%dT%H:%M:%S")
.map_err(de::Error::custom)
} else {
NaiveDateTime::parse_from_str(&s, "%Y-%m-%dT%H:%M:%S%#z").map_err(de::Error::custom)
}
}
}
}
/// get a free tcp socket
pub fn free_tcp_socket(exclude_socket: String) -> Option<String> {
for _ in 0..100 {
let port = rand::thread_rng().gen_range(45321..54268);
let socket = format!("127.0.0.1:{port}");
if socket != exclude_socket && TcpListener::bind(("127.0.0.1", port)).is_ok() {
return Some(socket);
}
}
None
}
pub fn round_to_nearest_ten(num: i64) -> i64 {
if num % 10 >= 5 {
((num / 10) + 1) * 10
} else {
(num / 10) * 10
}
}

View File

@ -1,22 +1,21 @@
use std::{fs, path::PathBuf};
use simplelog::*;
use sqlx::{Pool, Sqlite};
use log::*;
use crate::utils::{errors::ServiceError, files::norm_abs_path, playout_config};
use ffplayout_lib::utils::{
generate_playlist as playlist_generator, json_reader, json_writer, JsonPlaylist, PlayoutConfig,
use crate::player::controller::ChannelManager;
use crate::player::utils::{json_reader, json_writer, JsonPlaylist};
use crate::utils::{
config::PlayoutConfig, errors::ServiceError, files::norm_abs_path,
generator::playlist_generator,
};
pub async fn read_playlist(
conn: &Pool<Sqlite>,
id: i32,
config: &PlayoutConfig,
date: String,
) -> Result<JsonPlaylist, ServiceError> {
let (config, _) = playout_config(conn, &id).await?;
let (path, _, _) = norm_abs_path(&config.playlist.path, "")?;
let mut playlist_path = path;
let d: Vec<&str> = date.split('-').collect();
let mut playlist_path = config.global.playlist_path.clone();
playlist_path = playlist_path
.join(d[0])
.join(d[1])
@ -30,14 +29,12 @@ pub async fn read_playlist(
}
pub async fn write_playlist(
conn: &Pool<Sqlite>,
id: i32,
config: &PlayoutConfig,
json_data: JsonPlaylist,
) -> Result<String, ServiceError> {
let (config, _) = playout_config(conn, &id).await?;
let date = json_data.date.clone();
let mut playlist_path = config.playlist.path;
let d: Vec<&str> = date.split('-').collect();
let mut playlist_path = config.global.playlist_path.clone();
if !playlist_path
.extension()
@ -87,17 +84,16 @@ pub async fn write_playlist(
Err(ServiceError::InternalServerError)
}
pub async fn generate_playlist(
mut config: PlayoutConfig,
channel: String,
) -> Result<JsonPlaylist, ServiceError> {
pub fn generate_playlist(manager: ChannelManager) -> Result<JsonPlaylist, ServiceError> {
let mut config = manager.config.lock().unwrap();
if let Some(mut template) = config.general.template.take() {
for source in template.sources.iter_mut() {
let mut paths = vec![];
for path in &source.paths {
let (safe_path, _, _) =
norm_abs_path(&config.storage.path, &path.to_string_lossy())?;
norm_abs_path(&config.global.storage_path, &path.to_string_lossy())?;
paths.push(safe_path);
}
@ -107,7 +103,9 @@ pub async fn generate_playlist(
config.general.template = Some(template);
}
match playlist_generator(&config, Some(channel)) {
drop(config);
match playlist_generator(&manager) {
Ok(playlists) => {
if !playlists.is_empty() {
Ok(playlists[0].clone())
@ -124,14 +122,10 @@ pub async fn generate_playlist(
}
}
pub async fn delete_playlist(
conn: &Pool<Sqlite>,
id: i32,
date: &str,
) -> Result<String, ServiceError> {
let (config, _) = playout_config(conn, &id).await?;
let mut playlist_path = PathBuf::from(&config.playlist.path);
pub async fn delete_playlist(config: &PlayoutConfig, date: &str) -> Result<String, ServiceError> {
let d: Vec<&str> = date.split('-').collect();
let mut playlist_path = PathBuf::from(&config.global.playlist_path);
playlist_path = playlist_path
.join(d[0])
.join(d[1])

View File

@ -4,8 +4,8 @@ use local_ip_address::list_afinet_netifas;
use serde::Serialize;
use sysinfo::System;
use crate::utils::config::PlayoutConfig;
use crate::{DISKS, NETWORKS, SYS};
use ffplayout_lib::utils::PlayoutConfig;
const IGNORE_INTERFACES: [&str; 7] = ["docker", "lxdbr", "tab", "tun", "virbr", "veth", "vnet"];
@ -118,7 +118,7 @@ pub fn stat(config: PlayoutConfig) -> SystemStat {
for disk in &*disks {
if disk.mount_point().to_string_lossy().len() > 1
&& config.storage.path.starts_with(disk.mount_point())
&& config.global.storage_path.starts_with(disk.mount_point())
{
storage.path = disk.name().to_string_lossy().to_string();
storage.total = disk.total_space();

View File

@ -0,0 +1,27 @@
use std::process::Command;
use log::*;
use crate::player::utils::get_data_map;
use crate::player::controller::ChannelManager;
pub fn run(manager: ChannelManager) {
let task_path = manager.config.lock().unwrap().task.path.clone();
let obj = serde_json::to_string(&get_data_map(&manager)).unwrap();
trace!("Run task: {obj}");
match Command::new(task_path).arg(obj).spawn() {
Ok(mut c) => {
let status = c.wait().expect("Error in waiting for the task process!");
if !status.success() {
error!("Process stops with error.");
}
}
Err(e) => {
error!("Couldn't spawn task runner: {e}")
}
}
}

1
frontend Submodule

@ -0,0 +1 @@
Subproject commit e8fd6f65bf255c55c1ca509f9fe59c2c7875c394

View File

@ -1,40 +0,0 @@
[package]
name = "ffplayout-lib"
description = "Library for ffplayout"
readme = "README.md"
version.workspace = true
license.workspace = true
authors.workspace = true
repository.workspace = true
edition.workspace = true
[dependencies]
chrono = { version = "0.4", default-features = false, features = ["clock", "serde", "std"] }
crossbeam-channel = "0.5"
derive_more = "0.99"
ffprobe = "0.4"
file-rotate = "0.7"
home = "0.5"
lazy_static = "1.4"
lettre = { version = "0.11", features = ["builder", "rustls-tls", "smtp-transport"], default-features = false }
lexical-sort = "0.3"
log = "0.4"
num-traits = "0.2"
rand = "0.8"
regex = "1"
reqwest = { version = "0.12", default-features = false, features = ["blocking", "json", "rustls-tls"] }
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"
serde_with = "3.8"
shlex = "1.1"
simplelog = { version = "0.12", features = ["paris"] }
time = { version = "0.3", features = ["formatting", "macros"] }
toml_edit = {version ="0.22", features = ["serde"]}
walkdir = "2"
[target."cfg(windows)".dependencies.winapi]
version = "0.3"
features = ["shlobj", "std", "winerror"]
[target.'cfg(not(target_arch = "windows"))'.dependencies]
signal-child = "1"

View File

@ -1,4 +0,0 @@
**ffplayout-lib**
================
This folder only contains helper functions which are used in multiple apps.

View File

@ -1,8 +0,0 @@
extern crate log;
extern crate simplelog;
pub mod filter;
pub mod macros;
pub mod utils;
use utils::advanced_config::AdvancedConfig;

View File

@ -1,130 +0,0 @@
use std::{fs::File, io::Read, path::PathBuf};
use serde::{Deserialize, Serialize};
use serde_with::{serde_as, NoneAsEmptyString};
use shlex::split;
#[derive(Debug, Default, Serialize, Deserialize, Clone)]
pub struct AdvancedConfig {
pub decoder: DecoderConfig,
pub encoder: EncoderConfig,
pub filters: Filters,
pub ingest: IngestConfig,
}
#[serde_as]
#[derive(Debug, Default, Serialize, Deserialize, Clone)]
pub struct DecoderConfig {
#[serde_as(as = "NoneAsEmptyString")]
pub input_param: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub output_param: Option<String>,
#[serde(skip_serializing, skip_deserializing)]
pub input_cmd: Option<Vec<String>>,
#[serde(skip_serializing, skip_deserializing)]
pub output_cmd: Option<Vec<String>>,
}
#[serde_as]
#[derive(Debug, Default, Serialize, Deserialize, Clone)]
pub struct EncoderConfig {
#[serde_as(as = "NoneAsEmptyString")]
pub input_param: Option<String>,
#[serde(skip_serializing, skip_deserializing)]
pub input_cmd: Option<Vec<String>>,
}
#[serde_as]
#[derive(Debug, Default, Serialize, Deserialize, Clone)]
pub struct IngestConfig {
#[serde_as(as = "NoneAsEmptyString")]
pub input_param: Option<String>,
#[serde(skip_serializing, skip_deserializing)]
pub input_cmd: Option<Vec<String>>,
}
#[serde_as]
#[derive(Debug, Default, Serialize, Deserialize, Clone)]
pub struct Filters {
#[serde_as(as = "NoneAsEmptyString")]
pub deinterlace: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub pad_scale_w: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub pad_scale_h: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub pad_video: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub fps: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub scale: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub set_dar: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub fade_in: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub fade_out: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub overlay_logo_scale: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub overlay_logo_fade_in: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub overlay_logo_fade_out: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub overlay_logo: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub tpad: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub drawtext_from_file: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub drawtext_from_zmq: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub aevalsrc: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub afade_in: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub afade_out: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub apad: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub volume: Option<String>,
#[serde_as(as = "NoneAsEmptyString")]
pub split: Option<String>,
}
impl AdvancedConfig {
pub fn new(cfg_path: PathBuf) -> Self {
let mut config: AdvancedConfig = Default::default();
if let Ok(mut file) = File::open(cfg_path) {
let mut contents = String::new();
if let Err(e) = file.read_to_string(&mut contents) {
eprintln!("Read advanced config file: {e}")
};
match toml_edit::de::from_str(&contents) {
Ok(tm) => config = tm,
Err(e) => eprintln!("Serialize advanced config file: {e}"),
};
if let Some(input_parm) = &config.decoder.input_param {
config.decoder.input_cmd = split(input_parm);
}
if let Some(output_param) = &config.decoder.output_param {
config.decoder.output_cmd = split(output_param);
}
if let Some(input_param) = &config.encoder.input_param {
config.encoder.input_cmd = split(input_param);
}
if let Some(input_param) = &config.ingest.input_param {
config.ingest.input_cmd = split(input_param);
}
};
config
}
}

View File

@ -1,585 +0,0 @@
use std::{
env, fmt,
fs::File,
io::Read,
path::{Path, PathBuf},
process,
str::FromStr,
};
use chrono::NaiveTime;
use log::LevelFilter;
use serde::{de, Deserialize, Deserializer, Serialize, Serializer};
use shlex::split;
use crate::AdvancedConfig;
use super::vec_strings;
use crate::utils::{free_tcp_socket, time_to_sec, OutputMode::*};
pub const DUMMY_LEN: f64 = 60.0;
pub const IMAGE_FORMAT: [&str; 21] = [
"bmp", "dds", "dpx", "exr", "gif", "hdr", "j2k", "jpg", "jpeg", "pcx", "pfm", "pgm", "phm",
"png", "psd", "ppm", "sgi", "svg", "tga", "tif", "webp",
];
// Some well known errors can be safely ignore
pub const FFMPEG_IGNORE_ERRORS: [&str; 11] = [
"ac-tex damaged",
"codec s302m, is muxed as a private data stream",
"corrupt decoded frame in stream",
"corrupt input packet in stream",
"end mismatch left",
"Packet corrupt",
"Referenced QT chapter track not found",
"skipped MB in I-frame at",
"Thread message queue blocking",
"Warning MVs not available",
"frame size not set",
];
pub const FFMPEG_UNRECOVERABLE_ERRORS: [&str; 5] = [
"Address already in use",
"Invalid argument",
"Numerical result",
"Error initializing complex filters",
"Error while decoding stream #0:0: Invalid data found when processing input",
];
#[derive(Debug, Serialize, Deserialize, Clone, Eq, PartialEq)]
#[serde(rename_all = "lowercase")]
pub enum OutputMode {
Desktop,
HLS,
Null,
Stream,
}
impl FromStr for OutputMode {
type Err = String;
fn from_str(input: &str) -> Result<Self, Self::Err> {
match input {
"desktop" => Ok(Self::Desktop),
"hls" => Ok(Self::HLS),
"null" => Ok(Self::Null),
"stream" => Ok(Self::Stream),
_ => Err("Use 'desktop', 'hls', 'null' or 'stream'".to_string()),
}
}
}
#[derive(Debug, Serialize, Deserialize, Clone, Eq, PartialEq)]
#[serde(rename_all = "lowercase")]
pub enum ProcessMode {
Folder,
Playlist,
}
impl fmt::Display for ProcessMode {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
match *self {
ProcessMode::Folder => write!(f, "folder"),
ProcessMode::Playlist => write!(f, "playlist"),
}
}
}
impl FromStr for ProcessMode {
type Err = String;
fn from_str(input: &str) -> Result<Self, Self::Err> {
match input {
"folder" => Ok(Self::Folder),
"playlist" => Ok(Self::Playlist),
_ => Err("Use 'folder' or 'playlist'".to_string()),
}
}
}
pub fn string_to_log_level<'de, D>(deserializer: D) -> Result<LevelFilter, D::Error>
where
D: Deserializer<'de>,
{
let s: String = Deserialize::deserialize(deserializer)?;
match s.to_lowercase().as_str() {
"debug" => Ok(LevelFilter::Debug),
"error" => Ok(LevelFilter::Error),
"info" => Ok(LevelFilter::Info),
"trace" => Ok(LevelFilter::Trace),
"warning" => Ok(LevelFilter::Warn),
"off" => Ok(LevelFilter::Off),
_ => Err(de::Error::custom("Error level not exists!")),
}
}
fn log_level_to_string<S>(l: &LevelFilter, s: S) -> Result<S::Ok, S::Error>
where
S: Serializer,
{
match l {
LevelFilter::Debug => s.serialize_str("DEBUG"),
LevelFilter::Error => s.serialize_str("ERROR"),
LevelFilter::Info => s.serialize_str("INFO"),
LevelFilter::Trace => s.serialize_str("TRACE"),
LevelFilter::Warn => s.serialize_str("WARNING"),
LevelFilter::Off => s.serialize_str("OFF"),
}
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct Template {
pub sources: Vec<Source>,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct Source {
pub start: NaiveTime,
pub duration: NaiveTime,
pub shuffle: bool,
pub paths: Vec<PathBuf>,
}
/// Global Config
///
/// This we init ones, when ffplayout is starting and use them globally in the hole program.
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct PlayoutConfig {
#[serde(default, skip_serializing, skip_deserializing)]
pub advanced: Option<AdvancedConfig>,
pub general: General,
pub rpc_server: RpcServer,
pub mail: Mail,
pub logging: Logging,
pub processing: Processing,
pub ingest: Ingest,
pub playlist: Playlist,
pub storage: Storage,
pub text: Text,
#[serde(default)]
pub task: Task,
pub out: Out,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct General {
pub help_text: String,
pub stop_threshold: f64,
#[serde(default, skip_serializing, skip_deserializing)]
pub config_path: String,
#[serde(default)]
pub stat_file: String,
#[serde(skip_serializing, skip_deserializing)]
pub generate: Option<Vec<String>>,
#[serde(skip_serializing, skip_deserializing)]
pub ffmpeg_filters: Vec<String>,
#[serde(skip_serializing, skip_deserializing)]
pub ffmpeg_libs: Vec<String>,
#[serde(skip_serializing, skip_deserializing)]
pub template: Option<Template>,
#[serde(default, skip_serializing, skip_deserializing)]
pub skip_validation: bool,
#[serde(default, skip_serializing, skip_deserializing)]
pub validate: bool,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct RpcServer {
pub help_text: String,
pub enable: bool,
pub address: String,
pub authorization: String,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct Mail {
pub help_text: String,
pub subject: String,
pub smtp_server: String,
pub starttls: bool,
pub sender_addr: String,
pub sender_pass: String,
pub recipient: String,
pub mail_level: String,
pub interval: u64,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct Logging {
pub help_text: String,
pub log_to_file: bool,
pub backup_count: usize,
pub local_time: bool,
pub timestamp: bool,
#[serde(alias = "log_path")]
pub path: PathBuf,
#[serde(
alias = "log_level",
serialize_with = "log_level_to_string",
deserialize_with = "string_to_log_level"
)]
pub level: LevelFilter,
pub ffmpeg_level: String,
pub ingest_level: Option<String>,
#[serde(default)]
pub detect_silence: bool,
#[serde(default)]
pub ignore_lines: Vec<String>,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct Processing {
pub help_text: String,
pub mode: ProcessMode,
#[serde(default)]
pub audio_only: bool,
#[serde(default = "default_track_index")]
pub audio_track_index: i32,
#[serde(default)]
pub copy_audio: bool,
#[serde(default)]
pub copy_video: bool,
pub width: i64,
pub height: i64,
pub aspect: f64,
pub fps: f64,
pub add_logo: bool,
pub logo: String,
pub logo_scale: String,
pub logo_opacity: f32,
pub logo_position: String,
#[serde(default = "default_tracks")]
pub audio_tracks: i32,
#[serde(default = "default_channels")]
pub audio_channels: u8,
pub volume: f64,
#[serde(default)]
pub custom_filter: String,
#[serde(skip_serializing, skip_deserializing)]
pub cmd: Option<Vec<String>>,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct Ingest {
pub help_text: String,
pub enable: bool,
input_param: String,
#[serde(default)]
pub custom_filter: String,
#[serde(skip_serializing, skip_deserializing)]
pub input_cmd: Option<Vec<String>>,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct Playlist {
pub help_text: String,
pub path: PathBuf,
pub day_start: String,
#[serde(skip_serializing, skip_deserializing)]
pub start_sec: Option<f64>,
pub length: String,
#[serde(skip_serializing, skip_deserializing)]
pub length_sec: Option<f64>,
pub infinit: bool,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct Storage {
pub help_text: String,
pub path: PathBuf,
#[serde(skip_serializing, skip_deserializing)]
pub paths: Vec<PathBuf>,
#[serde(alias = "filler_clip")]
pub filler: PathBuf,
pub extensions: Vec<String>,
pub shuffle: bool,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct Text {
pub help_text: String,
pub add_text: bool,
#[serde(skip_serializing, skip_deserializing)]
pub node_pos: Option<usize>,
#[serde(skip_serializing, skip_deserializing)]
pub zmq_stream_socket: Option<String>,
#[serde(skip_serializing, skip_deserializing)]
pub zmq_server_socket: Option<String>,
pub fontfile: String,
pub text_from_filename: bool,
pub style: String,
pub regex: String,
}
#[derive(Debug, Default, Serialize, Deserialize, Clone)]
pub struct Task {
pub help_text: String,
pub enable: bool,
pub path: PathBuf,
}
#[derive(Debug, Serialize, Deserialize, Clone)]
pub struct Out {
pub help_text: String,
pub mode: OutputMode,
pub output_param: String,
#[serde(skip_serializing, skip_deserializing)]
pub output_count: usize,
#[serde(skip_serializing, skip_deserializing)]
pub output_filter: Option<String>,
#[serde(skip_serializing, skip_deserializing)]
pub output_cmd: Option<Vec<String>>,
}
fn default_track_index() -> i32 {
-1
}
fn default_tracks() -> i32 {
1
}
fn default_channels() -> u8 {
2
}
impl PlayoutConfig {
/// Read config from YAML file, and set some extra config values.
pub fn new(cfg_path: Option<PathBuf>, advanced_path: Option<PathBuf>) -> Self {
let mut config_path = PathBuf::from("/etc/ffplayout/ffplayout.toml");
if let Some(cfg) = cfg_path {
config_path = cfg;
}
if !config_path.is_file() {
if Path::new("./assets/ffplayout.toml").is_file() {
config_path = PathBuf::from("./assets/ffplayout.toml")
} else if let Some(p) = env::current_exe().ok().as_ref().and_then(|op| op.parent()) {
config_path = p.join("ffplayout.toml")
};
}
let mut file = match File::open(&config_path) {
Ok(file) => file,
Err(_) => {
eprintln!(
"ffplayout.toml not found!\nPut \"ffplayout.toml\" in \"/etc/playout/\" or beside the executable!"
);
process::exit(1);
}
};
let mut contents = String::new();
if let Err(e) = file.read_to_string(&mut contents) {
eprintln!("Read config file: {e}")
};
let mut config: PlayoutConfig = toml_edit::de::from_str(&contents).unwrap();
if let Some(adv_path) = advanced_path {
config.advanced = Some(AdvancedConfig::new(adv_path))
}
config.general.generate = None;
config.general.config_path = config_path.to_string_lossy().to_string();
config.general.stat_file = home::home_dir()
.unwrap_or_else(env::temp_dir)
.join(if config.general.stat_file.is_empty() {
".ffp_status"
} else {
&config.general.stat_file
})
.display()
.to_string();
if config.logging.ingest_level.is_none() {
config.logging.ingest_level = Some(config.logging.ffmpeg_level.clone())
}
config.playlist.start_sec = Some(time_to_sec(&config.playlist.day_start));
if config.playlist.length.contains(':') {
config.playlist.length_sec = Some(time_to_sec(&config.playlist.length));
} else {
config.playlist.length_sec = Some(86400.0);
}
if config.processing.add_logo && !Path::new(&config.processing.logo).is_file() {
config.processing.add_logo = false;
}
config.processing.logo_scale = config
.processing
.logo_scale
.trim_end_matches('~')
.to_string();
if config.processing.audio_tracks < 1 {
config.processing.audio_tracks = 1
}
let mut process_cmd = vec_strings![];
let advanced_output_cmd = config
.advanced
.as_ref()
.and_then(|a| a.decoder.output_cmd.clone());
if config.processing.audio_only {
process_cmd.append(&mut vec_strings!["-vn"]);
} else if config.processing.copy_video {
process_cmd.append(&mut vec_strings!["-c:v", "copy"]);
} else if let Some(decoder_cmd) = &advanced_output_cmd {
process_cmd.append(&mut decoder_cmd.clone());
} else {
let bitrate = format!(
"{}k",
config.processing.width * config.processing.height / 16
);
let buff_size = format!(
"{}k",
(config.processing.width * config.processing.height / 16) / 2
);
process_cmd.append(&mut vec_strings![
"-pix_fmt",
"yuv420p",
"-r",
&config.processing.fps,
"-c:v",
"mpeg2video",
"-g",
"1",
"-b:v",
&bitrate,
"-minrate",
&bitrate,
"-maxrate",
&bitrate,
"-bufsize",
&buff_size
]);
}
if config.processing.copy_audio {
process_cmd.append(&mut vec_strings!["-c:a", "copy"]);
} else if advanced_output_cmd.is_none() {
process_cmd.append(&mut pre_audio_codec(
&config.processing.custom_filter,
&config.ingest.custom_filter,
config.processing.audio_channels,
));
}
process_cmd.append(&mut vec_strings!["-f", "mpegts", "-"]);
config.processing.cmd = Some(process_cmd);
config.ingest.input_cmd = split(config.ingest.input_param.as_str());
config.out.output_count = 1;
config.out.output_filter = None;
if config.out.mode == Null {
config.out.output_cmd = Some(vec_strings!["-f", "null", "-"]);
} else if let Some(mut cmd) = split(config.out.output_param.as_str()) {
// get output count according to the var_stream_map value, or by counting output parameters
if let Some(i) = cmd.clone().iter().position(|m| m == "-var_stream_map") {
config.out.output_count = cmd[i + 1].split_whitespace().count();
} else {
config.out.output_count = cmd
.iter()
.enumerate()
.filter(|(i, p)| i > &0 && !p.starts_with('-') && !cmd[i - 1].starts_with('-'))
.count();
}
if let Some(i) = cmd.clone().iter().position(|r| r == "-filter_complex") {
config.out.output_filter = Some(cmd[i + 1].clone());
cmd.remove(i);
cmd.remove(i);
}
config.out.output_cmd = Some(cmd);
}
// when text overlay without text_from_filename is on, turn also the RPC server on,
// to get text messages from it
if config.text.add_text && !config.text.text_from_filename {
config.rpc_server.enable = true;
config.text.zmq_stream_socket = free_tcp_socket(String::new());
config.text.zmq_server_socket =
free_tcp_socket(config.text.zmq_stream_socket.clone().unwrap_or_default());
config.text.node_pos = Some(2);
} else {
config.text.zmq_stream_socket = None;
config.text.zmq_server_socket = None;
config.text.node_pos = None;
}
config
}
}
impl Default for PlayoutConfig {
fn default() -> Self {
Self::new(None, None)
}
}
/// When custom_filter contains loudnorm filter use a different audio encoder,
/// s302m has higher quality, but is experimental
/// and works not well together with the loudnorm filter.
fn pre_audio_codec(proc_filter: &str, ingest_filter: &str, channel_count: u8) -> Vec<String> {
let mut codec = vec_strings![
"-c:a",
"s302m",
"-strict",
"-2",
"-sample_fmt",
"s16",
"-ar",
"48000",
"-ac",
channel_count
];
if proc_filter.contains("loudnorm") || ingest_filter.contains("loudnorm") {
codec = vec_strings![
"-c:a",
"mp2",
"-b:a",
"384k",
"-ar",
"48000",
"-ac",
channel_count
];
}
codec
}

View File

@ -1,226 +0,0 @@
use std::{
fmt,
process::Child,
sync::{
atomic::{AtomicBool, AtomicUsize, Ordering},
Arc, Mutex,
},
};
#[cfg(not(windows))]
use signal_child::Signalable;
use serde::{Deserialize, Serialize};
use simplelog::*;
use crate::utils::Media;
/// Defined process units.
#[derive(Clone, Debug, Default, Copy, Eq, Serialize, Deserialize, PartialEq)]
pub enum ProcessUnit {
#[default]
Decoder,
Encoder,
Ingest,
}
impl fmt::Display for ProcessUnit {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
match *self {
ProcessUnit::Decoder => write!(f, "Decoder"),
ProcessUnit::Encoder => write!(f, "Encoder"),
ProcessUnit::Ingest => write!(f, "Ingest"),
}
}
}
use ProcessUnit::*;
/// Process Controller
///
/// We save here some global states, about what is running and which processes are alive.
/// This we need for process termination, skipping clip decoder etc.
#[derive(Clone)]
pub struct ProcessControl {
pub decoder_term: Arc<Mutex<Option<Child>>>,
pub encoder_term: Arc<Mutex<Option<Child>>>,
pub server_term: Arc<Mutex<Option<Child>>>,
pub server_is_running: Arc<AtomicBool>,
pub is_terminated: Arc<AtomicBool>,
pub is_alive: Arc<AtomicBool>,
}
impl ProcessControl {
pub fn new() -> Self {
Self {
decoder_term: Arc::new(Mutex::new(None)),
encoder_term: Arc::new(Mutex::new(None)),
server_term: Arc::new(Mutex::new(None)),
server_is_running: Arc::new(AtomicBool::new(false)),
is_terminated: Arc::new(AtomicBool::new(false)),
is_alive: Arc::new(AtomicBool::new(true)),
}
}
}
impl Default for ProcessControl {
fn default() -> Self {
Self::new()
}
}
impl ProcessControl {
pub fn stop(&self, unit: ProcessUnit) -> Result<(), String> {
match unit {
Decoder => {
if let Some(proc) = self.decoder_term.lock().unwrap().as_mut() {
#[cfg(not(windows))]
if let Err(e) = proc.term() {
return Err(format!("Decoder {e:?}"));
}
#[cfg(windows)]
if let Err(e) = proc.kill() {
return Err(format!("Decoder {e:?}"));
}
}
}
Encoder => {
if let Some(proc) = self.encoder_term.lock().unwrap().as_mut() {
if let Err(e) = proc.kill() {
return Err(format!("Encoder {e:?}"));
};
}
}
Ingest => {
if let Some(proc) = self.server_term.lock().unwrap().as_mut() {
if let Err(e) = proc.kill() {
return Err(format!("Ingest server {e:?}"));
};
}
}
}
self.wait(unit)?;
Ok(())
}
/// Wait for process to proper close.
/// This prevents orphaned/zombi processes in system
pub fn wait(&self, unit: ProcessUnit) -> Result<(), String> {
match unit {
Decoder => {
if let Some(proc) = self.decoder_term.lock().unwrap().as_mut() {
if let Err(e) = proc.wait() {
return Err(format!("Decoder {e:?}"));
};
}
}
Encoder => {
if let Some(proc) = self.encoder_term.lock().unwrap().as_mut() {
if let Err(e) = proc.wait() {
return Err(format!("Encoder {e:?}"));
};
}
}
Ingest => {
if let Some(proc) = self.server_term.lock().unwrap().as_mut() {
if let Err(e) = proc.wait() {
return Err(format!("Ingest server {e:?}"));
};
}
}
}
Ok(())
}
/// No matter what is running, terminate them all.
pub fn stop_all(&self) {
debug!("Stop all child processes");
self.is_terminated.store(true, Ordering::SeqCst);
self.server_is_running.store(false, Ordering::SeqCst);
if self.is_alive.load(Ordering::SeqCst) {
self.is_alive.store(false, Ordering::SeqCst);
trace!("Playout is alive and processes are terminated");
for unit in [Decoder, Encoder, Ingest] {
if let Err(e) = self.stop(unit) {
if !e.contains("exited process") {
error!("{e}")
}
}
if let Err(e) = self.wait(unit) {
if !e.contains("exited process") {
error!("{e}")
}
}
}
}
}
}
// impl Drop for ProcessControl {
// fn drop(&mut self) {
// self.stop_all()
// }
// }
/// Global player control, to get infos about current clip etc.
#[derive(Clone, Debug)]
pub struct PlayerControl {
pub current_media: Arc<Mutex<Option<Media>>>,
pub current_list: Arc<Mutex<Vec<Media>>>,
pub filler_list: Arc<Mutex<Vec<Media>>>,
pub current_index: Arc<AtomicUsize>,
pub filler_index: Arc<AtomicUsize>,
}
impl PlayerControl {
pub fn new() -> Self {
Self {
current_media: Arc::new(Mutex::new(None)),
current_list: Arc::new(Mutex::new(vec![Media::new(0, "", false)])),
filler_list: Arc::new(Mutex::new(vec![])),
current_index: Arc::new(AtomicUsize::new(0)),
filler_index: Arc::new(AtomicUsize::new(0)),
}
}
}
impl Default for PlayerControl {
fn default() -> Self {
Self::new()
}
}
/// Global playout control, for move forward/backward clip, or resetting playlist/state.
#[derive(Clone, Debug)]
pub struct PlayoutStatus {
pub chain: Option<Arc<Mutex<Vec<String>>>>,
pub current_date: Arc<Mutex<String>>,
pub date: Arc<Mutex<String>>,
pub list_init: Arc<AtomicBool>,
pub time_shift: Arc<Mutex<f64>>,
}
impl PlayoutStatus {
pub fn new() -> Self {
Self {
chain: None,
current_date: Arc::new(Mutex::new(String::new())),
date: Arc::new(Mutex::new(String::new())),
list_init: Arc::new(AtomicBool::new(true)),
time_shift: Arc::new(Mutex::new(0.0)),
}
}
}
impl Default for PlayoutStatus {
fn default() -> Self {
Self::new()
}
}

View File

@ -1,56 +0,0 @@
use std::io;
use derive_more::Display;
use ffprobe::FfProbeError;
#[derive(Debug, Display)]
pub enum ProcError {
#[display(fmt = "Failed to spawn ffmpeg/ffprobe. {}", _0)]
CommandSpawn(io::Error),
#[display(fmt = "IO Error {}", _0)]
IO(io::Error),
#[display(fmt = "{}", _0)]
Custom(String),
#[display(fmt = "{}", _0)]
Ffprobe(FfProbeError),
#[display(fmt = "Regex compile error {}", _0)]
Regex(String),
#[display(fmt = "Thread error {}", _0)]
Thread(String),
}
impl From<std::io::Error> for ProcError {
fn from(err: std::io::Error) -> Self {
Self::CommandSpawn(err)
}
}
impl From<FfProbeError> for ProcError {
fn from(err: FfProbeError) -> Self {
Self::Ffprobe(err)
}
}
impl From<regex::Error> for ProcError {
fn from(err: regex::Error) -> Self {
Self::Regex(err.to_string())
}
}
impl From<log::SetLoggerError> for ProcError {
fn from(err: log::SetLoggerError) -> Self {
Self::Custom(err.to_string())
}
}
impl From<serde_json::Error> for ProcError {
fn from(err: serde_json::Error) -> Self {
Self::Custom(err.to_string())
}
}
impl From<Box<dyn std::any::Any + std::marker::Send>> for ProcError {
fn from(err: Box<dyn std::any::Any + std::marker::Send>) -> Self {
Self::Thread(format!("{err:?}"))
}
}

View File

@ -1,281 +0,0 @@
extern crate log;
extern crate simplelog;
use std::{
path::PathBuf,
sync::{atomic::Ordering, Arc, Mutex},
thread::{self, sleep},
time::Duration,
};
use chrono::prelude::*;
use file_rotate::{
compression::Compression,
suffix::{AppendTimestamp, DateFrom, FileLimit},
ContentLimit, FileRotate, TimeFrequency,
};
use lettre::{
message::header, transport::smtp::authentication::Credentials, Message, SmtpTransport,
Transport,
};
use log::{Level, LevelFilter, Log, Metadata, Record};
use regex::Regex;
use simplelog::*;
use crate::utils::{PlayoutConfig, ProcessControl};
/// send log messages to mail recipient
pub fn send_mail(cfg: &PlayoutConfig, msg: String) {
let recipient = cfg
.mail
.recipient
.split_terminator([',', ';', ' '])
.filter(|s| s.contains('@'))
.map(|s| s.trim())
.collect::<Vec<&str>>();
let mut message = Message::builder()
.from(cfg.mail.sender_addr.parse().unwrap())
.subject(&cfg.mail.subject)
.header(header::ContentType::TEXT_PLAIN);
for r in recipient {
message = message.to(r.parse().unwrap());
}
if let Ok(mail) = message.body(clean_string(&msg)) {
let credentials =
Credentials::new(cfg.mail.sender_addr.clone(), cfg.mail.sender_pass.clone());
let mut transporter = SmtpTransport::relay(cfg.mail.smtp_server.clone().as_str());
if cfg.mail.starttls {
transporter = SmtpTransport::starttls_relay(cfg.mail.smtp_server.clone().as_str());
}
let mailer = transporter.unwrap().credentials(credentials).build();
// Send the mail
if let Err(e) = mailer.send(&mail) {
error!("Could not send mail: {e}");
}
} else {
error!("Mail Message failed!");
}
}
/// Basic Mail Queue
///
/// Check every give seconds for messages and send them.
fn mail_queue(
cfg: PlayoutConfig,
proc_ctl: ProcessControl,
messages: Arc<Mutex<Vec<String>>>,
interval: u64,
) {
while !proc_ctl.is_terminated.load(Ordering::SeqCst) {
let mut msg = messages.lock().unwrap();
if msg.len() > 0 {
send_mail(&cfg, msg.join("\n"));
msg.clear();
}
drop(msg);
sleep(Duration::from_secs(interval));
}
}
/// Self made Mail Log struct, to extend simplelog.
pub struct LogMailer {
level: LevelFilter,
pub config: Config,
messages: Arc<Mutex<Vec<String>>>,
last_messages: Arc<Mutex<Vec<String>>>,
}
impl LogMailer {
pub fn new(
log_level: LevelFilter,
config: Config,
messages: Arc<Mutex<Vec<String>>>,
) -> Box<LogMailer> {
Box::new(LogMailer {
level: log_level,
config,
messages,
last_messages: Arc::new(Mutex::new(vec![String::new()])),
})
}
}
impl Log for LogMailer {
fn enabled(&self, metadata: &Metadata<'_>) -> bool {
metadata.level() <= self.level
}
fn log(&self, record: &Record<'_>) {
if self.enabled(record.metadata()) {
let rec = record.args().to_string();
let mut last_msgs = self.last_messages.lock().unwrap();
// put message only to mail queue when it differs from last message
// this we do to prevent spamming the mail box
// also ignore errors from lettre mail module, because it prevents program from closing
if !last_msgs.contains(&rec) && !rec.contains("lettre") {
if last_msgs.len() > 2 {
last_msgs.clear()
}
last_msgs.push(rec.clone());
let local: DateTime<Local> = Local::now();
let time_stamp = local.format("[%Y-%m-%d %H:%M:%S%.3f]");
let level = record.level().to_string().to_uppercase();
let full_line = format!("{time_stamp} [{level: >5}] {rec}");
self.messages.lock().unwrap().push(full_line);
}
}
}
fn flush(&self) {}
}
impl SharedLogger for LogMailer {
fn level(&self) -> LevelFilter {
self.level
}
fn config(&self) -> Option<&Config> {
Some(&self.config)
}
fn as_log(self: Box<Self>) -> Box<dyn Log> {
Box::new(*self)
}
}
/// Workaround to remove color information from log
fn clean_string(text: &str) -> String {
let regex = Regex::new(r"\x1b\[[0-9;]*[mGKF]").unwrap();
regex.replace_all(text, "").to_string()
}
/// Initialize our logging, to have:
///
/// - console logger
/// - file logger
/// - mail logger
pub fn init_logging(
config: &PlayoutConfig,
proc_ctl: Option<ProcessControl>,
messages: Option<Arc<Mutex<Vec<String>>>>,
) -> Vec<Box<dyn SharedLogger>> {
let config_clone = config.clone();
let app_config = config.logging.clone();
let mut time_level = LevelFilter::Off;
let mut app_logger: Vec<Box<dyn SharedLogger>> = vec![];
if app_config.timestamp {
time_level = LevelFilter::Error;
}
let mut log_config = ConfigBuilder::new()
.set_thread_level(LevelFilter::Off)
.set_target_level(LevelFilter::Off)
.add_filter_ignore_str("hyper")
.add_filter_ignore_str("libc")
.add_filter_ignore_str("neli")
.add_filter_ignore_str("reqwest")
.add_filter_ignore_str("rpc")
.add_filter_ignore_str("rustls")
.add_filter_ignore_str("serial_test")
.add_filter_ignore_str("sqlx")
.add_filter_ignore_str("tiny_http")
.set_level_padding(LevelPadding::Left)
.set_time_level(time_level)
.clone();
if app_config.local_time {
log_config = match log_config.set_time_offset_to_local() {
Ok(local) => local.clone(),
Err(_) => log_config,
};
};
if app_config.log_to_file && app_config.path.exists() {
let file_config = log_config
.clone()
.set_time_format_custom(format_description!(
"[[[year]-[month]-[day] [hour]:[minute]:[second].[subsecond digits:5]]"
))
.build();
let mut log_path = PathBuf::from("logs/ffplayout.log");
if app_config.path.is_dir() {
log_path = app_config.path.join("ffplayout.log");
} else if app_config.path.is_file() {
log_path = app_config.path
} else {
eprintln!("Logging path not exists!")
}
let log_file = FileRotate::new(
log_path,
AppendTimestamp::with_format(
"%Y-%m-%d",
FileLimit::MaxFiles(app_config.backup_count),
DateFrom::DateYesterday,
),
ContentLimit::Time(TimeFrequency::Daily),
Compression::None,
#[cfg(unix)]
None,
);
app_logger.push(WriteLogger::new(app_config.level, file_config, log_file));
} else {
let term_config = log_config
.clone()
.set_level_color(Level::Trace, Some(Color::Ansi256(11)))
.set_level_color(Level::Debug, Some(Color::Ansi256(12)))
.set_level_color(Level::Info, Some(Color::Ansi256(10)))
.set_level_color(Level::Warn, Some(Color::Ansi256(208)))
.set_level_color(Level::Error, Some(Color::Ansi256(9)))
.set_time_format_custom(format_description!(
"\x1b[[30;1m[[[year]-[month]-[day] [hour]:[minute]:[second].[subsecond digits:5]]\x1b[[0m"
))
.build();
app_logger.push(TermLogger::new(
app_config.level,
term_config,
TerminalMode::Mixed,
ColorChoice::Auto,
));
}
// set mail logger only the recipient is set in config
if config.mail.recipient.contains('@') && config.mail.recipient.contains('.') {
let messages_clone = messages.clone().unwrap();
let interval = config.mail.interval;
thread::spawn(move || {
mail_queue(config_clone, proc_ctl.unwrap(), messages_clone, interval)
});
let mail_config = log_config.build();
let filter = match config.mail.mail_level.to_lowercase().as_str() {
"info" => LevelFilter::Info,
"warning" => LevelFilter::Warn,
_ => LevelFilter::Error,
};
app_logger.push(LogMailer::new(filter, mail_config, messages.unwrap()));
}
app_logger
}

View File

@ -0,0 +1,273 @@
-- Add migration script here
PRAGMA foreign_keys = ON;
CREATE TABLE
global (
id INTEGER PRIMARY KEY AUTOINCREMENT,
secret TEXT NOT NULL,
hls_path TEXT NOT NULL DEFAULT "/usr/share/ffplayout/public",
logging_path TEXT NOT NULL DEFAULT "/var/log/ffplayout",
playlist_path TEXT NOT NULL DEFAULT "/var/lib/ffplayout/playlists",
storage_path TEXT NOT NULL DEFAULT "/var/lib/ffplayout/tv-media",
shared_storage INTEGER NOT NULL DEFAULT 1,
UNIQUE (secret)
);
CREATE TABLE
roles (
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT NOT NULL,
UNIQUE (name)
);
CREATE TABLE
channels (
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT NOT NULL,
preview_url TEXT NOT NULL,
extra_extensions TEXT NOT NULL DEFAULT 'jpg,jpeg,png',
active INTEGER NOT NULL DEFAULT 0,
last_date TEXT,
time_shift REAL NOT NULL DEFAULT 0
);
CREATE TABLE
presets (
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT NOT NULL,
text TEXT NOT NULL,
x TEXT NOT NULL,
y TEXT NOT NULL,
fontsize TEXT NOT NULL,
line_spacing TEXT NOT NULL,
fontcolor TEXT NOT NULL,
box TEXT NOT NULL,
boxcolor TEXT NOT NULL,
boxborderw TEXT NOT NULL,
alpha TEXT NOT NULL,
channel_id INTEGER NOT NULL DEFAULT 1,
FOREIGN KEY (channel_id) REFERENCES channels (id) ON UPDATE CASCADE ON DELETE CASCADE,
UNIQUE (name)
);
CREATE TABLE
user (
id INTEGER PRIMARY KEY AUTOINCREMENT,
mail TEXT NOT NULL,
username TEXT NOT NULL,
password TEXT NOT NULL,
role_id INTEGER NOT NULL DEFAULT 3,
FOREIGN KEY (role_id) REFERENCES roles (id) ON UPDATE SET NULL ON DELETE SET DEFAULT,
UNIQUE (mail, username)
);
CREATE TABLE
user_channels (
id INTEGER PRIMARY KEY,
channel_id INTEGER NOT NULL,
user_id INTEGER NOT NULL,
FOREIGN KEY (channel_id) REFERENCES channels (id) ON UPDATE CASCADE ON DELETE CASCADE,
FOREIGN KEY (user_id) REFERENCES user (id) ON UPDATE CASCADE ON DELETE CASCADE
);
CREATE UNIQUE INDEX IF NOT EXISTS idx_user_channels_unique ON user_channels (channel_id, user_id);
CREATE TABLE
configurations (
id INTEGER PRIMARY KEY AUTOINCREMENT,
channel_id INTEGER NOT NULL DEFAULT 1,
general_help TEXT NOT NULL DEFAULT "Sometimes it can happen, that a file is corrupt but still playable, this can produce an streaming error over all following files. The only way in this case is, to stop ffplayout and start it again. Here we only say when it stops, the starting process is in your hand. Best way is a systemd service on linux.\n'stop_threshold' stop ffplayout, if it is async in time above this value. A number below 3 can cause unexpected errors.",
general_stop_threshold REAL NOT NULL DEFAULT 11.0,
mail_help TEXT NOT NULL DEFAULT "Send error messages to email address, like missing playlist; invalid json format; missing clip path. Leave recipient blank, if you don't need this.\n'mail_level' can be INFO, WARNING or ERROR.\n'interval' means seconds until a new mail will be sended, value must be in increments of 10.",
mail_subject TEXT NOT NULL DEFAULT "Playout Error",
mail_smtp TEXT NOT NULL DEFAULT "mail.example.org",
mail_addr TEXT NOT NULL DEFAULT "ffplayout@example.org",
mail_pass TEXT NOT NULL DEFAULT "",
mail_recipient TEXT NOT NULL DEFAULT "",
mail_starttls INTEGER NOT NULL DEFAULT 0,
mail_level TEXT NOT NULL DEFAULT "ERROR",
mail_interval INTEGER NOT NULL DEFAULT 120,
logging_help TEXT NOT NULL DEFAULT "If 'log_to_file' is true, log to file, when is false log to console. \n'local_time' to false will set log timestamps to UTC. Path to /var/log/ only if you run this program as daemon.\n'level' can be DEBUG, INFO, WARNING, ERROR.\n'ffmpeg_level/ingest_level' can be INFO, WARNING, ERROR.\n'detect_silence' logs an error message if the audio line is silent for 15 seconds during the validation process.\n'ignore_lines' makes logging to ignore strings that contains matched lines, in frontend is a semicolon separated list.",
logging_ffmpeg_level TEXT NOT NULL DEFAULT "ERROR",
logging_ingest_level TEXT NOT NULL DEFAULT "ERROR",
logging_detect_silence INTEGER NOT NULL DEFAULT 0,
logging_ignore TEXT NOT NULL DEFAULT "P sub_mb_type 4 out of range at;error while decoding MB;negative number of zero coeffs at;out of range intra chroma pred mode;non-existing SPS 0 referenced in buffering period",
processing_help TEXT NOT NULL DEFAULT "Default processing for all clips, to have them unique. Mode can be playlist or folder.\n'aspect' must be a float number.'logo' is only used if the path exist, path is relative to your storage folder.\n'logo_scale' scale the logo to target size, leave it blank when no scaling is needed, format is 'width:height', for example '100:-1' for proportional scaling. With 'logo_opacity' logo can become transparent.\nWith 'audio_tracks' it is possible to configure how many audio tracks should be processed.\n'audio_channels' can be use, if audio has more channels then only stereo.\nWith 'logo_position' in format 'x:y' you set the logo position.\nWith 'custom_filter' it is possible, to apply further filters. The filter outputs should end with [c_v_out] for video filter, and [c_a_out] for audio filter.",
processing_mode TEXT NOT NULL DEFAULT "playlist",
processing_audio_only INTEGER NOT NULL DEFAULT 0,
processing_copy_audio INTEGER NOT NULL DEFAULT 0,
processing_copy_video INTEGER NOT NULL DEFAULT 0,
processing_width INTEGER NOT NULL DEFAULT 1280,
processing_height INTEGER NOT NULL DEFAULT 720,
processing_aspect REAL NOT NULL DEFAULT 1.778,
processing_fps REAL NOT NULL DEFAULT 25.0,
processing_add_logo INTEGER NOT NULL DEFAULT 1,
processing_logo TEXT NOT NULL DEFAULT "graphics/logo.png",
processing_logo_scale TEXT NOT NULL DEFAULT "",
processing_logo_opacity REAL NOT NULL DEFAULT 0.7,
processing_logo_position TEXT NOT NULL DEFAULT "W-w-12:12",
processing_audio_tracks INTEGER NOT NULL DEFAULT 1,
processing_audio_track_index INTEGER NOT NULL DEFAULT -1,
processing_audio_channels INTEGER NOT NULL DEFAULT 2,
processing_volume REAL NOT NULL DEFAULT 1.0,
processing_filter TEXT NOT NULL DEFAULT "",
ingest_help "Run a server for a ingest stream. This stream will override the normal streaming until is done. There is only a very simple authentication mechanism, which check if the stream name is correct.\n'custom_filter' can be used in the same way then the one in the process section.",
ingest_enable INTEGER NOT NULL DEFAULT 0,
ingest_param TEXT NOT NULL DEFAULT "-f live_flv -listen 1 -i rtmp://127.0.0.1:1936/live/stream",
ingest_filter TEXT NOT NULL DEFAULT "",
playlist_help TEXT NOT NULL DEFAULT "'path' can be a path to a single file, or a directory. For directory put only the root folder, for example '/playlists', subdirectories are read by the program. Subdirectories needs this structure '/playlists/2018/01'.\n'day_start' means at which time the playlist should start, leave day_start blank when playlist should always start at the begin. 'length' represent the target length from playlist, when is blank real length will not consider.\n'infinit: true' works with single playlist file and loops it infinitely.",
playlist_day_start TEXT NOT NULL DEFAULT "05:59:25",
playlist_length TEXT NOT NULL DEFAULT "24:00:00",
playlist_infinit INTEGER NOT NULL DEFAULT 0,
storage_help TEXT NOT NULL DEFAULT "'filler' is for playing instead of a missing file or fill the end to reach 24 hours, can be a file or folder, it will loop when is necessary.\n'extensions' search only files with this extension. Set 'shuffle' to 'true' to pick files randomly.",
storage_filler TEXT NOT NULL DEFAULT "filler/filler.mp4",
storage_extensions TEXT NOT NULL DEFAULT "mp4;mkv;webm",
storage_shuffle INTEGER NOT NULL DEFAULT 1,
text_help TEXT NOT NULL DEFAULT "Overlay text in combination with libzmq for remote text manipulation. fontfile is a relative path to your storage folder.\n'text_from_filename' activate the extraction from text of a filename. With 'style' you can define the drawtext parameters like position, color, etc. Post Text over API will override this. With 'regex' you can format file names, to get a title from it.",
text_add INTEGER NOT NULL DEFAULT 1,
text_from_filename INTEGER NOT NULL DEFAULT 0,
text_font TEXT NOT NULL DEFAULT "fonts/DejaVuSans.ttf",
text_style TEXT NOT NULL DEFAULT "x=(w-tw)/2:y=(h-line_h)*0.9:fontsize=24:fontcolor=#ffffff:box=1:boxcolor=#000000:boxborderw=4",
text_regex TEXT NOT NULL DEFAULT "^.+[/\\](.*)(.mp4|.mkv|.webm)$",
task_help TEXT NOT NULL DEFAULT "Run an external program with a given media object. The media object is in json format and contains all the information about the current clip. The external program can be a script or a binary, but should only run for a short time.",
task_enable INTEGER NOT NULL DEFAULT 0,
task_path TEXT NOT NULL DEFAULT "",
output_help TEXT NOT NULL DEFAULT "The final playout compression. Set the settings to your needs. 'mode' has the options 'desktop', 'hls', 'null', 'stream'. Use 'stream' and adjust 'output_param:' settings when you want to stream to a rtmp/rtsp/srt/... server.\nIn production don't serve hls playlist with ffplayout, use nginx or another web server!",
output_mode TEXT NOT NULL DEFAULT "hls",
output_param TEXT NOT NULL DEFAULT "-c:v libx264 -crf 23 -x264-params keyint=50:min-keyint=25:scenecut=-1 -maxrate 1300k -bufsize 2600k -preset faster -tune zerolatency -profile:v Main -level 3.1 -c:a aac -ar 44100 -b:a 128k -flags +cgop -f hls -hls_time 6 -hls_list_size 600 -hls_flags append_list+delete_segments+omit_endlist -hls_segment_filename live/stream-%d.ts live/stream.m3u8",
FOREIGN KEY (channel_id) REFERENCES channels (id) ON UPDATE CASCADE ON DELETE CASCADE
);
CREATE TABLE
advanced_configurations (
id INTEGER PRIMARY KEY AUTOINCREMENT,
channel_id INTEGER NOT NULL DEFAULT 1,
decoder_input_param TEXT,
decoder_output_param TEXT,
encoder_input_param TEXT,
ingest_input_param TEXT,
filter_deinterlace TEXT,
filter_pad_scale_w TEXT,
filter_pad_scale_h TEXT,
filter_pad_video TEXT,
filter_fps TEXT,
filter_scale TEXT,
filter_set_dar TEXT,
filter_fade_in TEXT,
filter_fade_out TEXT,
filter_overlay_logo_scale TEXT,
filter_overlay_logo_fade_in TEXT,
filter_overlay_logo_fade_out TEXT,
filter_overlay_logo TEXT,
filter_tpad TEXT,
filter_drawtext_from_file TEXT,
filter_drawtext_from_zmq TEXT,
filter_aevalsrc TEXT,
filter_afade_in TEXT,
filter_afade_out TEXT,
filter_apad TEXT,
filter_volume TEXT,
filter_split TEXT,
FOREIGN KEY (channel_id) REFERENCES channels (id) ON UPDATE CASCADE ON DELETE CASCADE
);
-------------------------------------------------------------------------------
-- set defaults
INSERT INTO
roles (name)
VALUES
('global_admin'),
('channel_admin'),
('user'),
('guest');
INSERT INTO
channels (name, preview_url, extra_extensions, active)
VALUES
(
'Channel 1',
'http://127.0.0.1:8787/live/1/stream.m3u8',
'jpg,jpeg,png',
0
);
INSERT INTO
presets (
name,
text,
x,
y,
fontsize,
line_spacing,
fontcolor,
box,
boxcolor,
boxborderw,
alpha,
channel_id
)
VALUES
(
'Default',
'Wellcome to ffplayout messenger!',
'(w-text_w)/2',
'(h-text_h)/2',
'24',
'4',
'#ffffff@0xff',
'0',
'#000000@0x80',
'4',
'1.0',
'1'
),
(
'Empty Text',
'',
'0',
'0',
'24',
'4',
'#000000',
'0',
'#000000',
'0',
'0',
'1'
),
(
'Bottom Text fade in',
'The upcoming event will be delayed by a few minutes.',
'(w-text_w)/2',
'(h-line_h)*0.9',
'24',
'4',
'#ffffff',
'1',
'#000000@0x80',
'4',
'ifnot(ld(1),st(1,t));if(lt(t,ld(1)+1),0,if(lt(t,ld(1)+2),(t-(ld(1)+1))/1,if(lt(t,ld(1)+8),1,if(lt(t,ld(1)+9),(1-(t-(ld(1)+8)))/1,0))))',
'1'
),
(
'Scrolling Text',
'We have a very important announcement to make.',
'ifnot(ld(1),st(1,t));if(lt(t,ld(1)+1),w+4,w-w/12*mod(t-ld(1),12*(w+tw)/w))',
'(h-line_h)*0.9',
'24',
'4',
'#ffffff',
'1',
'#000000@0x80',
'4',
'1.0',
'1'
);
INSERT INTO
configurations DEFAULT
VALUES;
INSERT INTO
advanced_configurations DEFAULT
VALUES;

View File

@ -3,7 +3,7 @@
source $(dirname "$0")/man_create.sh
target=$1
if [ ! -f 'ffplayout-frontend/package.json' ]; then
if [ ! -f 'frontend/package.json' ]; then
git submodule update --init
fi
@ -34,10 +34,9 @@ for target in "${targets[@]}"; do
cross build --release --target=$target
cp ./target/${target}/release/ffpapi.exe .
cp ./target/${target}/release/ffplayout.exe .
zip -r "ffplayout-v${version}_${target}.zip" assets docker docs LICENSE README.md CHANGELOG.md ffplayout.exe ffpapi.exe -x *.db -x *.db-shm -x *.db-wal -x '11-ffplayout' -x *.service
rm -f ffplayout.exe ffpapi.exe
zip -r "ffplayout-v${version}_${target}.zip" assets docker docs LICENSE README.md CHANGELOG.md ffplayout.exe -x *.db -x *.db-shm -x *.db-wal -x *.service
rm -f ffplayout.exe
else
if [[ -f "ffplayout-v${version}_${target}.tar.gz" ]]; then
rm -f "ffplayout-v${version}_${target}.tar.gz"
@ -45,20 +44,17 @@ for target in "${targets[@]}"; do
cross build --release --target=$target
cp ./target/${target}/release/ffpapi .
cp ./target/${target}/release/ffplayout .
tar -czvf "ffplayout-v${version}_${target}.tar.gz" --exclude='*.db' --exclude='*.db-shm' --exclude='*.db-wal' assets docker docs LICENSE README.md CHANGELOG.md ffplayout ffpapi
rm -f ffplayout ffpapi
tar --transform 's/\.\/target\/.*\///g' -czvf "ffplayout-v${version}_${target}.tar.gz" --exclude='*.db' --exclude='*.db-shm' --exclude='*.db-wal' assets docker docs LICENSE README.md CHANGELOG.md ./target/${target}/release/ffplayout
fi
echo ""
done
if [[ "${#targets[@]}" == "5" ]] || [[ $targets == "x86_64-unknown-linux-musl" ]]; then
cargo deb --no-build --target=x86_64-unknown-linux-musl -p ffplayout --manifest-path=ffplayout-engine/Cargo.toml -o ffplayout_${version}-1_amd64.deb
cargo generate-rpm --payload-compress none --target=x86_64-unknown-linux-musl -p ffplayout-engine -o ffplayout-${version}-1.x86_64.rpm
cargo deb --no-build --target=x86_64-unknown-linux-musl -p ffplayout --manifest-path=ffplayout/Cargo.toml -o ffplayout_${version}-1_amd64.deb
cargo generate-rpm --payload-compress none --target=x86_64-unknown-linux-musl -p ffplayout -o ffplayout-${version}-1.x86_64.rpm
fi
if [[ "${#targets[@]}" == "5" ]] || [[ $targets == "aarch64-unknown-linux-gnu" ]]; then
cargo deb --no-build --target=aarch64-unknown-linux-gnu --variant=arm64 -p ffplayout --manifest-path=ffplayout-engine/Cargo.toml -o ffplayout_${version}-1_arm64.deb
cargo deb --no-build --target=aarch64-unknown-linux-gnu --variant=arm64 -p ffplayout --manifest-path=ffplayout/Cargo.toml -o ffplayout_${version}-1_arm64.deb
fi

Some files were not shown because too many files have changed in this diff Show More