Git Product home page Git Product logo

pinchflat's Introduction

Pinchflat Logo by @hernandito

logo by @hernandito

Your next YouTube media manager

Table of contents:

What it does

Pinchflat is a self-hosted app for downloading YouTube content built using yt-dlp. It's designed to be lightweight, self-contained, and easy to use. You set up rules for how to download content from YouTube channels or playlists and it'll do the rest, periodically checking for new content. It's perfect for people who want to download content for use in with a media center app (Plex, Jellyfin, Kodi) or for those who want to archive media!

While you can download individual videos, Pinchflat is best suited for downloading content from channels or playlists. It's also not meant for consuming content in-app - Pinchflat downloads content to disk where you can then watch it with a media center app or VLC.

If it doesn't work for your use case, please make a feature request! You can also check out these great alternatives: Tube Archivist, ytdl-sub, and TubeSync

Features

  • Self-contained - just one Docker container with no external dependencies
  • Powerful naming system so content is stored where and how you want it
  • Easy-to-use web interface with presets to get you started right away
  • First-class support for media center apps like Plex, Jellyfin, and Kodi
  • Supports serving RSS feeds to your favourite podcast app (docs)
  • Automatically downloads new content from channels and playlists
    • Uses a novel approach to download new content more quickly than other apps
  • Supports downloading audio content
  • Custom rules for handling YouTube Shorts and livestreams
  • Apprise support for notifications
  • Allows automatically redownloading new media after a set period
    • This can help improve the download quality of new content or improve SponsorBlock tags
  • Optionally automatically delete old content (docs)
  • Advanced options like setting cutoff dates and filtering by title
  • Reliable hands-off operation
  • Can pass cookies to YouTube to download your private playlists (docs)
  • Sponsorblock integration
  • [Advanced] allows custom yt-dlp options (docs)
  • [Advanced] supports running custom scripts when after downloading/deleting media (alpha - docs)

Screenshots

Pinchflat screenshot

Pinchflat screenshot

Installation

Unraid

Simply search for Pinchflat in the Community Apps store!

Portainer

Important

See the note below about storing config on a network file share. It's preferred to store the config on a local disk if at all possible.

Docker Compose file:

version: '3'
services:
  pinchflat:
    image: ghcr.io/kieraneglin/pinchflat:latest
    environment:
      # Set the timezone to your local timezone
      - TZ=America/New_York
    ports:
      - '8945:8945'
    volumes:
      - /host/path/to/config:/config
      - /host/path/to/downloads:/downloads

Docker

  1. Create two directories on your host machine: one for storing config and one for storing downloaded media. Make sure they're both writable by the user running the Docker container.
  2. Prepare the docker image in one of the two ways below:
    • From GHCR: docker pull ghcr.io/kieraneglin/pinchflat:latest
      • NOTE: also available on Docker Hub at keglin/pinchflat:latest
    • Building locally: docker build . --file docker/selfhosted.Dockerfile -t ghcr.io/kieraneglin/pinchflat:latest
  3. Run the container:
# Be sure to replace /host/path/to/config and /host/path/to/downloads below with
# the paths to the directories you created in step 1
# Be sure to replace America/New_York with your local timezone
docker run \
  -e TZ=America/New_York \
  -p 8945:8945 \
  -v /host/path/to/config:/config \
  -v /host/path/to/downloads:/downloads \
  ghcr.io/kieraneglin/pinchflat:latest

IMPORTANT: File permissions

You must ensure the host directories you've mounted are writable by the user running the Docker container. If you get a permission error follow the steps it suggests. See #106 for more.

Important

It's not recommended to run the container as root. Doing so can create permission issues if other apps need to work with the downloaded media.

Tip

If you need to run any command as root, you can run su from the container's shell as there is no password set for the root user.

ADVANCED: Storing Pinchflat config directory on a network share

As pointed out in #137, SQLite doesn't like being run in WAL mode on network shares. If you're running Pinchflat on a network share, you can disable WAL mode by setting the JOURNAL_MODE environment variable to delete. This will make Pinchflat run in rollback journal mode which is less performant but should work on network shares.

Caution

Changing this setting from WAL to delete on an existing Pinchflat instance could, conceivably, result in data loss. Only change this setting if you know what you're doing, why this is important, and are okay with possible data loss or DB corruption. Backup your database first!

If you change this setting and it works well for you, please leave a comment on #137! Doubly so if it does not work well.

Environment variables

Name Required? Default Notes
TZ No UTC Must follow IANA TZ format
LOG_LEVEL No debug Can be set to info but debug is strongly recommended
BASIC_AUTH_USERNAME No See authentication docs
BASIC_AUTH_PASSWORD No See authentication docs
EXPOSE_FEED_ENDPOINTS No See RSS feed docs
JOURNAL_MODE No wal Set to delete if your config directory is stored on a network share (not recommended)
TZ_DATA_DIR No /etc/elixir_tzdata_data The container path where the timezone database is stored
BASE_ROUTE_PATH No / The base path for route generation. Useful when running behind certain reverse proxies
YT_DLP_WORKER_CONCURRENCY No 2 The number of concurrent workers that use yt-dlp per queue. Set to 1 if you're getting IP limited, otherwise don't touch it

EFF donations

Prior to 2024-05-10, a portion of all donations were given to the Electronic Frontier Foundation. Now, the app doesn't accept donations that go to me personally and instead directs you straight to the EFF. Here are some people that have generously donated.

The EFF defends your online liberties and backed youtube-dl when Google took them down.

Pre-release disclaimer

This is pre-release software and anything can break at any time. I make not guarantees about the stability of this software, forward-compatibility of updates, or integrity (both related to and independent of Pinchflat). Essentially, use at your own risk and expect there will be rough edges for now.

License

See LICENSE file

pinchflat's People

Contributors

breakid avatar drewstopherlee avatar kieraneglin avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

pinchflat's Issues

Output template option output 'NA' for playlist option

I am new to this and trying to setup a profile for playlist but all of the playlist option below would only return 'NA'. Is there a different way to output the playlist name, playlist id, and playlist index?

/{{ playlist }} [{{ playlist_id}}]/{{ playlist_index }} - {{ title }} [{{ id }}].{{ ext }}

Bug: Thumbnails not embedding properly in Emby

Sometimes the app will only embed a screenshot of the Youtube episode and not the video thumbnail. See the attached image. Sometimes it wont embed an image at all. Graser has thumbs for all his videos so I'm not sure what is going on here. Let me know if you need anything else from me for troubleshooting.

chrome_RxI83B8ZrF

chrome_DNcnugBxsQ

Here are the Pinchlflat options enabled.

chrome_T8iAD2Uzqb

[Request] Automatically group videos with related titles

For example, the channel OverSimplified has the videos "WW2 - OverSimplified (Part 1)" and "WW2 - OverSimplified (Part 2)". I would like a way to have videos that share the same title except for some indication of being separate parts of an ongoing series (typically something like above or simply a number) to automatically be sorted together into a folder named only the common part of their names minus certain special characters. So in the above case, the folder name would be "WW2 - Oversimplified".

Combined with #174, the folder name would instead just be "WW2".

FR: Host notification when media is downloaded

Some way of knowing something has been downloaded would be nice. I know there are ways for a docker container to be configured to send notify-send events to a server. But it could even be as simple as a text file in /downloads getting updated with the name of the media that was just downloaded.

I've experimented in the past using inotifywait to try to determine when something got downloaded (such as for tubesync), but watching for YouTube video downloads is particularly annoying since they often get downloaded in several pieces and then merged together, resulting in multiple inotify events being triggered.

Doubts about resource consumption

Hi, I've been using the program for a few days now and I've noticed intensive CPU and RAM usage, the downward spike at 4:00 AM is due to the automatic update of the docker image.

screen

maybe I'm putting too many youtube channels to control? Should I put less?

screen#4

I see that it goes continuous prompts as if it continuously updates them all, while in the settings I selected daily update.
I'll leave a few more pictures, if needed I can post logs.
Pinchflat has the purple color line:

screen#2

screen#3

screen#1

Option to download only new - not whole channel/playlist?

This looks super clean and was easy to get setup. The only thing I was wishing for was an option to choose only NEW videos from a channel vs all.

If I had to add anything else it would be the option to specify a time for the index also. For example, checking/downloading at a specific time.

Very awesome project, high hopes for you!

Plex Media Center Server integration

Plex Media Center Server integration similar to Radarr/Sonarr where once integrated, new downloaded files trigger a refresh for Plex
#enhancement

UnRAID Docker isn't exposing any ports

image

The screenshot above shows the UnRaid Docker section. For some reason, the port mappings are not showing up for Pinchflat like how they show up for the rest of the Docker containers (see the Plex Docker container as an example).

When I select the WebUI option from the Docker container, I get the following: about:blank#blocked

Manually typing in the IP Address:Port works though.

I am not sure if the two issues are related.

Thanks!

[Request] - Refine video titles

I'd like to request tools to do two things to video titles:

  1. Remove a fixed part of titles from a particular source. For example, videos from the channel "OverSimplified" have "- OverSimplified" as part of every title, which I'd like to remove.

  2. Remove "fluff" from titles. This is probably harder to implement. It involves automatically detecting parts of titles that aren't actually relevant to understanding what the video is about. This likely includes hashtags, the channel's own name, and some other things like information in parantheses.

FR: The ability to import downloaded media and subscriptions from apps like tubesync or YoutubeDL

Until pinchflat came along - I like many others used apps like tubesync or youtubedl-material locally archiving youtube playlists or channels. Tubesync specifically has similar tasks setup for automatically downloading media from a channel or playlist source. As well managing the media. Would be really useful if I Pinchflat could import existing task & source directly from tubesync and also directory scanning. (reading nfo and json files, as well as directory name). Similar to how Sonarr can import shows already on a drive.

[FR] Move files after download similar to Sonarr

To prevent Plex or other programs from trying to index or access a file during download, allow for downloading to a temporary /Downloads directory. This should prevent excess CPU cycles from multiple programs executing simultaneously.

Once done, then move file to the intended directory for organization. Plex will pick up the file for scanning once moved provided it is set to monitor folder changes.

This should help the program work in tandem with Sonarr for organization.

Request: improve episode compatibility with media center apps (Plex, Jellyfin, Kodi, etc)

Many media clients support reading additional metadata stored as an NFO. If possible, we should create a compatible NFO from the metadata we get during download. See #68 for more.

Additionally, enforcing a proper file structure is a huge part of it. I'll have to look into that.

Will also want to look into adding support for things like banners at the source-level, but that's covered by #73. This is just concerned with episode-level improvements

Proposed Logo Designs

Hi,

I took the liberty in creating a few logo/icon variations. The one you currently use feels too generic/plain.

I don't know the genesis of the name "Pinchflat"; but I tried to do something with it. Same with the current teal color.

I created two main variations. One using the teal color, and one with a more direct relation with the YouTube logo. I am tending towards the teal option. The red feels too close to YouTube.

I will explain each variation:
The "v.A" down arrow versoiin = a play on the YouTube play button but rotated to imply more of a download concept. The cookie bite is to make it less generic.

The "v.B" is plainly the letter P sideways. Rotated to make is not look like a letter, and also is is facing "down" as in download.

The "v.C" is for indicating you are collecting individual's YouTube channels. From what I see in your program, you are downloading the content of a per channel basis.
The "v.D" option is similar to "A" without the cookie bite, and it is still implying both download and the play button idea. This is my least favorite as I think it comes to close to looking like a lock.

Versions E to G are all variations of the same idea. The use of a paper clip to organize things together. A paper clip "pinches" papers together. And the down arrow implies download.

I am pasting the images below. I have done them at around 1200px each. But I think they need to be seen in lower resolution for proper evaluation.

image

Please let me know if you like them. Please do not feel obliged to implement. As someone in the creative field, I understand getting ideas rejected.

I do have a favorite... but I will not influence.

Please let me know, and we can expand adding the word "Pindash" for a full logo.

Thanks.

Using NFS shares

Hello again.

I was experimenting with Pinchflat and it looks promising.

I have a machine for downloading media and another that serves as a media center.

I'm afraid to try it so I'd better ask: would it technically be possible to mount /downloads as an NFS share to download the media directly into my media center or will I have issues ?

Enhancement: Improve tab UI behaviour

The tabbed UI for sources (and the other models to a lesser extent) need some love. A few items in no order:

  • Reloading the page should keep you on the same tab (added in #161)
  • Navigating back/forward should respect tab click history (added in #161)
  • Tabs that show records should have pagination to ensure all records can be displayed (added in #190)
  • Tabs that show records should have sensible ordering
  • (Stretch goal) Tabs should live update, if applicable
    • Edit: won't get to live updating now - will create a new ticket if that ever comes around

File permissions error on app boot

Hi, i just try to run this software via docker but i get error with some db table i think.
The docker image i launch is this:

docker create --name CTX_pinchflat --restart unless-stopped -e PUID=1000 -e PGID=1000 -e TZ=Europe/Rome --network net -h pinchflat -p 8945:8945 -v /share_docker/pf_config:/config --mount type=bind,source=/mnt/misc/TEMP_YT/,target=/downloads --log-driver syslog --log-opt tag=DOCKER_pinchflat --log-opt syslog-address=udp://IP_LOCAL:514 keglin/pinchflat:latest

these below are the error i get:

!! - Edited by admin - !!

This was a database error that was actually a file permissions error. 
The error was improved in #114 and the original error here is no longer relevant

i need to fix something? thx

Request: support source-related file downloads (banners, avatars, NFOs)

See #68

This is a little harder than it seems to identify the root directory of a source since we're very flexible about where/how you store your files. Multiple sources/profiles can overlap partially or totally in their file structure and directories can be nested arbitrarily deep.

I'm hoping to attempt to discern the root directory somehow but it may require user input. We'll see!

REVISITED - Compatability w/ Media Players (Kodi, Plex)

Hi.

Following up my earlier feature request. I have had a chance to review the changes you have made.

Earlier you mentioned making Pinchflat (PF) pltform agnostic, and not focused on individual applications/media players. My current comments below keep this in mind. You mentioned being able for users to create custom scripts. I am not sure at what point in PF's process, would this script be executed and how it can be called out.

PF organizes downloads in unRAID /mnt/user/Media/YouTube/ChannelName/IndividualVideo. In each video folder it adds the downloaded files... mp4, nfo, and xxx-thumb.jpg.

My Kodi is properly picking up the folder structure correctly. I see 1) a list of channels. In each channel, I see 2) the list of all videos. Going into each video, I see 3) what should be info for the video.

When I go into 1) it would be great if one could see all the Channel banners. I am not sure if these can be downloaded via the YT scraping method. I am not anticipating having a ton of channels. I can manually download the banner, and copy it to the root of the channel folder and rename it properly (banner.jpg for Kodi). But automating this in a custom script would be great.

When I go into 2), we see the list of videos. It would be great if there was an NFO with the channel's info. I know you are scraping the NFO for the video (more on this later), but the info/description for the channel itself would be great. Once in 2) we have the opportunity to see the thumbnail image for each video. I simply copied the xxx-thumb.jpg and copied it to a file called folder.jpg. This shown the thumbnails beautifully.

When I go into 3), I see the video alone and the xxx-thumb.jpg is properly displayed. I am not sure if it is reading the jpg file, or if it's displaying the embedded thumbnail file. Nothing to do here. however, I should be seeing the NFO info here, but Kodi is not reading properly. A quick look at the nfo and compared to a stock Kodi nfo, the main items are named the same... I tried copying regular nfos from both movies and episodes, and none displayed the data. I think for now this is a Kodi issue.

As I mentioned above, I am not sure how custom scripts would operate as PF is doing its scraping/downloading. Should there be an entry in each Media Profile... something that injects a script command at the proper step that, for example, copies the xx-thumb.jpg to a folder.jpg file?

On a separate note, when PF is downloading a channel, there are also playlists in that channel (the playlist videos are included in the channel's video as individual files. I think placing the individual playlist videos into a separate folder would be nice. something like:

- YouTube
    - Channel
       - video one
       - video two
       - Playlist A
           - PlaylistA-video1
           - PlaylistA-video2
        - video three
        - Playlist B
            - PlaylistB-video1

This would make watching the playlists nicely organized.

After all my long-winded explanations above... we summarize:

  • Download Channel Banner and channel-wide nfo
  • Ability to inject command like the copying of xxx-thumb to folder.jpg
  • Figure out why Kodi does not process nfo's.
  • Separate Playlists into their own sub directories.

You should create a support thread in the unRAID form. If you are OK with it, I can post the above into that thread. This way Pinchlist gets more exposure/traction. For example, I am not a Plex/Jellyfin expert, people at the forum can chime in on how they would customize so it works with Plex.

Thank you!

FR: Redownload new videos after delay

Hello, really enjoying using pinchflat docker on unraid so far.

Just looking for some more info regarding how the download format is being selected.
From what I understand of download_option_builder.ex, it seems to already favour avc and m4a codecs, which is what I want.
But will an option be added to select the preferred codecs?
Seems your comments may already hint at adding a means for custom arguments.

Second, I am currently using YoutubeDL-Material and it has a feature that will check new uploads a day later to allow for enough processing time of higher qualities.
Screenshot 2024-03-25 225647
Could a delay option be added to the media profiles?
So that once a new video is detected it waits X time before fetching it.
Could also be handy if you plan on implementing sponsorblock segment removal in the future.

Docker Compose for Portainer?

Hello! Super interested in the project, it looks way more simple than what I've been using currently.

I'm pretty new to self hosting and I use Portainer to deploy services, with docker compose. I was looking over how to set up Pinchflat and wasn't sure how to do that like I normally would. It's possible that I'm missing how to go about it, but if not, is there a chance that a compose file could be made? I'm not sure how difficult that is, and if it's not of interest to the project, that's okay!

I use a NUC for a server, if that matters.

Thanks!

Support for other sites (Twitch, Nebula, etc)

Issue Statement

Pinchflat currently does not officially support any site other than YouTube and that isn't likely to change any time soon. That said, yt-dlp supports many sites and there's a fair chance Pinchflat would work with those sites - especially the most popular ones.

This closed issue will stand as a repository of known issues and workarounds for downloading media from other sites. It's almost certain that Pinchflat won't work 100% as expected with non-YouTube sites BUT if there's an issue that's an easy change and would also be applicable to YouTube, it may get fixed. Just remember that the focus is YouTube first and foremost and it's more like a happy accident if Pinchflat works with other sites.

Known Limitations

  • Fast indexing only works with YouTube - make sure you disable it for any non-YouTube sources
  • Some exclusively audio sources don't download

Workarounds

  • As a happy side effect of #225, iheartradio podcasts appear to work!
    • The {{ artist_name }} template option doesn't seem to work so you should use {{ source_custom_name }} or similar

Preferred resolution not downloading

I have a media profile set up for 4K, but videos are coming down in 480p most of the time. I checked some of these videos in the browser, and they are indeed 4K or 1080. Not sure which version I'm using (don't see it on the browser page). Using Docker on unRAID.

Remove items from "Pending Media"

I had to reboot my VM in the middle of downloads and it seems to have broken the app. There have been 2 downloads stuck in Pending Media with no way to remove them.

image

Plex and NFO files

Not sure if this is a Plex issue or Plex configuration issue...

  1. Add a new source to Media Profile for Plex, using Output path template of:
{{ source_custom_name }}/Season {{ season_from_date }}/{{ season_episode_from_date }} - {{ title }}.{{ ext }}

Enabled the following:

  • Download of Subtitles
  • Download of Autogenerated Subtitles
  • Download of Thumbnail
  • Embed Thumbail
  • Download Metadata
  • Embed Metadata
  • Download NFO data
  • Download Series Images
  1. Wait for source downloads to complete
  2. In Plex I made a Library type of "TV Show" (I assume this is correct) and select "Scan Library Files", wait for it to complete.
  • Channel is divided into seasons based on year as expected
  • Poster & Background image is set correctly
  • Each episode has proper icon shown

However, nothing else is populated within Plex, such as no channel summary, no episode title or summary, etc. I assume this is a short-coming of Plex itself not processing the respective *.NFO files??

Under Plex > Settings > Agents > Shows > Personal Media Shows both Personal Media Shows and Local Media Assets are enabled.

Is there something else needed to get Plex to process the *.NFO files per channel/episode?

Source Name display inconsistent

On sources page, "friendly" name of Source shows correct on main Sources list. Once you view the source, the breadcrumb is labeled "Source #$".

#enhancement

Request: Add media cap for media profile

Please consider adding an option to only download the last N videos from a channel/media source url. There are some channels I would like to add, but they have way too many older videos that I do not care to download.

below is how I did this manually with yt-dlp. It works with channels or playlists
--playlist-end TOTALCOUNT

Other nice to have functionality:

  • Selectable date range
  • max-filesize

Triage: Database file present but not writeable

Opening a new issue from #137 , since the locked database ended up changing into this one.

I did delete the contents of the config folder and ended up with this error, which looks very similar to the one I was getting before deleting it:

To fix the first issue, run "mix ecto.create" for the desired MIX_ENV.
To address the second, you can run "mix ecto.drop" followed by
"mix ecto.create", both for the desired MIX_ENV. Alternatively you may
configure Ecto to use another table and/or repository for managing
migrations:
    config :pinchflat, Pinchflat.Repo,
      migration_source: "some_other_table_for_schema_migrations",
      migration_repo: AnotherRepoForSchemaMigrations
The full error report is shown below.
** (Exqlite.Error) Database busy
CREATE TABLE IF NOT EXISTS "schema_migrations" ("version" INTEGER PRIMARY KEY, "inserted_at" TEXT)
    (ecto_sql 3.11.1) lib/ecto/adapters/sql.ex:1054: Ecto.Adapters.SQL.raise_sql_call_error/1
    (elixir 1.16.2) lib/enum.ex:1700: Enum."-map/2-lists^map/1-1-"/2
    (ecto_sql 3.11.1) lib/ecto/adapters/sql.ex:1161: Ecto.Adapters.SQL.execute_ddl/4
    (ecto_sql 3.11.1) lib/ecto/migrator.ex:755: Ecto.Migrator.verbose_schema_migration/3
    (ecto_sql 3.11.1) lib/ecto/migrator.ex:563: Ecto.Migrator.lock_for_migrations/4
    (ecto_sql 3.11.1) lib/ecto/migrator.ex:432: Ecto.Migrator.run/4
    (ecto_sql 3.11.1) lib/ecto/migrator.ex:170: Ecto.Migrator.with_repo/3
    nofile:1: (file)
13:47:05.353 [notice] Application pinchflat exited: Pinchflat.Application.start(:normal, []) returned an error: shutdown: failed to start child: Pinchflat.Boot.PreJobStartupTasks
    ** (EXIT) an exception was raised:
        ** (DBConnection.ConnectionError) connection not available and request was dropped from queue after 2000ms. This means requests are coming in and your connection pool cannot serve them fast enough. You can address this by:
  1. Ensuring your database is available and that you can connect to it
  2. Tracking down slow queries and making sure they are running fast enough
  3. Increasing the pool_size (although this increases resource consumption)
  4. Allowing requests to wait longer by increasing :queue_target and :queue_interval
See DBConnection.start_link/2 for more information
            (ecto_sqlite3 0.15.1) lib/ecto/adapters/sqlite3/connection.ex:47: Ecto.Adapters.SQLite3.Connection.prepare_execute/5
            (ecto_sql 3.11.1) lib/ecto/adapters/sql.ex:960: Ecto.Adapters.SQL.execute!/5
            (ecto_sql 3.11.1) lib/ecto/adapters/sql.ex:952: Ecto.Adapters.SQL.execute/6
            (pinchflat 0.1.5) lib/pinchflat/boot/pre_job_startup_tasks.ex:48: Pinchflat.Boot.PreJobStartupTasks.reset_executing_jobs/0
            (pinchflat 0.1.5) lib/pinchflat/boot/pre_job_startup_tasks.ex:34: Pinchflat.Boot.PreJobStartupTasks.init/1
            (stdlib 5.2) gen_server.erl:980: :gen_server.init_it/2
            (stdlib 5.2) gen_server.erl:935: :gen_server.init_it/6
            (stdlib 5.2) proc_lib.erl:241: :proc_lib.init_p_do_apply/3
Kernel pid terminated (application_controller) ("{application_start_failure,pinchflat,{{shutdown,{failed_to_start_child,'Elixir.Pinchflat.Boot.PreJobStartupTasks',{#{message => <<\"connection not available and request was dropped from queue after 2000ms. This means requests are coming in and your connection pool cannot serve them fast enough. You can address this by:\n\n  1. Ensuring your database is available and that you can connect to it\n  2. Tracking down slow queries and making sure they are running fast enough\n  3. Increasing the pool_size (although this increases resource consumption)\n  4. Allowing requests to wait longer by increasing :queue_target and :queue_interval\n\nSee DBConnection.start_link/2 for more information\n\">>,reason => queue_timeout,'__struct__' => 'Elixir.DBConnection.ConnectionError','__exception__' => true,severity => error},[{'Elixir.Ecto.Adapters.SQLite3.Connection',prepare_execute,5,[{file,\"lib/ecto/adapters/sqlite3/connection.ex\"},{line,47},{error_info,#{module => 'Elixi
Crash dump is being written to: erl_crash.dump...๏ฟฝ513:47:07.930 [info] Checking permissions for /config
13:47:07.939 [info] Permissions OK
13:47:07.939 [info] Checking permissions for /downloads
13:47:07.945 [info] Permissions OK
13:47:07.945 [info] Checking permissions for /tmp/pinchflat/data
13:47:07.945 [info] Permissions OK
13:47:07.945 [info] Checking permissions for /config/extras
13:47:07.951 [info] Permissions OK
13:47:07.951 [info] Checking permissions for /config/metadata
13:47:07.958 [info] Permissions OK
13:47:12.326 [error] Could not create schema migrations table. This error usually happens due to the following:
  * The database does not exist
  * The "schema_migrations" table, which Ecto uses for managing
    migrations, was defined by another library
  * There is a deadlock while migrating (such as using concurrent
    indexes with a migration_lock)
To fix the first issue, run "mix ecto.create" for the desired MIX_ENV.
To address the second, you can run "mix ecto.drop" followed by
"mix ecto.create", both for the desired MIX_ENV. Alternatively you may
configure Ecto to use another table and/or repository for managing
migrations:
    config :pinchflat, Pinchflat.Repo,
      migration_source: "some_other_table_for_schema_migrations",
      migration_repo: AnotherRepoForSchemaMigrations
The full error report is shown below.
** (Exqlite.Error) Database busy
CREATE TABLE IF NOT EXISTS "schema_migrations" ("version" INTEGER PRIMARY KEY, "inserted_at" TEXT)
    (ecto_sql 3.11.1) lib/ecto/adapters/sql.ex:1054: Ecto.Adapters.SQL.raise_sql_call_error/1
    (elixir 1.16.2) lib/enum.ex:1700: Enum."-map/2-lists^map/1-1-"/2
    (ecto_sql 3.11.1) lib/ecto/adapters/sql.ex:1161: Ecto.Adapters.SQL.execute_ddl/4
    (ecto_sql 3.11.1) lib/ecto/migrator.ex:755: Ecto.Migrator.verbose_schema_migration/3
    (ecto_sql 3.11.1) lib/ecto/migrator.ex:563: Ecto.Migrator.lock_for_migrations/4
    (ecto_sql 3.11.1) lib/ecto/migrator.ex:432: Ecto.Migrator.run/4
    (ecto_sql 3.11.1) lib/ecto/migrator.ex:170: Ecto.Migrator.with_repo/3
    nofile:1: (file)
13:47:15.048 [notice] Application pinchflat exited: Pinchflat.Application.start(:normal, []) returned an error: shutdown: failed to start child: Pinchflat.Boot.PreJobStartupTasks
    ** (EXIT) an exception was raised:
        ** (Exqlite.Error) no such table: oban_jobs
UPDATE "oban_jobs" AS o0 SET "state" = ? WHERE (o0."state" = 'executing')
            (ecto_sql 3.11.1) lib/ecto/adapters/sql.ex:1054: Ecto.Adapters.SQL.raise_sql_call_error/1
            (ecto_sql 3.11.1) lib/ecto/adapters/sql.ex:952: Ecto.Adapters.SQL.execute/6
            (pinchflat 0.1.5) lib/pinchflat/boot/pre_job_startup_tasks.ex:48: Pinchflat.Boot.PreJobStartupTasks.reset_executing_jobs/0
            (pinchflat 0.1.5) lib/pinchflat/boot/pre_job_startup_tasks.ex:34: Pinchflat.Boot.PreJobStartupTasks.init/1
            (stdlib 5.2) gen_server.erl:980: :gen_server.init_it/2
            (stdlib 5.2) gen_server.erl:935: :gen_server.init_it/6
            (stdlib 5.2) proc_lib.erl:241: :proc_lib.init_p_do_apply/3
{removed_failing_handler,default}
Logger - error: 

Editing a source

On Sources main page, clicking the source could enter into view mode, instead of having the view button at the end of the row. Once inside Source view mode, edit source button feels natural.

#enhancement
#feature request

Docker container does not start in systems without IP version 6

When I try to start the Docker container I get the following error:

docker run -p 8945:8945 -u 1000:1000 -v ./config:/config -v /mnt/raid/media/videos:/downloads ghcr.io/kieraneglin/pinchflat:latest
08:32:03.085 [info] Checking permissions for /config
08:32:03.087 [info] Permissions OK
08:32:03.087 [info] Checking permissions for /downloads
08:32:03.087 [info] Permissions OK
08:32:03.087 [info] Checking permissions for /downloads
08:32:03.087 [info] Permissions OK
08:32:03.087 [info] Checking permissions for /tmp/pinchflat/data
08:32:03.087 [info] Permissions OK
08:32:03.087 [info] Checking permissions for /config/extras
08:32:03.087 [info] Permissions OK
08:32:03.087 [info] Checking permissions for /config/metadata
08:32:03.087 [info] Permissions OK
08:32:03.375 [info] Migrations already up
08:32:04.023 [info] Reset 0 executing jobs
08:32:04.035 [error] Failed to start Ranch listener PinchflatWeb.Endpoint.HTTP in :ranch_tcp:listen([cacerts: :..., key: :..., cert: :..., ip: {0, 0, 0, 0, 0, 0, 0, 0}, port: 8945]) for reason :eafnosupport (address family not supported by protocol family)

08:32:04.036 [notice] Application pinchflat exited: Pinchflat.Application.start(:normal, []) returned an error: shutdown: failed to start child: PinchflatWeb.Endpoint
    ** (EXIT) shutdown: failed to start child: {:ranch_listener_sup, PinchflatWeb.Endpoint.HTTP}
        ** (EXIT) shutdown: failed to start child: :ranch_acceptors_sup
            ** (EXIT) {:listen_error, PinchflatWeb.Endpoint.HTTP, :eafnosupport}
Kernel pid terminated (application_controller) ("{application_start_failure,pinchflat,{{shutdown,{failed_to_start_child,'Elixir.PinchflatWeb.Endpoint',{shutdown,{failed_to_start_child,{ranch_listener_sup,'Elixir.PinchflatWeb.Endpoint.HTTP'},{shutdown,{failed_to_start_child,ranch_acceptors_sup,{listen_error,'Elixir.PinchflatWeb.Endpoint.HTTP',eafnosupport}}}}}}},{'Elixir.Pinchflat.Application',start,[normal,[]]}}}")

I think the issue is the Docker container is trying to listen in a IP version 6 address, but my system does not intentionally support IP version 6, only IP version 4.

Usually most of the Docker images I use for other apps allow to override the default listening address and port by either providing an environment variable, either a command line argument. Does Pinchflat have such option?

Feedback: Podcast RSS Feed

Hi,

Positives:

  1. I'm able to access the RSS feed over the Tailscale.
  2. Channel art is there.
  3. Fast!!!

Negatives:

  1. The podcast don't show duration of unplayed episode.

image

vs Podsync

image

Request: Better Compatability w/ Kodi, Plex, and Jellyfin

I think only a few small things are needed to make better compatability w/ Kodi and Plex (not 100%).

I am not exactly sure what can be scraped by Pinchflat when running. From what I can surmise, the below looks doable.

In my server I have a folder named "YouYube" (/mnt/user/Media/YouTube. This is the folder mapped to the docker. For this testing below, I have only done one channel. In my case it's:
https://www.youtube.com/@BaumgartnerRestoration

Is it possible to get an image for the channel itself? In the channel's home page, I see a banner looking image. I downloaded this image, and named it banner.jpg and it is in the root folder for the one source. In Kodi, I added a source and told it contains TV shows, but excludes them from scanning AND to scrape local information. I had to give it a TV category so I can get "banner view" in my skin.

This is what it looks like when I I go to my YouTube source:

image

When scraping and downloading each video, you are saving a .webp image as a thumbnail of the video. Can you please create a JPEG copy of this file and name it folder.jpg. This way, we a when browsing the selected channel in Kodi, you get image below. I converted the .webp image into a folder.jpg file.

image

I am not sure what other info can be scraped, but It would be nice to get additional info and store it in Kodi compatible .nfo files.

I think Plex can use the same images... as this is what my media library contains, and I do have Plex for my family members.

I think this has the potential to be the next *arr application.... pincharr.

Please let us know your thoughts.

Thank you!

[Bug] Orphaned files on 'Delete Source + Files'

Expected behaviour: When choosing Delete Source + Files from source info page, all files are deleted and download queue is cancelled.

Current behaviour: Choosing Delete Source + Files while a playlist is being downloaded results in orphaned files in the target path that were in the queue(s).

FR: Add a setting to retain "X" number of episodes

This is such an amazing app.

I have a use case where it would be ideal to save a certain number of episodes or episodes newer then X days.

While I can do this in plex, it is not very granular and would be great to handle it on the pinchflat end.

I have multiple channels where I prefer to watch via local but I dont need to keep all of the old, outdated episodes.

It would also be great if we can flag certain episodes to keep forever (it would override this rule for certain episodes).

Thanks for considering

Trouble with Jellyfin accepted filepath creation

I've followed these instructions, but I can't seem to get the Youtube videos I'm downloading to actually organize nicely within Jellyfin. They appear, but the folder structure isn't being respected.


When looking at TV shows I have, the structure is:

For the show's collection images:

[Show]\[name].png

[name] is backdrop, folder, and logo

For a season's poster:

[Show]\Season ##\folder.png

For a season's episode images:

[Show]\Season ##\metadata\[Show] - S##E## - [EpisodeTitle].png

For the episodes:

[Show]\Season ##\[Show] - S##E## - [EpisodeTitle].mkv

In contrast, when looking at the videos downloaded by Pinchflat, the structure is:

For the show's collection images:

[Show]\[name].png

[name] is banner, fanart, and poster

For a season's poster:

N/A

For a season's episode images:

[Show]\Season ####\[Show] [S####E####] [EpisodeTitle].jpg

For the episodes:

[Show]\Season ####\[Show] [S####E####] [EpisodeTitle].mp4

I'm not sure if I'm doing anything wrong or if Jellyfin is just particularly picky and Pinchflat needs some sort of update to accommodate. In either case, help would be appreciated. Thank you.

Shorts aren't always detected correctly

In some cases, shorts aren't returned with /shorts/ in their url, causing shorts detection to fail.

I'm already working on this in #58, but I wanted to put something here for visibility ๐Ÿค™๐Ÿป

Feature Request: delete old media after a certain time frame

In regards of this topic #60

Time-Window Management: The system should automatically track the upload dates of videos. Only videos uploaded within the last two weeks from the current date should be eligible for download and storage.

Automatic Updates: The container must periodically check the specified YouTube channel for new video uploads. Any new content that fits within the two-week window should be downloaded automatically.

Storage Optimization: As new videos are downloaded, videos that fall outside of the two-week window should be automatically deleted from the storage. This ensures that the storage does not become overwhelmed with outdated content.

Configurability: Users should be able to configure the specific start and end dates for the two-week window, allowing for flexibility in how the content is refreshed.

FR: Allow setting frame-ancestors header for adding Pinchflat to a dashboard like Organizr

An option either in-app or during Docker config to set the CSP frame-ancestors header would allow Pinchflat to be added to a network services organizer dashboard for simplified access.

Setting the CSP frame-ancestors header should override the current X-Frame-Options header and retain click-jacking security.

I'm only familiar with this from an Nginx approach rather than Elixir, but this seemed like a possible solution.

FR: Allow "ignoring" media items to prevent redownloading

Maybe a non-standard request... So I am not one to sit on downloaded media from YT. Once I watch it, I don't want to store it (at least not most of it). I have a channel I am testing this with and I noticed that once I delete an episode after watching, it just redownloads anything it doesn't "see" in my folder from the date I specified.

Is there a possibility to build in a mechanism to NOT redownload? Perhaps an option to auto increment that set date to the date of the most recent downloaded media as a way to not look back past that date and redownload purged media. As it stands, I'm assuming I will have to manually set a new cutoff for the back-date and purge periodically.

Thanks, this app has been amazing so far otherwise!

Issues with SQLite and network shares

Hi ๐Ÿ‘‹๐Ÿผ

I saw that you originally had Postgres and removed it in favor of SQLite, which makes sense to have no external deps for people using your app. However I am curious if you are interested in adding multiple database support? At least being able to support SQLite and Postgres only.

Using postgres gives from flexibility over SQLite which has a long a sordid past especially with trying to store the database on a network file-system.

Sonarr/Sonarr#1886
https://www.sqlite.org/useovernet.html

Provide alternate image registry to dockerhub

Hi ๐Ÿ‘‹๐Ÿผ Awesome project here...

Dockerhub has aggressive pull rate limits, it would be awesome if you could push the image to Github Container Registry (GHCR) too.

Dockerhub was one of the best years ago... GHCR is completely free for open source projects. It's also nice in that it integrates into your repo under the packages section. Take a look at my project here and look at the packages on the right side.

Thanks!

Pinchflat suddenly stopped

Hi, I noticed that my pinchflat docker stopped today. I have unraid last template.

12:51:26.091 request_id=F8Cc0nMgAESLQGYAAriB [info] Sent 200 in 2ms
12:52:26.095 request_id=F8Cc4GvI-52zTdsAArnB [info] GET /
12:52:26.097 request_id=F8Cc4GvI-52zTdsAArnB [info] Sent 200 in 1ms
12:52:38.778 [info] {"args":{},"id":50,"meta":{},"system_time":1711540358778059672,"max_attempts":20,"queue":"local_metadata","worker":"Pinchflat.Boot.DataBackfillWorker","source":"oban","event":"job:start","attempt":1,"tags":["media_item","media_metadata","local_metadata","data_backfill"]}
12:52:38.778 [info] Running data backfill worker
12:52:38.779 [info] {"args":{},"id":50,"meta":{},"state":"success","max_attempts":20,"queue":"local_metadata","worker":"Pinchflat.Boot.DataBackfillWorker","source":"oban","duration":1034,"event":"job:stop","queue_time":748701,"attempt":1,"tags":["media_item","media_metadata","local_metadata","data_backfill"]}
12:53:01.804 [info] {"args":{"id":2},"id":13,"meta":{},"system_time":1711540381804919871,"max_attempts":20,"queue":"media_collection_indexing","worker":"Pinchflat.SlowIndexing.MediaCollectionIndexingWorker","source":"oban","event":"job:start","attempt":1,"tags":["media_source","media_collection_indexing"]}
12:53:01.805 [info] [yt-dlp] called with: https://www.youtube.com/@myportowcy001 --simulate --skip-download --ignore-no-formats-error --print-to-file %(.{id,title,was_live,webpage_url,description,aspect_ratio,duration,upload_date})j /tmp/pinchflat/data/3770a6f0cfaad57db525b537397014aaad4b99ee95febe0c508fc2bb00c42ea1.json
12:53:26.102 request_id=F8Cc7mR-unIAbxIAArxB [info] GET /
12:53:26.105 request_id=F8Cc7mR-unIAbxIAArxB [info] Sent 200 in 2ms
12:54:26.110 request_id=F8Cc_F05OZqV6nUAAr7B [info] GET /
12:54:26.112 request_id=F8Cc_F05OZqV6nUAAr7B [info] Sent 200 in 1ms
12:55:26.116 request_id=F8CdClXevC8rPwYAAsGB [info] GET /
12:55:26.118 request_id=F8CdClXevC8rPwYAAsGB [info] Sent 200 in 2ms
12:56:26.123 request_id=F8CdGE6RWw9zR00AAsQB [info] GET /
12:56:26.125 request_id=F8CdGE6RWw9zR00AAsQB [info] Sent 200 in 1ms
12:57:26.130 request_id=F8CdJkdE9TU1xvAAAscB [info] GET /
12:57:26.132 request_id=F8CdJkdE9TU1xvAAAscB [info] Sent 200 in 1ms
12:58:26.136 request_id=F8CdND_okZWEJecAAslB [info] GET /
12:58:26.138 request_id=F8CdND_okZWEJecAAslB [info] Sent 200 in 1ms
12:59:26.143 request_id=F8CdQjiQgzfaJloAAsxB [info] GET /
12:59:26.148 request_id=F8CdQjiQgzfaJloAAsxB [info] Sent 200 in 5ms
13:00:26.154 request_id=F8CdUDGHirv5vskAAs8B [info] GET /
13:00:26.156 request_id=F8CdUDGHirv5vskAAs8B [info] Sent 200 in 2ms
13:00:31.632 [notice] SIGTERM received - shutting down

Regards.

Triage: Possible memory leak

First off, really loving the application.

I left pinchflat to do its thing over night, as it all seems pretty good to go running on Nomad on my homelab. In checkout out the metrics this morning, I noticed the memory kept creeping up, then maxed out, then shot back down. Any idea what's going on there?

image

This is alot of what my logs are full of:

22:01:47.192 [error] Exqlite.Connection (#PID<0.1977.0>) disconnected: ** (DBConnection.ConnectionError) client #PID<0.2017.0> timed out because it queued and checked out the connection for longer than 15000ms

#PID<0.2017.0> was at location:

    (exqlite 0.19.0) lib/exqlite/sqlite3.ex:102: Exqlite.Sqlite3.multi_step/3
    (exqlite 0.19.0) lib/exqlite/sqlite3.ex:140: Exqlite.Sqlite3.try_fetch_all/3
    (exqlite 0.19.0) lib/exqlite/sqlite3.ex:134: Exqlite.Sqlite3.fetch_all/3
    (exqlite 0.19.0) lib/exqlite/connection.ex:672: Exqlite.Connection.get_rows/2
    (exqlite 0.19.0) lib/exqlite/connection.ex:618: Exqlite.Connection.execute/4
    (db_connection 2.6.0) lib/db_connection/holder.ex:354: DBConnection.Holder.holder_apply/4
    (db_connection 2.6.0) lib/db_connection.ex:1512: DBConnection.run_execute/5

22:35:32.832 [info] {"args":{"id":12},"id":370,"meta":{},"system_time":1711406132831790727,"max_attempts":20,"queue":"media_collection_indexing","worker":"Pinchflat.SlowIndexing.MediaCollectionIndexingWorker","source":"oban","event":"job:start","attempt":12,"tags":["media_source","media_collection_indexing"]}
22:35:32.859 [info] [yt-dlp] called with: https://www.youtube.com/@SesameStreet --simulate --skip-download --ignore-no-formats-error --print-to-file %(.{id,title,was_live,webpage_url,description,aspect_ratio,duration,upload_date})j /tmp/pinchflat/data/c86265ac80577f6e5ad68b6cc10462abe2b6b9181f10f218dbed3e0ab2d004f4.json
22:37:55.131 [error] Exqlite.Connection (#PID<0.1977.0>) disconnected: ** (DBConnection.ConnectionError) client #PID<0.2017.0> timed out because it queued and checked out the connection for longer than 15000ms

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.