Git Product home page Git Product logo

tubeup's Introduction

Tubeup - a multi-VOD service to Archive.org uploader

Unit Tests Lint

tubeup uses yt-dlp to download a Youtube video (or any other provider supported by yt-dlp), and then uploads it with all metadata to the Internet Archive using the python module internetarchive.

It was designed by the Bibliotheca Anonoma to archive single videos, playlists (see warning below about more than video uploads) or accounts to the Internet Archive.

Prerequisites

This script strongly recommends Linux or some sort of POSIX system (such as macOS), preferably from a rented VPS and not your personal machine or phone.

Reccomended system specifications:

  • Linux VPS with Python 3.8 or higher and pipx installed
  • 2GB of RAM, 100GB of storage or much more for anything other than single short video mirroring. If your OS drive is too small, symlink it to something larger.

Setup and Installation

  1. Install ffmpeg, pip3 (typically python3-pipx or in Arch python-pipx), and git.
    To install ffmpeg in Ubuntu, enable the Universe repository.

For Debian/Ubuntu:

   sudo apt install ffmpeg python3-pipx git

Then run:

pipx ensurepath
  1. Use pipx to install the required python packages. At a minimum Python 3.8 and up is required (latest Python preferred).
   pipx install tubeup --include-deps
  1. If you don't already have an Internet Archive account, register for one to give the script upload privileges.

  2. Configure internetarchive with your Internet Archive account.

   ia configure

You will be prompted for your login credentials for the Internet Archive account you use.

Once configured to upload, you're ready to go.

  1. Start archiving a video by running the script on a URL (or multiple URLs) supported by yt-dlp.. For YouTube, this includes account URLs and playlist URLs.
   tubeup <url>
  1. Each archived video gets its own Archive.org item. Check out what you've uploaded at

    http://archive.org/details/@YOURUSERNAME.

Perodically before running, upgrade tubeup and its dependencies by running:

   pipx upgrade-all

Docker

Dockerized tubeup is provided by etnguyen03/docker-tubeup. Instructions are provided.

Windows Setup

  1. Install WSL2, pick a distribution of your choice. Ubuntu is popular and well-supported.
  2. Use Windows Terminal by Microsoft to interact with the WSL2 instance.
  3. Fully update the Linux installation with your package manager of choice. sudo apt update ; sudo apt upgrade
  4. Install python pipx and ffmpeg.
  5. Install Tubeup using steps 4-6 in the Linux configuration guide above and configuring internetarchive for your Archive.org account.
  6. Periodically update your Linux packages and python packages.

Usage

Usage:
  tubeup <url>... [--username <user>] [--password <pass>]
                  [--metadata=<key:value>...]
                  [--cookies=<filename>]
                  [--proxy <prox>]
                  [--quiet] [--debug]
                  [--use-download-archive]
                  [--output <output>]
                  [--ignore-existing-item]
  tubeup -h | --help
  tubeup --version
Arguments:
  <url>                         yt-dlp compatible URL to download.
                                Check yt-dlp documentation for a list
                                of compatible websites.
  --metadata=<key:value>        Custom metadata to add to the archive.org
                                item.
Options:
  -h --help                    Show this screen.
  -p --proxy <prox>            Use a proxy while uploading.
  -u --username <user>         Provide a username, for sites like Nico Nico Douga.
  -p --password <pass>         Provide a password, for sites like Nico Nico Douga.
  -a --use-download-archive    Record the video url to the download archive.
                               This will download only videos not listed in
                               the archive file. Record the IDs of all
                               downloaded videos in it.
  -q --quiet                   Just print errors.
  -d --debug                   Print all logs to stdout.
  -o --output <output>         yt-dlp output template.
  -i --ignore-existing-item    Don't check if an item already exists on archive.org

Metadata

You can specify custom metadata with the --metadata flag. For example, this script will upload your video to the Community Video collection by default. You can specify a different collection with the --metadata flag:

   tubeup --metadata=collection:opensource_audio <url>

Any arbitrary metadata can be added to the item, with a few exceptions. You can learn more about archive.org metadata here.

Collections

Archive.org users can upload to four open collections:

Note that care should be taken when uploading entire channels. Read the appropriate section in this guide for creating collections, and contact the collections staff if you're uploading a channel or multiple channels on one subject (gaming or horticulture for example). Internet Archive collections staff will either create a collection for you or merge any uploaded items based on the YouTube uploader name that are already up into a new collection.

Dumping entire channels into Community Video is abusive and may get your account locked. Talk to the Internet Archive admins first before doing large uploads; it's better to ask for guidence or help first than run afoul of the rules.

If you do not own a collection you will need to be added as an admin for that collection if you want to upload to it. Talk to the collection owner or staff if you need assistance with this.

Troubleshooting

  • Some videos are copyright blocked in certain countries. Use the proxy or torrenting/privacy VPN option to use a proxy to bypass this. Sweden and Germany are good countries to bypass geo-restrictions.
  • Upload taking forever? Getting s3 throttling on upload? Tubeup has specifically been tailored to wait the longest possible time before failing, and we've never seen a S3 outage that outlasted the insane wait times set in Tubeup.

A note on live videos

Do not use Tubeup to archive live Youtube (or any other site) video. We will not/cannot fix it, it's not even our problem, and any solutions are unpalitable since they involve more code complexity to be maintained ontop of having to disable livechat for one extractor only for live video.

Major Credits (in no particular order)

  • emijrp who wrote the original youtube2internetarchive.py in 2012
  • Matt Hazinski who forked emijrp's work in 2015 with numerous improvements of his own.
  • Antonizoon for switching the script to library calls rather than functioning as an external script, and many small improvements.
  • Small PRs from various people, both in and out of BibAnon.
  • vxbinaca for stabilizing downloads/uploads in yt-dlp/internetarchive library calls, cleansing item output, subtitles collection, and numerous small improvements over time.
  • mrpapersonic for adding logic to check if an item already exists in the Internet Archive and if so skips ingestion.
  • Jake Johnson of the Internet Archive for adding variable collections ability as a flag, switching Tubeup from a script to PyPi repository, ISO-compliant item dates, fixing what others couldn't, and many improvements.
  • Refeed for re-basing the code to OOP, turning Tubeup itself into a library. and adding download and upload bar graphs, and squashing bugs.

License (GPLv3)

Copyright (C) 2024 Bibliotheca Anonoma

This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.

This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.

You should have received a copy of the GNU General Public License along with this program. If not, see http://www.gnu.org/licenses/.

tubeup's People

Contributors

antonizoon avatar brandongalbraith avatar captainhook avatar chandraprakash9029 avatar coloradohusky avatar darklinkxxxx avatar emijrp avatar failedshack avatar jjjake avatar johtso avatar matthazinski avatar midgleyc avatar mrpapersonic avatar nemobis avatar puigru avatar refeed avatar rjw1 avatar svrnwnsch avatar upintheairsheep avatar vxbinaca avatar windows81 avatar yzqzss avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

tubeup's Issues

Add "version" prefix if current archive is low quality

Many YouTube video items are archived in low quality because they were archived using an old version of tubeup. Could a feature be added that detects if the current archive is in high quality and if it isn't, use a prefixed item name (e.g. "youtube-[id].1", "youtube-[id].2").

I am assuming that the current tubeup uses "bestvideo+bestaudio" or equivalent, as that seems to be the way to get the highest quality download.

Requires manually installing version 0.4 of jsonpatch

I just followed the installation instructions on OS X, and got an error when I tried running tubeup that seems to have amounted to it expecting strictly version 0.4 of jsonpatch. I uninstalled jsonpatch and reinstalled it as 0.4 and... well, it moved on to a different error, but that's a matter for another thread...

Interestingly, it seems like it had originally installed jsonpatch 0.4 in the first place, but updated it to 1.16 when I ran the "periodically upgrade tubeup and its dependencies" line, so if the requirement isn't going to change, it might be worth taking jsonpatch out of the list of things to update in that line.

Upload URL should use https

Please change

Upload URL: http://archive.org/details/youtube-xxxxxxxx

to

Upload URL: https://archive.org/details/youtube-xxxxxxxx

Files are downloaded, but not uploaded

Since the latest updates to tubeup archiving is not working for youtube channels or playlists, it only works for one URL at a time. Whole channel can't be archived, it downloads everything, but doesn't even start uploading. The same problem is also happening when trying to archive vid.me - tubeup downloads everything and doesn't upload. Also Periscope and facebook urls are having the same issue.

root@s:~# tubeup https://www.facebook.com/.../videos/.../
[debug] Encodings: locale UTF-8, fs utf-8, out UTF-8, pref UTF-8
[debug] youtube-dl version 2017.11.26
[debug] Python version 3.4.3 - Linux-3.4.0+-x86_64-with-Ubuntu-14.04-trusty
[debug] exe versions: avconv 9.20-6, avprobe 9.20-6, ffmpeg 3.3.3, ffprobe 3.3.3
[debug] Proxy map: {}
[download] 100.0% of 150.66MiB at  4.01MiB/s ETA 00:00
{'downloaded_bytes': 157978090, 'filename': '../mnt/f/youtube/tubeup/downloads/....mp4', 'total_bytes': 157978090, '_total_bytes_str': '150.66MiB', '_elapsed_str': '00:37', 'elapsed': 37.57568573951721, 'status': 'finished'}
Downloaded ../mnt/f/youtube/tubeup/downloads/....mp4
[download] 100.0% of 19.91MiB at  2.60MiB/s ETA 00:00
{'downloaded_bytes': 20880946, 'filename': '../mnt/f/youtube/tubeup/downloads/....m4a', 'total_bytes': 20880946, '_total_bytes_str': '19.91MiB', '_elapsed_str': '00:07', 'elapsed': 7.6777238845825195, 'status': 'finished'}
Downloaded ../mnt/f/youtube/tubeup/downloads/....m4a

root@s:~# tubeup https://www.pscp.tv/w/...
[debug] Encodings: locale UTF-8, fs utf-8, out UTF-8, pref UTF-8
[debug] youtube-dl version 2017.11.26
[debug] Python version 3.4.3 - Linux-3.4.0+-x86_64-with-Ubuntu-14.04-trusty
[debug] exe versions: avconv 9.20-6, avprobe 9.20-6, ffmpeg 3.3.3, ffprobe 3.3.3
[debug] Proxy map: {}
[download] 100.0% of ~66.01MiB at 746.25KiB/s ETA 00:00
{'total_bytes': 69218404, '_elapsed_str': '01:08', 'filename': '../mnt/f/youtube/tubeup/downloads/....mp4', '_total_bytes_str': '66MiB', 'elapsed': 68.55479526519775, 'downloaded_bytes': 69218404, 'status': 'finished'}
Downloaded ../mnt/f/youtube/tubeup/downloads/........mp4

root@s:~# tubeup https://www.pscp.tv/w/...
[debug] Encodings: locale UTF-8, fs utf-8, out UTF-8, pref UTF-8
[debug] youtube-dl version 2017.11.26
[debug] Python version 3.4.3 - Linux-3.4.0+-x86_64-with-Ubuntu-14.04-trusty
[debug] exe versions: avconv 9.20-6, avprobe 9.20-6, ffmpeg 3.3.3, ffprobe 3.3.3
[debug] Proxy map: {}
root@s:~#

Just ran an update for components sudo -H pip3 install -U pip tubeup but the problem is still occurring.

Downloaded ../mnt/f/youtube/tubeup/downloads/....mp4
root@s:~# tubeup https://www.pscp...
[debug] Encodings: locale UTF-8, fs utf-8, out UTF-8, pref UTF-8
[debug] youtube-dl version 2017.12.02
[debug] Python version 3.4.3 - Linux-3.4.0+-x86_64-with-Ubuntu-14.04-trusty
[debug] exe versions: avconv 9.20-6, avprobe 9.20-6, ffmpeg 3.3.3, ffprobe 3.3.3
[debug] Proxy map: {}

{'filename': '../mnt/f/youtube/tubeup/downloads/....mp4', '_total_bytes_str': '66.01MiB', 'total_bytes': 69218404, 'status': 'finished'}
Downloaded ../mnt/f/youtube/tubeup/downloads/....mp4
root@s:~#

Archive.org - So many items are 'not found'

I've noticed so many videos I randomly click on are not found.

The item you have requested had an error:
Item cannot be found.
which prevents us from displaying this page.

If you would like to report this problem as an error report, you may do so here.

For example some random Japanese video, IDK what that even is, but it was mirrored from niconico.

image

https://archive.org/details/niconico-sm26635762

I clicked on many - 90% Molyneux videos in the screenshots and they are all 'not found'.
image

Also many videos of Jordan B. Peterson are gone.

But the fact is that so much of what I have uploaded and many others are lost. Is this archive.org internal problem or they remove items on purpose now (because of tags..)?

They seem to be digging their own grave.

image

Add Support for Youku (youku.com)

Hi, when I try to upload Youku videos, it gives me an error with the upload_date. Here's the complete output:

BLANK@BLANK-550P5C-550P7C:~โŸซ tubeup http://v.youku.com/v_show/id_XMzc3OTgzMzQw.html
[debug] Encodings: locale UTF-8, fs utf-8, out UTF-8, pref UTF-8
[debug] youtube-dl version 2017.02.24.1
[debug] Python version 3.5.2 - Linux-4.4.0-62-generic-x86_64-with-Ubuntu-16.04-xenial
[debug] exe versions: avconv 2.8.11-0ubuntu0.16.04.1, avprobe 2.8.11-0ubuntu0.16.04.1, ffmpeg 2.8.11-0ubuntu0.16.04.1, ffprobe 2.8.11-0ubuntu0.16.04.1, rtmpdump 2.4
[debug] Proxy map: {}
[debug] Public IP address: 2601:600:9201:8030:4c3:a3e3:c362:986c
{'total_bytes': 10357238, 'downloaded_bytes': 10357238, 'filename': '/home/BLANK/.tubeup/downloads/5_niconico_iQUe_GBA-XMzc3OTgzMzQw_part1.flv', '_total_bytes_str': '9.88MiB', 'elapsed': 2.124525547027588, '_elapsed_str': '00:02', 'status': 'finished'}
:: Downloaded: /home/BLANK/.tubeup/downloads/5_niconico_iQUe_GBA-XMzc3OTgzMzQw_part1.flv...
{'total_bytes': 10283148, 'downloaded_bytes': 10283148, 'filename': '/home/BLANK/.tubeup/downloads/5_niconico_iQUe_GBA-XMzc3OTgzMzQw_part2.flv', '_total_bytes_str': '9.81MiB', 'elapsed': 3.3654489517211914, '_elapsed_str': '00:03', 'status': 'finished'}
:: Downloaded: /home/BLANK/.tubeup/downloads/5_niconico_iQUe_GBA-XMzc3OTgzMzQw_part2.flv...
:: Uploading /home/BLANK/.tubeup/downloads/5_niconico_iQUe_GBA-XMzc3OTgzMzQw_part1...
Traceback (most recent call last):
File "/usr/local/bin/tubeup", line 9, in
load_entry_point('tubeup==0.0.4', 'console_scripts', 'tubeup')()
File "/usr/local/lib/python3.5/dist-packages/tubeup/main.py", line 272, in main
identifier, meta = upload_ia(video, custom_meta=md)
File "/usr/local/lib/python3.5/dist-packages/tubeup/main.py", line 161, in upload_ia
d = datetime.strptime(vid_meta['upload_date'], '%Y%m%d')
KeyError: 'upload_date'

Suggestion: Option to archive geo-blocked video/channel (youtube-dl feature)

Currently when trying to archive a channel it shows

ERROR: Youtube said: This channel is not available in your country.

But yt-dl has an option to bypass this:

https://github.com/rg3/youtube-dl#geo-restriction

--geo-bypass  Bypass geographic restriction via faking X-Forwarded-For HTTP header (experimental)

Would be a good addition to this script if we could add this flag at the end or somewhere when launching tubeup command, because for me proxies rarely work, so I had used VPNs in the past to bypass yt geo-censorship.

Twitter Video Upload Doesn't Work

The output is long but is listed below:

~$ tubeup  https://twitter.com/GreenDay/status/910545872206700544 
[debug] Encodings: locale UTF-8, fs utf-8, out UTF-8, pref UTF-8
[debug] youtube-dl version 2017.10.01
[debug] Python version 3.5.2 - Linux-4.10.0-35-generic-i686-with-LinuxMint-18.2-sonya
[debug] exe versions: ffmpeg 2.8.11-0ubuntu0.16.04.1, ffprobe 2.8.11-0ubuntu0.16.04.1
[debug] Proxy map: {}
[debug] Public IP address: 2601:600:9880:390:818a:d092:d4b2:18e
ffmpeg version 2.8.11-0ubuntu0.16.04.1 Copyright (c) 2000-2017 the FFmpeg developers
  built with gcc 5.4.0 (Ubuntu 5.4.0-6ubuntu1~16.04.4) 20160609
  configuration: --prefix=/usr --extra-version=0ubuntu0.16.04.1 --build-suffix=-ffmpeg --toolchain=hardened --libdir=/usr/lib/i386-linux-gnu --incdir=/usr/include/i386-linux-gnu --cc=cc --cxx=g++ --enable-gpl --enable-shared --disable-stripping --disable-decoder=libopenjpeg --disable-decoder=libschroedinger --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librtmp --enable-libschroedinger --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzvbi --enable-openal --enable-opengl --enable-x11grab --enable-libdc1394 --enable-libiec61883 --enable-libzmq --enable-frei0r --enable-libx264 --enable-libopencv --disable-i686
  WARNING: library configuration mismatch
  avcodec     configuration: --prefix=/usr --extra-version=0ubuntu0.16.04.1 --build-suffix=-ffmpeg --toolchain=hardened --libdir=/usr/lib/i386-linux-gnu --incdir=/usr/include/i386-linux-gnu --cc=cc --cxx=g++ --enable-gpl --enable-shared --disable-stripping --disable-decoder=libopenjpeg --disable-decoder=libschroedinger --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librtmp --enable-libschroedinger --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzvbi --enable-openal --enable-opengl --enable-x11grab --enable-libdc1394 --enable-libiec61883 --enable-libzmq --enable-frei0r --enable-libx264 --enable-libopencv --disable-i686 --enable-version3 --disable-doc --disable-programs --disable-avdevice --disable-avfilter --disable-avformat --disable-avresample --disable-postproc --disable-swscale --enable-libopencore_amrnb --enable-libopencore_amrwb --enable-libvo_aacenc --enable-libvo_amrwbenc
  libavutil      54. 31.100 / 54. 31.100
  libavcodec     56. 60.100 / 56. 60.100
  libavformat    56. 40.101 / 56. 40.101
  libavdevice    56.  4.100 / 56.  4.100
  libavfilter     5. 40.101 /  5. 40.101
  libavresample   2.  1.  0 /  2.  1.  0
  libswscale      3.  1.101 /  3.  1.101
  libswresample   1.  2.101 /  1.  2.101
  libpostproc    53.  3.100 / 53.  3.100
[hls,applehttp @ 0x82c99a0] HLS request for url 'https://video.twimg.com/amplify_video/910541641295024128/vid/0/3000/720x720/JlgHY8tqYPhoFq5U.ts', offset 0, playlist 0
Input #0, hls,applehttp, from 'https://video.twimg.com/amplify_video/910541641295024128/pl/720x720/t47yOs5lhHTKUeBh.m3u8':
  Duration: 00:00:47.80, start: 0.083411, bitrate: 0 kb/s
  Program 0 
    Metadata:
      variant_bitrate : 0
    Stream #0:0: Video: h264 (High), 4 reference frames ([27][0][0][0] / 0x001B), yuv420p, 720x720, 23.98 fps, 23.98 tbr, 90k tbn, 47.95 tbc
    Stream #0:1: Audio: aac (LC) ([15][0][0][0] / 0x000F), 48000 Hz, stereo, fltp, 131 kb/s
[mp4 @ 0x89c8880] Codec for stream 0 does not use global headers but container format requires global headers
[mp4 @ 0x89c8880] Codec for stream 1 does not use global headers but container format requires global headers
Output #0, mp4, to 'file:/home/USER/.tubeup/downloads/Green_Day_-_Support_@GlblCtzn_+_enter_to_go_VIP_at_one_of_our_upcoming_shows_in_South_America-910545872206700544.mp4.part':
  Metadata:
    encoder         : Lavf56.40.101
    Stream #0:0: Video: h264, 1 reference frame ([33][0][0][0] / 0x0021), yuv420p, 720x720 (0x0), q=2-31, 23.98 fps, 23.98 tbr, 90k tbn, 90k tbc
    Stream #0:1: Audio: aac ([64][0][0][0] / 0x0040), 48000 Hz, stereo, 131 kb/s
Stream mapping:
  Stream #0:0 -> #0:0 (copy)
  Stream #0:1 -> #0:1 (copy)
Press [q] to stop, [?] for help
[hls,applehttp @ 0x82c99a0] HLS request for url 'https://video.twimg.com/amplify_video/910541641295024128/vid/3000/6000/720x720/qIFQaitii_YHqMod.ts', offset 0, playlist 0
frame=   72 fps=0.0 q=-1.0 size=      88kB time=00:00:03.00 bitrate= 239.0kbits/[mp4 @ 0x89c8880] Non-monotonous DTS in output stream 0:0; previous: 259008, current: 255255; changing to 259009. This may result in incorrect timestamps in the output file.
[mp4 @ 0x89c8880] Non-monotonous DTS in output stream 0:0; previous: 259009, current: 259009; changing to 259010. This may result in incorrect timestamps in the output file.
[mp4 @ 0x89c8880] Non-monotonous DTS in output stream 0:1; previous: 143360, current: 140380; changing to 143361. This may result in incorrect timestamps in the output file.
[mp4 @ 0x89c8880] Non-monotonous DTS in output stream 0:1; previous: 143361, current: 141404; changing to 143362. This may result in incorrect timestamps in the output file.
[mp4 @ 0x89c8880] Non-monotonous DTS in output stream 0:1; previous: 143362, current: 142428; changing to 143363. This may result in incorrect timestamps in the output file.
[hls,applehttp @ 0x82c99a0] HLS request for url 'https://video.twimg.com/amplify_video/910541641295024128/vid/6000/9000/720x720/wymTxyER4S2KE4TG.ts', offset 0, playlist 0
frame=  144 fps=127 q=-1.0 size=     193kB time=00:00:05.93 bitrate= 266.2kbits/[hls,applehttp @ 0x82c99a0] HLS request for url 'https://video.twimg.com/amplify_video/910541641295024128/vid/9000/12000/720x720/Suktx8aayROR8tsH.ts', offset 0, playlist 0
[hls,applehttp @ 0x82c99a0] HLS request for url 'https://video.twimg.com/amplify_video/910541641295024128/vid/12000/15000/720x720/yJjTN5JIiu8OIjkc.ts', offset 0, playlist 0
frame=  288 fps=170 q=-1.0 size=     427kB time=00:00:11.92 bitrate= 293.0kbits/[hls,applehttp @ 0x82c99a0] HLS request for url 'https://video.twimg.com/amplify_video/910541641295024128/vid/15000/18000/720x720/vmWDw9xnXVE0R1ac.ts', offset 0, playlist 0
[hls,applehttp @ 0x82c99a0] HLS request for url 'https://video.twimg.com/amplify_video/910541641295024128/vid/18000/21000/720x720/FgWNli2K3FOftEaS.ts', offset 0, playlist 0
frame=  432 fps=191 q=-1.0 size=     644kB time=00:00:17.92 bitrate= 294.3kbits/[hls,applehttp @ 0x82c99a0] HLS request for url 'https://video.twimg.com/amplify_video/910541641295024128/vid/21000/24000/720x720/XZ5J0HC9IiZUEW3d.ts', offset 0, playlist 0
frame=  504 fps=171 q=-1.0 size=     752kB time=00:00:20.92 bitrate= 294.2kbits/[hls,applehttp @ 0x82c99a0] HLS request for url 'https://video.twimg.com/amplify_video/910541641295024128/vid/24000/27000/720x720/kcEAb91fWJTqLF_Q.ts', offset 0, playlist 0
frame=  576 fps=162 q=-1.0 size=     861kB time=00:00:23.91 bitrate= 295.0kbits/[hls,applehttp @ 0x82c99a0] HLS request for url 'https://video.twimg.com/amplify_video/910541641295024128/vid/27000/30000/720x720/hQ9bKqqA3fwruVLn.ts', offset 0, playlist 0
[hls,applehttp @ 0x82c99a0] HLS request for url 'https://video.twimg.com/amplify_video/910541641295024128/vid/30000/33000/720x720/4IAJObLEwTstlKRo.ts', offset 0, playlist 0
frame=  720 fps=163 q=-1.0 size=    1217kB time=00:00:29.93 bitrate= 333.1kbits/[hls,applehttp @ 0x82c99a0] HLS request for url 'https://video.twimg.com/amplify_video/910541641295024128/vid/33000/36000/720x720/D0awuOXv2YNsylX5.ts', offset 0, playlist 0
frame=  792 fps=157 q=-1.0 size=    1341kB time=00:00:32.91 bitrate= 333.7kbits/[hls,applehttp @ 0x82c99a0] HLS request for url 'https://video.twimg.com/amplify_video/910541641295024128/vid/36000/39000/720x720/ViQbTurnYoWJUscD.ts', offset 0, playlist 0
[hls,applehttp @ 0x82c99a0] HLS request for url 'https://video.twimg.com/amplify_video/910541641295024128/vid/39000/42000/720x720/JXwGc5gXSMBlzTeL.ts', offset 0, playlist 0
frame=  936 fps=166 q=-1.0 size=    1586kB time=00:00:38.93 bitrate= 333.7kbits/[hls,applehttp @ 0x82c99a0] HLS request for url 'https://video.twimg.com/amplify_video/910541641295024128/vid/42000/45000/720x720/ndGoSMEFJpRDUJB2.ts', offset 0, playlist 0
frame= 1007 fps=159 q=-1.0 size=    1675kB time=00:00:41.92 bitrate= 327.2kbits/[hls,applehttp @ 0x82c99a0] HLS request for url 'https://video.twimg.com/amplify_video/910541641295024128/vid/45000/47798/720x720/4pWgA4P2AO7JO2X_.ts', offset 0, playlist 0
frame= 1079 fps=158 q=-1.0 size=    1765kB time=00:00:44.92 bitrate= 321.8kbits/No more output streams to write to, finishing.
frame= 1146 fps=166 q=-1.0 Lsize=    1909kB time=00:00:47.72 bitrate= 327.6kbits/s    
video:1121kB audio:750kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 2.003915%
Input file #0 (https://video.twimg.com/amplify_video/910541641295024128/pl/720x720/t47yOs5lhHTKUeBh.m3u8):
  Input stream #0:0 (video): 1146 packets read (1147727 bytes); 
  Input stream #0:1 (audio): 2241 packets read (784009 bytes); 
  Total: 3387 packets (1931736 bytes) demuxed
Output file #0 (file:/home/USER/.tubeup/downloads/Green_Day_-_Support_@GlblCtzn_+_enter_to_go_VIP_at_one_of_our_upcoming_shows_in_South_America-910545872206700544.mp4.part):
  Output stream #0:0 (video): 1146 packets muxed (1147727 bytes); 
  Output stream #0:1 (audio): 2241 packets muxed (768322 bytes); 
  Total: 3387 packets (1916049 bytes) muxed
{'status': 'finished', '_total_bytes_str': '1.86MiB', 'filename': '/home/USER/.tubeup/downloads/Green_Day_-_Support_@GlblCtzn_+_enter_to_go_VIP_at_one_of_our_upcoming_shows_in_South_America-910545872206700544.mp4', 'total_bytes': 1954445, 'downloaded_bytes': 1954445}
:: Downloaded: /home/USER/.tubeup/downloads/Green_Day_-_Support_@GlblCtzn_+_enter_to_go_VIP_at_one_of_our_upcoming_shows_in_South_America-910545872206700544.mp4...
:: Uploading /home/USER/.tubeup/downloads/Green_Day_-_Support_@GlblCtzn_+_enter_to_go_VIP_at_one_of_our_upcoming_shows_in_South_America-910545872206700544...
2017-10-05 22:04:42,900 - internetarchive.item - ERROR -  error uploading Green_Day_-_Support_@GlblCtzn_+_enter_to_go_VIP_at_one_of_our_upcoming_shows_in_South_America-910545872206700544.info.json to twitter:card-910545872206700544, The specified bucket is not valid. - Bucket names should be valid archive identifiers; try someting matching this regular expression: ^[a-zA-Z0-9][a-zA-Z0-9_.-]{4,100}$ (or, if you are making unusual identifiers, this user may lack the special permission to do so)
Traceback (most recent call last):
  File "/usr/local/lib/python3.5/dist-packages/internetarchive/item.py", line 703, in upload_file
    response.raise_for_status()
  File "/usr/local/lib/python3.5/dist-packages/requests/models.py", line 935, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 400 Client Error: Bad Request for url: https://s3.us.archive.org/twitter:card-910545872206700544/Green_Day_-_Support_%40GlblCtzn_%2B_enter_to_go_VIP_at_one_of_our_upcoming_shows_in_South_America-910545872206700544.info.json

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/bin/tubeup", line 9, in <module>
    load_entry_point('tubeup==0.0.11', 'console_scripts', 'tubeup')()
  File "/usr/local/lib/python3.5/dist-packages/tubeup/__main__.py", line 298, in main
    identifier, meta = upload_ia(video, custom_meta=md)
  File "/usr/local/lib/python3.5/dist-packages/tubeup/__main__.py", line 226, in upload_ia
    item.upload(vid_files, metadata=meta, retries=9001, request_kwargs=dict(timeout=9001), delete=True)
  File "/usr/local/lib/python3.5/dist-packages/internetarchive/item.py", line 831, in upload
    request_kwargs=request_kwargs)
  File "/usr/local/lib/python3.5/dist-packages/internetarchive/item.py", line 723, in upload_file
    raise type(exc)(error_msg, response=exc.response, request=exc.request)
requests.exceptions.HTTPError:  error uploading Green_Day_-_Support_@GlblCtzn_+_enter_to_go_VIP_at_one_of_our_upcoming_shows_in_South_America-910545872206700544.info.json to twitter:card-910545872206700544, The specified bucket is not valid. - Bucket names should be valid archive identifiers; try someting matching this regular expression: ^[a-zA-Z0-9][a-zA-Z0-9_.-]{4,100}$ (or, if you are making unusual identifiers, this user may lack the special permission to do so)

Another example:

~$ tubeup  https://twitter.com/GreenDay/status/907260960263602177
[debug] Encodings: locale UTF-8, fs utf-8, out UTF-8, pref UTF-8
[debug] youtube-dl version 2017.10.01
[debug] Python version 3.5.2 - Linux-4.10.0-35-generic-i686-with-LinuxMint-18.2-sonya
[debug] exe versions: ffmpeg 2.8.11-0ubuntu0.16.04.1, ffprobe 2.8.11-0ubuntu0.16.04.1
[debug] Proxy map: {}
[debug] Public IP address: 2601:600:9880:390:818a:d092:d4b2:18e
ffmpeg version 2.8.11-0ubuntu0.16.04.1 Copyright (c) 2000-2017 the FFmpeg developers
  built with gcc 5.4.0 (Ubuntu 5.4.0-6ubuntu1~16.04.4) 20160609
  configuration: --prefix=/usr --extra-version=0ubuntu0.16.04.1 --build-suffix=-ffmpeg --toolchain=hardened --libdir=/usr/lib/i386-linux-gnu --incdir=/usr/include/i386-linux-gnu --cc=cc --cxx=g++ --enable-gpl --enable-shared --disable-stripping --disable-decoder=libopenjpeg --disable-decoder=libschroedinger --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librtmp --enable-libschroedinger --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzvbi --enable-openal --enable-opengl --enable-x11grab --enable-libdc1394 --enable-libiec61883 --enable-libzmq --enable-frei0r --enable-libx264 --enable-libopencv --disable-i686
  WARNING: library configuration mismatch
  avcodec     configuration: --prefix=/usr --extra-version=0ubuntu0.16.04.1 --build-suffix=-ffmpeg --toolchain=hardened --libdir=/usr/lib/i386-linux-gnu --incdir=/usr/include/i386-linux-gnu --cc=cc --cxx=g++ --enable-gpl --enable-shared --disable-stripping --disable-decoder=libopenjpeg --disable-decoder=libschroedinger --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmodplug --enable-libmp3lame --enable-libopenjpeg --enable-libopus --enable-libpulse --enable-librtmp --enable-libschroedinger --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxvid --enable-libzvbi --enable-openal --enable-opengl --enable-x11grab --enable-libdc1394 --enable-libiec61883 --enable-libzmq --enable-frei0r --enable-libx264 --enable-libopencv --disable-i686 --enable-version3 --disable-doc --disable-programs --disable-avdevice --disable-avfilter --disable-avformat --disable-avresample --disable-postproc --disable-swscale --enable-libopencore_amrnb --enable-libopencore_amrwb --enable-libvo_aacenc --enable-libvo_amrwbenc
  libavutil      54. 31.100 / 54. 31.100
  libavcodec     56. 60.100 / 56. 60.100
  libavformat    56. 40.101 / 56. 40.101
  libavdevice    56.  4.100 / 56.  4.100
  libavfilter     5. 40.101 /  5. 40.101
  libavresample   2.  1.  0 /  2.  1.  0
  libswscale      3.  1.101 /  3.  1.101
  libswresample   1.  2.101 /  1.  2.101
  libpostproc    53.  3.100 / 53.  3.100
[hls,applehttp @ 0x96e49a0] HLS request for url 'https://video.twimg.com/ext_tw_video/907253385765769217/pu/vid/0/3000/1280x720/z2ZgVLDqoVoz7cS-.ts', offset 0, playlist 0
[h264 @ 0x9a0d260] Current profile doesn't provide more RBSP data in PPS, skipping
Input #0, hls,applehttp, from 'https://video.twimg.com/ext_tw_video/907253385765769217/pu/pl/1280x720/zWqmxJ2sVJhH3OlR.m3u8':
  Duration: 00:00:15.65, start: 0.000000, bitrate: 0 kb/s
  Program 0 
    Metadata:
      variant_bitrate : 0
    Stream #0:0: Video: h264 (Main), 2 reference frames ([27][0][0][0] / 0x001B), yuv420p, 1280x720, 29.97 fps, 29.97 tbr, 90k tbn, 59.94 tbc
    Stream #0:1: Audio: aac (LC) ([15][0][0][0] / 0x000F), 48000 Hz, stereo, fltp, 119 kb/s
[mp4 @ 0x9e04800] Codec for stream 0 does not use global headers but container format requires global headers
[mp4 @ 0x9e04800] Codec for stream 1 does not use global headers but container format requires global headers
Output #0, mp4, to 'file:/home/USER/.tubeup/downloads/Green_Day_-_The_stars_at_night_are_big_and_bright_thanks_Texas-907260960263602177.mp4.part':
  Metadata:
    encoder         : Lavf56.40.101
    Stream #0:0: Video: h264, 1 reference frame ([33][0][0][0] / 0x0021), yuv420p, 1280x720 (0x0), q=2-31, 29.97 fps, 29.97 tbr, 90k tbn, 90k tbc
    Stream #0:1: Audio: aac ([64][0][0][0] / 0x0040), 48000 Hz, stereo, 119 kb/s
Stream mapping:
  Stream #0:0 -> #0:0 (copy)
  Stream #0:1 -> #0:1 (copy)
Press [q] to stop, [?] for help
[NULL @ 0x9a0d260] Current profile doesn't provide more RBSP data in PPS, skipping
    Last message repeated 1 times
[hls,applehttp @ 0x96e49a0] HLS request for url 'https://video.twimg.com/ext_tw_video/907253385765769217/pu/vid/3000/6000/1280x720/67Lp6HDWDtaxl6uH.ts', offset 0, playlist 0
[NULL @ 0x9a0d260] Current profile doesn't provide more RBSP data in PPS, skipping
    Last message repeated 2 times
[hls,applehttp @ 0x96e49a0] HLS request for url 'https://video.twimg.com/ext_tw_video/907253385765769217/pu/vid/6000/9000/1280x720/vqnt49Ch2qqqPXMH.ts', offset 0, playlist 0
frame=  180 fps=123 q=-1.0 size=    1024kB time=00:00:06.01 bitrate=1394.9kbits/[NULL @ 0x9a0d260] Current profile doesn't provide more RBSP data in PPS, skipping
    Last message repeated 2 times
[hls,applehttp @ 0x96e49a0] HLS request for url 'https://video.twimg.com/ext_tw_video/907253385765769217/pu/vid/9000/12000/1280x720/4uditm3_EPTAJ71B.ts', offset 0, playlist 0
frame=  270 fps=125 q=-1.0 size=    1366kB time=00:00:09.00 bitrate=1243.1kbits/[NULL @ 0x9a0d260] Current profile doesn't provide more RBSP data in PPS, skipping
    Last message repeated 2 times
[hls,applehttp @ 0x96e49a0] HLS request for url 'https://video.twimg.com/ext_tw_video/907253385765769217/pu/vid/12000/15649/1280x720/CxMs09-uwFOyeUAn.ts', offset 0, playlist 0
frame=  360 fps=123 q=-1.0 size=    1958kB time=00:00:12.01 bitrate=1335.7kbits/[NULL @ 0x9a0d260] Current profile doesn't provide more RBSP data in PPS, skipping
    Last message repeated 1 times
frame=  410 fps=119 q=-1.0 size=    2338kB time=00:00:13.63 bitrate=1404.7kbits/[NULL @ 0x9a0d260] Current profile doesn't provide more RBSP data in PPS, skipping
    Last message repeated 1 times
No more output streams to write to, finishing.
frame=  469 fps=135 q=-1.0 Lsize=    2736kB time=00:00:15.65 bitrate=1431.6kbits/s    
video:2477kB audio:245kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.534781%
Input file #0 (https://video.twimg.com/ext_tw_video/907253385765769217/pu/pl/1280x720/zWqmxJ2sVJhH3OlR.m3u8):
  Input stream #0:0 (video): 469 packets read (2536536 bytes); 
  Input stream #0:1 (audio): 734 packets read (255722 bytes); 
  Total: 1203 packets (2792258 bytes) demuxed
Output file #0 (file:/home/USER/.tubeup/downloads/Green_Day_-_The_stars_at_night_are_big_and_bright_thanks_Texas-907260960263602177.mp4.part):
  Output stream #0:0 (video): 469 packets muxed (2536536 bytes); 
  Output stream #0:1 (audio): 734 packets muxed (250584 bytes); 
  Total: 1203 packets (2787120 bytes) muxed
{'_total_bytes_str': '2.67MiB', 'total_bytes': 2802025, 'downloaded_bytes': 2802025, 'status': 'finished', 'filename': '/home/USER/.tubeup/downloads/Green_Day_-_The_stars_at_night_are_big_and_bright_thanks_Texas-907260960263602177.mp4'}
:: Downloaded: /home/USER/.tubeup/downloads/Green_Day_-_The_stars_at_night_are_big_and_bright_thanks_Texas-907260960263602177.mp4...
:: Uploading /home/USER/.tubeup/downloads/Green_Day_-_The_stars_at_night_are_big_and_bright_thanks_Texas-907260960263602177...
2017-10-05 22:06:46,530 - internetarchive.item - ERROR -  error uploading Green_Day_-_The_stars_at_night_are_big_and_bright_thanks_Texas-907260960263602177.description to twitter:card-907260960263602177, The specified bucket is not valid. - Bucket names should be valid archive identifiers; try someting matching this regular expression: ^[a-zA-Z0-9][a-zA-Z0-9_.-]{4,100}$ (or, if you are making unusual identifiers, this user may lack the special permission to do so)
Traceback (most recent call last):
  File "/usr/local/lib/python3.5/dist-packages/internetarchive/item.py", line 703, in upload_file
    response.raise_for_status()
  File "/usr/local/lib/python3.5/dist-packages/requests/models.py", line 935, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 400 Client Error: Bad Request for url: https://s3.us.archive.org/twitter:card-907260960263602177/Green_Day_-_The_stars_at_night_are_big_and_bright_thanks_Texas-907260960263602177.description

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/bin/tubeup", line 9, in <module>
    load_entry_point('tubeup==0.0.11', 'console_scripts', 'tubeup')()
  File "/usr/local/lib/python3.5/dist-packages/tubeup/__main__.py", line 298, in main
    identifier, meta = upload_ia(video, custom_meta=md)
  File "/usr/local/lib/python3.5/dist-packages/tubeup/__main__.py", line 226, in upload_ia
    item.upload(vid_files, metadata=meta, retries=9001, request_kwargs=dict(timeout=9001), delete=True)
  File "/usr/local/lib/python3.5/dist-packages/internetarchive/item.py", line 831, in upload
    request_kwargs=request_kwargs)
  File "/usr/local/lib/python3.5/dist-packages/internetarchive/item.py", line 723, in upload_file
    raise type(exc)(error_msg, response=exc.response, request=exc.request)
requests.exceptions.HTTPError:  error uploading Green_Day_-_The_stars_at_night_are_big_and_bright_thanks_Texas-907260960263602177.description to twitter:card-907260960263602177, The specified bucket is not valid. - Bucket names should be valid archive identifiers; try someting matching this regular expression: ^[a-zA-Z0-9][a-zA-Z0-9_.-]{4,100}$ (or, if you are making unusual identifiers, this user may lack the special permission to do so)

Null date metadata causing uploads to fail

tubeup https://www.facebook.com/327319634141163/videos/362921167247676/

[debug] Encodings: locale UTF-8, fs utf-8, out UTF-8, pref UTF-8
[debug] youtube-dl version 2016.07.22
[debug] Python version 3.5.2 - Linux-4.4.0-31-generic-x86_64-with-Ubuntu-16.04-xenial
[debug] exe versions: avconv 2.8.6-1ubuntu2, avprobe 2.8.6-1ubuntu2, ffmpeg 2.8.6-1ubuntu2, ffprobe 2.8.6-1ubuntu2, rtmpdump 2.4
[debug] Proxy map: {}
[debug] Public IP address: [redacted]
{'_total_bytes_str': '4.45MiB', '_elapsed_str': '00:09', 'total_bytes': 4666441, 'status': 'finished', 'filename': '/home/vxbinaca/.tubeup/downloads/Facebook_video_362921167247676-362921167247676.mp4', 'elapsed': 9.106133937835693, 'downloaded_bytes': 4666441}
:: Downloaded: /home/vxbinaca/.tubeup/downloads/Facebook_video_362921167247676-362921167247676.mp4...
:: Uploading /home/vxbinaca/.tubeup/downloads/Facebook_video_362921167247676-362921167247676...
Traceback (most recent call last):
  File "/usr/local/bin/tubeup", line 11, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.5/dist-packages/tubeup/__main__.py", line 252, in main
    identifier, meta = upload_ia(video, custom_meta=md)
  File "/usr/local/lib/python3.5/dist-packages/tubeup/__main__.py", line 141, in upload_ia
    d = datetime.strptime(vid_meta['upload_date'], '%Y%m%d')
KeyError: 'upload_date'

This is on our end because it completes the download. However Facebook + youtube-dl is a little funky sometimes......a lot of the time but this is the first failure I've had to even upload. Most other problems are just metadata ending up in the title of the item.

Edit: The metadata;

{"uploader": null, "extractor": "facebook", "display_id": "362921167247676", "format": "progressive_hd_src_no_ratelimit - unknown", "webpage_url_basename": "362921167247676", "format_id": "progressive_hd_src_no_ratelimit", "extractor_key": "Facebook", "playlist": null, "webpage_url": "https://www.facebook.com/327319634141163/videos/362921167247676/", "fulltitle": "Facebook video #362921167247676", "http_headers": {"Accept-Language": "en-us,en;q=0.5", "User-Agent": "Mozilla/5.0 (X11; Linux x86_64; rv:10.0) Gecko/20150101 Firefox/47.0 (Chrome)", "Accept-Encoding": "gzip, deflate", "Accept-Charset": "ISO-8859-1,utf-8;q=0.7,*;q=0.7", "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"}, "formats": [{"format_id": "progressive_sd_src", "ext": "mp4", "http_headers": {"Accept-Language": "en-us,en;q=0.5", "User-Agent": "Mozilla/5.0 (X11; Linux x86_64; rv:10.0) Gecko/20150101 Firefox/47.0 (Chrome)", "Accept-Encoding": "gzip, deflate", "Accept-Charset": "ISO-8859-1,utf-8;q=0.7,*;q=0.7", "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"}, "preference": -10, "protocol": "https", "format": "progressive_sd_src - unknown", "url": "https://video-mia1-1.xx.fbcdn.net/v/t42.1790-2/11194096_362921810580945_1666410974_n.mp4?efg=eyJybHIiOjM0NSwicmxhIjo1MTIsInZlbmNvZGVfdGFnIjoicmVzXzQyNl9jcmZfMjNfbWFpbl8zLjBfc2QifQ%3D%3D&rl=345&vabr=192&oh=a63103da7c38aa5aa862dbc17406cda4&oe=5793473A"}, {"format_id": "progressive_sd_src_no_ratelimit", "ext": "mp4", "http_headers": {"Accept-Language": "en-us,en;q=0.5", "User-Agent": "Mozilla/5.0 (X11; Linux x86_64; rv:10.0) Gecko/20150101 Firefox/47.0 (Chrome)", "Accept-Encoding": "gzip, deflate", "Accept-Charset": "ISO-8859-1,utf-8;q=0.7,*;q=0.7", "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"}, "preference": -10, "protocol": "https", "format": "progressive_sd_src_no_ratelimit - unknown", "url": "https://video-mia1-1.xx.fbcdn.net/v/t42.1790-2/11194096_362921810580945_1666410974_n.mp4?efg=eyJ2ZW5jb2RlX3RhZyI6InJlc180MjZfY3JmXzIzX21haW5fMy4wX3NkIn0%3D&oh=a63103da7c38aa5aa862dbc17406cda4&oe=5793473A"}, {"format_id": "progressive_hd_src", "ext": "mp4", "http_headers": {"Accept-Language": "en-us,en;q=0.5", "User-Agent": "Mozilla/5.0 (X11; Linux x86_64; rv:10.0) Gecko/20150101 Firefox/47.0 (Chrome)", "Accept-Encoding": "gzip, deflate", "Accept-Charset": "ISO-8859-1,utf-8;q=0.7,*;q=0.7", "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"}, "preference": -5, "protocol": "https", "format": "progressive_hd_src - unknown", "url": "https://video-mia1-1.xx.fbcdn.net/v/t43.1792-2/11067207_362921663914293_663837328_n.mp4?efg=eyJybHIiOjE1MDAsInJsYSI6MTAyNCwidmVuY29kZV90YWciOiJoZCJ9&rl=1500&vabr=760&oh=0b6d008a52ec9d8a699093a6e41b1528&oe=579345C3"}, {"format_id": "progressive_hd_src_no_ratelimit", "ext": "mp4", "http_headers": {"Accept-Language": "en-us,en;q=0.5", "User-Agent": "Mozilla/5.0 (X11; Linux x86_64; rv:10.0) Gecko/20150101 Firefox/47.0 (Chrome)", "Accept-Encoding": "gzip, deflate", "Accept-Charset": "ISO-8859-1,utf-8;q=0.7,*;q=0.7", "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"}, "preference": -5, "protocol": "https", "format": "progressive_hd_src_no_ratelimit - unknown", "url": "https://video-mia1-1.xx.fbcdn.net/v/t43.1792-2/11067207_362921663914293_663837328_n.mp4?efg=eyJ2ZW5jb2RlX3RhZyI6ImhkIn0%3D&oh=0b6d008a52ec9d8a699093a6e41b1528&oe=579345C3"}], "_filename": "/home/vxbinaca/.tubeup/downloads/Facebook_video_362921167247676-362921167247676.mp4", "preference": -5, "protocol": "https", "title": "Facebook video #362921167247676", "url": "https://video-mia1-1.xx.fbcdn.net/v/t43.1792-2/11067207_362921663914293_663837328_n.mp4?efg=eyJ2ZW5jb2RlX3RhZyI6ImhkIn0%3D&oh=0b6d008a52ec9d8a699093a6e41b1528&oe=579345C3", "ext": "mp4", "playlist_index": null, "id": "362921167247676"}

Error: "warning: s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left."

I keep getting the following error whenever I try to upload videos by the Youtuber Shirrako (and only him):

prof_frink@DESKTOP-1PS5GGT:~$ tubeup https://www.youtube.com/watch?v=oTcwl232id8
[debug] Encodings: locale UTF-8, fs utf-8, out UTF-8, pref UTF-8
[debug] youtube-dl version 2018.11.07
[debug] Python version 3.6.6 (CPython) - Linux-4.4.0-17134-Microsoft-x86_64-with-Ubuntu-18.04-bionic
[debug] exe versions: ffmpeg 3.4.4-0ubuntu0.18.04.1, ffprobe 3.4.4-0ubuntu0.18.04.1
[debug] Proxy map: {}
video doesn't have subtitles
 uploading Red_Dead_Redemption_2_-_Bringing_Black_Man_To_KKK-oTcwl232id8.annotations.xml: [                              uploading Red_Dead_Redemption_2_-_Bringing_Black_Man_To_KKK-oTcwl232id8.annotations.xml: [############################# uploading Red_Dead_Redemption_2_-_Bringing_Black_Man_To_KKK-oTcwl232id8.annotations.xml: [################################] 1/1 - 00:00:00
 warning: s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
 uploading Red_Dead_Redemption_2_-_Bringing_Black_Man_To_KKK-oTcwl232id8.annotations.xml: [                              uploading Red_Dead_Redemption_2_-_Bringing_Black_Man_To_KKK-oTcwl232id8.annotations.xml: [############################# uploading Red_Dead_Redemption_2_-_Bringing_Black_Man_To_KKK-oTcwl232id8.annotations.xml: [################################] 1/1 - 00:00:00
 warning: s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
 uploading Red_Dead_Redemption_2_-_Bringing_Black_Man_To_KKK-oTcwl232id8.annotations.xml: [                              uploading Red_Dead_Redemption_2_-_Bringing_Black_Man_To_KKK-oTcwl232id8.annotations.xml: [############################# uploading Red_Dead_Redemption_2_-_Bringing_Black_Man_To_KKK-oTcwl232id8.annotations.xml: [################################] 1/1 - 00:00:00
 warning: s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
 uploading Red_Dead_Redemption_2_-_Bringing_Black_Man_To_KKK-oTcwl232id8.annotations.xml: [                              uploading Red_Dead_Redemption_2_-_Bringing_Black_Man_To_KKK-oTcwl232id8.annotations.xml: [############################# uploading Red_Dead_Redemption_2_-_Bringing_Black_Man_To_KKK-oTcwl232id8.annotations.xml: [################################] 1/1 - 00:00:00
 warning: s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
 uploading Red_Dead_Redemption_2_-_Bringing_Black_Man_To_KKK-oTcwl232id8.annotations.xml: [                              uploading Red_Dead_Redemption_2_-_Bringing_Black_Man_To_KKK-oTcwl232id8.annotations.xml: [############################# uploading Red_Dead_Redemption_2_-_Bringing_Black_Man_To_KKK-oTcwl232id8.annotations.xml: [################################] 1/1 - 00:00:00
 warning: s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
 uploading Red_Dead_Redemption_2_-_Bringing_Black_Man_To_KKK-oTcwl232id8.annotations.xml: [                              uploading Red_Dead_Redemption_2_-_Bringing_Black_Man_To_KKK-oTcwl232id8.annotations.xml: [############################# uploading Red_Dead_Redemption_2_-_Bringing_Black_Man_To_KKK-oTcwl232id8.annotations.xml: [################################] 1/1 - 00:00:00
 warning: s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.

It's weird, because all other videos upload just fine, and two of his videos were already posted to Mirrortube:

https://archive.org/details/youtube-MPYAM9AfRHo
https://archive.org/details/youtube-mZ7RicfseRU

Also, I'm new to this, so I'm wondering: is there something else I should be doing other than just 'tubeup <url>'? I can't add to any collections, but I noticed Sketch the Cow is adding them to Mirrortube. Just wondering if this is the proper procedure, or if I'm creating unnecessary work for someone.

As well, I noticed that the download page mentioned not uploading entire channels. Is this an absolute, or does it depend on how many videos are in the channel?

Thanks!

Check For Item In Internet Archive Prior To Performing Retrieval

The tubeup readme mentions this:

Obviously, if someone else uploaded the video to the Internet Archive, you will get a permissions error. We don't want duplicates, do we?

Attempted to archive https://www.youtube.com/watch?v=ARrNYyJEnFI, but was not aware it already existed as an item.

tubeup https://www.youtube.com/watch\?v\=ARrNYyJEnFI
[debug] Encodings: locale UTF-8, fs utf-8, out UTF-8, pref UTF-8
[debug] youtube-dl version 2017.11.15
[debug] Python version 3.6.3 - Darwin-16.7.0-x86_64-i386-64bit
[debug] exe versions: ffmpeg 3.4, ffprobe 3.4, rtmpdump 2.4
[debug] Proxy map: {}
[download] 100.0% of 2.46GiB at  2.94MiB/s ETA 00:00
{'downloaded_bytes': 2637871187, 'total_bytes': 2637871187, 'filename': '/Users/brandon/.tubeup/downloads/Introduction_to_Investing_and_Finance_-_Lesson_1_by_Martin_Shkreli-ARrNYyJEnFI.f299.mp4', 'status': 'finished', 'elapsed': 855.510488986969, '_total_bytes_str': '2.46GiB', '_elapsed_str': '14:15'}
Downloaded /Users/brandon/.tubeup/downloads/Introduction_to_Investing_and_Finance_-_Lesson_1_by_Martin_Shkreli-ARrNYyJEnFI.f299.mp4
[download] 100.0% of 142.46MiB at  3.53MiB/s ETA 00:00
{'downloaded_bytes': 149377905, 'total_bytes': 149377905, 'filename': '/Users/brandon/.tubeup/downloads/Introduction_to_Investing_and_Finance_-_Lesson_1_by_Martin_Shkreli-ARrNYyJEnFI.f251.webm', 'status': 'finished', 'elapsed': 40.32209777832031, '_total_bytes_str': '142.46MiB', '_elapsed_str': '00:40'}
Downloaded /Users/brandon/.tubeup/downloads/Introduction_to_Investing_and_Finance_-_Lesson_1_by_Martin_Shkreli-ARrNYyJEnFI.f251.webm
 uploading Introduction_to_Investing_and_Finance_-_Lesson_1_by_Martin_Shkreli-AR uploading Introduction_to_Investing_and_Finance_-_Lesson_1_by_Martin_Shkreli-AR uploading Introduction_to_Investing_and_Finance_-_Lesson_1_by_Martin_Shkreli-ARrNYyJEnFI.annotations.xml: [################################] 1/1 - 00:00:00
 error uploading Introduction_to_Investing_and_Finance_-_Lesson_1_by_Martin_Shkreli-ARrNYyJEnFI.annotations.xml: Access Denied - You lack sufficient privileges to write to this item.

Is it sound logic to check for an item with the Originalurl tag value set to the URL specified to retrieve and upload prior to the retrieval process? If so, I'll proceed with a fork->PR.

Script for downloading the best possible quality + vtt subtitles and thumbnail [youtube-dl]

Can you include this in the default youtube-dl download script ?

It downloads the best video version and the best audio, merges them as .mp4 file.

youtube-dl -ci --write-thumbnail --sub-format ass/srt/best --write-auto-sub -f 'bestvideo[ext=mp4]+bestaudio[ext=m4a]/mp4 -o "/%%(uploader)s - %%(title)s (%%(id)s).%%(ext)s" %input%

%input% is video URL, channel URL, or playlist URL.

there's also naming template %%(uploader)s - %%(title)s (%%(id)s).%%(ext)s which needs to be adjusted for linux, I think double % are not necessary, it's only for windows.

TypeError: send() got multiple values for keyword argument 'timeout'

:: Uploading /root/.tubeup/downloads/Pong_AI_with_Policy_Gradients-YOW8m2YGtRg...
Traceback (most recent call last):
  File "/usr/local/bin/tubeup", line 11, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.5/dist-packages/tubeup/__main__.py", line 263, in main
    identifier, meta = upload_ia(video, custom_meta=md)
  File "/usr/local/lib/python3.5/dist-packages/tubeup/__main__.py", line 208, in upload_ia
    item.upload(vid_files, metadata=meta, retries=9001, request_kwargs=dict(timeout=9001), delete=True)
  File "/usr/local/lib/python3.5/dist-packages/internetarchive/item.py", line 830, in upload
    request_kwargs=request_kwargs)
  File "/usr/local/lib/python3.5/dist-packages/internetarchive/item.py", line 690, in upload_file
    **request_kwargs)
TypeError: send() got multiple values for keyword argument 'timeout'

504 Server Error when uploading yt playlist

I uploaded a playlist from youtube, this is the error I got and archiving stopped, similar error I posted some months ago, but it still happens.

:: Uploading ../home/.tubeup/downloads/The_Unreported_Facts_About_Libya_True_News-V11Bq1ZdSqM...
2017-06-27 20:24:35,188 - internetarchive.item - INFO - uploaded The_Unreported_Facts_About_Libya_True_News-V11Bq1ZdSqM.jpg to https://s3.us.archive.org/youtube-V11Bq1ZdSqM/The_Unreported_Facts_About_Libya_True_News-V11Bq1ZdSqM.jpg
2017-06-27 20:24:35,189 - internetarchive.item - INFO - The_Unreported_Facts_About_Libya_True_News-V11Bq1ZdSqM.jpg successfully uploaded to https://archive.org/download/youtube-V11Bq1ZdSqM/The_Unreported_Facts_About_Libya_True_News-V11Bq1ZdSqM.jpg and verified, deleting local copy
2017-06-27 22:24:37,074 - internetarchive.item - ERROR -  error uploading The_Unreported_Facts_About_Libya_True_News-V11Bq1ZdSqM.description to youtube-V11Bq1ZdSqM,
Traceback (most recent call last):
  File "/usr/local/lib/python3.5/dist-packages/internetarchive/item.py", line 645, in upload_file
    response.raise_for_status()
  File "/usr/local/lib/python3.5/dist-packages/requests/models.py", line 937, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 504 Server Error: Gateway Time-out for url: https://s3.us.archive.org/youtube-V11Bq1ZdSqM/The_Unreported_Facts_About_Libya_True_News-V11Bq1ZdSqM.description

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/bin/tubeup", line 11, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.5/dist-packages/tubeup/__main__.py", line 263, in main
    identifier, meta = upload_ia(video, custom_meta=md)
  File "/usr/local/lib/python3.5/dist-packages/tubeup/__main__.py", line 208, in upload_ia
    item.upload(vid_files, metadata=meta, retries=9001, request_kwargs=dict(timeout=9001), delete=True)
  File "/usr/local/lib/python3.5/dist-packages/internetarchive/item.py", line 771, in upload
    request_kwargs=request_kwargs)
  File "/usr/local/lib/python3.5/dist-packages/internetarchive/item.py", line 665, in upload_file
    raise type(exc)(error_msg, response=exc.response, request=exc.request)
requests.exceptions.HTTPError:  error uploading The_Unreported_Facts_About_Libya_True_News-V11Bq1ZdSqM.description to youtube-V11Bq1ZdSqM

Youtube res-IDs with "-" at the start being missed from upload

Example:

https://www.youtube.com/watch?v=-QPLjNS3FYs
-QPLjNS3FYs

The script is treating it like a flag instead of a URL. The script thinks its -Q and just dumps out and skips to the next video. Nice for fault tolerance, not so much for archival.

This was missed by a pass of her entire channel. Now, I now know what to look for and can sort of compensate or add the IDs to a list to later re-rip when this is fixed but still this may and being missed by others who use this or channel dumps.

I go around this by using --get-id in youtube-dl, then copying the output to a text editor and looking for "-", and then going through and ripping full urls and it. Still, others are missing this.

Garbage collection without interference

REGEX needs to be used to do this.

Goals:

  • Clean video and metadata from disk on upload
  • Don't interfere with another instance of tubeups work. EX: one process is uploading a channel while another simultaneous instance is downloading it (max bandwidth utilization). So no quick and easy rm ~/.tubeup/downloads/* on exit.

.netrc + general auth for sites

This is a feature request to download a users viewing history (Watch Again) list using oauth or some form of password/cookie auth.

Provide --username and --password to youtube-dl via Tubeup

We should make two arguments that pass a username and password to tubeup, as some services require login.

Some services also used to allow nonlogin scraping (such as niconico), but they're having issues with that, so we need to be able to log in.

Example youtube-dl command;

youtube-dl --username "[email protected]" --password "your_password" --write-annotations --write-description --write-info-json --write-thumbnail URL

Add Support for Wenoo (wenoo.net)

Gives this output when I try and reupload a video from wenoo.net:

BLANK@BLANK-550P5C-550P7C:~โŸซ tubeup http://wenoo.net/video/4d2de9a85124dc64613
[debug] Encodings: locale UTF-8, fs utf-8, out UTF-8, pref UTF-8
[debug] youtube-dl version 2017.02.24.1
[debug] Python version 3.5.2 - Linux-4.4.0-62-generic-x86_64-with-Ubuntu-16.04-xenial
[debug] exe versions: avconv 2.8.11-0ubuntu0.16.04.1, avprobe 2.8.11-0ubuntu0.16.04.1, ffmpeg 2.8.11-0ubuntu0.16.04.1, ffprobe 2.8.11-0ubuntu0.16.04.1, rtmpdump 2.4
[debug] Proxy map: {}
[debug] Public IP address: 2601:600:9201:8030:4c3:a3e3:c362:986c
{'total_bytes_str': '7.59MiB', 'status': 'finished', 'filename': '/home/BLANK/.tubeup/downloads/Puyo_Puyo_Tetris-_Official_Nintendo_Switch_Trailer-4d2de9a85124dc64613.mp4', 'total_bytes': 7958264, 'elapsed': 3.245335578918457, 'elapsed_str': '00:03', 'downloaded_bytes': 7958264}
:: Downloaded: /home/BLANK/.tubeup/downloads/Puyo_Puyo_Tetris
-Official_Nintendo_Switch_Trailer-4d2de9a85124dc64613.mp4...
:: Uploading /home/BLANK/.tubeup/downloads/Puyo_Puyo_Tetris
-_Official_Nintendo_Switch_Trailer-4d2de9a85124dc64613...
Traceback (most recent call last):
File "/usr/local/bin/tubeup", line 9, in
load_entry_point('tubeup==0.0.4', 'console_scripts', 'tubeup')()
File "/usr/local/lib/python3.5/dist-packages/tubeup/main.py", line 272, in main
identifier, meta = upload_ia(video, custom_meta=md)
File "/usr/local/lib/python3.5/dist-packages/tubeup/main.py", line 161, in upload_ia
d = datetime.strptime(vid_meta['upload_date'], '%Y%m%d')
KeyError: 'upload_date'

tubeup completely fails to grab video

tubeup https://www.youtube.com/watch?v=Yeb08cbUswk

[debug] Encodings: locale UTF-8, fs utf-8, out UTF-8, pref UTF-8
[debug] youtube-dl version 2016.08.22
[debug] Python version 3.5.2 - Linux-4.4.0-34-generic-x86_64-with-Ubuntu-16.04-xenial
[debug] exe versions: avconv 2.8.6-1ubuntu2, avprobe 2.8.6-1ubuntu2, ffmpeg 2.8.6-1ubuntu2, ffprobe 2.8.6-1ubuntu2
[debug] Proxy map: {}
Traceback (most recent call last):
  File "/usr/lib/python3.5/urllib/request.py", line 1254, in do_open
    h.request(req.get_method(), req.selector, req.data, headers)
  File "/usr/lib/python3.5/http/client.py", line 1106, in request
    self._send_request(method, url, body, headers)
  File "/usr/lib/python3.5/http/client.py", line 1151, in _send_request
    self.endheaders(body)
  File "/usr/lib/python3.5/http/client.py", line 1102, in endheaders
    self._send_output(message_body)
  File "/usr/lib/python3.5/http/client.py", line 934, in _send_output
    self.send(msg)
  File "/usr/lib/python3.5/http/client.py", line 877, in send
    self.connect()
  File "/usr/lib/python3.5/http/client.py", line 1252, in connect
    super().connect()
  File "/usr/lib/python3.5/http/client.py", line 849, in connect
    (self.host,self.port), self.timeout, self.source_address)
  File "/usr/lib/python3.5/socket.py", line 711, in create_connection
    raise err
  File "/usr/lib/python3.5/socket.py", line 697, in create_connection
    sock = socket(af, socktype, proto)
  File "/usr/lib/python3.5/socket.py", line 134, in __init__
    _socket.socket.__init__(self, family, type, proto, fileno)
OSError: [Errno 97] Address family not supported by protocol

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/bin/tubeup", line 11, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.5/dist-packages/tubeup/__main__.py", line 241, in main
    download(URLs, proxy_url)
  File "/usr/local/lib/python3.5/dist-packages/tubeup/__main__.py", line 110, in download
    with youtube_dl.YoutubeDL(ydl_opts) as ydl:
  File "/usr/local/lib/python3.5/dist-packages/youtube_dl/YoutubeDL.py", line 367, in __init__
    self.print_debug_header()
  File "/usr/local/lib/python3.5/dist-packages/youtube_dl/YoutubeDL.py", line 2055, in print_debug_header
    ipaddr = self.urlopen('https://yt-dl.org/ip').read().decode('utf-8')
  File "/usr/local/lib/python3.5/dist-packages/youtube_dl/YoutubeDL.py", line 1996, in urlopen
    return self._opener.open(req, timeout=self._socket_timeout)
  File "/usr/lib/python3.5/urllib/request.py", line 466, in open
    response = self._open(req, data)
  File "/usr/lib/python3.5/urllib/request.py", line 484, in _open
    '_open', req)
  File "/usr/lib/python3.5/urllib/request.py", line 444, in _call_chain
    result = func(*args)
  File "/usr/local/lib/python3.5/dist-packages/youtube_dl/utils.py", line 1004, in https_open
    req, **kwargs)
  File "/usr/lib/python3.5/urllib/request.py", line 1256, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [Errno 97] Address family not supported by protocol>

Youtube-dl gets the video, but tubeup doesn't. Need me to print out the dependency versions?

dependency versions:

youtube-dl: 2016.8.22
requests: 2.11.1
jsonpatch: 0.4
internetarchive: 1.0.9
tubeup: Version: 0.0.1

So this looks like a problem with requests, is this related to the problem you had with internetarchive? I'll take a look at how it was fixed and try to replicate.

"s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left." output again and again.

Log:

powerkitten@powerkitten-GA-78LMT-S2P:~$ tubeup "http://jp.channel.pandora.tv/channel/video.ptv?c1=&ch_userid=jpchan15&prgid=44933722"
[debug] Encodings: locale UTF-8, fs utf-8, out UTF-8, pref UTF-8
[debug] youtube-dl version 2017.05.09
[debug] Python version 3.5.2 - Linux-4.4.0-57-generic-x86_64-with-elementary_OS-0.3.2-freya
[debug] exe versions: ffmpeg 2.4.3-1ubuntu1, ffprobe 2.4.3-1ubuntu1
[debug] Proxy map: {}
[debug] Public IP address: 2605:a601:a20:9500:d1a0:e13c:1342:4e7d
{'_elapsed_str': '13:52', 'elapsed': 832.6274771690369, 'total_bytes': 416286757, 'downloaded_bytes': 416286757, '_total_bytes_str': '397.00MiB', 'status': 'finished', 'filename': '/home/powerkitten/.tubeup/downloads/Super_Mario_64_120_star_speed_run_1_46_35_by_nero-44933722.flv'}
:: Downloaded: /home/powerkitten/.tubeup/downloads/Super_Mario_64_120_star_speed_run_1_46_35_by_nero-44933722.flv...
:: Uploading /home/powerkitten/.tubeup/downloads/Super_Mario_64_120_star_speed_run_1_46_35_by_nero-44933722...
2017-06-04 17:16:57,503 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:17:28,316 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:17:59,033 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:18:29,795 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:19:00,613 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:19:31,396 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:20:02,121 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:20:32,892 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:21:03,670 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:21:34,421 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:22:05,183 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:22:36,030 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:23:06,804 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:23:37,591 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:24:08,396 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:24:39,168 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:25:09,934 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:25:40,715 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:26:11,412 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:26:42,112 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:27:12,897 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:27:43,656 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:28:14,453 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:28:45,241 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:29:15,983 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:29:47,243 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:30:18,014 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:30:49,176 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:31:19,946 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:31:50,700 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:32:21,442 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:32:52,179 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:33:22,950 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:33:53,712 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:34:24,469 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:34:55,224 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:35:25,986 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:35:56,713 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:36:27,525 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:36:58,869 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:37:29,624 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:38:00,393 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:38:31,146 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:39:01,916 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:39:32,703 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:40:03,462 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:40:34,224 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:41:05,100 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:41:35,831 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:42:06,584 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:42:37,324 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:43:08,411 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:43:39,185 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:44:09,976 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:44:40,760 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:45:11,524 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:45:42,263 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:46:13,003 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:46:43,760 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:47:14,613 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:47:45,509 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:48:16,233 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:48:47,000 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:49:17,757 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:49:48,538 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:50:19,308 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:50:50,183 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:51:20,940 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.
2017-06-04 17:51:51,666 - internetarchive.item - INFO - s3 is overloaded, sleeping for 30 seconds and retrying. 9001 retries left.

Access Denied - You lack sufficient privileges to write to this item

I get this error when trying to archive a youtube video:

tubeup http://www.youtube.com/watch?v=KSIp5Jckg0M --use-download-archive
[debug] Encodings: locale UTF-8, fs utf-8, out UTF-8, pref UTF-8
[debug] youtube-dl version 2018.10.05
[debug] Python version 3.5.3 (CPython) - Linux-4.14.52-v7+-armv7l-with-debian-9.4
[debug] exe versions: avconv 3.2.10-1, avprobe 3.2.10-1, ffmpeg 3.2.10-1, ffprobe 3.2.10-1
[debug] Proxy map: {}
video doesn't have subtitles
 uploading Luigi_s_Mansion_-_Not-So-Spooky_Trailer_-_Nintendo_3DS-KSIp5Jckg0M.annotations.xml: [                                ] 0/ uploading Luigi_s_Mansion_-_Not-So-Spooky_Trailer_-_Nintendo_3DS-KSIp5Jckg0M.annotations.xml: [############################### ] 1/ uploading Luigi_s_Mansion_-_Not-So-Spooky_Trailer_-_Nintendo_3DS-KSIp5Jckg0M.annotations.xml: [################################] 1/1 - 00:00:00
 error uploading Luigi_s_Mansion_-_Not-So-Spooky_Trailer_-_Nintendo_3DS-KSIp5Jckg0M.annotations.xml: Access Denied - You lack sufficient privileges to write to this item.

An exception just occured, if you found this exception isn't related with any of your connection problem, please report this issue to https://github.com/bibanon/tubeup/issues
Traceback (most recent call last):
  File "/usr/local/lib/python3.5/dist-packages/internetarchive/item.py", line 722, in upload_file
    response.raise_for_status()
  File "/usr/local/lib/python3.5/dist-packages/requests/models.py", line 939, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 403 Client Error: Forbidden for url: https://s3.us.archive.org/youtube-KSIp5Jckg0M/Luigi_s_Mansion_-_Not-So-Spooky_Trailer_-_Nintendo_3DS-KSIp5Jckg0M.annotations.xml

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.5/dist-packages/tubeup/__main__.py", line 91, in main
    use_download_archive):
  File "/usr/local/lib/python3.5/dist-packages/tubeup/TubeUp.py", line 359, in archive_urls
    identifier, meta = self.upload_ia(basename, custom_meta)
  File "/usr/local/lib/python3.5/dist-packages/tubeup/TubeUp.py", line 328, in upload_ia
    secret_key=s3_secret_key)
  File "/usr/local/lib/python3.5/dist-packages/internetarchive/item.py", line 860, in upload
    request_kwargs=request_kwargs)
  File "/usr/local/lib/python3.5/dist-packages/internetarchive/item.py", line 742, in upload_file
    raise type(exc)(error_msg, response=exc.response, request=exc.request)
requests.exceptions.HTTPError:  error uploading Luigi_s_Mansion_-_Not-So-Spooky_Trailer_-_Nintendo_3DS-KSIp5Jckg0M.annotations.xml to youtube-KSIp5Jckg0M, Access Denied - You lack sufficient privileges to write to this item.

Youtube-dl configs and tubeup

I'm getting youtube-dl from regular pip, not pip3.

How do I set configs in the pip3 version so the script will respect that one?

My config is this:

-4 --download-archive ~/.ytdlarchive --retries 100 --no-overwrites --call-home --continue --write-info-json --write-description --write-thumbnail --write-annotations --all-subs --convert-subs srt --write-sub --add-metadata -f bestvideo+bestaudio/best --merge-output-format 'mkv' --prefer-ffmpeg --embed-thumbnail

Edit:

I tested this. Love the script but I have a few issues that prevent me from relying on it:

  • Lack of subtitle ripping and uploading, compatible in a way that allows it's use on IAs video player automatically (they'd need to be in the SRT format, even though VTT seems to be common on youtube using --convert-subs srt would fix that).
  • Not using -f bestvideo+bestaudio/best to dictate video quality, the first part is for youtube specifically, and the last /best is a catch-all for 99 percent of other sites out there.
  • Perhaps consider switching to MKV as a video container? All the TV show rippers use it, it has maximum compatibility on OSs and the only slight downside is a slight delay in the video being streamable because of the derive task.

Tubeup error

An exception just occured, if you found this exception isn't related with any of your connection problem, please report this issue to https://github.com/bibanon/tubeup/issues
Traceback (most recent call last):
  File "/home/vxbinaca/.local/lib/python3.4/site-packages/tubeup/__main__.py", line 91, in main
    use_download_archive):
  File "/home/vxbinaca/.local/lib/python3.4/site-packages/tubeup/TubeUp.py", line 356, in archive_urls
    urls, proxy, ydl_username, ydl_password, use_download_archive)
  File "/home/vxbinaca/.local/lib/python3.4/site-packages/tubeup/TubeUp.py", line 144, in get_resource_basenames
    self.create_basenames_from_ydl_info_dict(ydl, info_dict)
  File "/home/vxbinaca/.local/lib/python3.4/site-packages/tubeup/TubeUp.py", line 172, in create_basenames_from_ydl_info_dict
    filenames.add(ydl.prepare_filename(video))
  File "/home/vxbinaca/.local/lib/python3.4/site-packages/youtube_dl/YoutubeDL.py", line 632, in prepare_filename
    template_dict = dict(info_dict)
TypeError: 'NoneType' object is not iterable

Edited by @refeed :
More info in #61 (comment)

I updated tubeup and this happened again when archiving channel with 5-10 videos. When archiving individually youtube urls, it works.

Implement os.path.expanduser for filesystem access

See title.

Every time I have tried setting it to ~/ or $HOME it's failed. I would have assume with $HOME we were dealing with a env problem with bash but ~/ doesn't work either.

The result currently is, it can't write to the archive file on the youtube-dl end of the script, so progress made through a channel or individual items isn't recorded for subsequent runs on a channel.

Log the download process

It would be nice if there is a log that showing the download process of the video e.g ETA, download precentage, download speed, etc. Currently tubeup just show that the video has been downloaded, which confuses me when the video size is too big (since the download process takes much time, so then I sometimes think this program hangs, but it is not).

Archiving stops after few youtube-dl copyright ERROR messages

So I was archiving and then like 4 messages showed up of something like this

ERROR: xxx: YouTube said: This video contains content from FranceTV mcn, who has blocked it on copyright grounds.
Traceback (most recent call last):
  File "/usr/local/lib/python3.5/dist-packages/youtube_dl/YoutubeDL.py", line 694, in extract_info
    ie_result = ie.extract(url)
  File "/usr/local/lib/python3.5/dist-packages/youtube_dl/extractor/common.py", line 357, in extract
    return self._real_extract(url)
  File "/usr/local/lib/python3.5/dist-packages/youtube_dl/extractor/youtube.py", line 1327, in _real_extract
    expected=True, video_id=video_id)
youtube_dl.utils.ExtractorError: xxx: YouTube said: This video contains content from FranceTV mcn, who has blocked it on copyright grounds.

root@xxx:~# 

And the download/archiving process cancelled..

I entered those 4 ID's manually into .ytdlarchive file, but then it just didn't work at all for this particular channel.

root@xxx:~# tubeup https://www.youtube.com/channel/xxx
[debug] Encodings: locale UTF-8, fs utf-8, out UTF-8, pref UTF-8
[debug] youtube-dl version 2016.12.12
[debug] Python version 3.5.2+ - Linux-4.8.0-30-generic-x86_64-with-Ubuntu-16.10-yakkety
[debug] exe versions: avconv 3.0.2-1ubuntu3, avprobe 3.0.2-1ubuntu3, ffmpeg 3.0.2-1ubuntu3, ffprobe 3.0.2-1ubuntu3
[debug] Proxy map: {}
[debug] Public IP address: 2001:4xxx
root@xxx:~#

And the channel was not archived completely, I had uploaded only 15% on archive.org the rest 85% can't be done because of these ERRORs.

And what about cancelling (by cancelling I mean closing putty window) a download I have done that on other channels. It should resume DL from the last fragment, but sometimes it has recorded all the ID's in the .ytdlarchive file as already downloaded and downloads are not starting again for that channel .... It just downloads the newest videos from the channel.

I'm also now reading this

'download_archive': os.path.expanduser('~/.tubeup/.ytdlarchive'), ## I guess we will avoid doing this because it prevents failed uploads from being redone in our current system. Maybe when we turn it into an OOP library?

That's exatcly what is happening

it prevents failed uploads from being redone in our current system

cancelled archiving proceess = failed uploads
And so I can't redone them, other than deleting the .ytdlarchive file ?

If I delete it, I bet there are gonna be created/downloaded files again with _(1) at the end. And I don't want that..

Modernize the front page

  • Take as much as possible and stick it into a FAQ file.
  • Issue submission guidelines need to be spelled out.
  • Trim outdated cruft.
  • Fix my markup problems I caused.
  • FreeBSD has better manpages due to unix guys including examples. I'll need to do this too.
  • Let's make the secret flags like --use-download-archive not secret

Script fails an entire channel upload if one video has already been uploaded by someone else.

I attempted to use tubeup to automatically repost the 4,400 videos that https://www.youtube.com/channel/UC5VlSHEz9tXpli-iPEb5m5g has made, which consist of old advertisements and show bumpers which could be lost otherwise. Unfortunately, it quickly failed and died because someone else had already uploaded a single video from the channel and it didn't handle the error at all. The item that caused the error was http://archive.org/details/youtube-GjqajlJjhz4/, which was uploaded by DKL3. It needs to be able to resume uploads of a channel if a video has already been done.

Facebook Uploading Issue

$ tubeup https://www.facebook.com/GreenDay/videos/10154561156789521/

[debug] Encodings: locale UTF-8, fs utf-8, out UTF-8, pref UTF-8
[debug] youtube-dl version 2017.10.01
[debug] Python version 3.5.2 - Linux-4.10.0-35-generic-i686-with-LinuxMint-18.2-sonya
[debug] exe versions: ffmpeg 2.8.11-0ubuntu0.16.04.1, ffprobe 2.8.11-0ubuntu0.16.04.1
[debug] Proxy map: {}
[debug] Public IP address: 2601:600:9880:390:818a:d092:d4b2:18e
{'status': 'finished', 'downloaded_bytes': 38679766, '_elapsed_str': '00:06', 'total_bytes': 38679766, '_total_bytes_str': '36.89MiB', 'elapsed': 6.104719877243042, 'filename': '/home/USER/.tubeup/downloads/Green_Day_s_Bang_Bang_Video_Directed_By_Tim_Armstrong._Here_s_A_Behind_Th...-10154561156789521.f10154561198149521v.mp4'}
:: Downloaded: /home/USER/.tubeup/downloads/Green_Day_s_Bang_Bang_Video_Directed_By_Tim_Armstrong._Here_s_A_Behind_Th...-10154561156789521.f10154561198149521v.mp4...
{'status': 'finished', 'downloaded_bytes': 674879, '_elapsed_str': '00:00', 'total_bytes': 674879, '_total_bytes_str': '659.06KiB', 'elapsed': 0.5873575210571289, 'filename': '/home/USER/.tubeup/downloads/Green_Day_s_Bang_Bang_Video_Directed_By_Tim_Armstrong._Here_s_A_Behind_Th...-10154561156789521.f10154561191199521ad.m4a'}
:: Downloaded: /home/USER/.tubeup/downloads/Green_Day_s_Bang_Bang_Video_Directed_By_Tim_Armstrong._Here_s_A_Behind_Th...-10154561156789521.f10154561191199521ad.m4a...
:: Uploading /home/USER/.tubeup/downloads/Green_Day_s_Bang_Bang_Video_Directed_By_Tim_Armstrong._Here_s_A_Behind_Th...-10154561156789521v...
Traceback (most recent call last):
File "/usr/local/bin/tubeup", line 9, in
load_entry_point('tubeup==0.0.11', 'console_scripts', 'tubeup')()
File "/usr/local/lib/python3.5/dist-packages/tubeup/main.py", line 298, in main
identifier, meta = upload_ia(video, custom_meta=md)
File "/usr/local/lib/python3.5/dist-packages/tubeup/main.py", line 128, in upload_ia
with open(json_fname) as f:
FileNotFoundError: [Errno 2] No such file or directory: '/home/USER/.tubeup/downloads/Green_Day_s_Bang_Bang_Video_Directed_By_Tim_Armstrong._Here_s_A_Behind_Th...-10154561156789521v.info.json'

Make Filenames Deterministic

Confirmed tubeup is where the fix needs to be. youtube-dl downloads successfully to my local workstation. Looking for suggestions on what would be considered "safe" for sanitizing the tweet body to support a "safe" filename; we maintain the tweet identifier after the - deliminator.

tubeup https://twitter.com/StevieBuckley/status/1051762443851112448
[debug] Encodings: locale UTF-8, fs utf-8, out UTF-8, pref UTF-8
[debug] youtube-dl version 2018.10.05
[debug] Python version 3.7.0 (CPython) - Darwin-18.0.0-x86_64-i386-64bit
[debug] exe versions: ffmpeg 4.0.1, ffprobe 4.0.1, phantomjs 2.1.1, rtmpdump 2.4
[debug] Proxy map: {}
There are no annotations to write.
[download] 100.0% of ~2.61MiB at 746.84KiB/s ETA 00:00
{'downloaded_bytes': 2739348, 'total_bytes': 2739348, 'filename': '/Users/brandon/.tubeup/downloads/Stevie_Buckley_-_Managed_to_get_myself_blocked_on_LinkedIn_for_saying_that_this_is_the_best_parody_of_the_recruitment_industry_I_ve_ever_seen._Turns_out_Haigh_Associates_were_actually_being_sincer...-1051762443851112448.mp4', 'status': 'finished', 'elapsed': 5.218930959701538, '_total_bytes_str': '2.61MiB', '_elapsed_str': '00:05'}
Downloaded /Users/brandon/.tubeup/downloads/Stevie_Buckley_-_Managed_to_get_myself_blocked_on_LinkedIn_for_saying_that_this_is_the_best_parody_of_the_recruitment_industry_I_ve_ever_seen._Turns_out_Haigh_Associates_were_actually_being_sincer...-1051762443851112448.mp4
1051762443851112448: malformed AAC bitstream detected.

An exception just occured, if you found this exception isn't related with any of your connection problem, please report this issue to https://github.com/bibanon/tubeup/issues
Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/tubeup/__main__.py", line 91, in main
    use_download_archive):
  File "/usr/local/lib/python3.7/site-packages/tubeup/TubeUp.py", line 359, in archive_urls
    identifier, meta = self.upload_ia(basename, custom_meta)
  File "/usr/local/lib/python3.7/site-packages/tubeup/TubeUp.py", line 273, in upload_ia
    with open(json_metadata_filepath) as f:
FileNotFoundError: [Errno 2] No such file or directory: '/Users/brandon/.tubeup/downloads/Stevie_Buckley_-_Managed_to_get_myself_blocked_on_LinkedIn_for_saying_that_this_is_the_best_parody_of_the_recruitment_industry_I_ve_ever_seen._Turns_out_Haigh_Associates_were_actually_being_sincere_with_this_video.-1051762443851112448.info.json'

Files in local tubeup cache (.tubeup/downloads):

Stevie_Buckley_-_Managed_to_get_myself_blocked_on_LinkedIn_for_saying_that_this_is_the_best_parody_of_the_recruitment_industry_I_ve_ever_seen._Turns_out_Haigh_Associates_were_actually_being_sincer...-1051762443851112448.annotations.xml
Stevie_Buckley_-_Managed_to_get_myself_blocked_on_LinkedIn_for_saying_that_this_is_the_best_parody_of_the_recruitment_industry_I_ve_ever_seen._Turns_out_Haigh_Associates_were_actually_being_sincer...-1051762443851112448.description
Stevie_Buckley_-_Managed_to_get_myself_blocked_on_LinkedIn_for_saying_that_this_is_the_best_parody_of_the_recruitment_industry_I_ve_ever_seen._Turns_out_Haigh_Associates_were_actually_being_sincer...-1051762443851112448.info.json
Stevie_Buckley_-_Managed_to_get_myself_blocked_on_LinkedIn_for_saying_that_this_is_the_best_parody_of_the_recruitment_industry_I_ve_ever_seen._Turns_out_Haigh_Associates_were_actually_being_sincer...-1051762443851112448.jpg
Stevie_Buckley_-_Managed_to_get_myself_blocked_on_LinkedIn_for_saying_that_this_is_the_best_parody_of_the_recruitment_industry_I_ve_ever_seen._Turns_out_Haigh_Associates_were_actually_being_sincer...-1051762443851112448.mp4

Termination of Annotations collection and a call for PRs

I'm killing off the flag for annotations collection on Janurary 18th, 2019 even though the next day is when Youtube will kill it. This is being done to prevent errors with bots or projects that might be impacted by the failure to get annotations - even though there are if I recall checks put in place by Antonizoon and Refeed to prevent failure of rips in that case - all I'll be doing is zapping the depreciated annotations flag.

If you're sitting on PRs or want to make a fix, any time between now and 1/18/19 would be a dandy time to submit PRs for testing so it can be done in one shot.

TypeError: strptime() argument 1 must be str, not None

:: Uploading /root/.tubeup/downloads/Pong_AI_with_Policy_Gradients-YOW8m2YGtRg...
Traceback (most recent call last):
  File "/usr/local/bin/tubeup", line 11, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.5/dist-packages/tubeup/__main__.py", line 263, in main
    identifier, meta = upload_ia(video, custom_meta=md)
  File "/usr/local/lib/python3.5/dist-packages/tubeup/__main__.py", line 150, in upload_ia
    d = datetime.strptime(vid_meta['upload_date'], '%Y%m%d')
TypeError: strptime() argument 1 must be str, not None

Convert VTT closed caption files to SRT

I think we might be able to do this as part of an archive.org derive process, so they would be converted on archive.org servers after upload automatically. I will look into this.

Python 3.2 seems not to work

What python 3 version do I need for this project. I tried python 3.2 but got this error when downlading https://www.youtube.com/watch?v=uURbtP6oga0

python3 tubeup.py http://www.youtube.com/watch?v=uURbtP6oga0
  File "tubeup.py", line 163
    meta = dict(mediatype='movies', creator=uploader, language=language, collection=collection, title=title, description=u'{0} <br/><br/>Source: <a href="{1}">{2}</a><br/>Uploader: <a href="http://www.youtube.com/user/{3}">{4}</a><br/>Upload date: {5}'.format(description, videourl, videourl, uploader, uploader, upload_date), date=upload_date, year=upload_year, subject=tags_string, originalurl=videourl, licenseurl=(cc and 'http://creativecommons.org/licenses/by/3.0/' or ''))
                                                                                                                                                                                                                                                           ^
SyntaxError: invalid syntax

Tubeup-only errors need to go to a logfile instead of STDOUT w/ printout of file location

  • A fine place to put the log would be ~/.tubeup/errors.log

  • Youtube-dl errors (404, geo-restricted, terminated account, removed or privated video) shouldn't print to STDOUT and then a bunch of python errors. Those should be supressed and not printed to the error log. Print human readable part to STDOUT. Trash the python output from known problems with youtube-dl and video sites.

  • Similarly if a item is darked, not owned by the user or some other error it shouldn't go to STDOUT. That human readable part of the error should print to STDOUT with the corresponding log location.

  • Last line after log log location should be to make an issue here.

  • Append after a tubeup error the location of the log. Datestamp the error with year:month:day:hour. Try to make error one line, if not segment errors clearly. That way people aren't dumping their entire errors into issues and we spend 30 minutes digging through what they did that triggered the log entry.

  • Actions based on string filters is cheating (IE a long list of rules of when to log to error based on string that will change based on output) and inefficient. This should be future proof and function with little maintenance to the code in the future for lasting results. Everything this code touches including the code it's self is a library theres no reason to do strings.

Examples:

An error has occurred with Tubeup. 
The error was logged to ~/.tubeup/error.log
Please make an issue with the contents of that file with the last error your encountered and make an issue at https://github.com/bibanon/tubeup/issues

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.