Merge branch 'master' into pr_echo360

This commit is contained in:
Daniel Vogt 2024-04-18 19:05:57 +02:00 committed by GitHub
commit cad8b67755
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
821 changed files with 48565 additions and 27795 deletions

View File

@ -1,5 +1,5 @@
name: Broken site
description: Report error in a supported site
name: Broken site support
description: Report issue with yt-dlp on a supported site
labels: [triage, site-bug]
body:
- type: checkboxes
@ -16,9 +16,9 @@ body:
description: |
Carefully read and work through this check list in order to prevent the most common mistakes and misuse of yt-dlp:
options:
- label: I'm reporting that a **supported** site is broken
- label: I'm reporting that yt-dlp is broken on a **supported** site
required: true
- label: I've verified that I'm running yt-dlp version **2023.03.04** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
required: true
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
required: true
@ -61,19 +61,18 @@ body:
description: |
It should start like this:
placeholder: |
[debug] Command-line config: ['-vU', 'test:youtube']
[debug] Portable config "yt-dlp.conf": ['-i']
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
[debug] yt-dlp version 2023.03.04 [9d339c4] (win32_exe)
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
[debug] Checking exe version: ffmpeg -bsfs
[debug] Checking exe version: ffprobe -bsfs
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
[debug] Proxy map: {}
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
Latest version: 2023.03.04, Current version: 2023.03.04
yt-dlp is up to date (2023.03.04)
[debug] Request Handlers: urllib, requests
[debug] Loaded 1893 extractors
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
<more lines>
render: shell
validations:

View File

@ -18,7 +18,7 @@ body:
options:
- label: I'm reporting a new site support request
required: true
- label: I've verified that I'm running yt-dlp version **2023.03.04** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
required: true
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
required: true
@ -73,19 +73,18 @@ body:
description: |
It should start like this:
placeholder: |
[debug] Command-line config: ['-vU', 'test:youtube']
[debug] Portable config "yt-dlp.conf": ['-i']
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
[debug] yt-dlp version 2023.03.04 [9d339c4] (win32_exe)
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
[debug] Checking exe version: ffmpeg -bsfs
[debug] Checking exe version: ffprobe -bsfs
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
[debug] Proxy map: {}
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
Latest version: 2023.03.04, Current version: 2023.03.04
yt-dlp is up to date (2023.03.04)
[debug] Request Handlers: urllib, requests
[debug] Loaded 1893 extractors
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
<more lines>
render: shell
validations:

View File

@ -18,7 +18,7 @@ body:
options:
- label: I'm requesting a site-specific feature
required: true
- label: I've verified that I'm running yt-dlp version **2023.03.04** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
required: true
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
required: true
@ -69,19 +69,18 @@ body:
description: |
It should start like this:
placeholder: |
[debug] Command-line config: ['-vU', 'test:youtube']
[debug] Portable config "yt-dlp.conf": ['-i']
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
[debug] yt-dlp version 2023.03.04 [9d339c4] (win32_exe)
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
[debug] Checking exe version: ffmpeg -bsfs
[debug] Checking exe version: ffprobe -bsfs
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
[debug] Proxy map: {}
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
Latest version: 2023.03.04, Current version: 2023.03.04
yt-dlp is up to date (2023.03.04)
[debug] Request Handlers: urllib, requests
[debug] Loaded 1893 extractors
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
<more lines>
render: shell
validations:

View File

@ -1,4 +1,4 @@
name: Bug report
name: Core bug report
description: Report a bug unrelated to any particular site or extractor
labels: [triage, bug]
body:
@ -18,7 +18,7 @@ body:
options:
- label: I'm reporting a bug unrelated to a specific site
required: true
- label: I've verified that I'm running yt-dlp version **2023.03.04** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
required: true
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
required: true
@ -54,19 +54,18 @@ body:
description: |
It should start like this:
placeholder: |
[debug] Command-line config: ['-vU', 'test:youtube']
[debug] Portable config "yt-dlp.conf": ['-i']
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
[debug] yt-dlp version 2023.03.04 [9d339c4] (win32_exe)
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
[debug] Checking exe version: ffmpeg -bsfs
[debug] Checking exe version: ffprobe -bsfs
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
[debug] Proxy map: {}
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
Latest version: 2023.03.04, Current version: 2023.03.04
yt-dlp is up to date (2023.03.04)
[debug] Request Handlers: urllib, requests
[debug] Loaded 1893 extractors
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
<more lines>
render: shell
validations:

View File

@ -20,7 +20,7 @@ body:
required: true
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
required: true
- label: I've verified that I'm running yt-dlp version **2023.03.04** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
required: true
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
required: true
@ -50,18 +50,17 @@ body:
description: |
It should start like this:
placeholder: |
[debug] Command-line config: ['-vU', 'test:youtube']
[debug] Portable config "yt-dlp.conf": ['-i']
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
[debug] yt-dlp version 2023.03.04 [9d339c4] (win32_exe)
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
[debug] Checking exe version: ffmpeg -bsfs
[debug] Checking exe version: ffprobe -bsfs
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
[debug] Proxy map: {}
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
Latest version: 2023.03.04, Current version: 2023.03.04
yt-dlp is up to date (2023.03.04)
[debug] Request Handlers: urllib, requests
[debug] Loaded 1893 extractors
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
<more lines>
render: shell

View File

@ -26,7 +26,7 @@ body:
required: true
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
required: true
- label: I've verified that I'm running yt-dlp version **2023.03.04** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
required: true
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar questions **including closed ones**. DO NOT post duplicates
required: true
@ -56,18 +56,17 @@ body:
description: |
It should start like this:
placeholder: |
[debug] Command-line config: ['-vU', 'test:youtube']
[debug] Portable config "yt-dlp.conf": ['-i']
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
[debug] yt-dlp version 2023.03.04 [9d339c4] (win32_exe)
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
[debug] Checking exe version: ffmpeg -bsfs
[debug] Checking exe version: ffprobe -bsfs
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
[debug] Proxy map: {}
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
Latest version: 2023.03.04, Current version: 2023.03.04
yt-dlp is up to date (2023.03.04)
[debug] Request Handlers: urllib, requests
[debug] Loaded 1893 extractors
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
<more lines>
render: shell

View File

@ -1,5 +1,5 @@
name: Broken site
description: Report error in a supported site
name: Broken site support
description: Report issue with yt-dlp on a supported site
labels: [triage, site-bug]
body:
%(no_skip)s
@ -10,9 +10,9 @@ body:
description: |
Carefully read and work through this check list in order to prevent the most common mistakes and misuse of yt-dlp:
options:
- label: I'm reporting that a **supported** site is broken
- label: I'm reporting that yt-dlp is broken on a **supported** site
required: true
- label: I've verified that I'm running yt-dlp version **%(version)s** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
required: true
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
required: true

View File

@ -12,7 +12,7 @@ body:
options:
- label: I'm reporting a new site support request
required: true
- label: I've verified that I'm running yt-dlp version **%(version)s** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
required: true
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
required: true

View File

@ -12,7 +12,7 @@ body:
options:
- label: I'm requesting a site-specific feature
required: true
- label: I've verified that I'm running yt-dlp version **%(version)s** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
required: true
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
required: true

View File

@ -1,4 +1,4 @@
name: Bug report
name: Core bug report
description: Report a bug unrelated to any particular site or extractor
labels: [triage, bug]
body:
@ -12,7 +12,7 @@ body:
options:
- label: I'm reporting a bug unrelated to a specific site
required: true
- label: I've verified that I'm running yt-dlp version **%(version)s** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
required: true
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
required: true

View File

@ -14,7 +14,7 @@ body:
required: true
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
required: true
- label: I've verified that I'm running yt-dlp version **%(version)s** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
required: true
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
required: true

View File

@ -20,7 +20,7 @@ body:
required: true
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
required: true
- label: I've verified that I'm running yt-dlp version **%(version)s** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
required: true
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar questions **including closed ones**. DO NOT post duplicates
required: true

10
.github/banner.svg vendored

File diff suppressed because one or more lines are too long

Before

Width:  |  Height:  |  Size: 24 KiB

After

Width:  |  Height:  |  Size: 15 KiB

View File

@ -30,6 +30,10 @@ on:
meta_files:
default: true
type: boolean
origin:
required: false
default: ''
type: string
secrets:
GPG_SIGNING_KEY:
required: false
@ -37,11 +41,13 @@ on:
workflow_dispatch:
inputs:
version:
description: Version tag (YYYY.MM.DD[.REV])
description: |
VERSION: yyyy.mm.dd[.rev] or rev
required: true
type: string
channel:
description: Update channel (stable/nightly)
description: |
SOURCE of this build's updates: stable/nightly/master/<repo>
required: true
default: stable
type: string
@ -73,20 +79,40 @@ on:
description: SHA2-256SUMS, SHA2-512SUMS, _update_spec
default: true
type: boolean
origin:
description: Origin
required: false
default: 'current repo'
type: choice
options:
- 'current repo'
permissions:
contents: read
jobs:
process:
runs-on: ubuntu-latest
outputs:
origin: ${{ steps.process_origin.outputs.origin }}
steps:
- name: Process origin
id: process_origin
run: |
echo "origin=${{ inputs.origin == 'current repo' && github.repository || inputs.origin }}" | tee "$GITHUB_OUTPUT"
unix:
needs: process
if: inputs.unix
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
- uses: actions/checkout@v4
with:
fetch-depth: 0 # Needed for changelog
- uses: actions/setup-python@v5
with:
python-version: "3.10"
- uses: conda-incubator/setup-miniconda@v2
- uses: conda-incubator/setup-miniconda@v3
with:
miniforge-variant: Mambaforge
use-mamba: true
@ -96,22 +122,21 @@ jobs:
auto-activate-base: false
- name: Install Requirements
run: |
sudo apt-get -y install zip pandoc man sed
python -m pip install -U pip setuptools wheel
python -m pip install -U Pyinstaller -r requirements.txt
reqs=$(mktemp)
cat > $reqs << EOF
sudo apt -y install zip pandoc man sed
cat > ./requirements.txt << EOF
python=3.10.*
pyinstaller
cffi
brotli-python
EOF
sed '/^brotli.*/d' requirements.txt >> $reqs
mamba create -n build --file $reqs
python devscripts/install_deps.py --print \
--exclude brotli --exclude brotlicffi \
--include secretstorage >> ./requirements.txt
mamba create -n build --file ./requirements.txt
- name: Prepare
run: |
python devscripts/update-version.py -c ${{ inputs.channel }} ${{ inputs.version }}
python devscripts/update-version.py -c "${{ inputs.channel }}" -r "${{ needs.process.outputs.origin }}" "${{ inputs.version }}"
python devscripts/update_changelog.py -vv
python devscripts/make_lazy_extractors.py
- name: Build Unix platform-independent binary
run: |
@ -121,22 +146,38 @@ jobs:
run: |
unset LD_LIBRARY_PATH # Harmful; set by setup-python
conda activate build
python pyinst.py --onedir
python -m bundle.pyinstaller --onedir
(cd ./dist/yt-dlp_linux && zip -r ../yt-dlp_linux.zip .)
python pyinst.py
python -m bundle.pyinstaller
mv ./dist/yt-dlp_linux ./yt-dlp_linux
mv ./dist/yt-dlp_linux.zip ./yt-dlp_linux.zip
- name: Verify --update-to
if: vars.UPDATE_TO_VERIFICATION
run: |
binaries=("yt-dlp" "yt-dlp_linux")
for binary in "${binaries[@]}"; do
chmod +x ./${binary}
cp ./${binary} ./${binary}_downgraded
version="$(./${binary} --version)"
./${binary}_downgraded -v --update-to yt-dlp/yt-dlp@2023.03.04
downgraded_version="$(./${binary}_downgraded --version)"
[[ "$version" != "$downgraded_version" ]]
done
- name: Upload artifacts
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
with:
name: build-bin-${{ github.job }}
path: |
yt-dlp
yt-dlp.tar.gz
yt-dlp_linux
yt-dlp_linux.zip
compression-level: 0
linux_arm:
needs: process
if: inputs.linux_arm
permissions:
contents: read
@ -149,7 +190,7 @@ jobs:
- aarch64
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
path: ./repo
- name: Virtualized Install, Prepare & Build
@ -164,59 +205,105 @@ jobs:
dockerRunArgs: --volume "${PWD}/repo:/repo"
install: | # Installing Python 3.10 from the Deadsnakes repo raises errors
apt update
apt -y install zlib1g-dev python3.8 python3.8-dev python3.8-distutils python3-pip
apt -y install zlib1g-dev libffi-dev python3.8 python3.8-dev python3.8-distutils python3-pip
python3.8 -m pip install -U pip setuptools wheel
# Cannot access requirements.txt from the repo directory at this stage
python3.8 -m pip install -U Pyinstaller mutagen pycryptodomex websockets brotli certifi
# Cannot access any files from the repo directory at this stage
python3.8 -m pip install -U Pyinstaller mutagen pycryptodomex websockets brotli certifi secretstorage cffi
run: |
cd repo
python3.8 -m pip install -U Pyinstaller -r requirements.txt # Cached version may be out of date
python3.8 devscripts/update-version.py -c ${{ inputs.channel }} ${{ inputs.version }}
python3.8 devscripts/install_deps.py -o --include build
python3.8 devscripts/install_deps.py --include pyinstaller --include secretstorage # Cached version may be out of date
python3.8 devscripts/update-version.py -c "${{ inputs.channel }}" -r "${{ needs.process.outputs.origin }}" "${{ inputs.version }}"
python3.8 devscripts/make_lazy_extractors.py
python3.8 pyinst.py
python3.8 -m bundle.pyinstaller
if ${{ vars.UPDATE_TO_VERIFICATION && 'true' || 'false' }}; then
arch="${{ (matrix.architecture == 'armv7' && 'armv7l') || matrix.architecture }}"
chmod +x ./dist/yt-dlp_linux_${arch}
cp ./dist/yt-dlp_linux_${arch} ./dist/yt-dlp_linux_${arch}_downgraded
version="$(./dist/yt-dlp_linux_${arch} --version)"
./dist/yt-dlp_linux_${arch}_downgraded -v --update-to yt-dlp/yt-dlp@2023.03.04
downgraded_version="$(./dist/yt-dlp_linux_${arch}_downgraded --version)"
[[ "$version" != "$downgraded_version" ]]
fi
- name: Upload artifacts
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
with:
name: build-bin-linux_${{ matrix.architecture }}
path: | # run-on-arch-action designates armv7l as armv7
repo/dist/yt-dlp_linux_${{ (matrix.architecture == 'armv7' && 'armv7l') || matrix.architecture }}
compression-level: 0
macos:
needs: process
if: inputs.macos
runs-on: macos-11
steps:
- uses: actions/checkout@v3
# NB: In order to create a universal2 application, the version of python3 in /usr/bin has to be used
- uses: actions/checkout@v4
# NB: Building universal2 does not work with python from actions/setup-python
- name: Install Requirements
run: |
brew install coreutils
/usr/bin/python3 -m pip install -U --user pip Pyinstaller -r requirements.txt
python3 devscripts/install_deps.py --user -o --include build
python3 devscripts/install_deps.py --print --include pyinstaller > requirements.txt
# We need to ignore wheels otherwise we break universal2 builds
python3 -m pip install -U --user --no-binary :all: -r requirements.txt
# We need to fuse our own universal2 wheels for curl_cffi
python3 -m pip install -U --user delocate
mkdir curl_cffi_whls curl_cffi_universal2
python3 devscripts/install_deps.py --print -o --include curl_cffi > requirements.txt
for platform in "macosx_11_0_arm64" "macosx_11_0_x86_64"; do
python3 -m pip download \
--only-binary=:all: \
--platform "${platform}" \
--pre -d curl_cffi_whls \
-r requirements.txt
done
python3 -m delocate.cmd.delocate_fuse curl_cffi_whls/curl_cffi*.whl -w curl_cffi_universal2
python3 -m delocate.cmd.delocate_fuse curl_cffi_whls/cffi*.whl -w curl_cffi_universal2
cd curl_cffi_universal2
for wheel in *cffi*.whl; do mv -n -- "${wheel}" "${wheel/x86_64/universal2}"; done
python3 -m pip install -U --user *cffi*.whl
- name: Prepare
run: |
/usr/bin/python3 devscripts/update-version.py -c ${{ inputs.channel }} ${{ inputs.version }}
/usr/bin/python3 devscripts/make_lazy_extractors.py
python3 devscripts/update-version.py -c "${{ inputs.channel }}" -r "${{ needs.process.outputs.origin }}" "${{ inputs.version }}"
python3 devscripts/make_lazy_extractors.py
- name: Build
run: |
/usr/bin/python3 pyinst.py --target-architecture universal2 --onedir
python3 -m bundle.pyinstaller --target-architecture universal2 --onedir
(cd ./dist/yt-dlp_macos && zip -r ../yt-dlp_macos.zip .)
/usr/bin/python3 pyinst.py --target-architecture universal2
python3 -m bundle.pyinstaller --target-architecture universal2
- name: Verify --update-to
if: vars.UPDATE_TO_VERIFICATION
run: |
chmod +x ./dist/yt-dlp_macos
cp ./dist/yt-dlp_macos ./dist/yt-dlp_macos_downgraded
version="$(./dist/yt-dlp_macos --version)"
./dist/yt-dlp_macos_downgraded -v --update-to yt-dlp/yt-dlp@2023.03.04
downgraded_version="$(./dist/yt-dlp_macos_downgraded --version)"
[[ "$version" != "$downgraded_version" ]]
- name: Upload artifacts
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
with:
name: build-bin-${{ github.job }}
path: |
dist/yt-dlp_macos
dist/yt-dlp_macos.zip
compression-level: 0
macos_legacy:
needs: process
if: inputs.macos_legacy
runs-on: macos-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Install Python
# We need the official Python, because the GA ones only support newer macOS versions
env:
@ -232,89 +319,137 @@ jobs:
- name: Install Requirements
run: |
brew install coreutils
python3 -m pip install -U --user pip Pyinstaller -r requirements.txt
python3 devscripts/install_deps.py --user -o --include build
python3 devscripts/install_deps.py --user --include pyinstaller
- name: Prepare
run: |
python3 devscripts/update-version.py -c ${{ inputs.channel }} ${{ inputs.version }}
python3 devscripts/update-version.py -c "${{ inputs.channel }}" -r "${{ needs.process.outputs.origin }}" "${{ inputs.version }}"
python3 devscripts/make_lazy_extractors.py
- name: Build
run: |
python3 pyinst.py
python3 -m bundle.pyinstaller
mv dist/yt-dlp_macos dist/yt-dlp_macos_legacy
- name: Verify --update-to
if: vars.UPDATE_TO_VERIFICATION
run: |
chmod +x ./dist/yt-dlp_macos_legacy
cp ./dist/yt-dlp_macos_legacy ./dist/yt-dlp_macos_legacy_downgraded
version="$(./dist/yt-dlp_macos_legacy --version)"
./dist/yt-dlp_macos_legacy_downgraded -v --update-to yt-dlp/yt-dlp@2023.03.04
downgraded_version="$(./dist/yt-dlp_macos_legacy_downgraded --version)"
[[ "$version" != "$downgraded_version" ]]
- name: Upload artifacts
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
with:
name: build-bin-${{ github.job }}
path: |
dist/yt-dlp_macos_legacy
compression-level: 0
windows:
needs: process
if: inputs.windows
runs-on: windows-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with: # 3.8 is used for Win7 support
python-version: "3.8"
- name: Install Requirements
run: | # Custom pyinstaller built with https://github.com/yt-dlp/pyinstaller-builds
python -m pip install -U pip setuptools wheel py2exe
pip install -U "https://yt-dlp.github.io/Pyinstaller-Builds/x86_64/pyinstaller-5.8.0-py3-none-any.whl" -r requirements.txt
python devscripts/install_deps.py -o --include build
python devscripts/install_deps.py --include py2exe --include curl_cffi
python -m pip install -U "https://yt-dlp.github.io/Pyinstaller-Builds/x86_64/pyinstaller-5.8.0-py3-none-any.whl"
- name: Prepare
run: |
python devscripts/update-version.py -c ${{ inputs.channel }} ${{ inputs.version }}
python devscripts/update-version.py -c "${{ inputs.channel }}" -r "${{ needs.process.outputs.origin }}" "${{ inputs.version }}"
python devscripts/make_lazy_extractors.py
- name: Build
run: |
python setup.py py2exe
python -m bundle.py2exe
Move-Item ./dist/yt-dlp.exe ./dist/yt-dlp_min.exe
python pyinst.py
python pyinst.py --onedir
python -m bundle.pyinstaller
python -m bundle.pyinstaller --onedir
Compress-Archive -Path ./dist/yt-dlp/* -DestinationPath ./dist/yt-dlp_win.zip
- name: Verify --update-to
if: vars.UPDATE_TO_VERIFICATION
run: |
foreach ($name in @("yt-dlp","yt-dlp_min")) {
Copy-Item "./dist/${name}.exe" "./dist/${name}_downgraded.exe"
$version = & "./dist/${name}.exe" --version
& "./dist/${name}_downgraded.exe" -v --update-to yt-dlp/yt-dlp@2023.03.04
$downgraded_version = & "./dist/${name}_downgraded.exe" --version
if ($version -eq $downgraded_version) {
exit 1
}
}
- name: Upload artifacts
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
with:
name: build-bin-${{ github.job }}
path: |
dist/yt-dlp.exe
dist/yt-dlp_min.exe
dist/yt-dlp_win.zip
compression-level: 0
windows32:
needs: process
if: inputs.windows32
runs-on: windows-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with: # 3.7 is used for Vista support. See https://github.com/yt-dlp/yt-dlp/issues/390
python-version: "3.7"
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: "3.8"
architecture: "x86"
- name: Install Requirements
run: |
python -m pip install -U pip setuptools wheel
pip install -U "https://yt-dlp.github.io/Pyinstaller-Builds/i686/pyinstaller-5.8.0-py3-none-any.whl" -r requirements.txt
python devscripts/install_deps.py -o --include build
python devscripts/install_deps.py
python -m pip install -U "https://yt-dlp.github.io/Pyinstaller-Builds/i686/pyinstaller-5.8.0-py3-none-any.whl"
- name: Prepare
run: |
python devscripts/update-version.py -c ${{ inputs.channel }} ${{ inputs.version }}
python devscripts/update-version.py -c "${{ inputs.channel }}" -r "${{ needs.process.outputs.origin }}" "${{ inputs.version }}"
python devscripts/make_lazy_extractors.py
- name: Build
run: |
python pyinst.py
python -m bundle.pyinstaller
- name: Verify --update-to
if: vars.UPDATE_TO_VERIFICATION
run: |
foreach ($name in @("yt-dlp_x86")) {
Copy-Item "./dist/${name}.exe" "./dist/${name}_downgraded.exe"
$version = & "./dist/${name}.exe" --version
& "./dist/${name}_downgraded.exe" -v --update-to yt-dlp/yt-dlp@2023.03.04
$downgraded_version = & "./dist/${name}_downgraded.exe" --version
if ($version -eq $downgraded_version) {
exit 1
}
}
- name: Upload artifacts
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
with:
name: build-bin-${{ github.job }}
path: |
dist/yt-dlp_x86.exe
compression-level: 0
meta_files:
if: inputs.meta_files && always()
if: inputs.meta_files && always() && !cancelled()
needs:
- process
- unix
- linux_arm
- macos
@ -323,19 +458,33 @@ jobs:
- windows32
runs-on: ubuntu-latest
steps:
- uses: actions/download-artifact@v3
- uses: actions/download-artifact@v4
with:
path: artifact
pattern: build-bin-*
merge-multiple: true
- name: Make SHA2-SUMS files
run: |
cd ./artifact/
sha256sum * > ../SHA2-256SUMS
sha512sum * > ../SHA2-512SUMS
# make sure SHA sums are also printed to stdout
sha256sum * | tee ../SHA2-256SUMS
sha512sum * | tee ../SHA2-512SUMS
- name: Make Update spec
run: |
cat >> _update_spec << EOF
# This file is used for regulating self-update
lock 2022.08.18.36 .+ Python 3.6
lock 2022.08.18.36 .+ Python 3\.6
lock 2023.11.16 (?!win_x86_exe).+ Python 3\.7
lock 2023.11.16 win_x86_exe .+ Windows-(?:Vista|2008Server)
lockV2 yt-dlp/yt-dlp 2022.08.18.36 .+ Python 3\.6
lockV2 yt-dlp/yt-dlp 2023.11.16 (?!win_x86_exe).+ Python 3\.7
lockV2 yt-dlp/yt-dlp 2023.11.16 win_x86_exe .+ Windows-(?:Vista|2008Server)
lockV2 yt-dlp/yt-dlp-nightly-builds 2023.11.15.232826 (?!win_x86_exe).+ Python 3\.7
lockV2 yt-dlp/yt-dlp-nightly-builds 2023.11.15.232826 win_x86_exe .+ Windows-(?:Vista|2008Server)
lockV2 yt-dlp/yt-dlp-master-builds 2023.11.15.232812 (?!win_x86_exe).+ Python 3\.7
lockV2 yt-dlp/yt-dlp-master-builds 2023.11.15.232812 win_x86_exe .+ Windows-(?:Vista|2008Server)
EOF
- name: Sign checksum files
@ -349,8 +498,11 @@ jobs:
done
- name: Upload artifacts
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
with:
name: build-${{ github.job }}
path: |
SHA*SUMS*
_update_spec
SHA*SUMS*
compression-level: 0
overwrite: true

65
.github/workflows/codeql.yml vendored Normal file
View File

@ -0,0 +1,65 @@
name: "CodeQL"
on:
push:
branches: [ 'master', 'gh-pages', 'release' ]
pull_request:
# The branches below must be a subset of the branches above
branches: [ 'master' ]
schedule:
- cron: '59 11 * * 5'
jobs:
analyze:
name: Analyze
runs-on: ubuntu-latest
permissions:
actions: read
contents: read
security-events: write
strategy:
fail-fast: false
matrix:
language: [ 'python' ]
# CodeQL supports [ 'cpp', 'csharp', 'go', 'java', 'javascript', 'python', 'ruby' ]
# Use only 'java' to analyze code written in Java, Kotlin or both
# Use only 'javascript' to analyze code written in JavaScript, TypeScript or both
# Learn more about CodeQL language support at https://aka.ms/codeql-docs/language-support
steps:
- name: Checkout repository
uses: actions/checkout@v4
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v2
with:
languages: ${{ matrix.language }}
# If you wish to specify custom queries, you can do so here or in a config file.
# By default, queries listed here will override any specified in a config file.
# Prefix the list here with "+" to use these queries and those in the config file.
# For more details on CodeQL's query packs, refer to: https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs
# queries: security-extended,security-and-quality
# Autobuild attempts to build any compiled languages (C/C++, C#, Go, Java, or Swift).
# If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild
uses: github/codeql-action/autobuild@v2
# Command-line programs to run using the OS shell.
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun
# If the Autobuild fails above, remove it and uncomment the following three lines.
# modify them (or add more) to build your code if your project, please refer to the EXAMPLE below for guidance.
# - run: |
# echo "Run, Build Application using script"
# ./location_of_script_within_repo/buildscript.sh
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v2
with:
category: "/language:${{matrix.language}}"

View File

@ -1,8 +1,32 @@
name: Core Tests
on: [push, pull_request]
on:
push:
paths:
- .github/**
- devscripts/**
- test/**
- yt_dlp/**.py
- '!yt_dlp/extractor/*.py'
- yt_dlp/extractor/__init__.py
- yt_dlp/extractor/common.py
- yt_dlp/extractor/extractors.py
pull_request:
paths:
- .github/**
- devscripts/**
- test/**
- yt_dlp/**.py
- '!yt_dlp/extractor/*.py'
- yt_dlp/extractor/__init__.py
- yt_dlp/extractor/common.py
- yt_dlp/extractor/extractors.py
permissions:
contents: read
concurrency:
group: core-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: ${{ github.event_name == 'pull_request' }}
jobs:
tests:
name: Core Tests
@ -12,27 +36,26 @@ jobs:
fail-fast: false
matrix:
os: [ubuntu-latest]
# CPython 3.11 is in quick-test
python-version: ['3.8', '3.9', '3.10', pypy-3.7, pypy-3.8]
run-tests-ext: [sh]
# CPython 3.8 is in quick-test
python-version: ['3.9', '3.10', '3.11', '3.12', pypy-3.8, pypy-3.10]
include:
# atleast one of each CPython/PyPy tests must be in windows
- os: windows-latest
python-version: '3.7'
run-tests-ext: bat
python-version: '3.8'
- os: windows-latest
python-version: '3.12'
- os: windows-latest
python-version: pypy-3.9
run-tests-ext: bat
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: Install pytest
run: pip install pytest
- name: Install test requirements
run: python3 ./devscripts/install_deps.py --include dev --include curl_cffi
- name: Run tests
continue-on-error: False
run: |
python3 -m yt_dlp -v || true # Print debug head
./devscripts/run_tests.${{ matrix.run-tests-ext }} core
python3 ./devscripts/run_tests.py core

View File

@ -9,16 +9,16 @@ jobs:
if: "contains(github.event.head_commit.message, 'ci run dl')"
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: 3.9
- name: Install test requirements
run: pip install pytest
run: python3 ./devscripts/install_deps.py --include dev
- name: Run tests
continue-on-error: true
run: ./devscripts/run_tests.sh download
run: python3 ./devscripts/run_tests.py download
full:
name: Full Download Tests
@ -28,24 +28,21 @@ jobs:
fail-fast: true
matrix:
os: [ubuntu-latest]
python-version: ['3.7', '3.10', 3.11-dev, pypy-3.7, pypy-3.8]
run-tests-ext: [sh]
python-version: ['3.10', '3.11', '3.12', pypy-3.8, pypy-3.10]
include:
# atleast one of each CPython/PyPy tests must be in windows
- os: windows-latest
python-version: '3.8'
run-tests-ext: bat
- os: windows-latest
python-version: pypy-3.9
run-tests-ext: bat
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: Install pytest
run: pip install pytest
- name: Install test requirements
run: python3 ./devscripts/install_deps.py --include dev
- name: Run tests
continue-on-error: true
run: ./devscripts/run_tests.${{ matrix.run-tests-ext }} download
run: python3 ./devscripts/run_tests.py download

View File

@ -1,81 +0,0 @@
name: Publish
on:
workflow_call:
inputs:
nightly:
default: false
required: false
type: boolean
version:
required: true
type: string
target_commitish:
required: true
type: string
secrets:
ARCHIVE_REPO_TOKEN:
required: false
permissions:
contents: write
jobs:
publish:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
with:
fetch-depth: 0
- uses: actions/download-artifact@v3
- uses: actions/setup-python@v4
with:
python-version: "3.10"
- name: Generate release notes
run: |
cat >> ./RELEASE_NOTES << EOF
#### A description of the various files are in the [README](https://github.com/yt-dlp/yt-dlp#release-files)
---
<details><summary><h3>Changelog</h3></summary>
$(python ./devscripts/make_changelog.py -vv)
</details>
EOF
echo "**This is an automated nightly pre-release build**" >> ./PRERELEASE_NOTES
cat ./RELEASE_NOTES >> ./PRERELEASE_NOTES
echo "Generated from: https://github.com/${{ github.repository }}/commit/${{ inputs.target_commitish }}" >> ./ARCHIVE_NOTES
cat ./RELEASE_NOTES >> ./ARCHIVE_NOTES
- name: Archive nightly release
env:
GH_TOKEN: ${{ secrets.ARCHIVE_REPO_TOKEN }}
GH_REPO: ${{ vars.ARCHIVE_REPO }}
if: |
inputs.nightly && env.GH_TOKEN != '' && env.GH_REPO != ''
run: |
gh release create \
--notes-file ARCHIVE_NOTES \
--title "yt-dlp nightly ${{ inputs.version }}" \
${{ inputs.version }} \
artifact/*
- name: Prune old nightly release
if: inputs.nightly && !vars.ARCHIVE_REPO
env:
GH_TOKEN: ${{ github.token }}
run: |
gh release delete --yes --cleanup-tag "nightly" || true
git tag --delete "nightly" || true
sleep 5 # Enough time to cover deletion race condition
- name: Publish release${{ inputs.nightly && ' (nightly)' || '' }}
env:
GH_TOKEN: ${{ github.token }}
if: (inputs.nightly && !vars.ARCHIVE_REPO) || !inputs.nightly
run: |
gh release create \
--notes-file ${{ inputs.nightly && 'PRE' || '' }}RELEASE_NOTES \
--target ${{ inputs.target_commitish }} \
--title "yt-dlp ${{ inputs.nightly && 'nightly ' || '' }}${{ inputs.version }}" \
${{ inputs.nightly && '--prerelease "nightly"' || inputs.version }} \
artifact/*

View File

@ -9,27 +9,29 @@ jobs:
if: "!contains(github.event.head_commit.message, 'ci skip all')"
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up Python 3.11
uses: actions/setup-python@v4
- uses: actions/checkout@v4
- name: Set up Python 3.8
uses: actions/setup-python@v5
with:
python-version: '3.11'
python-version: '3.8'
- name: Install test requirements
run: pip install pytest pycryptodomex
run: python3 ./devscripts/install_deps.py --include dev
- name: Run tests
run: |
python3 -m yt_dlp -v || true
./devscripts/run_tests.sh core
python3 ./devscripts/run_tests.py core
flake8:
name: Linter
if: "!contains(github.event.head_commit.message, 'ci skip all')"
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: '3.8'
- name: Install flake8
run: pip install flake8
run: python3 ./devscripts/install_deps.py -o --include dev
- name: Make lazy extractors
run: python devscripts/make_lazy_extractors.py
run: python3 ./devscripts/make_lazy_extractors.py
- name: Run flake8
run: flake8 .

29
.github/workflows/release-master.yml vendored Normal file
View File

@ -0,0 +1,29 @@
name: Release (master)
on:
push:
branches:
- master
paths:
- "yt_dlp/**.py"
- "!yt_dlp/version.py"
- "bundle/*.py"
- "pyproject.toml"
- "Makefile"
- ".github/workflows/build.yml"
concurrency:
group: release-master
permissions:
contents: read
jobs:
release:
if: vars.BUILD_MASTER != ''
uses: ./.github/workflows/release.yml
with:
prerelease: true
source: master
permissions:
contents: write
packages: write
id-token: write # mandatory for trusted publishing
secrets: inherit

View File

@ -1,51 +1,42 @@
name: Release (nightly)
on:
push:
branches:
- master
paths:
- "yt_dlp/**.py"
- "!yt_dlp/version.py"
concurrency:
group: release-nightly
cancel-in-progress: true
schedule:
- cron: '23 23 * * *'
permissions:
contents: read
jobs:
prepare:
check_nightly:
if: vars.BUILD_NIGHTLY != ''
runs-on: ubuntu-latest
outputs:
version: ${{ steps.get_version.outputs.version }}
commit: ${{ steps.check_for_new_commits.outputs.commit }}
steps:
- uses: actions/checkout@v3
- name: Get version
id: get_version
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Check for new commits
id: check_for_new_commits
run: |
python devscripts/update-version.py "$(date -u +"%H%M%S")" | grep -Po "version=\d+(\.\d+){3}" >> "$GITHUB_OUTPUT"
relevant_files=(
"yt_dlp/*.py"
':!yt_dlp/version.py'
"bundle/*.py"
"pyproject.toml"
"Makefile"
".github/workflows/build.yml"
)
echo "commit=$(git log --format=%H -1 --since="24 hours ago" -- "${relevant_files[@]}")" | tee "$GITHUB_OUTPUT"
build:
needs: prepare
uses: ./.github/workflows/build.yml
release:
needs: [check_nightly]
if: ${{ needs.check_nightly.outputs.commit }}
uses: ./.github/workflows/release.yml
with:
version: ${{ needs.prepare.outputs.version }}
channel: nightly
permissions:
contents: read
packages: write # For package cache
secrets:
GPG_SIGNING_KEY: ${{ secrets.GPG_SIGNING_KEY }}
publish:
needs: [prepare, build]
uses: ./.github/workflows/publish.yml
secrets:
ARCHIVE_REPO_TOKEN: ${{ secrets.ARCHIVE_REPO_TOKEN }}
prerelease: true
source: nightly
permissions:
contents: write
with:
nightly: true
version: ${{ needs.prepare.outputs.version }}
target_commitish: ${{ github.sha }}
packages: write
id-token: write # mandatory for trusted publishing
secrets: inherit

View File

@ -1,5 +1,53 @@
name: Release
on: workflow_dispatch
on:
workflow_call:
inputs:
prerelease:
required: false
default: true
type: boolean
source:
required: false
default: ''
type: string
target:
required: false
default: ''
type: string
version:
required: false
default: ''
type: string
workflow_dispatch:
inputs:
source:
description: |
SOURCE of this release's updates:
channel, repo, tag, or channel/repo@tag
(default: <current_repo>)
required: false
default: ''
type: string
target:
description: |
TARGET to publish this release to:
channel, tag, or channel@tag
(default: <source> if writable else <current_repo>[@source_tag])
required: false
default: ''
type: string
version:
description: |
VERSION: yyyy.mm.dd[.rev] or rev
(default: auto-generated)
required: false
default: ''
type: string
prerelease:
description: Pre-release
default: false
type: boolean
permissions:
contents: read
@ -9,121 +57,327 @@ jobs:
contents: write
runs-on: ubuntu-latest
outputs:
version: ${{ steps.update_version.outputs.version }}
head_sha: ${{ steps.push_release.outputs.head_sha }}
channel: ${{ steps.setup_variables.outputs.channel }}
version: ${{ steps.setup_variables.outputs.version }}
target_repo: ${{ steps.setup_variables.outputs.target_repo }}
target_repo_token: ${{ steps.setup_variables.outputs.target_repo_token }}
target_tag: ${{ steps.setup_variables.outputs.target_tag }}
pypi_project: ${{ steps.setup_variables.outputs.pypi_project }}
pypi_suffix: ${{ steps.setup_variables.outputs.pypi_suffix }}
head_sha: ${{ steps.get_target.outputs.head_sha }}
steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
with:
fetch-depth: 0
- uses: actions/setup-python@v4
- uses: actions/setup-python@v5
with:
python-version: "3.10"
- name: Update version
id: update_version
- name: Process inputs
id: process_inputs
run: |
python devscripts/update-version.py ${{ vars.PUSH_VERSION_COMMIT == '' && '"$(date -u +"%H%M%S")"' || '' }} | \
grep -Po "version=\d+\.\d+\.\d+(\.\d+)?" >> "$GITHUB_OUTPUT"
cat << EOF
::group::Inputs
prerelease=${{ inputs.prerelease }}
source=${{ inputs.source }}
target=${{ inputs.target }}
version=${{ inputs.version }}
::endgroup::
EOF
IFS='@' read -r source_repo source_tag <<<"${{ inputs.source }}"
IFS='@' read -r target_repo target_tag <<<"${{ inputs.target }}"
cat << EOF >> "$GITHUB_OUTPUT"
source_repo=${source_repo}
source_tag=${source_tag}
target_repo=${target_repo}
target_tag=${target_tag}
EOF
- name: Setup variables
id: setup_variables
env:
source_repo: ${{ steps.process_inputs.outputs.source_repo }}
source_tag: ${{ steps.process_inputs.outputs.source_tag }}
target_repo: ${{ steps.process_inputs.outputs.target_repo }}
target_tag: ${{ steps.process_inputs.outputs.target_tag }}
run: |
# unholy bash monstrosity (sincere apologies)
fallback_token () {
if ${{ !secrets.ARCHIVE_REPO_TOKEN }}; then
echo "::error::Repository access secret ${target_repo_token^^} not found"
exit 1
fi
target_repo_token=ARCHIVE_REPO_TOKEN
return 0
}
source_is_channel=0
[[ "${source_repo}" == 'stable' ]] && source_repo='yt-dlp/yt-dlp'
if [[ -z "${source_repo}" ]]; then
source_repo='${{ github.repository }}'
elif [[ '${{ vars[format('{0}_archive_repo', env.source_repo)] }}' ]]; then
source_is_channel=1
source_channel='${{ vars[format('{0}_archive_repo', env.source_repo)] }}'
elif [[ -z "${source_tag}" && "${source_repo}" != */* ]]; then
source_tag="${source_repo}"
source_repo='${{ github.repository }}'
fi
resolved_source="${source_repo}"
if [[ "${source_tag}" ]]; then
resolved_source="${resolved_source}@${source_tag}"
elif [[ "${source_repo}" == 'yt-dlp/yt-dlp' ]]; then
resolved_source='stable'
fi
revision="${{ (inputs.prerelease || !vars.PUSH_VERSION_COMMIT) && '$(date -u +"%H%M%S")' || '' }}"
version="$(
python devscripts/update-version.py \
-c "${resolved_source}" -r "${{ github.repository }}" ${{ inputs.version || '$revision' }} | \
grep -Po "version=\K\d+\.\d+\.\d+(\.\d+)?")"
if [[ "${target_repo}" ]]; then
if [[ -z "${target_tag}" ]]; then
if [[ '${{ vars[format('{0}_archive_repo', env.target_repo)] }}' ]]; then
target_tag="${source_tag:-${version}}"
else
target_tag="${target_repo}"
target_repo='${{ github.repository }}'
fi
fi
if [[ "${target_repo}" != '${{ github.repository}}' ]]; then
target_repo='${{ vars[format('{0}_archive_repo', env.target_repo)] }}'
target_repo_token='${{ env.target_repo }}_archive_repo_token'
${{ !!secrets[format('{0}_archive_repo_token', env.target_repo)] }} || fallback_token
pypi_project='${{ vars[format('{0}_pypi_project', env.target_repo)] }}'
pypi_suffix='${{ vars[format('{0}_pypi_suffix', env.target_repo)] }}'
fi
else
target_tag="${source_tag:-${version}}"
if ((source_is_channel)); then
target_repo="${source_channel}"
target_repo_token='${{ env.source_repo }}_archive_repo_token'
${{ !!secrets[format('{0}_archive_repo_token', env.source_repo)] }} || fallback_token
pypi_project='${{ vars[format('{0}_pypi_project', env.source_repo)] }}'
pypi_suffix='${{ vars[format('{0}_pypi_suffix', env.source_repo)] }}'
else
target_repo='${{ github.repository }}'
fi
fi
if [[ "${target_repo}" == '${{ github.repository }}' ]] && ${{ !inputs.prerelease }}; then
pypi_project='${{ vars.PYPI_PROJECT }}'
fi
echo "::group::Output variables"
cat << EOF | tee -a "$GITHUB_OUTPUT"
channel=${resolved_source}
version=${version}
target_repo=${target_repo}
target_repo_token=${target_repo_token}
target_tag=${target_tag}
pypi_project=${pypi_project}
pypi_suffix=${pypi_suffix}
EOF
echo "::endgroup::"
- name: Update documentation
env:
version: ${{ steps.setup_variables.outputs.version }}
target_repo: ${{ steps.setup_variables.outputs.target_repo }}
if: |
!inputs.prerelease && env.target_repo == github.repository
run: |
python devscripts/update_changelog.py -vv
make doc
sed '/### /Q' Changelog.md >> ./CHANGELOG
echo '### ${{ steps.update_version.outputs.version }}' >> ./CHANGELOG
python ./devscripts/make_changelog.py -vv -c >> ./CHANGELOG
echo >> ./CHANGELOG
grep -Poz '(?s)### \d+\.\d+\.\d+.+' 'Changelog.md' | head -n -1 >> ./CHANGELOG
cat ./CHANGELOG > Changelog.md
- name: Push to release
id: push_release
env:
version: ${{ steps.setup_variables.outputs.version }}
target_repo: ${{ steps.setup_variables.outputs.target_repo }}
if: |
!inputs.prerelease && env.target_repo == github.repository
run: |
git config --global user.name github-actions
git config --global user.email github-actions@example.com
git config --global user.name "github-actions[bot]"
git config --global user.email "41898282+github-actions[bot]@users.noreply.github.com"
git add -u
git commit -m "Release ${{ steps.update_version.outputs.version }}" \
git commit -m "Release ${{ env.version }}" \
-m "Created by: ${{ github.event.sender.login }}" -m ":ci skip all :ci run dl"
git push origin --force ${{ github.event.ref }}:release
- name: Get target commitish
id: get_target
run: |
echo "head_sha=$(git rev-parse HEAD)" >> "$GITHUB_OUTPUT"
- name: Update master
if: vars.PUSH_VERSION_COMMIT != ''
env:
target_repo: ${{ steps.setup_variables.outputs.target_repo }}
if: |
vars.PUSH_VERSION_COMMIT != '' && !inputs.prerelease && env.target_repo == github.repository
run: git push origin ${{ github.event.ref }}
publish_pypi_homebrew:
needs: prepare
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: "3.10"
- name: Install Requirements
run: |
sudo apt-get -y install pandoc man
python -m pip install -U pip setuptools wheel twine
python -m pip install -U -r requirements.txt
- name: Prepare
run: |
python devscripts/update-version.py ${{ needs.prepare.outputs.version }}
python devscripts/make_lazy_extractors.py
- name: Build and publish on PyPI
env:
TWINE_USERNAME: __token__
TWINE_PASSWORD: ${{ secrets.PYPI_TOKEN }}
if: env.TWINE_PASSWORD != ''
run: |
rm -rf dist/*
make pypi-files
python devscripts/set-variant.py pip -M "You installed yt-dlp with pip or using the wheel from PyPi; Use that to update"
python setup.py sdist bdist_wheel
twine upload dist/*
- name: Checkout Homebrew repository
env:
BREW_TOKEN: ${{ secrets.BREW_TOKEN }}
PYPI_TOKEN: ${{ secrets.PYPI_TOKEN }}
if: env.BREW_TOKEN != '' && env.PYPI_TOKEN != ''
uses: actions/checkout@v3
with:
repository: yt-dlp/homebrew-taps
path: taps
ssh-key: ${{ secrets.BREW_TOKEN }}
- name: Update Homebrew Formulae
env:
BREW_TOKEN: ${{ secrets.BREW_TOKEN }}
PYPI_TOKEN: ${{ secrets.PYPI_TOKEN }}
if: env.BREW_TOKEN != '' && env.PYPI_TOKEN != ''
run: |
python devscripts/update-formulae.py taps/Formula/yt-dlp.rb "${{ needs.prepare.outputs.version }}"
git -C taps/ config user.name github-actions
git -C taps/ config user.email github-actions@example.com
git -C taps/ commit -am 'yt-dlp: ${{ needs.prepare.outputs.version }}'
git -C taps/ push
build:
needs: prepare
uses: ./.github/workflows/build.yml
with:
version: ${{ needs.prepare.outputs.version }}
channel: ${{ needs.prepare.outputs.channel }}
origin: ${{ needs.prepare.outputs.target_repo }}
permissions:
contents: read
packages: write # For package cache
secrets:
GPG_SIGNING_KEY: ${{ secrets.GPG_SIGNING_KEY }}
publish_pypi:
needs: [prepare, build]
if: ${{ needs.prepare.outputs.pypi_project }}
runs-on: ubuntu-latest
permissions:
id-token: write # mandatory for trusted publishing
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- uses: actions/setup-python@v5
with:
python-version: "3.10"
- name: Install Requirements
run: |
sudo apt -y install pandoc man
python devscripts/install_deps.py -o --include build
- name: Prepare
env:
version: ${{ needs.prepare.outputs.version }}
suffix: ${{ needs.prepare.outputs.pypi_suffix }}
channel: ${{ needs.prepare.outputs.channel }}
target_repo: ${{ needs.prepare.outputs.target_repo }}
pypi_project: ${{ needs.prepare.outputs.pypi_project }}
run: |
python devscripts/update-version.py -c "${{ env.channel }}" -r "${{ env.target_repo }}" -s "${{ env.suffix }}" "${{ env.version }}"
python devscripts/update_changelog.py -vv
python devscripts/make_lazy_extractors.py
sed -i -E '0,/(name = ")[^"]+(")/s//\1${{ env.pypi_project }}\2/' pyproject.toml
- name: Build
run: |
rm -rf dist/*
make pypi-files
printf '%s\n\n' \
'Official repository: <https://github.com/yt-dlp/yt-dlp>' \
'**PS**: Some links in this document will not work since this is a copy of the README.md from Github' > ./README.md.new
cat ./README.md >> ./README.md.new && mv -f ./README.md.new ./README.md
python devscripts/set-variant.py pip -M "You installed yt-dlp with pip or using the wheel from PyPi; Use that to update"
make clean-cache
python -m build --no-isolation .
- name: Publish to PyPI
uses: pypa/gh-action-pypi-publish@release/v1
with:
verbose: true
publish:
needs: [prepare, build]
uses: ./.github/workflows/publish.yml
permissions:
contents: write
with:
version: ${{ needs.prepare.outputs.version }}
target_commitish: ${{ needs.prepare.outputs.head_sha }}
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- uses: actions/download-artifact@v4
with:
path: artifact
pattern: build-*
merge-multiple: true
- uses: actions/setup-python@v5
with:
python-version: "3.10"
- name: Generate release notes
env:
head_sha: ${{ needs.prepare.outputs.head_sha }}
target_repo: ${{ needs.prepare.outputs.target_repo }}
target_tag: ${{ needs.prepare.outputs.target_tag }}
run: |
printf '%s' \
'[![Installation](https://img.shields.io/badge/-Which%20file%20to%20download%3F-white.svg?style=for-the-badge)]' \
'(https://github.com/${{ github.repository }}#installation "Installation instructions") ' \
'[![Discord](https://img.shields.io/discord/807245652072857610?color=blue&labelColor=555555&label=&logo=discord&style=for-the-badge)]' \
'(https://discord.gg/H5MNcFW63r "Discord") ' \
'[![Donate](https://img.shields.io/badge/_-Donate-red.svg?logo=githubsponsors&labelColor=555555&style=for-the-badge)]' \
'(https://github.com/yt-dlp/yt-dlp/blob/master/Collaborators.md#collaborators "Donate") ' \
'[![Documentation](https://img.shields.io/badge/-Docs-brightgreen.svg?style=for-the-badge&logo=GitBook&labelColor=555555)]' \
'(https://github.com/${{ github.repository }}' \
'${{ env.target_repo == github.repository && format('/tree/{0}', env.target_tag) || '' }}#readme "Documentation") ' \
${{ env.target_repo == 'yt-dlp/yt-dlp' && '\
"[![Nightly](https://img.shields.io/badge/Nightly%20builds-purple.svg?style=for-the-badge)]" \
"(https://github.com/yt-dlp/yt-dlp-nightly-builds/releases/latest \"Nightly builds\") " \
"[![Master](https://img.shields.io/badge/Master%20builds-lightblue.svg?style=for-the-badge)]" \
"(https://github.com/yt-dlp/yt-dlp-master-builds/releases/latest \"Master builds\")"' || '' }} > ./RELEASE_NOTES
printf '\n\n' >> ./RELEASE_NOTES
cat >> ./RELEASE_NOTES << EOF
#### A description of the various files are in the [README](https://github.com/${{ github.repository }}#release-files)
---
$(python ./devscripts/make_changelog.py -vv --collapsible)
EOF
printf '%s\n\n' '**This is a pre-release build**' >> ./PRERELEASE_NOTES
cat ./RELEASE_NOTES >> ./PRERELEASE_NOTES
printf '%s\n\n' 'Generated from: https://github.com/${{ github.repository }}/commit/${{ env.head_sha }}' >> ./ARCHIVE_NOTES
cat ./RELEASE_NOTES >> ./ARCHIVE_NOTES
- name: Publish to archive repo
env:
GH_TOKEN: ${{ secrets[needs.prepare.outputs.target_repo_token] }}
GH_REPO: ${{ needs.prepare.outputs.target_repo }}
version: ${{ needs.prepare.outputs.version }}
channel: ${{ needs.prepare.outputs.channel }}
if: |
inputs.prerelease && env.GH_TOKEN != '' && env.GH_REPO != '' && env.GH_REPO != github.repository
run: |
title="${{ startswith(env.GH_REPO, 'yt-dlp/') && 'yt-dlp ' || '' }}${{ env.channel }}"
gh release create \
--notes-file ARCHIVE_NOTES \
--title "${title} ${{ env.version }}" \
${{ env.version }} \
artifact/*
- name: Prune old release
env:
GH_TOKEN: ${{ github.token }}
version: ${{ needs.prepare.outputs.version }}
target_repo: ${{ needs.prepare.outputs.target_repo }}
target_tag: ${{ needs.prepare.outputs.target_tag }}
if: |
env.target_repo == github.repository && env.target_tag != env.version
run: |
gh release delete --yes --cleanup-tag "${{ env.target_tag }}" || true
git tag --delete "${{ env.target_tag }}" || true
sleep 5 # Enough time to cover deletion race condition
- name: Publish release
env:
GH_TOKEN: ${{ github.token }}
version: ${{ needs.prepare.outputs.version }}
target_repo: ${{ needs.prepare.outputs.target_repo }}
target_tag: ${{ needs.prepare.outputs.target_tag }}
head_sha: ${{ needs.prepare.outputs.head_sha }}
if: |
env.target_repo == github.repository
run: |
title="${{ github.repository == 'yt-dlp/yt-dlp' && 'yt-dlp ' || '' }}"
title+="${{ env.target_tag != env.version && format('{0} ', env.target_tag) || '' }}"
gh release create \
--notes-file ${{ inputs.prerelease && 'PRERELEASE_NOTES' || 'RELEASE_NOTES' }} \
--target ${{ env.head_sha }} \
--title "${title}${{ env.version }}" \
${{ inputs.prerelease && '--prerelease' || '' }} \
${{ env.target_tag }} \
artifact/*

3
.gitignore vendored
View File

@ -33,6 +33,7 @@ cookies
*.gif
*.jpeg
*.jpg
*.lrc
*.m4a
*.m4v
*.mhtml
@ -40,6 +41,7 @@ cookies
*.mov
*.mp3
*.mp4
*.mpg
*.mpga
*.oga
*.ogg
@ -47,6 +49,7 @@ cookies
*.png
*.sbv
*.srt
*.ssa
*.swf
*.swp
*.tt

View File

@ -79,7 +79,7 @@ Before reporting any issue, type `yt-dlp -U`. This should report that you're up-
### Is the issue already documented?
Make sure that someone has not already opened the issue you're trying to open. Search at the top of the window or browse the [GitHub Issues](https://github.com/yt-dlp/yt-dlp/search?type=Issues) of this repository. If there is an issue, feel free to write something along the lines of "This affects me as well, with version 2021.01.01. Here is some more information on the issue: ...". While some issues may be old, a new post into them often spurs rapid activity.
Make sure that someone has not already opened the issue you're trying to open. Search at the top of the window or browse the [GitHub Issues](https://github.com/yt-dlp/yt-dlp/search?type=Issues) of this repository. If there is an issue, subscribe to it to be notified when there is any progress. Unless you have something useful to add to the conversation, please refrain from commenting.
Additionally, it is also helpful to see if the issue has already been documented in the [youtube-dl issue tracker](https://github.com/ytdl-org/youtube-dl/issues). If similar issues have already been reported in youtube-dl (but not in our issue tracker), links to them can be included in your issue report here.
@ -138,14 +138,11 @@ Most users do not need to build yt-dlp and can [download the builds](https://git
To run yt-dlp as a developer, you don't need to build anything either. Simply execute
python -m yt_dlp
python3 -m yt_dlp
To run the test, simply invoke your favorite test runner, or execute a test file directly; any of the following work:
To run all the available core tests, use:
python -m unittest discover
python test/test_download.py
nosetests
pytest
python3 devscripts/run_tests.py
See item 6 of [new extractor tutorial](#adding-support-for-a-new-site) for how to run extractor specific test cases.
@ -154,7 +151,7 @@ If you want to create a build of yt-dlp yourself, you can follow the instruction
## Adding new feature or making overarching changes
Before you start writing code for implementing a new feature, open an issue explaining your feature request and atleast one use case. This allows the maintainers to decide whether such a feature is desired for the project in the first place, and will provide an avenue to discuss some implementation details. If you open a pull request for a new feature without discussing with us first, do not be surprised when we ask for large changes to the code, or even reject it outright.
Before you start writing code for implementing a new feature, open an issue explaining your feature request and at least one use case. This allows the maintainers to decide whether such a feature is desired for the project in the first place, and will provide an avenue to discuss some implementation details. If you open a pull request for a new feature without discussing with us first, do not be surprised when we ask for large changes to the code, or even reject it outright.
The same applies for changes to the documentation, code style, or overarching changes to the architecture
@ -187,15 +184,21 @@ After you have ensured this site is distributing its content legally, you can fo
'url': 'https://yourextractor.com/watch/42',
'md5': 'TODO: md5 sum of the first 10241 bytes of the video file (use --test)',
'info_dict': {
# For videos, only the 'id' and 'ext' fields are required to RUN the test:
'id': '42',
'ext': 'mp4',
'title': 'Video title goes here',
'thumbnail': r're:^https?://.*\.jpg$',
# TODO more properties, either as:
# * A value
# * MD5 checksum; start the string with md5:
# * A regular expression; start the string with re:
# * Any Python type, e.g. int or float
# Then if the test run fails, it will output the missing/incorrect fields.
# Properties can be added as:
# * A value, e.g.
# 'title': 'Video title goes here',
# * MD5 checksum; start the string with 'md5:', e.g.
# 'description': 'md5:098f6bcd4621d373cade4e832627b4f6',
# * A regular expression; start the string with 're:', e.g.
# 'thumbnail': r're:^https?://.*\.jpg$',
# * A count of elements in a list; start the string with 'count:', e.g.
# 'tags': 'count:10',
# * Any Python type, e.g.
# 'view_count': int,
}
}]
@ -215,14 +218,14 @@ After you have ensured this site is distributing its content legally, you can fo
}
```
1. Add an import in [`yt_dlp/extractor/_extractors.py`](yt_dlp/extractor/_extractors.py). Note that the class name must end with `IE`.
1. Run `python test/test_download.py TestDownload.test_YourExtractor` (note that `YourExtractor` doesn't end with `IE`). This *should fail* at first, but you can continually re-run it until you're done. If you decide to add more than one test, the tests will then be named `TestDownload.test_YourExtractor`, `TestDownload.test_YourExtractor_1`, `TestDownload.test_YourExtractor_2`, etc. Note that tests with `only_matching` key in test's dict are not counted in. You can also run all the tests in one go with `TestDownload.test_YourExtractor_all`
1. Make sure you have atleast one test for your extractor. Even if all videos covered by the extractor are expected to be inaccessible for automated testing, tests should still be added with a `skip` parameter indicating why the particular test is disabled from running.
1. Have a look at [`yt_dlp/extractor/common.py`](yt_dlp/extractor/common.py) for possible helper methods and a [detailed description of what your extractor should and may return](yt_dlp/extractor/common.py#L91-L426). Add tests and code for as many as you want.
1. Run `python3 devscripts/run_tests.py YourExtractor`. This *may fail* at first, but you can continually re-run it until you're done. Upon failure, it will output the missing fields and/or correct values which you can copy. If you decide to add more than one test, the tests will then be named `YourExtractor`, `YourExtractor_1`, `YourExtractor_2`, etc. Note that tests with an `only_matching` key in the test's dict are not included in the count. You can also run all the tests in one go with `YourExtractor_all`
1. Make sure you have at least one test for your extractor. Even if all videos covered by the extractor are expected to be inaccessible for automated testing, tests should still be added with a `skip` parameter indicating why the particular test is disabled from running.
1. Have a look at [`yt_dlp/extractor/common.py`](yt_dlp/extractor/common.py) for possible helper methods and a [detailed description of what your extractor should and may return](yt_dlp/extractor/common.py#L119-L440). Add tests and code for as many as you want.
1. Make sure your code follows [yt-dlp coding conventions](#yt-dlp-coding-conventions) and check the code with [flake8](https://flake8.pycqa.org/en/latest/index.html#quickstart):
$ flake8 yt_dlp/extractor/yourextractor.py
1. Make sure your code works under all [Python](https://www.python.org/) versions supported by yt-dlp, namely CPython and PyPy for Python 3.7 and above. Backward compatibility is not required for even older versions of Python.
1. Make sure your code works under all [Python](https://www.python.org/) versions supported by yt-dlp, namely CPython and PyPy for Python 3.8 and above. Backward compatibility is not required for even older versions of Python.
1. When the tests pass, [add](https://git-scm.com/docs/git-add) the new files, [commit](https://git-scm.com/docs/git-commit) them and [push](https://git-scm.com/docs/git-push) the result, like this:
$ git add yt_dlp/extractor/_extractors.py
@ -234,7 +237,7 @@ After you have ensured this site is distributing its content legally, you can fo
In any case, thank you very much for your contributions!
**Tip:** To test extractors that require login information, create a file `test/local_parameters.json` and add `"usenetrc": true` or your username and password in it:
**Tip:** To test extractors that require login information, create a file `test/local_parameters.json` and add `"usenetrc": true` or your `username`&`password` or `cookiefile`/`cookiesfrombrowser` in it:
```json
{
"username": "your user name",
@ -246,12 +249,12 @@ In any case, thank you very much for your contributions!
This section introduces a guide lines for writing idiomatic, robust and future-proof extractor code.
Extractors are very fragile by nature since they depend on the layout of the source data provided by 3rd party media hosters out of your control and this layout tends to change. As an extractor implementer your task is not only to write code that will extract media links and metadata correctly but also to minimize dependency on the source's layout and even to make the code foresee potential future changes and be ready for that. This is important because it will allow the extractor not to break on minor layout changes thus keeping old yt-dlp versions working. Even though this breakage issue may be easily fixed by a new version of yt-dlp, this could take some time, during which the the extractor will remain broken.
Extractors are very fragile by nature since they depend on the layout of the source data provided by 3rd party media hosters out of your control and this layout tends to change. As an extractor implementer your task is not only to write code that will extract media links and metadata correctly but also to minimize dependency on the source's layout and even to make the code foresee potential future changes and be ready for that. This is important because it will allow the extractor not to break on minor layout changes thus keeping old yt-dlp versions working. Even though this breakage issue may be easily fixed by a new version of yt-dlp, this could take some time, during which the extractor will remain broken.
### Mandatory and optional metafields
For extraction to work yt-dlp relies on metadata your extractor extracts and provides to yt-dlp expressed by an [information dictionary](yt_dlp/extractor/common.py#L91-L426) or simply *info dict*. Only the following meta fields in the *info dict* are considered mandatory for a successful extraction process by yt-dlp:
For extraction to work yt-dlp relies on metadata your extractor extracts and provides to yt-dlp expressed by an [information dictionary](yt_dlp/extractor/common.py#L119-L440) or simply *info dict*. Only the following meta fields in the *info dict* are considered mandatory for a successful extraction process by yt-dlp:
- `id` (media identifier)
- `title` (media title)
@ -261,7 +264,7 @@ The aforementioned metafields are the critical data that the extraction does not
For pornographic sites, appropriate `age_limit` must also be returned.
The extractor is allowed to return the info dict without url or formats in some special cases if it allows the user to extract usefull information with `--ignore-no-formats-error` - e.g. when the video is a live stream that has not started yet.
The extractor is allowed to return the info dict without url or formats in some special cases if it allows the user to extract useful information with `--ignore-no-formats-error` - e.g. when the video is a live stream that has not started yet.
[Any field](yt_dlp/extractor/common.py#219-L426) apart from the aforementioned ones are considered **optional**. That means that extraction should be **tolerant** to situations when sources for these fields can potentially be unavailable (even if they are always available at the moment) and **future-proof** in order not to break the extraction of general purpose mandatory fields.
@ -696,7 +699,7 @@ formats = [
### Use convenience conversion and parsing functions
Wrap all extracted numeric data into safe functions from [`yt_dlp/utils.py`](yt_dlp/utils.py): `int_or_none`, `float_or_none`. Use them for string to number conversions as well.
Wrap all extracted numeric data into safe functions from [`yt_dlp/utils/`](yt_dlp/utils/): `int_or_none`, `float_or_none`. Use them for string to number conversions as well.
Use `url_or_none` for safe URL processing.
@ -704,7 +707,7 @@ Use `traverse_obj` and `try_call` (superseeds `dict_get` and `try_get`) for safe
Use `unified_strdate` for uniform `upload_date` or any `YYYYMMDD` meta field extraction, `unified_timestamp` for uniform `timestamp` extraction, `parse_filesize` for `filesize` extraction, `parse_count` for count meta fields extraction, `parse_resolution`, `parse_duration` for `duration` extraction, `parse_age_limit` for `age_limit` extraction.
Explore [`yt_dlp/utils.py`](yt_dlp/utils.py) for more useful convenience functions.
Explore [`yt_dlp/utils/`](yt_dlp/utils/) for more useful convenience functions.
#### Examples

View File

@ -2,7 +2,6 @@ pukkandan (owner)
shirt-dev (collaborator)
coletdjnz/colethedj (collaborator)
Ashish0804 (collaborator)
nao20010128nao/Lesmiscore (collaborator)
bashonly (collaborator)
Grub4K (collaborator)
h-h-h-h
@ -409,3 +408,205 @@ Hill-98
LXYan2333
mushbite
venkata-krishnas
7vlad7
alexklapheke
arobase-che
bepvte
bergoid
blmarket
brandon-dacrib
c-basalt
CoryTibbettsDev
Cyberes
D0LLYNH0
danog
DataGhost
falbrechtskirchinger
foreignBlade
garret1317
hasezoey
hoaluvn
ItzMaxTV
ivanskodje
jo-nike
kangalio
linsui
makew0rld
menschel
mikf
mrscrapy
NDagestad
Neurognostic
NextFire
nick-cd
permunkle
pzhlkj6612
ringus1
rjy
Schmoaaaaah
sjthespian
theperfectpunk
toomyzoom
truedread
TxI5
unbeatable-101
vampirefrog
vidiot720
viktor-enzell
zhgwn
barthelmannk
berkanteber
OverlordQ
rexlambert22
Ti4eeT4e
AmanSal1
bbilly1
meliber
nnoboa
rdamas
RfadnjdExt
urectanc
nao20010128nao/Lesmiscore
04-pasha-04
aaruni96
aky-01
AmirAflak
ApoorvShah111
at-wat
davinkevin
demon071
denhotte
FinnRG
fireattack
Frankgoji
GD-Slime
hatsomatt
ifan-t
kshitiz305
kylegustavo
mabdelfattah
nathantouze
niemands
Rajeshwaran2001
RedDeffender
Rohxn16
sb0stn
SevenLives
simon300000
snixon
soundchaser128
szabyg
trainman261
trislee
wader
Yalab7
zhallgato
zhong-yiyu
Zprokkel
AS6939
drzraf
handlerug
jiru
madewokherd
xofe
awalgarg
midnightveil
naginatana
Riteo
1100101
aniolpages
bartbroere
CrendKing
Esokrates
HitomaruKonpaku
LoserFox
peci1
saintliao
shubhexists
SirElderling
almx
elivinsky
starius
TravisDupes
amir16yp
Fymyte
Ganesh910
hashFactory
kclauhk
Kyraminol
lstrojny
middlingphys
NickCis
nicodato
prettykool
S-Aarab
sonmezberkay
TSRBerry
114514ns
agibson-fl
alard
alien-developers
antonkesy
ArnauvGilotra
Arthurszzz
Bibhav48
Bl4Cc4t
boredzo
Caesim404
chkuendig
chtk
Danish-H
dasidiot
diman8
divStar
DmitryScaletta
feederbox826
gmes78
gonzalezjo
hui1601
infanf
jazz1611
jingtra
jkmartindale
johnvictorfs
llistochek
marcdumais
martinxyz
michal-repo
mrmedieval
nbr23
Nicals
Noor-5
NurTasin
pompos02
Pranaxcau
pwaldhauer
RaduManole
RalphORama
rrgomes
ruiminggu
rvsit
sefidel
shmohawk
Snack-X
src-tinkerer
stilor
syntaxsurge
t-nil
ufukk
vista-narvas
x11x
xpadev-net
Xpl0itU
YoshichikaAAA
zhijinwuu
alb
hruzgar
kasper93
leoheitmannruiz
luiso1979
nipotan
Offert4324
sta1us
Tomoka1
trwstin

File diff suppressed because it is too large Load Diff

View File

@ -8,7 +8,7 @@ You can also find lists of all [contributors of yt-dlp](CONTRIBUTORS) and [autho
## [pukkandan](https://github.com/pukkandan)
[![ko-fi](https://img.shields.io/badge/_-Ko--fi-red.svg?logo=kofi&labelColor=555555&style=for-the-badge)](https://ko-fi.com/pukkandan)
[![gh-sponsor](https://img.shields.io/badge/_-Github-red.svg?logo=github&labelColor=555555&style=for-the-badge)](https://github.com/sponsors/pukkandan)
[![gh-sponsor](https://img.shields.io/badge/_-Github-white.svg?logo=github&labelColor=555555&style=for-the-badge)](https://github.com/sponsors/pukkandan)
* Owner of the fork
@ -26,9 +26,10 @@ You can also find lists of all [contributors of yt-dlp](CONTRIBUTORS) and [autho
## [coletdjnz](https://github.com/coletdjnz)
[![gh-sponsor](https://img.shields.io/badge/_-Github-red.svg?logo=github&labelColor=555555&style=for-the-badge)](https://github.com/sponsors/coletdjnz)
[![gh-sponsor](https://img.shields.io/badge/_-Github-white.svg?logo=github&labelColor=555555&style=for-the-badge)](https://github.com/sponsors/coletdjnz)
* Improved plugin architecture
* Rewrote the networking infrastructure, implemented support for `requests`
* YouTube improvements including: age-gate bypass, private playlists, multiple-clients (to avoid throttling) and a lot of under-the-hood improvements
* Added support for new websites YoutubeWebArchive, MainStreaming, PRX, nzherald, Mediaklikk, StarTV etc
* Improved/fixed support for Patreon, panopto, gfycat, itv, pbs, SouthParkDE etc
@ -44,28 +45,19 @@ You can also find lists of all [contributors of yt-dlp](CONTRIBUTORS) and [autho
* Improved/fixed support for HiDive, HotStar, Hungama, LBRY, LinkedInLearning, Mxplayer, SonyLiv, TV2, Vimeo, VLive etc
## [Lesmiscore](https://github.com/Lesmiscore) <sub><sup>(nao20010128nao)</sup></sub>
**Bitcoin**: bc1qfd02r007cutfdjwjmyy9w23rjvtls6ncve7r3s
**Monacoin**: mona1q3tf7dzvshrhfe3md379xtvt2n22duhglv5dskr
* Download live from start to end for YouTube
* Added support for new websites AbemaTV, mildom, PixivSketch, skeb, radiko, voicy, mirrativ, openrec, whowatch, damtomo, 17.live, mixch etc
* Improved/fixed support for fc2, YahooJapanNews, tver, iwara etc
## [bashonly](https://github.com/bashonly)
* `--update-to`, automated release, nightly builds
* `--cookies-from-browser` support for Firefox containers
* Added support for new websites Genius, Kick, NBCStations, Triller, VideoKen etc
* Improved/fixed support for Anvato, Brightcove, Instagram, ParamountPlus, Reddit, SlidesLive, TikTok, Twitter, Vimeo etc
* `--update-to`, self-updater rewrite, automated/nightly/master releases
* `--cookies-from-browser` support for Firefox containers, external downloader cookie handling overhaul
* Added support for new websites like Dacast, Kick, NBCStations, Triller, VideoKen, Weverse, WrestleUniverse etc
* Improved/fixed support for Anvato, Brightcove, Reddit, SlidesLive, TikTok, Twitter, Vimeo etc
## [Grub4K](https://github.com/Grub4K)
[![ko-fi](https://img.shields.io/badge/_-Ko--fi-red.svg?logo=kofi&labelColor=555555&style=for-the-badge)](https://ko-fi.com/Grub4K) [![gh-sponsor](https://img.shields.io/badge/_-Github-red.svg?logo=github&labelColor=555555&style=for-the-badge)](https://github.com/sponsors/Grub4K)
[![gh-sponsor](https://img.shields.io/badge/_-Github-white.svg?logo=github&labelColor=555555&style=for-the-badge)](https://github.com/sponsors/Grub4K) [![ko-fi](https://img.shields.io/badge/_-Ko--fi-red.svg?logo=kofi&labelColor=555555&style=for-the-badge)](https://ko-fi.com/Grub4K)
* `--update-to`, automated release, nightly builds
* Rework internals like `traverse_obj`, various core refactors and bugs fixes
* Helped fix crunchyroll, Twitter, wrestleuniverse, wistia, slideslive etc
* `--update-to`, self-updater rewrite, automated/nightly/master releases
* Reworked internals like `traverse_obj`, various core refactors and bugs fixes
* Implemented proper progress reporting for parallel downloads
* Improved/fixed/added Bundestag, crunchyroll, pr0gramm, Twitter, WrestleUniverse etc

View File

@ -1,10 +0,0 @@
include AUTHORS
include Changelog.md
include LICENSE
include README.md
include completions/*/*
include supportedsites.md
include yt-dlp.1
include requirements.txt
recursive-include devscripts *
recursive-include test *

View File

@ -2,26 +2,29 @@ all: lazy-extractors yt-dlp doc pypi-files
clean: clean-test clean-dist
clean-all: clean clean-cache
completions: completion-bash completion-fish completion-zsh
doc: README.md CONTRIBUTING.md issuetemplates supportedsites
doc: README.md CONTRIBUTING.md CONTRIBUTORS issuetemplates supportedsites
ot: offlinetest
tar: yt-dlp.tar.gz
# Keep this list in sync with MANIFEST.in
# Keep this list in sync with pyproject.toml includes/artifacts
# intended use: when building a source distribution,
# make pypi-files && python setup.py sdist
# make pypi-files && python3 -m build -sn .
pypi-files: AUTHORS Changelog.md LICENSE README.md README.txt supportedsites \
completions yt-dlp.1 requirements.txt setup.cfg devscripts/* test/*
completions yt-dlp.1 pyproject.toml setup.cfg devscripts/* test/*
.PHONY: all clean install test tar pypi-files completions ot offlinetest codetest supportedsites
.PHONY: all clean clean-all clean-test clean-dist clean-cache \
completions completion-bash completion-fish completion-zsh \
doc issuetemplates supportedsites ot offlinetest codetest test \
tar pypi-files lazy-extractors install uninstall
clean-test:
rm -rf test/testdata/sigs/player-*.js tmp/ *.annotations.xml *.aria2 *.description *.dump *.frag \
*.frag.aria2 *.frag.urls *.info.json *.live_chat.json *.meta *.part* *.tmp *.temp *.unknown_video *.ytdl \
*.3gp *.ape *.ass *.avi *.desktop *.f4v *.flac *.flv *.gif *.jpeg *.jpg *.m4a *.m4v *.mhtml *.mkv *.mov *.mp3 \
*.mp4 *.mpga *.oga *.ogg *.opus *.png *.sbv *.srt *.swf *.swp *.tt *.ttml *.url *.vtt *.wav *.webloc *.webm *.webp
*.3gp *.ape *.ass *.avi *.desktop *.f4v *.flac *.flv *.gif *.jpeg *.jpg *.lrc *.m4a *.m4v *.mhtml *.mkv *.mov *.mp3 *.mp4 \
*.mpg *.mpga *.oga *.ogg *.opus *.png *.sbv *.srt *.ssa *.swf *.swp *.tt *.ttml *.url *.vtt *.wav *.webloc *.webm *.webp
clean-dist:
rm -rf yt-dlp.1.temp.md yt-dlp.1 README.txt MANIFEST build/ dist/ .coverage cover/ yt-dlp.tar.gz completions/ \
yt_dlp/extractor/lazy_extractors.py *.spec CONTRIBUTING.md.tmp yt-dlp yt-dlp.exe yt_dlp.egg-info/ AUTHORS .mailmap
yt_dlp/extractor/lazy_extractors.py *.spec CONTRIBUTING.md.tmp yt-dlp yt-dlp.exe yt_dlp.egg-info/ AUTHORS
clean-cache:
find . \( \
-type d -name .pytest_cache -o -type d -name __pycache__ -o -name "*.pyc" -o -name "*.class" \
@ -37,12 +40,15 @@ BINDIR ?= $(PREFIX)/bin
MANDIR ?= $(PREFIX)/man
SHAREDIR ?= $(PREFIX)/share
PYTHON ?= /usr/bin/env python3
GNUTAR ?= tar
# set SYSCONFDIR to /etc if PREFIX=/usr or PREFIX=/usr/local
SYSCONFDIR = $(shell if [ $(PREFIX) = /usr -o $(PREFIX) = /usr/local ]; then echo /etc; else echo $(PREFIX)/etc; fi)
# set markdown input format to "markdown-smart" for pandoc version 2 and to "markdown" for pandoc prior to version 2
MARKDOWN = $(shell if [ `pandoc -v | head -n1 | cut -d" " -f2 | head -c1` = "2" ]; then echo markdown-smart; else echo markdown; fi)
# set markdown input format to "markdown-smart" for pandoc version 2+ and to "markdown" for pandoc prior to version 2
PANDOC_VERSION_CMD = pandoc -v 2>/dev/null | head -n1 | cut -d' ' -f2 | head -c1
PANDOC_VERSION != $(PANDOC_VERSION_CMD)
PANDOC_VERSION ?= $(shell $(PANDOC_VERSION_CMD))
MARKDOWN_CMD = if [ "$(PANDOC_VERSION)" = "1" -o "$(PANDOC_VERSION)" = "0" ]; then echo markdown; else echo markdown-smart; fi
MARKDOWN != $(MARKDOWN_CMD)
MARKDOWN ?= $(shell $(MARKDOWN_CMD))
install: lazy-extractors yt-dlp yt-dlp.1 completions
mkdir -p $(DESTDIR)$(BINDIR)
@ -73,24 +79,28 @@ test:
offlinetest: codetest
$(PYTHON) -m pytest -k "not download"
# XXX: This is hard to maintain
CODE_FOLDERS = yt_dlp yt_dlp/downloader yt_dlp/extractor yt_dlp/postprocessor yt_dlp/compat yt_dlp/dependencies
yt-dlp: yt_dlp/*.py yt_dlp/*/*.py
CODE_FOLDERS_CMD = find yt_dlp -type f -name '__init__.py' | sed 's,/__init__.py,,' | grep -v '/__' | sort
CODE_FOLDERS != $(CODE_FOLDERS_CMD)
CODE_FOLDERS ?= $(shell $(CODE_FOLDERS_CMD))
CODE_FILES_CMD = for f in $(CODE_FOLDERS) ; do echo "$$f" | sed 's,$$,/*.py,' ; done
CODE_FILES != $(CODE_FILES_CMD)
CODE_FILES ?= $(shell $(CODE_FILES_CMD))
yt-dlp: $(CODE_FILES)
mkdir -p zip
for d in $(CODE_FOLDERS) ; do \
mkdir -p zip/$$d ;\
cp -pPR $$d/*.py zip/$$d/ ;\
done
touch -t 200001010101 zip/yt_dlp/*.py zip/yt_dlp/*/*.py
(cd zip && touch -t 200001010101 $(CODE_FILES))
mv zip/yt_dlp/__main__.py zip/
cd zip ; zip -q ../yt-dlp yt_dlp/*.py yt_dlp/*/*.py __main__.py
(cd zip && zip -q ../yt-dlp $(CODE_FILES) __main__.py)
rm -rf zip
echo '#!$(PYTHON)' > yt-dlp
cat yt-dlp.zip >> yt-dlp
rm yt-dlp.zip
chmod a+x yt-dlp
README.md: yt_dlp/*.py yt_dlp/*/*.py devscripts/make_readme.py
README.md: $(CODE_FILES) devscripts/make_readme.py
COLUMNS=80 $(PYTHON) yt_dlp/__main__.py --ignore-config --help | $(PYTHON) devscripts/make_readme.py
CONTRIBUTING.md: README.md devscripts/make_contributing.py
@ -115,24 +125,26 @@ yt-dlp.1: README.md devscripts/prepare_manpage.py
pandoc -s -f $(MARKDOWN) -t man yt-dlp.1.temp.md -o yt-dlp.1
rm -f yt-dlp.1.temp.md
completions/bash/yt-dlp: yt_dlp/*.py yt_dlp/*/*.py devscripts/bash-completion.in
completions/bash/yt-dlp: $(CODE_FILES) devscripts/bash-completion.in
mkdir -p completions/bash
$(PYTHON) devscripts/bash-completion.py
completions/zsh/_yt-dlp: yt_dlp/*.py yt_dlp/*/*.py devscripts/zsh-completion.in
completions/zsh/_yt-dlp: $(CODE_FILES) devscripts/zsh-completion.in
mkdir -p completions/zsh
$(PYTHON) devscripts/zsh-completion.py
completions/fish/yt-dlp.fish: yt_dlp/*.py yt_dlp/*/*.py devscripts/fish-completion.in
completions/fish/yt-dlp.fish: $(CODE_FILES) devscripts/fish-completion.in
mkdir -p completions/fish
$(PYTHON) devscripts/fish-completion.py
_EXTRACTOR_FILES = $(shell find yt_dlp/extractor -name '*.py' -and -not -name 'lazy_extractors.py')
_EXTRACTOR_FILES_CMD = find yt_dlp/extractor -name '*.py' -and -not -name 'lazy_extractors.py'
_EXTRACTOR_FILES != $(_EXTRACTOR_FILES_CMD)
_EXTRACTOR_FILES ?= $(shell $(_EXTRACTOR_FILES_CMD))
yt_dlp/extractor/lazy_extractors.py: devscripts/make_lazy_extractors.py devscripts/lazy_load_template.py $(_EXTRACTOR_FILES)
$(PYTHON) devscripts/make_lazy_extractors.py $@
yt-dlp.tar.gz: all
@tar -czf yt-dlp.tar.gz --transform "s|^|yt-dlp/|" --owner 0 --group 0 \
@$(GNUTAR) -czf yt-dlp.tar.gz --transform "s|^|yt-dlp/|" --owner 0 --group 0 \
--exclude '*.DS_Store' \
--exclude '*.kate-swp' \
--exclude '*.pyc' \
@ -144,12 +156,17 @@ yt-dlp.tar.gz: all
-- \
README.md supportedsites.md Changelog.md LICENSE \
CONTRIBUTING.md Collaborators.md CONTRIBUTORS AUTHORS \
Makefile MANIFEST.in yt-dlp.1 README.txt completions \
setup.py setup.cfg yt-dlp yt_dlp requirements.txt \
devscripts test
Makefile yt-dlp.1 README.txt completions .gitignore \
setup.cfg yt-dlp yt_dlp pyproject.toml devscripts test
AUTHORS: .mailmap
git shortlog -s -n | cut -f2 | sort > AUTHORS
AUTHORS: Changelog.md
@if [ -d '.git' ] && command -v git > /dev/null ; then \
echo 'Generating $@ from git commit history' ; \
git shortlog -s -n HEAD | cut -f2 | sort > $@ ; \
fi
.mailmap:
git shortlog -s -e -n | awk '!(out[$$NF]++) { $$1="";sub(/^[ \t]+/,""); print}' > .mailmap
CONTRIBUTORS: Changelog.md
@if [ -d '.git' ] && command -v git > /dev/null ; then \
echo 'Updating $@ from git commit history' ; \
$(PYTHON) devscripts/make_changelog.py -v -c > /dev/null ; \
fi

559
README.md

File diff suppressed because it is too large Load Diff

0
bundle/__init__.py Normal file
View File

59
bundle/py2exe.py Executable file
View File

@ -0,0 +1,59 @@
#!/usr/bin/env python3
# Allow execution from anywhere
import os
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import warnings
from py2exe import freeze
from devscripts.utils import read_version
VERSION = read_version()
def main():
warnings.warn(
'py2exe builds do not support pycryptodomex and needs VC++14 to run. '
'It is recommended to run "pyinst.py" to build using pyinstaller instead')
freeze(
console=[{
'script': './yt_dlp/__main__.py',
'dest_base': 'yt-dlp',
'icon_resources': [(1, 'devscripts/logo.ico')],
}],
version_info={
'version': VERSION,
'description': 'A feature-rich command-line audio/video downloader',
'comments': 'Official repository: <https://github.com/yt-dlp/yt-dlp>',
'product_name': 'yt-dlp',
'product_version': VERSION,
},
options={
'bundle_files': 0,
'compressed': 1,
'optimize': 2,
'dist_dir': './dist',
'excludes': [
# py2exe cannot import Crypto
'Crypto',
'Cryptodome',
# py2exe appears to confuse this with our socks library.
# We don't use pysocks and urllib3.contrib.socks would fail to import if tried.
'urllib3.contrib.socks'
],
'dll_excludes': ['w9xpopen.exe', 'crypt32.dll'],
# Modules that are only imported dynamically must be added here
'includes': ['yt_dlp.compat._legacy', 'yt_dlp.compat._deprecated',
'yt_dlp.utils._legacy', 'yt_dlp.utils._deprecated'],
},
zipfile=None,
)
if __name__ == '__main__':
main()

132
bundle/pyinstaller.py Executable file
View File

@ -0,0 +1,132 @@
#!/usr/bin/env python3
# Allow direct execution
import os
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import platform
from PyInstaller.__main__ import run as run_pyinstaller
from devscripts.utils import read_version
OS_NAME, MACHINE, ARCH = sys.platform, platform.machine().lower(), platform.architecture()[0][:2]
if MACHINE in ('x86', 'x86_64', 'amd64', 'i386', 'i686'):
MACHINE = 'x86' if ARCH == '32' else ''
def main():
opts, version = parse_options(), read_version()
onedir = '--onedir' in opts or '-D' in opts
if not onedir and '-F' not in opts and '--onefile' not in opts:
opts.append('--onefile')
name, final_file = exe(onedir)
print(f'Building yt-dlp v{version} for {OS_NAME} {platform.machine()} with options {opts}')
print('Remember to update the version using "devscripts/update-version.py"')
if not os.path.isfile('yt_dlp/extractor/lazy_extractors.py'):
print('WARNING: Building without lazy_extractors. Run '
'"devscripts/make_lazy_extractors.py" to build lazy extractors', file=sys.stderr)
print(f'Destination: {final_file}\n')
opts = [
f'--name={name}',
'--icon=devscripts/logo.ico',
'--upx-exclude=vcruntime140.dll',
'--noconfirm',
'--additional-hooks-dir=yt_dlp/__pyinstaller',
*opts,
'yt_dlp/__main__.py',
]
print(f'Running PyInstaller with {opts}')
run_pyinstaller(opts)
set_version_info(final_file, version)
def parse_options():
# Compatibility with older arguments
opts = sys.argv[1:]
if opts[0:1] in (['32'], ['64']):
if ARCH != opts[0]:
raise Exception(f'{opts[0]}bit executable cannot be built on a {ARCH}bit system')
opts = opts[1:]
return opts
def exe(onedir):
"""@returns (name, path)"""
name = '_'.join(filter(None, (
'yt-dlp',
{'win32': '', 'darwin': 'macos'}.get(OS_NAME, OS_NAME),
MACHINE,
)))
return name, ''.join(filter(None, (
'dist/',
onedir and f'{name}/',
name,
OS_NAME == 'win32' and '.exe'
)))
def version_to_list(version):
version_list = version.split('.')
return list(map(int, version_list)) + [0] * (4 - len(version_list))
def set_version_info(exe, version):
if OS_NAME == 'win32':
windows_set_version(exe, version)
def windows_set_version(exe, version):
from PyInstaller.utils.win32.versioninfo import (
FixedFileInfo,
StringFileInfo,
StringStruct,
StringTable,
VarFileInfo,
VarStruct,
VSVersionInfo,
)
try:
from PyInstaller.utils.win32.versioninfo import SetVersion
except ImportError: # Pyinstaller >= 5.8
from PyInstaller.utils.win32.versioninfo import write_version_info_to_executable as SetVersion
version_list = version_to_list(version)
suffix = MACHINE and f'_{MACHINE}'
SetVersion(exe, VSVersionInfo(
ffi=FixedFileInfo(
filevers=version_list,
prodvers=version_list,
mask=0x3F,
flags=0x0,
OS=0x4,
fileType=0x1,
subtype=0x0,
date=(0, 0),
),
kids=[
StringFileInfo([StringTable('040904B0', [
StringStruct('Comments', 'yt-dlp%s Command Line Interface' % suffix),
StringStruct('CompanyName', 'https://github.com/yt-dlp'),
StringStruct('FileDescription', 'yt-dlp%s' % (MACHINE and f' ({MACHINE})')),
StringStruct('FileVersion', version),
StringStruct('InternalName', f'yt-dlp{suffix}'),
StringStruct('LegalCopyright', 'pukkandan.ytdlp@gmail.com | UNLICENSE'),
StringStruct('OriginalFilename', f'yt-dlp{suffix}.exe'),
StringStruct('ProductName', f'yt-dlp{suffix}'),
StringStruct(
'ProductVersion', f'{version}{suffix} on Python {platform.python_version()}'),
])]), VarFileInfo([VarStruct('Translation', [0, 1200])])
]
))
if __name__ == '__main__':
main()

Binary file not shown.

Binary file not shown.

View File

@ -1 +0,0 @@
# Empty file needed to make devscripts.utils properly importable from outside

View File

@ -1,12 +1,151 @@
[
{
"action": "add",
"when": "776d1c3f0c9b00399896dd2e40e78e9a43218109",
"when": "29cb20bd563c02671b31dd840139e93dd37150a1",
"short": "[priority] **A new release type has been added!**\n * [`nightly`](https://github.com/yt-dlp/yt-dlp/releases/tag/nightly) builds will be made after each push, containing the latest fixes (but also possibly bugs).\n * When using `--update`/`-U`, a release binary will only update to its current channel (either `stable` or `nightly`).\n * The `--update-to` option has been added allowing the user more control over program upgrades (or downgrades).\n * `--update-to` can change the release channel (`stable`, `nightly`) and also upgrade or downgrade to specific tags.\n * **Usage**: `--update-to CHANNEL`, `--update-to TAG`, `--update-to CHANNEL@TAG`"
},
{
"action": "add",
"when": "776d1c3f0c9b00399896dd2e40e78e9a43218109",
"when": "5038f6d713303e0967d002216e7a88652401c22a",
"short": "[priority] **YouTube throttling fixes!**"
},
{
"action": "remove",
"when": "2e023649ea4e11151545a34dc1360c114981a236"
},
{
"action": "add",
"when": "01aba2519a0884ef17d5f85608dbd2a455577147",
"short": "[priority] YouTube: Improved throttling and signature fixes"
},
{
"action": "change",
"when": "c86e433c35fe5da6cb29f3539eef97497f84ed38",
"short": "[extractor/niconico:series] Fix extraction (#6898)",
"authors": ["sqrtNOT"]
},
{
"action": "change",
"when": "69a40e4a7f6caa5662527ebd2f3c4e8aa02857a2",
"short": "[extractor/youtube:music_search_url] Extract title (#7102)",
"authors": ["kangalio"]
},
{
"action": "change",
"when": "8417f26b8a819cd7ffcd4e000ca3e45033e670fb",
"short": "Add option `--color` (#6904)",
"authors": ["Grub4K"]
},
{
"action": "change",
"when": "b4e0d75848e9447cee2cd3646ce54d4744a7ff56",
"short": "Improve `--download-sections`\n - Support negative time-ranges\n - Add `*from-url` to obey time-ranges in URL",
"authors": ["pukkandan"]
},
{
"action": "change",
"when": "1e75d97db21152acc764b30a688e516f04b8a142",
"short": "[extractor/youtube] Add `ios` to default clients used\n - IOS is affected neither by 403 nor by nsig so helps mitigate them preemptively\n - IOS also has higher bit-rate 'premium' formats though they are not labeled as such",
"authors": ["pukkandan"]
},
{
"action": "change",
"when": "f2ff0f6f1914b82d4a51681a72cc0828115dcb4a",
"short": "[extractor/motherless] Add gallery support, fix groups (#7211)",
"authors": ["rexlambert22", "Ti4eeT4e"]
},
{
"action": "change",
"when": "a4486bfc1dc7057efca9dd3fe70d7fa25c56f700",
"short": "[misc] Revert \"Add automatic duplicate issue detection\"",
"authors": ["pukkandan"]
},
{
"action": "add",
"when": "1ceb657bdd254ad961489e5060f2ccc7d556b729",
"short": "[priority] Security: [[CVE-2023-35934](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2023-35934)] Fix [Cookie leak](https://github.com/yt-dlp/yt-dlp/security/advisories/GHSA-v8mc-9377-rwjj)\n - `--add-header Cookie:` is deprecated and auto-scoped to input URL domains\n - Cookies are scoped when passed to external downloaders\n - Add `cookies` field to info.json and deprecate `http_headers.Cookie`"
},
{
"action": "change",
"when": "b03fa7834579a01cc5fba48c0e73488a16683d48",
"short": "[ie/twitter] Revert 92315c03774cfabb3a921884326beb4b981f786b",
"authors": ["pukkandan"]
},
{
"action": "change",
"when": "fcd6a76adc49d5cd8783985c7ce35384b72e545f",
"short": "[test] Add tests for socks proxies (#7908)",
"authors": ["coletdjnz"]
},
{
"action": "change",
"when": "4bf912282a34b58b6b35d8f7e6be535770c89c76",
"short": "[rh:urllib] Remove dot segments during URL normalization (#7662)",
"authors": ["coletdjnz"]
},
{
"action": "change",
"when": "59e92b1f1833440bb2190f847eb735cf0f90bc85",
"short": "[rh:urllib] Simplify gzip decoding (#7611)",
"authors": ["Grub4K"]
},
{
"action": "add",
"when": "c1d71d0d9f41db5e4306c86af232f5f6220a130b",
"short": "[priority] **The minimum *recommended* Python version has been raised to 3.8**\nSince Python 3.7 has reached end-of-life, support for it will be dropped soon. [Read more](https://github.com/yt-dlp/yt-dlp/issues/7803)"
},
{
"action": "add",
"when": "61bdf15fc7400601c3da1aa7a43917310a5bf391",
"short": "[priority] Security: [[CVE-2023-40581](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2023-40581)] [Prevent RCE when using `--exec` with `%q` on Windows](https://github.com/yt-dlp/yt-dlp/security/advisories/GHSA-42h4-v29r-42qg)\n - The shell escape function is now using `\"\"` instead of `\\\"`.\n - `utils.Popen` has been patched to properly quote commands."
},
{
"action": "change",
"when": "8a8b54523addf46dfd50ef599761a81bc22362e6",
"short": "[rh:requests] Add handler for `requests` HTTP library (#3668)\n\n\tAdds support for HTTPS proxies and persistent connections (keep-alive)",
"authors": ["bashonly", "coletdjnz", "Grub4K"]
},
{
"action": "add",
"when": "1d03633c5a1621b9f3a756f0a4f9dc61fab3aeaa",
"short": "[priority] **The release channels have been adjusted!**\n\t* [`master`](https://github.com/yt-dlp/yt-dlp-master-builds) builds are made after each push, containing the latest fixes (but also possibly bugs). This was previously the `nightly` channel.\n\t* [`nightly`](https://github.com/yt-dlp/yt-dlp-nightly-builds) builds are now made once a day, if there were any changes."
},
{
"action": "add",
"when": "f04b5bedad7b281bee9814686bba1762bae092eb",
"short": "[priority] Security: [[CVE-2023-46121](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2023-46121)] Patch [Generic Extractor MITM Vulnerability via Arbitrary Proxy Injection](https://github.com/yt-dlp/yt-dlp/security/advisories/GHSA-3ch3-jhc6-5r8x)\n\t- Disallow smuggling of arbitrary `http_headers`; extractors now only use specific headers"
},
{
"action": "change",
"when": "15f22b4880b6b3f71f350c64d70976ae65b9f1ca",
"short": "[webvtt] Allow spaces before newlines for CueBlock (#7681)",
"authors": ["TSRBerry"]
},
{
"action": "change",
"when": "4ce57d3b873c2887814cbec03d029533e82f7db5",
"short": "[ie] Support multi-period MPD streams (#6654)",
"authors": ["alard", "pukkandan"]
},
{
"action": "change",
"when": "aa7e9ae4f48276bd5d0173966c77db9484f65a0a",
"short": "[ie/xvideos] Support new URL format (#9502)",
"authors": ["sta1us"]
},
{
"action": "remove",
"when": "22e4dfacb61f62dfbb3eb41b31c7b69ba1059b80"
},
{
"action": "change",
"when": "e3a3ed8a981d9395c4859b6ef56cd02bc3148db2",
"short": "[cleanup:ie] No `from` stdlib imports in extractors",
"authors": ["pukkandan"]
},
{
"action": "add",
"when": "9590cc6b4768e190183d7d071a6c78170889116a",
"short": "[priority] Security: [[CVE-2024-22423](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2024-22423)] [Prevent RCE when using `--exec` with `%q` on Windows](https://github.com/yt-dlp/yt-dlp/security/advisories/GHSA-hjq6-52gw-2g7p)\n - The shell escape function now properly escapes `%`, `\\` and `\\n`.\n - `utils.Popen` has been patched accordingly."
}
]

48
devscripts/cli_to_api.py Normal file
View File

@ -0,0 +1,48 @@
# Allow direct execution
import os
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import yt_dlp
import yt_dlp.options
create_parser = yt_dlp.options.create_parser
def parse_patched_options(opts):
patched_parser = create_parser()
patched_parser.defaults.update({
'ignoreerrors': False,
'retries': 0,
'fragment_retries': 0,
'extract_flat': False,
'concat_playlist': 'never',
})
yt_dlp.options.create_parser = lambda: patched_parser
try:
return yt_dlp.parse_options(opts)
finally:
yt_dlp.options.create_parser = create_parser
default_opts = parse_patched_options([]).ydl_opts
def cli_to_api(opts, cli_defaults=False):
opts = (yt_dlp.parse_options if cli_defaults else parse_patched_options)(opts).ydl_opts
diff = {k: v for k, v in opts.items() if default_opts[k] != v}
if 'postprocessors' in diff:
diff['postprocessors'] = [pp for pp in diff['postprocessors']
if pp not in default_opts['postprocessors']]
return diff
if __name__ == '__main__':
from pprint import pprint
print('\nThe arguments passed translate to:\n')
pprint(cli_to_api(sys.argv[1:]))
print('\nCombining these with the CLI defaults gives:\n')
pprint(cli_to_api(sys.argv[1:], True))

73
devscripts/install_deps.py Executable file
View File

@ -0,0 +1,73 @@
#!/usr/bin/env python3
# Allow execution from anywhere
import os
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import argparse
import re
import subprocess
from pathlib import Path
from devscripts.tomlparse import parse_toml
from devscripts.utils import read_file
def parse_args():
parser = argparse.ArgumentParser(description='Install dependencies for yt-dlp')
parser.add_argument(
'input', nargs='?', metavar='TOMLFILE', default=Path(__file__).parent.parent / 'pyproject.toml',
help='input file (default: %(default)s)')
parser.add_argument(
'-e', '--exclude', metavar='DEPENDENCY', action='append',
help='exclude a dependency')
parser.add_argument(
'-i', '--include', metavar='GROUP', action='append',
help='include an optional dependency group')
parser.add_argument(
'-o', '--only-optional', action='store_true',
help='only install optional dependencies')
parser.add_argument(
'-p', '--print', action='store_true',
help='only print requirements to stdout')
parser.add_argument(
'-u', '--user', action='store_true',
help='install with pip as --user')
return parser.parse_args()
def main():
args = parse_args()
project_table = parse_toml(read_file(args.input))['project']
optional_groups = project_table['optional-dependencies']
excludes = args.exclude or []
targets = []
if not args.only_optional: # `-o` should exclude 'dependencies' and the 'default' group
targets.extend(project_table['dependencies'])
if 'default' not in excludes: # `--exclude default` should exclude entire 'default' group
targets.extend(optional_groups['default'])
for include in filter(None, map(optional_groups.get, args.include or [])):
targets.extend(include)
targets = [t for t in targets if re.match(r'[\w-]+', t).group(0).lower() not in excludes]
if args.print:
for target in targets:
print(target)
return
pip_args = [sys.executable, '-m', 'pip', 'install', '-U']
if args.user:
pip_args.append('--user')
pip_args.extend(targets)
return subprocess.call(pip_args)
if __name__ == '__main__':
sys.exit(main())

View File

@ -6,6 +6,7 @@ from ..utils import (
age_restricted,
bug_reports_message,
classproperty,
variadic,
write_string,
)

View File

@ -26,58 +26,60 @@ logger = logging.getLogger(__name__)
class CommitGroup(enum.Enum):
UPSTREAM = None
PRIORITY = 'Important'
CORE = 'Core'
EXTRACTOR = 'Extractor'
DOWNLOADER = 'Downloader'
POSTPROCESSOR = 'Postprocessor'
NETWORKING = 'Networking'
MISC = 'Misc.'
@classmethod
@lru_cache
def commit_lookup(cls):
def subgroup_lookup(cls):
return {
name: group
for group, names in {
cls.PRIORITY: {''},
cls.UPSTREAM: {'upstream'},
cls.CORE: {
'aes',
'cache',
'compat_utils',
'compat',
'cookies',
'core',
'dependencies',
'jsinterp',
'outtmpl',
'plugins',
'update',
'utils',
},
cls.MISC: {
'build',
'ci',
'cleanup',
'devscripts',
'docs',
'misc',
'test',
},
cls.EXTRACTOR: {'extractor', 'extractors'},
cls.DOWNLOADER: {'downloader'},
cls.POSTPROCESSOR: {'postprocessor'},
cls.NETWORKING: {
'rh',
},
}.items()
for name in names
}
@classmethod
def get(cls, value):
result = cls.commit_lookup().get(value)
if result:
logger.debug(f'Mapped {value!r} => {result.name}')
@lru_cache
def group_lookup(cls):
result = {
'fd': cls.DOWNLOADER,
'ie': cls.EXTRACTOR,
'pp': cls.POSTPROCESSOR,
'upstream': cls.CORE,
}
result.update({item.name.lower(): item for item in iter(cls)})
return result
@classmethod
def get(cls, value: str) -> tuple[CommitGroup | None, str | None]:
group, _, subgroup = (group.strip().lower() for group in value.partition('/'))
result = cls.group_lookup().get(group)
if not result:
if subgroup:
return None, value
subgroup = group
result = cls.subgroup_lookup().get(subgroup)
return result, subgroup or None
@dataclass
class Commit:
@ -111,22 +113,36 @@ class CommitInfo:
return ((self.details or '').lower(), self.sub_details, self.message)
def unique(items):
return sorted({item.strip().lower(): item for item in items if item}.values())
class Changelog:
MISC_RE = re.compile(r'(?:^|\b)(?:lint(?:ing)?|misc|format(?:ting)?|fixes)(?:\b|$)', re.IGNORECASE)
ALWAYS_SHOWN = (CommitGroup.PRIORITY,)
def __init__(self, groups, repo):
def __init__(self, groups, repo, collapsible=False):
self._groups = groups
self._repo = repo
self._collapsible = collapsible
def __str__(self):
return '\n'.join(self._format_groups(self._groups)).replace('\t', ' ')
def _format_groups(self, groups):
first = True
for item in CommitGroup:
if self._collapsible and item not in self.ALWAYS_SHOWN and first:
first = False
yield '\n<details><summary><h3>Changelog</h3></summary>\n'
group = groups[item]
if group:
yield self.format_module(item.value, group)
if self._collapsible:
yield '\n</details>'
def format_module(self, name, group):
result = f'\n#### {name} changes\n' if name else '\n'
return result + '\n'.join(self._format_group(group))
@ -137,65 +153,59 @@ class Changelog:
for _, items in detail_groups:
items = list(items)
details = items[0].details
if not details:
indent = ''
else:
yield f'- {details}'
indent = '\t'
if details == 'cleanup':
items, cleanup_misc_items = self._filter_cleanup_misc_items(items)
items = self._prepare_cleanup_misc_items(items)
prefix = '-'
if details:
if len(items) == 1:
prefix = f'- **{details}**:'
else:
yield f'- **{details}**'
prefix = '\t-'
sub_detail_groups = itertools.groupby(items, lambda item: tuple(map(str.lower, item.sub_details)))
for sub_details, entries in sub_detail_groups:
if not sub_details:
for entry in entries:
yield f'{indent}- {self.format_single_change(entry)}'
yield f'{prefix} {self.format_single_change(entry)}'
continue
entries = list(entries)
prefix = f'{indent}- {", ".join(entries[0].sub_details)}'
sub_prefix = f'{prefix} {", ".join(entries[0].sub_details)}'
if len(entries) == 1:
yield f'{prefix}: {self.format_single_change(entries[0])}'
yield f'{sub_prefix}: {self.format_single_change(entries[0])}'
continue
yield prefix
yield sub_prefix
for entry in entries:
yield f'{indent}\t- {self.format_single_change(entry)}'
yield f'\t{prefix} {self.format_single_change(entry)}'
if details == 'cleanup' and cleanup_misc_items:
yield from self._format_cleanup_misc_sub_group(cleanup_misc_items)
def _filter_cleanup_misc_items(self, items):
def _prepare_cleanup_misc_items(self, items):
cleanup_misc_items = defaultdict(list)
non_misc_items = []
sorted_items = []
for item in items:
if self.MISC_RE.search(item.message):
cleanup_misc_items[tuple(item.commit.authors)].append(item)
else:
non_misc_items.append(item)
sorted_items.append(item)
return non_misc_items, cleanup_misc_items
for commit_infos in cleanup_misc_items.values():
sorted_items.append(CommitInfo(
'cleanup', ('Miscellaneous',), ', '.join(
self._format_message_link(None, info.commit.hash)
for info in sorted(commit_infos, key=lambda item: item.commit.hash or '')),
[], Commit(None, '', commit_infos[0].commit.authors), []))
def _format_cleanup_misc_sub_group(self, group):
prefix = '\t- Miscellaneous'
if len(group) == 1:
yield f'{prefix}: {next(self._format_cleanup_misc_items(group))}'
return
return sorted_items
yield prefix
for message in self._format_cleanup_misc_items(group):
yield f'\t\t- {message}'
def format_single_change(self, info: CommitInfo):
message, sep, rest = info.message.partition('\n')
if '[' not in message:
# If the message doesn't already contain markdown links, try to add a link to the commit
message = self._format_message_link(message, info.commit.hash)
def _format_cleanup_misc_items(self, group):
for authors, infos in group.items():
message = ', '.join(
self._format_message_link(None, info.commit.hash)
for info in sorted(infos, key=lambda item: item.commit.hash or ''))
yield f'{message} by {self._format_authors(authors)}'
def format_single_change(self, info):
message = self._format_message_link(info.message, info.commit.hash)
if info.issues:
message = f'{message} ({self._format_issues(info.issues)})'
@ -211,7 +221,7 @@ class Changelog:
message = f'{message} (With fixes in {fix_message})'
return message
return message if not sep else f'{message}{sep}{rest}'
def _format_message_link(self, message, hash):
assert message or hash, 'Improperly defined commit message or override'
@ -236,17 +246,14 @@ class CommitRange:
AUTHOR_INDICATOR_RE = re.compile(r'Authored by:? ', re.IGNORECASE)
MESSAGE_RE = re.compile(r'''
(?:\[
(?P<prefix>[^\]\/:,]+)
(?:/(?P<details>[^\]:,]+))?
(?:[:,](?P<sub_details>[^\]]+))?
\]\ )?
(?:(?P<sub_details_alt>`?[^:`]+`?): )?
(?:\[(?P<prefix>[^\]]+)\]\ )?
(?:(?P<sub_details>`?[\w.-]+`?): )?
(?P<message>.+?)
(?:\ \((?P<issues>\#\d+(?:,\ \#\d+)*)\))?
''', re.VERBOSE | re.DOTALL)
EXTRACTOR_INDICATOR_RE = re.compile(r'(?:Fix|Add)\s+Extractors?', re.IGNORECASE)
FIXES_RE = re.compile(r'(?i:Fix(?:es)?(?:\s+bugs?)?(?:\s+in|\s+for)?|Revert)\s+([\da-f]{40})')
REVERT_RE = re.compile(r'(?:\[[^\]]+\]\s+)?(?i:Revert)\s+([\da-f]{40})')
FIXES_RE = re.compile(r'(?i:Fix(?:es)?(?:\s+bugs?)?(?:\s+in|\s+for)?|Revert|Improve)\s+([\da-f]{40})')
UPSTREAM_MERGE_RE = re.compile(r'Update to ytdl-commit-([\da-f]+)')
def __init__(self, start, end, default_author=None):
@ -273,7 +280,7 @@ class CommitRange:
self.COMMAND, 'log', f'--format=%H%n%s%n%b%n{self.COMMIT_SEPARATOR}',
f'{self._start}..{self._end}' if self._start else self._end).stdout
commits = {}
commits, reverts = {}, {}
fixes = defaultdict(list)
lines = iter(result.splitlines(False))
for i, commit_hash in enumerate(lines):
@ -294,6 +301,11 @@ class CommitRange:
logger.debug(f'Reached Release commit, breaking: {commit}')
break
revert_match = self.REVERT_RE.fullmatch(commit.short)
if revert_match:
reverts[revert_match.group(1)] = commit
continue
fix_match = self.FIXES_RE.search(commit.short)
if fix_match:
commitish = fix_match.group(1)
@ -301,6 +313,13 @@ class CommitRange:
commits[commit.hash] = commit
for commitish, revert_commit in reverts.items():
reverted = commits.pop(commitish, None)
if reverted:
logger.debug(f'{commitish} fully reverted {reverted}')
else:
commits[revert_commit.hash] = revert_commit
for commitish, fix_commits in fixes.items():
if commitish in commits:
hashes = ', '.join(commit.hash[:HASH_LENGTH] for commit in fix_commits)
@ -316,10 +335,10 @@ class CommitRange:
for override in overrides:
when = override.get('when')
if when and when not in self and when != self._start:
logger.debug(f'Ignored {when!r}, not in commits {self._start!r}')
logger.debug(f'Ignored {when!r} override')
continue
override_hash = override.get('hash')
override_hash = override.get('hash') or when
if override['action'] == 'add':
commit = Commit(override.get('hash'), override['short'], override.get('authors') or [])
logger.info(f'ADD {commit}')
@ -333,67 +352,78 @@ class CommitRange:
elif override['action'] == 'change':
if override_hash not in self._commits:
continue
commit = Commit(override_hash, override['short'], override['authors'])
commit = Commit(override_hash, override['short'], override.get('authors') or [])
logger.info(f'CHANGE {self._commits[commit.hash]} -> {commit}')
self._commits[commit.hash] = commit
self._commits = {key: value for key, value in reversed(self._commits.items())}
def groups(self):
groups = defaultdict(list)
group_dict = defaultdict(list)
for commit in self:
upstream_re = self.UPSTREAM_MERGE_RE.match(commit.short)
upstream_re = self.UPSTREAM_MERGE_RE.search(commit.short)
if upstream_re:
commit.short = f'[upstream] Merge up to youtube-dl {upstream_re.group(1)}'
commit.short = f'[upstream] Merged with youtube-dl {upstream_re.group(1)}'
match = self.MESSAGE_RE.fullmatch(commit.short)
if not match:
logger.error(f'Error parsing short commit message: {commit.short!r}')
continue
prefix, details, sub_details, sub_details_alt, message, issues = match.groups()
group = None
if prefix:
if prefix == 'priority':
prefix, _, details = (details or '').partition('/')
logger.debug(f'Priority: {message!r}')
group = CommitGroup.PRIORITY
if not details and prefix:
if prefix not in ('core', 'downloader', 'extractor', 'misc', 'postprocessor', 'upstream'):
logger.debug(f'Replaced details with {prefix!r}')
details = prefix or None
if details == 'common':
details = None
if details:
details = details.strip()
else:
group = CommitGroup.CORE
sub_details = f'{sub_details or ""},{sub_details_alt or ""}'.replace(':', ',')
sub_details = tuple(filter(None, map(str.strip, sub_details.split(','))))
prefix, sub_details_alt, message, issues = match.groups()
issues = [issue.strip()[1:] for issue in issues.split(',')] if issues else []
if prefix:
groups, details, sub_details = zip(*map(self.details_from_prefix, prefix.split(',')))
group = next(iter(filter(None, groups)), None)
details = ', '.join(unique(details))
sub_details = list(itertools.chain.from_iterable(sub_details))
else:
group = CommitGroup.CORE
details = None
sub_details = []
if sub_details_alt:
sub_details.append(sub_details_alt)
sub_details = tuple(unique(sub_details))
if not group:
group = CommitGroup.get(prefix.lower())
if not group:
if self.EXTRACTOR_INDICATOR_RE.search(commit.short):
group = CommitGroup.EXTRACTOR
else:
group = CommitGroup.POSTPROCESSOR
logger.warning(f'Failed to map {commit.short!r}, selected {group.name}')
if self.EXTRACTOR_INDICATOR_RE.search(commit.short):
group = CommitGroup.EXTRACTOR
logger.error(f'Assuming [ie] group for {commit.short!r}')
else:
group = CommitGroup.CORE
commit_info = CommitInfo(
details, sub_details, message.strip(),
issues, commit, self._fixes[commit.hash])
logger.debug(f'Resolved {commit.short!r} to {commit_info!r}')
groups[group].append(commit_info)
return groups
logger.debug(f'Resolved {commit.short!r} to {commit_info!r}')
group_dict[group].append(commit_info)
return group_dict
@staticmethod
def details_from_prefix(prefix):
if not prefix:
return CommitGroup.CORE, None, ()
prefix, *sub_details = prefix.split(':')
group, details = CommitGroup.get(prefix)
if group is CommitGroup.PRIORITY and details:
details = details.partition('/')[2].strip()
if details and '/' in details:
logger.error(f'Prefix is overnested, using first part: {prefix}')
details = details.partition('/')[0].strip()
if details == 'common':
details = None
elif group is CommitGroup.NETWORKING and details == 'rh':
details = 'Request Handler'
return group, details, sub_details
def get_new_contributors(contributors_path, commits):
@ -415,7 +445,32 @@ def get_new_contributors(contributors_path, commits):
return sorted(new_contributors, key=str.casefold)
if __name__ == '__main__':
def create_changelog(args):
logging.basicConfig(
datefmt='%Y-%m-%d %H-%M-%S', format='{asctime} | {levelname:<8} | {message}',
level=logging.WARNING - 10 * args.verbosity, style='{', stream=sys.stderr)
commits = CommitRange(None, args.commitish, args.default_author)
if not args.no_override:
if args.override_path.exists():
overrides = json.loads(read_file(args.override_path))
commits.apply_overrides(overrides)
else:
logger.warning(f'File {args.override_path.as_posix()} does not exist')
logger.info(f'Loaded {len(commits)} commits')
new_contributors = get_new_contributors(args.contributors_path, commits)
if new_contributors:
if args.contributors:
write_file(args.contributors_path, '\n'.join(new_contributors) + '\n', mode='a')
logger.info(f'New contributors: {", ".join(new_contributors)}')
return Changelog(commits.groups(), args.repo, args.collapsible)
def create_parser():
import argparse
parser = argparse.ArgumentParser(
@ -444,27 +499,12 @@ if __name__ == '__main__':
parser.add_argument(
'--repo', default='yt-dlp/yt-dlp',
help='the github repository to use for the operations (default: %(default)s)')
args = parser.parse_args()
parser.add_argument(
'--collapsible', action='store_true',
help='make changelog collapsible (default: %(default)s)')
logging.basicConfig(
datefmt='%Y-%m-%d %H-%M-%S', format='{asctime} | {levelname:<8} | {message}',
level=logging.WARNING - 10 * args.verbosity, style='{', stream=sys.stderr)
return parser
commits = CommitRange(None, args.commitish, args.default_author)
if not args.no_override:
if args.override_path.exists():
overrides = json.loads(read_file(args.override_path))
commits.apply_overrides(overrides)
else:
logger.warning(f'File {args.override_path.as_posix()} does not exist')
logger.info(f'Loaded {len(commits)} commits')
new_contributors = get_new_contributors(args.contributors_path, commits)
if new_contributors:
if args.contributors:
write_file(args.contributors_path, '\n'.join(new_contributors) + '\n', mode='a')
logger.info(f'New contributors: {", ".join(new_contributors)}')
print(Changelog(commits.groups(), args.repo))
if __name__ == '__main__':
print(create_changelog(create_parser().parse_args()))

View File

@ -9,12 +9,7 @@ sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import re
from devscripts.utils import (
get_filename_args,
read_file,
read_version,
write_file,
)
from devscripts.utils import get_filename_args, read_file, write_file
VERBOSE_TMPL = '''
- type: checkboxes
@ -35,19 +30,18 @@ VERBOSE_TMPL = '''
description: |
It should start like this:
placeholder: |
[debug] Command-line config: ['-vU', 'test:youtube']
[debug] Portable config "yt-dlp.conf": ['-i']
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
[debug] yt-dlp version %(version)s [9d339c4] (win32_exe)
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
[debug] Checking exe version: ffmpeg -bsfs
[debug] Checking exe version: ffprobe -bsfs
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
[debug] Proxy map: {}
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
Latest version: %(version)s, Current version: %(version)s
yt-dlp is up to date (%(version)s)
[debug] Request Handlers: urllib, requests
[debug] Loaded 1893 extractors
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
<more lines>
render: shell
validations:
@ -66,7 +60,7 @@ NO_SKIP = '''
def main():
fields = {'version': read_version(), 'no_skip': NO_SKIP}
fields = {'no_skip': NO_SKIP}
fields['verbose'] = VERBOSE_TMPL % fields
fields['verbose_optional'] = re.sub(r'(\n\s+validations:)?\n\s+required: true', '', fields['verbose'])

View File

@ -24,7 +24,7 @@ PREFIX = r'''%yt-dlp(1)
# NAME
yt\-dlp \- A youtube-dl fork with additional features and patches
yt\-dlp \- A feature\-rich command\-line audio/video downloader
# SYNOPSIS
@ -43,6 +43,27 @@ def filter_excluded_sections(readme):
'', readme)
def _convert_code_blocks(readme):
current_code_block = None
for line in readme.splitlines(True):
if current_code_block:
if line == current_code_block:
current_code_block = None
yield '\n'
else:
yield f' {line}'
elif line.startswith('```'):
current_code_block = line.count('`') * '`' + '\n'
yield '\n'
else:
yield line
def convert_code_blocks(readme):
return ''.join(_convert_code_blocks(readme))
def move_sections(readme):
MOVE_TAG_TEMPLATE = '<!-- MANPAGE: MOVE "%s" SECTION HERE -->'
sections = re.findall(r'(?m)^%s$' % (
@ -65,8 +86,10 @@ def move_sections(readme):
def filter_options(readme):
section = re.search(r'(?sm)^# USAGE AND OPTIONS\n.+?(?=^# )', readme).group(0)
section_new = section.replace('*', R'\*')
options = '# OPTIONS\n'
for line in section.split('\n')[1:]:
for line in section_new.split('\n')[1:]:
mobj = re.fullmatch(r'''(?x)
\s{4}(?P<opt>-(?:,\s|[^\s])+)
(?:\s(?P<meta>(?:[^\s]|\s(?!\s))+))?
@ -86,7 +109,7 @@ def filter_options(readme):
return readme.replace(section, options, 1)
TRANSFORM = compose_functions(filter_excluded_sections, move_sections, filter_options)
TRANSFORM = compose_functions(filter_excluded_sections, convert_code_blocks, move_sections, filter_options)
def main():

View File

@ -1,17 +1,4 @@
@setlocal
@echo off
cd /d %~dp0..
if ["%~1"]==[""] (
set "test_set="test""
) else if ["%~1"]==["core"] (
set "test_set="-m not download""
) else if ["%~1"]==["download"] (
set "test_set="-m "download""
) else (
echo.Invalid test type "%~1". Use "core" ^| "download"
exit /b 1
)
set PYTHONWARNINGS=error
pytest %test_set%
>&2 echo run_tests.bat is deprecated. Please use `devscripts/run_tests.py` instead
python %~dp0run_tests.py %~1

71
devscripts/run_tests.py Executable file
View File

@ -0,0 +1,71 @@
#!/usr/bin/env python3
import argparse
import functools
import os
import re
import subprocess
import sys
from pathlib import Path
fix_test_name = functools.partial(re.compile(r'IE(_all|_\d+)?$').sub, r'\1')
def parse_args():
parser = argparse.ArgumentParser(description='Run selected yt-dlp tests')
parser.add_argument(
'test', help='a extractor tests, or one of "core" or "download"', nargs='*')
parser.add_argument(
'-k', help='run a test matching EXPRESSION. Same as "pytest -k"', metavar='EXPRESSION')
return parser.parse_args()
def run_tests(*tests, pattern=None, ci=False):
run_core = 'core' in tests or (not pattern and not tests)
run_download = 'download' in tests
tests = list(map(fix_test_name, tests))
arguments = ['pytest', '-Werror', '--tb=short']
if ci:
arguments.append('--color=yes')
if run_core:
arguments.extend(['-m', 'not download'])
elif run_download:
arguments.extend(['-m', 'download'])
elif pattern:
arguments.extend(['-k', pattern])
else:
arguments.extend(
f'test/test_download.py::TestDownload::test_{test}' for test in tests)
print(f'Running {arguments}', flush=True)
try:
return subprocess.call(arguments)
except FileNotFoundError:
pass
arguments = [sys.executable, '-Werror', '-m', 'unittest']
if run_core:
print('"pytest" needs to be installed to run core tests', file=sys.stderr, flush=True)
return 1
elif run_download:
arguments.append('test.test_download')
elif pattern:
arguments.extend(['-k', pattern])
else:
arguments.extend(
f'test.test_download.TestDownload.test_{test}' for test in tests)
print(f'Running {arguments}', flush=True)
return subprocess.call(arguments)
if __name__ == '__main__':
try:
args = parse_args()
os.chdir(Path(__file__).parent.parent)
sys.exit(run_tests(*args.test, pattern=args.k, ci=bool(os.getenv('CI'))))
except KeyboardInterrupt:
pass

View File

@ -1,14 +1,4 @@
#!/usr/bin/env sh
if [ -z "$1" ]; then
test_set='test'
elif [ "$1" = 'core' ]; then
test_set="-m not download"
elif [ "$1" = 'download' ]; then
test_set="-m download"
else
echo 'Invalid test type "'"$1"'". Use "core" | "download"'
exit 1
fi
python3 -bb -Werror -m pytest "$test_set"
>&2 echo 'run_tests.sh is deprecated. Please use `devscripts/run_tests.py` instead'
python3 devscripts/run_tests.py "$1"

189
devscripts/tomlparse.py Executable file
View File

@ -0,0 +1,189 @@
#!/usr/bin/env python3
"""
Simple parser for spec compliant toml files
A simple toml parser for files that comply with the spec.
Should only be used to parse `pyproject.toml` for `install_deps.py`.
IMPORTANT: INVALID FILES OR MULTILINE STRINGS ARE NOT SUPPORTED!
"""
from __future__ import annotations
import datetime as dt
import json
import re
WS = r'(?:[\ \t]*)'
STRING_RE = re.compile(r'"(?:\\.|[^\\"\n])*"|\'[^\'\n]*\'')
SINGLE_KEY_RE = re.compile(rf'{STRING_RE.pattern}|[A-Za-z0-9_-]+')
KEY_RE = re.compile(rf'{WS}(?:{SINGLE_KEY_RE.pattern}){WS}(?:\.{WS}(?:{SINGLE_KEY_RE.pattern}){WS})*')
EQUALS_RE = re.compile(rf'={WS}')
WS_RE = re.compile(WS)
_SUBTABLE = rf'(?P<subtable>^\[(?P<is_list>\[)?(?P<path>{KEY_RE.pattern})\]\]?)'
EXPRESSION_RE = re.compile(rf'^(?:{_SUBTABLE}|{KEY_RE.pattern}=)', re.MULTILINE)
LIST_WS_RE = re.compile(rf'{WS}((#[^\n]*)?\n{WS})*')
LEFTOVER_VALUE_RE = re.compile(r'[^,}\]\t\n#]+')
def parse_key(value: str):
for match in SINGLE_KEY_RE.finditer(value):
if match[0][0] == '"':
yield json.loads(match[0])
elif match[0][0] == '\'':
yield match[0][1:-1]
else:
yield match[0]
def get_target(root: dict, paths: list[str], is_list=False):
target = root
for index, key in enumerate(paths, 1):
use_list = is_list and index == len(paths)
result = target.get(key)
if result is None:
result = [] if use_list else {}
target[key] = result
if isinstance(result, dict):
target = result
elif use_list:
target = {}
result.append(target)
else:
target = result[-1]
assert isinstance(target, dict)
return target
def parse_enclosed(data: str, index: int, end: str, ws_re: re.Pattern):
index += 1
if match := ws_re.match(data, index):
index = match.end()
while data[index] != end:
index = yield True, index
if match := ws_re.match(data, index):
index = match.end()
if data[index] == ',':
index += 1
if match := ws_re.match(data, index):
index = match.end()
assert data[index] == end
yield False, index + 1
def parse_value(data: str, index: int):
if data[index] == '[':
result = []
indices = parse_enclosed(data, index, ']', LIST_WS_RE)
valid, index = next(indices)
while valid:
index, value = parse_value(data, index)
result.append(value)
valid, index = indices.send(index)
return index, result
if data[index] == '{':
result = {}
indices = parse_enclosed(data, index, '}', WS_RE)
valid, index = next(indices)
while valid:
valid, index = indices.send(parse_kv_pair(data, index, result))
return index, result
if match := STRING_RE.match(data, index):
return match.end(), json.loads(match[0]) if match[0][0] == '"' else match[0][1:-1]
match = LEFTOVER_VALUE_RE.match(data, index)
assert match
value = match[0].strip()
for func in [
int,
float,
dt.time.fromisoformat,
dt.date.fromisoformat,
dt.datetime.fromisoformat,
{'true': True, 'false': False}.get,
]:
try:
value = func(value)
break
except Exception:
pass
return match.end(), value
def parse_kv_pair(data: str, index: int, target: dict):
match = KEY_RE.match(data, index)
if not match:
return None
*keys, key = parse_key(match[0])
match = EQUALS_RE.match(data, match.end())
assert match
index = match.end()
index, value = parse_value(data, index)
get_target(target, keys)[key] = value
return index
def parse_toml(data: str):
root = {}
target = root
index = 0
while True:
match = EXPRESSION_RE.search(data, index)
if not match:
break
if match.group('subtable'):
index = match.end()
path, is_list = match.group('path', 'is_list')
target = get_target(root, list(parse_key(path)), bool(is_list))
continue
index = parse_kv_pair(data, match.start(), target)
assert index is not None
return root
def main():
import argparse
from pathlib import Path
parser = argparse.ArgumentParser()
parser.add_argument('infile', type=Path, help='The TOML file to read as input')
args = parser.parse_args()
with args.infile.open('r', encoding='utf-8') as file:
data = file.read()
def default(obj):
if isinstance(obj, (dt.date, dt.time, dt.datetime)):
return obj.isoformat()
print(json.dumps(parse_toml(data), default=default))
if __name__ == '__main__':
main()

View File

@ -1,39 +0,0 @@
#!/usr/bin/env python3
"""
Usage: python3 ./devscripts/update-formulae.py <path-to-formulae-rb> <version>
version can be either 0-aligned (yt-dlp version) or normalized (PyPi version)
"""
# Allow direct execution
import os
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import json
import re
import urllib.request
from devscripts.utils import read_file, write_file
filename, version = sys.argv[1:]
normalized_version = '.'.join(str(int(x)) for x in version.split('.'))
pypi_release = json.loads(urllib.request.urlopen(
'https://pypi.org/pypi/yt-dlp/%s/json' % normalized_version
).read().decode())
tarball_file = next(x for x in pypi_release['urls'] if x['filename'].endswith('.tar.gz'))
sha256sum = tarball_file['digests']['sha256']
url = tarball_file['url']
formulae_text = read_file(filename)
formulae_text = re.sub(r'sha256 "[0-9a-f]*?"', 'sha256 "%s"' % sha256sum, formulae_text, count=1)
formulae_text = re.sub(r'url "[^"]*?"', 'url "%s"' % url, formulae_text, count=1)
write_file(filename, formulae_text)

View File

@ -9,18 +9,18 @@ sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import argparse
import contextlib
import datetime as dt
import sys
from datetime import datetime
from devscripts.utils import read_version, run_process, write_file
def get_new_version(version, revision):
if not version:
version = datetime.utcnow().strftime('%Y.%m.%d')
version = dt.datetime.now(dt.timezone.utc).strftime('%Y.%m.%d')
if revision:
assert revision.isdigit(), 'Revision must be a number'
assert revision.isdecimal(), 'Revision must be a number'
else:
old_version = read_version().split('.')
if version.split('.') == old_version[:3]:
@ -46,13 +46,23 @@ VARIANT = None
UPDATE_HINT = None
CHANNEL = {channel!r}
ORIGIN = {origin!r}
_pkg_version = {package_version!r}
'''
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Update the version.py file')
parser.add_argument(
'-c', '--channel', choices=['stable', 'nightly'], default='stable',
'-c', '--channel', default='stable',
help='Select update channel (default: %(default)s)')
parser.add_argument(
'-r', '--origin', default='local',
help='Select origin/repository (default: %(default)s)')
parser.add_argument(
'-s', '--suffix', default='',
help='Add an alphanumeric suffix to the package version, e.g. "dev"')
parser.add_argument(
'-o', '--output', default='yt_dlp/version.py',
help='The output file to write to (default: %(default)s)')
@ -66,6 +76,7 @@ if __name__ == '__main__':
args.version if args.version and '.' in args.version
else get_new_version(None, args.version))
write_file(args.output, VERSION_TEMPLATE.format(
version=version, git_head=git_head, channel=args.channel))
version=version, git_head=git_head, channel=args.channel, origin=args.origin,
package_version=f'{version}{args.suffix}'))
print(f'version={version} ({args.channel}), head={git_head}')

26
devscripts/update_changelog.py Executable file
View File

@ -0,0 +1,26 @@
#!/usr/bin/env python3
# Allow direct execution
import os
import sys
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from pathlib import Path
from devscripts.make_changelog import create_changelog, create_parser
from devscripts.utils import read_file, read_version, write_file
# Always run after devscripts/update-version.py, and run before `make doc|pypi-files|tar|all`
if __name__ == '__main__':
parser = create_parser()
parser.description = 'Update an existing changelog file with an entry for a new release'
parser.add_argument(
'--changelog-path', type=Path, default=Path(__file__).parent.parent / 'Changelog.md',
help='path to the Changelog file')
args = parser.parse_args()
new_entry = create_changelog(args)
header, sep, changelog = read_file(args.changelog_path).partition('\n### ')
write_file(args.changelog_path, f'{header}{sep}{read_version()}\n{new_entry}\n{sep}{changelog}')

View File

@ -13,10 +13,11 @@ def write_file(fname, content, mode='w'):
return f.write(content)
def read_version(fname='yt_dlp/version.py'):
def read_version(fname='yt_dlp/version.py', varname='__version__'):
"""Get the version without importing the package"""
exec(compile(read_file(fname), fname, 'exec'))
return locals()['__version__']
items = {}
exec(compile(read_file(fname), fname, 'exec'), items)
return items[varname]
def get_filename_args(has_infile=False, default_outfile=None):

125
pyinst.py Normal file → Executable file
View File

@ -1,132 +1,17 @@
#!/usr/bin/env python3
# Allow direct execution
# Allow execution from anywhere
import os
import sys
sys.path.insert(0, os.path.dirname(os.path.abspath(__file__)))
import platform
import warnings
from PyInstaller.__main__ import run as run_pyinstaller
from devscripts.utils import read_version
OS_NAME, MACHINE, ARCH = sys.platform, platform.machine().lower(), platform.architecture()[0][:2]
if MACHINE in ('x86', 'x86_64', 'amd64', 'i386', 'i686'):
MACHINE = 'x86' if ARCH == '32' else ''
def main():
opts, version = parse_options(), read_version()
onedir = '--onedir' in opts or '-D' in opts
if not onedir and '-F' not in opts and '--onefile' not in opts:
opts.append('--onefile')
name, final_file = exe(onedir)
print(f'Building yt-dlp v{version} for {OS_NAME} {platform.machine()} with options {opts}')
print('Remember to update the version using "devscripts/update-version.py"')
if not os.path.isfile('yt_dlp/extractor/lazy_extractors.py'):
print('WARNING: Building without lazy_extractors. Run '
'"devscripts/make_lazy_extractors.py" to build lazy extractors', file=sys.stderr)
print(f'Destination: {final_file}\n')
opts = [
f'--name={name}',
'--icon=devscripts/logo.ico',
'--upx-exclude=vcruntime140.dll',
'--noconfirm',
'--additional-hooks-dir=yt_dlp/__pyinstaller',
*opts,
'yt_dlp/__main__.py',
]
print(f'Running PyInstaller with {opts}')
run_pyinstaller(opts)
set_version_info(final_file, version)
def parse_options():
# Compatibility with older arguments
opts = sys.argv[1:]
if opts[0:1] in (['32'], ['64']):
if ARCH != opts[0]:
raise Exception(f'{opts[0]}bit executable cannot be built on a {ARCH}bit system')
opts = opts[1:]
return opts
def exe(onedir):
"""@returns (name, path)"""
name = '_'.join(filter(None, (
'yt-dlp',
{'win32': '', 'darwin': 'macos'}.get(OS_NAME, OS_NAME),
MACHINE,
)))
return name, ''.join(filter(None, (
'dist/',
onedir and f'{name}/',
name,
OS_NAME == 'win32' and '.exe'
)))
def version_to_list(version):
version_list = version.split('.')
return list(map(int, version_list)) + [0] * (4 - len(version_list))
def set_version_info(exe, version):
if OS_NAME == 'win32':
windows_set_version(exe, version)
def windows_set_version(exe, version):
from PyInstaller.utils.win32.versioninfo import (
FixedFileInfo,
StringFileInfo,
StringStruct,
StringTable,
VarFileInfo,
VarStruct,
VSVersionInfo,
)
try:
from PyInstaller.utils.win32.versioninfo import SetVersion
except ImportError: # Pyinstaller >= 5.8
from PyInstaller.utils.win32.versioninfo import write_version_info_to_executable as SetVersion
version_list = version_to_list(version)
suffix = MACHINE and f'_{MACHINE}'
SetVersion(exe, VSVersionInfo(
ffi=FixedFileInfo(
filevers=version_list,
prodvers=version_list,
mask=0x3F,
flags=0x0,
OS=0x4,
fileType=0x1,
subtype=0x0,
date=(0, 0),
),
kids=[
StringFileInfo([StringTable('040904B0', [
StringStruct('Comments', 'yt-dlp%s Command Line Interface' % suffix),
StringStruct('CompanyName', 'https://github.com/yt-dlp'),
StringStruct('FileDescription', 'yt-dlp%s' % (MACHINE and f' ({MACHINE})')),
StringStruct('FileVersion', version),
StringStruct('InternalName', f'yt-dlp{suffix}'),
StringStruct('LegalCopyright', 'pukkandan.ytdlp@gmail.com | UNLICENSE'),
StringStruct('OriginalFilename', f'yt-dlp{suffix}.exe'),
StringStruct('ProductName', f'yt-dlp{suffix}'),
StringStruct(
'ProductVersion', f'{version}{suffix} on Python {platform.python_version()}'),
])]), VarFileInfo([VarStruct('Translation', [0, 1200])])
]
))
from bundle.pyinstaller import main
warnings.warn(DeprecationWarning('`pyinst.py` is deprecated and will be removed in a future version. '
'Use `bundle.pyinstaller` instead'))
if __name__ == '__main__':
main()

View File

@ -1,5 +1,124 @@
[build-system]
build-backend = 'setuptools.build_meta'
# https://github.com/yt-dlp/yt-dlp/issues/5941
# https://github.com/pypa/distutils/issues/17
requires = ['setuptools > 50']
requires = ["hatchling"]
build-backend = "hatchling.build"
[project]
name = "yt-dlp"
maintainers = [
{name = "pukkandan", email = "pukkandan.ytdlp@gmail.com"},
{name = "Grub4K", email = "contact@grub4k.xyz"},
{name = "bashonly", email = "bashonly@protonmail.com"},
{name = "coletdjnz", email = "coletdjnz@protonmail.com"},
]
description = "A feature-rich command-line audio/video downloader"
readme = "README.md"
requires-python = ">=3.8"
keywords = [
"youtube-dl",
"video-downloader",
"youtube-downloader",
"sponsorblock",
"youtube-dlc",
"yt-dlp",
]
license = {file = "LICENSE"}
classifiers = [
"Topic :: Multimedia :: Video",
"Development Status :: 5 - Production/Stable",
"Environment :: Console",
"Programming Language :: Python",
"Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
"Programming Language :: Python :: Implementation",
"Programming Language :: Python :: Implementation :: CPython",
"Programming Language :: Python :: Implementation :: PyPy",
"License :: OSI Approved :: The Unlicense (Unlicense)",
"Operating System :: OS Independent",
]
dynamic = ["version"]
dependencies = [
"brotli; implementation_name=='cpython'",
"brotlicffi; implementation_name!='cpython'",
"certifi",
"mutagen",
"pycryptodomex",
"requests>=2.31.0,<3",
"urllib3>=1.26.17,<3",
"websockets>=12.0",
]
[project.optional-dependencies]
default = []
curl_cffi = ["curl-cffi==0.5.10; implementation_name=='cpython'"]
secretstorage = [
"cffi",
"secretstorage",
]
build = [
"build",
"hatchling",
"pip",
"wheel",
]
dev = [
"flake8",
"isort",
"pytest",
]
pyinstaller = [
"pyinstaller>=6.3; sys_platform!='darwin'",
"pyinstaller==5.13.2; sys_platform=='darwin'", # needed for curl_cffi
]
py2exe = ["py2exe>=0.12"]
[project.urls]
Documentation = "https://github.com/yt-dlp/yt-dlp#readme"
Repository = "https://github.com/yt-dlp/yt-dlp"
Tracker = "https://github.com/yt-dlp/yt-dlp/issues"
Funding = "https://github.com/yt-dlp/yt-dlp/blob/master/Collaborators.md#collaborators"
[project.scripts]
yt-dlp = "yt_dlp:main"
[project.entry-points.pyinstaller40]
hook-dirs = "yt_dlp.__pyinstaller:get_hook_dirs"
[tool.hatch.build.targets.sdist]
include = [
"/yt_dlp",
"/devscripts",
"/test",
"/.gitignore", # included by default, needed for auto-excludes
"/Changelog.md",
"/LICENSE", # included as license
"/pyproject.toml", # included by default
"/README.md", # included as readme
"/setup.cfg",
"/supportedsites.md",
]
artifacts = [
"/yt_dlp/extractor/lazy_extractors.py",
"/completions",
"/AUTHORS", # included by default
"/README.txt",
"/yt-dlp.1",
]
[tool.hatch.build.targets.wheel]
packages = ["yt_dlp"]
artifacts = ["/yt_dlp/extractor/lazy_extractors.py"]
[tool.hatch.build.targets.wheel.shared-data]
"completions/bash/yt-dlp" = "share/bash-completion/completions/yt-dlp"
"completions/zsh/_yt-dlp" = "share/zsh/site-functions/_yt-dlp"
"completions/fish/yt-dlp.fish" = "share/fish/vendor_completions.d/yt-dlp.fish"
"README.txt" = "share/doc/yt_dlp/README.txt"
"yt-dlp.1" = "share/man/man1/yt-dlp.1"
[tool.hatch.version]
path = "yt_dlp/version.py"
pattern = "_pkg_version = '(?P<version>[^']+)'"

View File

@ -1,6 +0,0 @@
mutagen
pycryptodomex
websockets
brotli; platform_python_implementation=='CPython'
brotlicffi; platform_python_implementation!='CPython'
certifi

View File

@ -1,7 +1,3 @@
[wheel]
universal = true
[flake8]
exclude = build,venv,.tox,.git,.pytest_cache
ignore = E402,E501,E731,E741,W503
@ -26,7 +22,7 @@ markers =
[tox:tox]
skipsdist = true
envlist = py{36,37,38,39,310,311},pypy{36,37,38,39}
envlist = py{38,39,310,311,312},pypy{38,39,310}
skip_missing_interpreters = true
[testenv] # tox
@ -39,7 +35,7 @@ setenv =
[isort]
py_version = 37
py_version = 38
multi_line_output = VERTICAL_HANGING_INDENT
line_length = 80
reverse_relative = true

175
setup.py Normal file → Executable file
View File

@ -6,170 +6,31 @@ import sys
sys.path.insert(0, os.path.dirname(os.path.abspath(__file__)))
import subprocess
import warnings
try:
from setuptools import Command, find_packages, setup
setuptools_available = True
except ImportError:
from distutils.core import Command, setup
setuptools_available = False
from devscripts.utils import read_file, read_version
if sys.argv[1:2] == ['py2exe']:
warnings.warn(DeprecationWarning('`setup.py py2exe` is deprecated and will be removed in a future version. '
'Use `bundle.py2exe` instead'))
VERSION = read_version()
import bundle.py2exe
DESCRIPTION = 'A youtube-dl fork with additional features and patches'
bundle.py2exe.main()
LONG_DESCRIPTION = '\n\n'.join((
'Official repository: <https://github.com/yt-dlp/yt-dlp>',
'**PS**: Some links in this document will not work since this is a copy of the README.md from Github',
read_file('README.md')))
elif 'build_lazy_extractors' in sys.argv:
warnings.warn(DeprecationWarning('`setup.py build_lazy_extractors` is deprecated and will be removed in a future version. '
'Use `devscripts.make_lazy_extractors` instead'))
REQUIREMENTS = read_file('requirements.txt').splitlines()
import subprocess
os.chdir(sys.path[0])
print('running build_lazy_extractors')
subprocess.run([sys.executable, 'devscripts/make_lazy_extractors.py'])
def packages():
if setuptools_available:
return find_packages(exclude=('youtube_dl', 'youtube_dlc', 'test', 'ytdlp_plugins', 'devscripts'))
else:
return [
'yt_dlp', 'yt_dlp.extractor', 'yt_dlp.downloader', 'yt_dlp.postprocessor', 'yt_dlp.compat',
]
def py2exe_params():
warnings.warn(
'py2exe builds do not support pycryptodomex and needs VC++14 to run. '
'It is recommended to run "pyinst.py" to build using pyinstaller instead')
return {
'console': [{
'script': './yt_dlp/__main__.py',
'dest_base': 'yt-dlp',
'icon_resources': [(1, 'devscripts/logo.ico')],
}],
'version_info': {
'version': VERSION,
'description': DESCRIPTION,
'comments': LONG_DESCRIPTION.split('\n')[0],
'product_name': 'yt-dlp',
'product_version': VERSION,
},
'options': {
'bundle_files': 0,
'compressed': 1,
'optimize': 2,
'dist_dir': './dist',
'excludes': ['Crypto', 'Cryptodome'], # py2exe cannot import Crypto
'dll_excludes': ['w9xpopen.exe', 'crypt32.dll'],
# Modules that are only imported dynamically must be added here
'includes': ['yt_dlp.compat._legacy'],
},
'zipfile': None,
}
def build_params():
files_spec = [
('share/bash-completion/completions', ['completions/bash/yt-dlp']),
('share/zsh/site-functions', ['completions/zsh/_yt-dlp']),
('share/fish/vendor_completions.d', ['completions/fish/yt-dlp.fish']),
('share/doc/yt_dlp', ['README.txt']),
('share/man/man1', ['yt-dlp.1'])
]
data_files = []
for dirname, files in files_spec:
resfiles = []
for fn in files:
if not os.path.exists(fn):
warnings.warn(f'Skipping file {fn} since it is not present. Try running " make pypi-files " first')
else:
resfiles.append(fn)
data_files.append((dirname, resfiles))
params = {'data_files': data_files}
if setuptools_available:
params['entry_points'] = {
'console_scripts': ['yt-dlp = yt_dlp:main'],
'pyinstaller40': ['hook-dirs = yt_dlp.__pyinstaller:get_hook_dirs'],
}
else:
params['scripts'] = ['yt-dlp']
return params
class build_lazy_extractors(Command):
description = 'Build the extractor lazy loading module'
user_options = []
def initialize_options(self):
pass
def finalize_options(self):
pass
def run(self):
if self.dry_run:
print('Skipping build of lazy extractors in dry run mode')
return
subprocess.run([sys.executable, 'devscripts/make_lazy_extractors.py'])
def main():
if sys.argv[1:2] == ['py2exe']:
params = py2exe_params()
try:
from py2exe import freeze
except ImportError:
import py2exe # noqa: F401
warnings.warn('You are using an outdated version of py2exe. Support for this version will be removed in the future')
params['console'][0].update(params.pop('version_info'))
params['options'] = {'py2exe': params.pop('options')}
else:
return freeze(**params)
else:
params = build_params()
setup(
name='yt-dlp',
version=VERSION,
maintainer='pukkandan',
maintainer_email='pukkandan.ytdlp@gmail.com',
description=DESCRIPTION,
long_description=LONG_DESCRIPTION,
long_description_content_type='text/markdown',
url='https://github.com/yt-dlp/yt-dlp',
packages=packages(),
install_requires=REQUIREMENTS,
python_requires='>=3.7',
project_urls={
'Documentation': 'https://github.com/yt-dlp/yt-dlp#readme',
'Source': 'https://github.com/yt-dlp/yt-dlp',
'Tracker': 'https://github.com/yt-dlp/yt-dlp/issues',
'Funding': 'https://github.com/yt-dlp/yt-dlp/blob/master/Collaborators.md#collaborators',
},
classifiers=[
'Topic :: Multimedia :: Video',
'Development Status :: 5 - Production/Stable',
'Environment :: Console',
'Programming Language :: Python',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Programming Language :: Python :: 3.11',
'Programming Language :: Python :: Implementation',
'Programming Language :: Python :: Implementation :: CPython',
'Programming Language :: Python :: Implementation :: PyPy',
'License :: Public Domain',
'Operating System :: OS Independent',
],
cmdclass={'build_lazy_extractors': build_lazy_extractors},
**params
)
main()
print(
'ERROR: Building by calling `setup.py` is deprecated. '
'Use a build frontend like `build` instead. ',
'Refer to https://build.pypa.io for more info', file=sys.stderr)
sys.exit(1)

File diff suppressed because it is too large Load Diff

26
test/conftest.py Normal file
View File

@ -0,0 +1,26 @@
import functools
import inspect
import pytest
from yt_dlp.networking import RequestHandler
from yt_dlp.networking.common import _REQUEST_HANDLERS
from yt_dlp.utils._utils import _YDLLogger as FakeLogger
@pytest.fixture
def handler(request):
RH_KEY = request.param
if inspect.isclass(RH_KEY) and issubclass(RH_KEY, RequestHandler):
handler = RH_KEY
elif RH_KEY in _REQUEST_HANDLERS:
handler = _REQUEST_HANDLERS[RH_KEY]
else:
pytest.skip(f'{RH_KEY} request handler is not available')
return functools.partial(handler, logger=FakeLogger)
def validate_and_send(rh, req):
rh.validate(req)
return rh.send(req)

View File

@ -10,7 +10,7 @@ import types
import yt_dlp.extractor
from yt_dlp import YoutubeDL
from yt_dlp.compat import compat_os_name
from yt_dlp.utils import preferredencoding, write_string
from yt_dlp.utils import preferredencoding, try_call, write_string, find_available_port
if 'pytest' in sys.modules:
import pytest
@ -194,8 +194,8 @@ def sanitize_got_info_dict(got_dict):
'formats', 'thumbnails', 'subtitles', 'automatic_captions', 'comments', 'entries',
# Auto-generated
'autonumber', 'playlist', 'format_index', 'video_ext', 'audio_ext', 'duration_string', 'epoch',
'fulltitle', 'extractor', 'extractor_key', 'filepath', 'infojson_filename', 'original_url', 'n_entries',
'autonumber', 'playlist', 'format_index', 'video_ext', 'audio_ext', 'duration_string', 'epoch', 'n_entries',
'fulltitle', 'extractor', 'extractor_key', 'filename', 'filepath', 'infojson_filename', 'original_url',
# Only live_status needs to be checked
'is_live', 'was_live',
@ -214,14 +214,23 @@ def sanitize_got_info_dict(got_dict):
test_info_dict = {
key: sanitize(key, value) for key, value in got_dict.items()
if value is not None and key not in IGNORED_FIELDS and not any(
key.startswith(f'{prefix}_') for prefix in IGNORED_PREFIXES)
if value is not None and key not in IGNORED_FIELDS and (
not any(key.startswith(f'{prefix}_') for prefix in IGNORED_PREFIXES)
or key == '_old_archive_ids')
}
# display_id may be generated from id
if test_info_dict.get('display_id') == test_info_dict.get('id'):
test_info_dict.pop('display_id')
# Remove deprecated fields
for old in YoutubeDL._deprecated_multivalue_fields.keys():
test_info_dict.pop(old, None)
# release_year may be generated from release_date
if try_call(lambda: test_info_dict['release_year'] == int(test_info_dict['release_date'][:4])):
test_info_dict.pop('release_year')
# Check url for flat entries
if got_dict.get('_type', 'video') != 'video' and got_dict.get('url'):
test_info_dict['url'] = got_dict['url']
@ -324,3 +333,8 @@ def http_server_port(httpd):
else:
sock = httpd.socket
return sock.getsockname()[1]
def verify_address_availability(address):
if find_available_port(address) is None:
pytest.skip(f'Unable to bind to source address {address} (address may not exist)')

View File

@ -917,8 +917,6 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'acodec': 'mp4a.40.2',
'video_ext': 'mp4',
'audio_ext': 'none',
'vbr': 263.851,
'abr': 0,
}, {
'format_id': '577',
'format_index': None,
@ -936,8 +934,6 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'acodec': 'mp4a.40.2',
'video_ext': 'mp4',
'audio_ext': 'none',
'vbr': 577.61,
'abr': 0,
}, {
'format_id': '915',
'format_index': None,
@ -955,8 +951,6 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'acodec': 'mp4a.40.2',
'video_ext': 'mp4',
'audio_ext': 'none',
'vbr': 915.905,
'abr': 0,
}, {
'format_id': '1030',
'format_index': None,
@ -974,8 +968,6 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'acodec': 'mp4a.40.2',
'video_ext': 'mp4',
'audio_ext': 'none',
'vbr': 1030.138,
'abr': 0,
}, {
'format_id': '1924',
'format_index': None,
@ -993,8 +985,6 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'acodec': 'mp4a.40.2',
'video_ext': 'mp4',
'audio_ext': 'none',
'vbr': 1924.009,
'abr': 0,
}],
{
'en': [{
@ -1406,6 +1396,7 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'vcodec': 'none',
'acodec': 'AACL',
'protocol': 'ism',
'audio_channels': 2,
'_download_params': {
'stream_type': 'audio',
'duration': 8880746666,
@ -1419,9 +1410,6 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'bits_per_sample': 16,
'nal_unit_length_field': 4
},
'audio_ext': 'isma',
'video_ext': 'none',
'abr': 128,
}, {
'format_id': 'video-100',
'url': 'https://sdn-global-streaming-cache-3qsdn.akamaized.net/stream/3144/files/17/07/672975/3144-kZT4LWMQw6Rh7Kpd.ism/Manifest',
@ -1445,9 +1433,6 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'bits_per_sample': 16,
'nal_unit_length_field': 4
},
'video_ext': 'ismv',
'audio_ext': 'none',
'vbr': 100,
}, {
'format_id': 'video-326',
'url': 'https://sdn-global-streaming-cache-3qsdn.akamaized.net/stream/3144/files/17/07/672975/3144-kZT4LWMQw6Rh7Kpd.ism/Manifest',
@ -1471,9 +1456,6 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'bits_per_sample': 16,
'nal_unit_length_field': 4
},
'video_ext': 'ismv',
'audio_ext': 'none',
'vbr': 326,
}, {
'format_id': 'video-698',
'url': 'https://sdn-global-streaming-cache-3qsdn.akamaized.net/stream/3144/files/17/07/672975/3144-kZT4LWMQw6Rh7Kpd.ism/Manifest',
@ -1497,9 +1479,6 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'bits_per_sample': 16,
'nal_unit_length_field': 4
},
'video_ext': 'ismv',
'audio_ext': 'none',
'vbr': 698,
}, {
'format_id': 'video-1493',
'url': 'https://sdn-global-streaming-cache-3qsdn.akamaized.net/stream/3144/files/17/07/672975/3144-kZT4LWMQw6Rh7Kpd.ism/Manifest',
@ -1523,9 +1502,6 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'bits_per_sample': 16,
'nal_unit_length_field': 4
},
'video_ext': 'ismv',
'audio_ext': 'none',
'vbr': 1493,
}, {
'format_id': 'video-4482',
'url': 'https://sdn-global-streaming-cache-3qsdn.akamaized.net/stream/3144/files/17/07/672975/3144-kZT4LWMQw6Rh7Kpd.ism/Manifest',
@ -1549,9 +1525,6 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'bits_per_sample': 16,
'nal_unit_length_field': 4
},
'video_ext': 'ismv',
'audio_ext': 'none',
'vbr': 4482,
}],
{
'eng': [
@ -1575,34 +1548,6 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'ec-3_test',
'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
[{
'format_id': 'audio_deu_1-224',
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
'manifest_url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
'ext': 'isma',
'tbr': 224,
'asr': 48000,
'vcodec': 'none',
'acodec': 'EC-3',
'protocol': 'ism',
'_download_params':
{
'stream_type': 'audio',
'duration': 370000000,
'timescale': 10000000,
'width': 0,
'height': 0,
'fourcc': 'EC-3',
'language': 'deu',
'codec_private_data': '00063F000000AF87FBA7022DFB42A4D405CD93843BDD0700200F00',
'sampling_rate': 48000,
'channels': 6,
'bits_per_sample': 16,
'nal_unit_length_field': 4
},
'audio_ext': 'isma',
'video_ext': 'none',
'abr': 224,
}, {
'format_id': 'audio_deu-127',
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
'manifest_url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
@ -1612,8 +1557,9 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'vcodec': 'none',
'acodec': 'AACL',
'protocol': 'ism',
'_download_params':
{
'language': 'deu',
'audio_channels': 2,
'_download_params': {
'stream_type': 'audio',
'duration': 370000000,
'timescale': 10000000,
@ -1627,9 +1573,32 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'bits_per_sample': 16,
'nal_unit_length_field': 4
},
'audio_ext': 'isma',
'video_ext': 'none',
'abr': 127,
}, {
'format_id': 'audio_deu_1-224',
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
'manifest_url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
'ext': 'isma',
'tbr': 224,
'asr': 48000,
'vcodec': 'none',
'acodec': 'EC-3',
'protocol': 'ism',
'language': 'deu',
'audio_channels': 6,
'_download_params': {
'stream_type': 'audio',
'duration': 370000000,
'timescale': 10000000,
'width': 0,
'height': 0,
'fourcc': 'EC-3',
'language': 'deu',
'codec_private_data': '00063F000000AF87FBA7022DFB42A4D405CD93843BDD0700200F00',
'sampling_rate': 48000,
'channels': 6,
'bits_per_sample': 16,
'nal_unit_length_field': 4
},
}, {
'format_id': 'video_deu-23',
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
@ -1641,8 +1610,8 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'vcodec': 'AVC1',
'acodec': 'none',
'protocol': 'ism',
'_download_params':
{
'language': 'deu',
'_download_params': {
'stream_type': 'video',
'duration': 370000000,
'timescale': 10000000,
@ -1655,9 +1624,6 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'bits_per_sample': 16,
'nal_unit_length_field': 4
},
'video_ext': 'ismv',
'audio_ext': 'none',
'vbr': 23,
}, {
'format_id': 'video_deu-403',
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
@ -1669,8 +1635,8 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'vcodec': 'AVC1',
'acodec': 'none',
'protocol': 'ism',
'_download_params':
{
'language': 'deu',
'_download_params': {
'stream_type': 'video',
'duration': 370000000,
'timescale': 10000000,
@ -1683,9 +1649,6 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'bits_per_sample': 16,
'nal_unit_length_field': 4
},
'video_ext': 'ismv',
'audio_ext': 'none',
'vbr': 403,
}, {
'format_id': 'video_deu-680',
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
@ -1697,8 +1660,8 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'vcodec': 'AVC1',
'acodec': 'none',
'protocol': 'ism',
'_download_params':
{
'language': 'deu',
'_download_params': {
'stream_type': 'video',
'duration': 370000000,
'timescale': 10000000,
@ -1711,9 +1674,6 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'bits_per_sample': 16,
'nal_unit_length_field': 4
},
'video_ext': 'ismv',
'audio_ext': 'none',
'vbr': 680,
}, {
'format_id': 'video_deu-1253',
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
@ -1725,8 +1685,9 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'vcodec': 'AVC1',
'acodec': 'none',
'protocol': 'ism',
'_download_params':
{
'vbr': 1253,
'language': 'deu',
'_download_params': {
'stream_type': 'video',
'duration': 370000000,
'timescale': 10000000,
@ -1739,9 +1700,6 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'bits_per_sample': 16,
'nal_unit_length_field': 4
},
'video_ext': 'ismv',
'audio_ext': 'none',
'vbr': 1253,
}, {
'format_id': 'video_deu-2121',
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
@ -1753,8 +1711,8 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'vcodec': 'AVC1',
'acodec': 'none',
'protocol': 'ism',
'_download_params':
{
'language': 'deu',
'_download_params': {
'stream_type': 'video',
'duration': 370000000,
'timescale': 10000000,
@ -1767,9 +1725,6 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'bits_per_sample': 16,
'nal_unit_length_field': 4
},
'video_ext': 'ismv',
'audio_ext': 'none',
'vbr': 2121,
}, {
'format_id': 'video_deu-3275',
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
@ -1781,8 +1736,8 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'vcodec': 'AVC1',
'acodec': 'none',
'protocol': 'ism',
'_download_params':
{
'language': 'deu',
'_download_params': {
'stream_type': 'video',
'duration': 370000000,
'timescale': 10000000,
@ -1795,9 +1750,6 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'bits_per_sample': 16,
'nal_unit_length_field': 4
},
'video_ext': 'ismv',
'audio_ext': 'none',
'vbr': 3275,
}, {
'format_id': 'video_deu-5300',
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
@ -1809,8 +1761,8 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'vcodec': 'AVC1',
'acodec': 'none',
'protocol': 'ism',
'_download_params':
{
'language': 'deu',
'_download_params': {
'stream_type': 'video',
'duration': 370000000,
'timescale': 10000000,
@ -1823,9 +1775,6 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'bits_per_sample': 16,
'nal_unit_length_field': 4
},
'video_ext': 'ismv',
'audio_ext': 'none',
'vbr': 5300,
}, {
'format_id': 'video_deu-8079',
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
@ -1837,8 +1786,8 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'vcodec': 'AVC1',
'acodec': 'none',
'protocol': 'ism',
'_download_params':
{
'language': 'deu',
'_download_params': {
'stream_type': 'video',
'duration': 370000000,
'timescale': 10000000,
@ -1851,9 +1800,6 @@ jwplayer("mediaplayer").setup({"abouttext":"Visit Indie DB","aboutlink":"http:\/
'bits_per_sample': 16,
'nal_unit_length_field': 4
},
'video_ext': 'ismv',
'audio_ext': 'none',
'vbr': 8079,
}],
{},
),

View File

@ -10,9 +10,8 @@ sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import copy
import json
import urllib.error
from test.helper import FakeYDL, assertRegexpMatches
from test.helper import FakeYDL, assertRegexpMatches, try_rm
from yt_dlp import YoutubeDL
from yt_dlp.compat import compat_os_name
from yt_dlp.extractor import YoutubeIE
@ -25,6 +24,7 @@ from yt_dlp.utils import (
int_or_none,
match_filter_func,
)
from yt_dlp.utils.traversal import traverse_obj
TEST_URL = 'http://localhost/sample.mp4'
@ -140,6 +140,8 @@ class TestFormatSelection(unittest.TestCase):
test('example-with-dashes', 'example-with-dashes')
test('all', '2', '47', '45', 'example-with-dashes', '35')
test('mergeall', '2+47+45+example-with-dashes+35', multi=True)
# See: https://github.com/yt-dlp/yt-dlp/pulls/8797
test('7_a/worst', '35')
def test_format_selection_audio(self):
formats = [
@ -181,7 +183,7 @@ class TestFormatSelection(unittest.TestCase):
]
info_dict = _make_result(formats)
ydl = YDL({'format': 'best'})
ydl = YDL({'format': 'best', 'format_sort': ['abr', 'ext']})
ydl.sort_formats(info_dict)
ydl.process_ie_result(copy.deepcopy(info_dict))
downloaded = ydl.downloaded_info_dicts[0]
@ -193,7 +195,7 @@ class TestFormatSelection(unittest.TestCase):
downloaded = ydl.downloaded_info_dicts[0]
self.assertEqual(downloaded['format_id'], 'mp3-64')
ydl = YDL({'prefer_free_formats': True})
ydl = YDL({'prefer_free_formats': True, 'format_sort': ['abr', 'ext']})
ydl.sort_formats(info_dict)
ydl.process_ie_result(copy.deepcopy(info_dict))
downloaded = ydl.downloaded_info_dicts[0]
@ -669,7 +671,7 @@ class TestYoutubeDL(unittest.TestCase):
for (name, got), expect in zip((('outtmpl', out), ('filename', fname)), expected):
if callable(expect):
self.assertTrue(expect(got), f'Wrong {name} from {tmpl}')
else:
elif expect is not None:
self.assertEqual(got, expect, f'Wrong {name} from {tmpl}')
# Side-effects
@ -684,7 +686,8 @@ class TestYoutubeDL(unittest.TestCase):
test('%(id)s.%(ext)s', '1234.mp4')
test('%(duration_string)s', ('27:46:40', '27-46-40'))
test('%(resolution)s', '1080p')
test('%(playlist_index)s', '001')
test('%(playlist_index|)s', '001')
test('%(playlist_index&{}!)s', '1!')
test('%(playlist_autonumber)s', '02')
test('%(autonumber)s', '00001')
test('%(autonumber+2)03d', '005', autonumber_start=3)
@ -727,7 +730,7 @@ class TestYoutubeDL(unittest.TestCase):
self.assertEqual(got_dict.get(info_field), expected, info_field)
return True
test('%()j', (expect_same_infodict, str))
test('%()j', (expect_same_infodict, None))
# NA placeholder
NA_TEST_OUTTMPL = '%(uploader_date)s-%(width)d-%(x|def)s-%(id)s.%(ext)s'
@ -755,20 +758,23 @@ class TestYoutubeDL(unittest.TestCase):
test('%(ext)c', 'm')
test('%(id)d %(id)r', "1234 '1234'")
test('%(id)r %(height)r', "'1234' 1080")
test('%(title5)a %(height)a', (R"'\xe1\xe9\xed \U0001d400' 1080", None))
test('%(ext)s-%(ext|def)d', 'mp4-def')
test('%(width|0)04d', '0000')
test('a%(width|)d', 'a', outtmpl_na_placeholder='none')
test('%(width|0)04d', '0')
test('a%(width|b)d', 'ab', outtmpl_na_placeholder='none')
FORMATS = self.outtmpl_info['formats']
sanitize = lambda x: x.replace(':', '').replace('"', "").replace('\n', ' ')
# Custom type casting
test('%(formats.:.id)l', 'id 1, id 2, id 3')
test('%(formats.:.id)#l', ('id 1\nid 2\nid 3', 'id 1 id 2 id 3'))
test('%(ext)l', 'mp4')
test('%(formats.:.id) 18l', ' id 1, id 2, id 3')
test('%(formats)j', (json.dumps(FORMATS), sanitize(json.dumps(FORMATS))))
test('%(formats)#j', (json.dumps(FORMATS, indent=4), sanitize(json.dumps(FORMATS, indent=4))))
test('%(formats)j', (json.dumps(FORMATS), None))
test('%(formats)#j', (
json.dumps(FORMATS, indent=4),
json.dumps(FORMATS, indent=4).replace(':', '').replace('"', "").replace('\n', ' ')
))
test('%(title5).3B', 'á')
test('%(title5)U', 'áéí 𝐀')
test('%(title5)#U', 'a\u0301e\u0301i\u0301 𝐀')
@ -780,9 +786,9 @@ class TestYoutubeDL(unittest.TestCase):
test('%(title4)#S', 'foo_bar_test')
test('%(title4).10S', ('foo bar ', 'foo bar' + ('#' if compat_os_name == 'nt' else ' ')))
if compat_os_name == 'nt':
test('%(title4)q', ('"foo \\"bar\\" test"', "foo bar test"))
test('%(formats.:.id)#q', ('"id 1" "id 2" "id 3"', 'id 1 id 2 id 3'))
test('%(formats.0.id)#q', ('"id 1"', 'id 1'))
test('%(title4)q', ('"foo ""bar"" test"', None))
test('%(formats.:.id)#q', ('"id 1" "id 2" "id 3"', None))
test('%(formats.0.id)#q', ('"id 1"', None))
else:
test('%(title4)q', ('\'foo "bar" test\'', '\'foo bar test\''))
test('%(formats.:.id)#q', "'id 1' 'id 2' 'id 3'")
@ -793,8 +799,9 @@ class TestYoutubeDL(unittest.TestCase):
test('%(title|%)s %(title|%%)s', '% %%')
test('%(id+1-height+3)05d', '00158')
test('%(width+100)05d', 'NA')
test('%(formats.0) 15s', ('% 15s' % FORMATS[0], '% 15s' % sanitize(str(FORMATS[0]))))
test('%(formats.0)r', (repr(FORMATS[0]), sanitize(repr(FORMATS[0]))))
test('%(filesize*8)d', '8192')
test('%(formats.0) 15s', ('% 15s' % FORMATS[0], None))
test('%(formats.0)r', (repr(FORMATS[0]), None))
test('%(height.0)03d', '001')
test('%(-height.0)04d', '-001')
test('%(formats.-1.id)s', FORMATS[-1]['id'])
@ -806,7 +813,7 @@ class TestYoutubeDL(unittest.TestCase):
out = json.dumps([{'id': f['id'], 'height.:2': str(f['height'])[:2]}
if 'height' in f else {'id': f['id']}
for f in FORMATS])
test('%(formats.:.{id,height.:2})j', (out, sanitize(out)))
test('%(formats.:.{id,height.:2})j', (out, None))
test('%(formats.:.{id,height}.id)l', ', '.join(f['id'] for f in FORMATS))
test('%(.{id,title})j', ('{"id": "1234"}', '{id 1234}'))
@ -822,6 +829,11 @@ class TestYoutubeDL(unittest.TestCase):
test('%(title&foo|baz)s.bar', 'baz.bar')
test('%(x,id&foo|baz)s.bar', 'foo.bar')
test('%(x,title&foo|baz)s.bar', 'baz.bar')
test('%(id&a\nb|)s', ('a\nb', 'a b'))
test('%(id&hi {:>10} {}|)s', 'hi 1234 1234')
test(R'%(id&{0} {}|)s', 'NA')
test(R'%(id&{0.1}|)s', 'NA')
test('%(height&{:,d})S', '1,080')
# Laziness
def gen():
@ -867,12 +879,12 @@ class TestYoutubeDL(unittest.TestCase):
class SimplePP(PostProcessor):
def run(self, info):
with open(audiofile, 'wt') as f:
with open(audiofile, 'w') as f:
f.write('EXAMPLE')
return [info['filepath']], info
def run_pp(params, PP):
with open(filename, 'wt') as f:
with open(filename, 'w') as f:
f.write('EXAMPLE')
ydl = YoutubeDL(params)
ydl.add_post_processor(PP())
@ -891,7 +903,7 @@ class TestYoutubeDL(unittest.TestCase):
class ModifierPP(PostProcessor):
def run(self, info):
with open(info['filepath'], 'wt') as f:
with open(info['filepath'], 'w') as f:
f.write('MODIFIED')
return [], info
@ -929,7 +941,7 @@ class TestYoutubeDL(unittest.TestCase):
def get_videos(filter_=None):
ydl = YDL({'match_filter': filter_, 'simulate': True})
for v in videos:
ydl.process_ie_result(v, download=True)
ydl.process_ie_result(v.copy(), download=True)
return [v['id'] for v in ydl.downloaded_info_dicts]
res = get_videos()
@ -1093,11 +1105,6 @@ class TestYoutubeDL(unittest.TestCase):
test_selection({'playlist_items': '-15::2'}, INDICES[1::2], True)
test_selection({'playlist_items': '-15::15'}, [], True)
def test_urlopen_no_file_protocol(self):
# see https://github.com/ytdl-org/youtube-dl/issues/8227
ydl = YDL()
self.assertRaises(urllib.error.URLError, ydl.urlopen, 'file:///etc/passwd')
def test_do_not_override_ie_key_in_url_transparent(self):
ydl = YDL()
@ -1211,6 +1218,129 @@ class TestYoutubeDL(unittest.TestCase):
self.assertEqual(downloaded['extractor'], 'Video')
self.assertEqual(downloaded['extractor_key'], 'Video')
def test_header_cookies(self):
from http.cookiejar import Cookie
ydl = FakeYDL()
ydl.report_warning = lambda *_, **__: None
def cookie(name, value, version=None, domain='', path='', secure=False, expires=None):
return Cookie(
version or 0, name, value, None, False,
domain, bool(domain), bool(domain), path, bool(path),
secure, expires, False, None, None, rest={})
_test_url = 'https://yt.dlp/test'
def test(encoded_cookies, cookies, *, headers=False, round_trip=None, error_re=None):
def _test():
ydl.cookiejar.clear()
ydl._load_cookies(encoded_cookies, autoscope=headers)
if headers:
ydl._apply_header_cookies(_test_url)
data = {'url': _test_url}
ydl._calc_headers(data)
self.assertCountEqual(
map(vars, ydl.cookiejar), map(vars, cookies),
'Extracted cookiejar.Cookie is not the same')
if not headers:
self.assertEqual(
data.get('cookies'), round_trip or encoded_cookies,
'Cookie is not the same as round trip')
ydl.__dict__['_YoutubeDL__header_cookies'] = []
with self.subTest(msg=encoded_cookies):
if not error_re:
_test()
return
with self.assertRaisesRegex(Exception, error_re):
_test()
test('test=value; Domain=.yt.dlp', [cookie('test', 'value', domain='.yt.dlp')])
test('test=value', [cookie('test', 'value')], error_re=r'Unscoped cookies are not allowed')
test('cookie1=value1; Domain=.yt.dlp; Path=/test; cookie2=value2; Domain=.yt.dlp; Path=/', [
cookie('cookie1', 'value1', domain='.yt.dlp', path='/test'),
cookie('cookie2', 'value2', domain='.yt.dlp', path='/')])
test('test=value; Domain=.yt.dlp; Path=/test; Secure; Expires=9999999999', [
cookie('test', 'value', domain='.yt.dlp', path='/test', secure=True, expires=9999999999)])
test('test="value; "; path=/test; domain=.yt.dlp', [
cookie('test', 'value; ', domain='.yt.dlp', path='/test')],
round_trip='test="value\\073 "; Domain=.yt.dlp; Path=/test')
test('name=; Domain=.yt.dlp', [cookie('name', '', domain='.yt.dlp')],
round_trip='name=""; Domain=.yt.dlp')
test('test=value', [cookie('test', 'value', domain='.yt.dlp')], headers=True)
test('cookie1=value; Domain=.yt.dlp; cookie2=value', [], headers=True, error_re=r'Invalid syntax')
ydl.deprecated_feature = ydl.report_error
test('test=value', [], headers=True, error_re=r'Passing cookies as a header is a potential security risk')
def test_infojson_cookies(self):
TEST_FILE = 'test_infojson_cookies.info.json'
TEST_URL = 'https://example.com/example.mp4'
COOKIES = 'a=b; Domain=.example.com; c=d; Domain=.example.com'
COOKIE_HEADER = {'Cookie': 'a=b; c=d'}
ydl = FakeYDL()
ydl.process_info = lambda x: ydl._write_info_json('test', x, TEST_FILE)
def make_info(info_header_cookies=False, fmts_header_cookies=False, cookies_field=False):
fmt = {'url': TEST_URL}
if fmts_header_cookies:
fmt['http_headers'] = COOKIE_HEADER
if cookies_field:
fmt['cookies'] = COOKIES
return _make_result([fmt], http_headers=COOKIE_HEADER if info_header_cookies else None)
def test(initial_info, note):
result = {}
result['processed'] = ydl.process_ie_result(initial_info)
self.assertTrue(ydl.cookiejar.get_cookies_for_url(TEST_URL),
msg=f'No cookies set in cookiejar after initial process when {note}')
ydl.cookiejar.clear()
with open(TEST_FILE) as infojson:
result['loaded'] = ydl.sanitize_info(json.load(infojson), True)
result['final'] = ydl.process_ie_result(result['loaded'].copy(), download=False)
self.assertTrue(ydl.cookiejar.get_cookies_for_url(TEST_URL),
msg=f'No cookies set in cookiejar after final process when {note}')
ydl.cookiejar.clear()
for key in ('processed', 'loaded', 'final'):
info = result[key]
self.assertIsNone(
traverse_obj(info, ((None, ('formats', 0)), 'http_headers', 'Cookie'), casesense=False, get_all=False),
msg=f'Cookie header not removed in {key} result when {note}')
self.assertEqual(
traverse_obj(info, ((None, ('formats', 0)), 'cookies'), get_all=False), COOKIES,
msg=f'No cookies field found in {key} result when {note}')
test({'url': TEST_URL, 'http_headers': COOKIE_HEADER, 'id': '1', 'title': 'x'}, 'no formats field')
test(make_info(info_header_cookies=True), 'info_dict header cokies')
test(make_info(fmts_header_cookies=True), 'format header cookies')
test(make_info(info_header_cookies=True, fmts_header_cookies=True), 'info_dict and format header cookies')
test(make_info(info_header_cookies=True, fmts_header_cookies=True, cookies_field=True), 'all cookies fields')
test(make_info(cookies_field=True), 'cookies format field')
test({'url': TEST_URL, 'cookies': COOKIES, 'id': '1', 'title': 'x'}, 'info_dict cookies field only')
try_rm(TEST_FILE)
def test_add_headers_cookie(self):
def check_for_cookie_header(result):
return traverse_obj(result, ((None, ('formats', 0)), 'http_headers', 'Cookie'), casesense=False, get_all=False)
ydl = FakeYDL({'http_headers': {'Cookie': 'a=b'}})
ydl._apply_header_cookies(_make_result([])['webpage_url']) # Scope to input webpage URL: .example.com
fmt = {'url': 'https://example.com/video.mp4'}
result = ydl.process_ie_result(_make_result([fmt]), download=False)
self.assertIsNone(check_for_cookie_header(result), msg='http_headers cookies in result info_dict')
self.assertEqual(result.get('cookies'), 'a=b; Domain=.example.com', msg='No cookies were set in cookies field')
self.assertIn('a=b', ydl.cookiejar.get_cookie_header(fmt['url']), msg='No cookies were set in cookiejar')
fmt = {'url': 'https://wrong.com/video.mp4'}
result = ydl.process_ie_result(_make_result([fmt]), download=False)
self.assertIsNone(check_for_cookie_header(result), msg='http_headers cookies for wrong domain')
self.assertFalse(result.get('cookies'), msg='Cookies set in cookies field for wrong domain')
self.assertFalse(ydl.cookiejar.get_cookie_header(fmt['url']), msg='Cookies set in cookiejar for wrong domain')
if __name__ == '__main__':
unittest.main()

View File

@ -11,16 +11,16 @@ sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import re
import tempfile
from yt_dlp.utils import YoutubeDLCookieJar
from yt_dlp.cookies import YoutubeDLCookieJar
class TestYoutubeDLCookieJar(unittest.TestCase):
def test_keep_session_cookies(self):
cookiejar = YoutubeDLCookieJar('./test/testdata/cookies/session_cookies.txt')
cookiejar.load(ignore_discard=True, ignore_expires=True)
cookiejar.load()
tf = tempfile.NamedTemporaryFile(delete=False)
try:
cookiejar.save(filename=tf.name, ignore_discard=True, ignore_expires=True)
cookiejar.save(filename=tf.name)
temp = tf.read().decode()
self.assertTrue(re.search(
r'www\.foobar\.foobar\s+FALSE\s+/\s+TRUE\s+0\s+YoutubeDLExpiresEmpty\s+YoutubeDLExpiresEmptyValue', temp))
@ -32,7 +32,7 @@ class TestYoutubeDLCookieJar(unittest.TestCase):
def test_strip_httponly_prefix(self):
cookiejar = YoutubeDLCookieJar('./test/testdata/cookies/httponly_cookies.txt')
cookiejar.load(ignore_discard=True, ignore_expires=True)
cookiejar.load()
def assert_cookie_has_value(key):
self.assertEqual(cookiejar._cookies['www.foobar.foobar']['/'][key].value, key + '_VALUE')
@ -42,11 +42,25 @@ class TestYoutubeDLCookieJar(unittest.TestCase):
def test_malformed_cookies(self):
cookiejar = YoutubeDLCookieJar('./test/testdata/cookies/malformed_cookies.txt')
cookiejar.load(ignore_discard=True, ignore_expires=True)
cookiejar.load()
# Cookies should be empty since all malformed cookie file entries
# will be ignored
self.assertFalse(cookiejar._cookies)
def test_get_cookie_header(self):
cookiejar = YoutubeDLCookieJar('./test/testdata/cookies/httponly_cookies.txt')
cookiejar.load()
header = cookiejar.get_cookie_header('https://www.foobar.foobar')
self.assertIn('HTTPONLY_COOKIE', header)
def test_get_cookies_for_url(self):
cookiejar = YoutubeDLCookieJar('./test/testdata/cookies/session_cookies.txt')
cookiejar.load()
cookies = cookiejar.get_cookies_for_url('https://www.foobar.foobar/')
self.assertEqual(len(cookies), 2)
cookies = cookiejar.get_cookies_for_url('https://foobar.foobar/')
self.assertFalse(cookies)
if __name__ == '__main__':
unittest.main()

View File

@ -9,15 +9,16 @@ sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import struct
import urllib.parse
from yt_dlp import compat
from yt_dlp.compat import urllib # isort: split
from yt_dlp.compat import (
compat_etree_fromstring,
compat_expanduser,
compat_urllib_parse_unquote,
compat_urllib_parse_urlencode,
)
from yt_dlp.compat.urllib.request import getproxies
class TestCompat(unittest.TestCase):
@ -28,8 +29,7 @@ class TestCompat(unittest.TestCase):
with self.assertWarns(DeprecationWarning):
compat.WINDOWS_VT_MODE
# TODO: Test submodule
# compat.asyncio.events # Must not raise error
self.assertEqual(urllib.request.getproxies, getproxies)
with self.assertWarns(DeprecationWarning):
compat.compat_pycrypto_AES # Must not raise error

View File

@ -1,5 +1,5 @@
import datetime as dt
import unittest
from datetime import datetime, timezone
from yt_dlp import cookies
from yt_dlp.cookies import (
@ -49,32 +49,38 @@ class TestCookies(unittest.TestCase):
""" based on https://chromium.googlesource.com/chromium/src/+/refs/heads/main/base/nix/xdg_util_unittest.cc """
test_cases = [
({}, _LinuxDesktopEnvironment.OTHER),
({'DESKTOP_SESSION': 'my_custom_de'}, _LinuxDesktopEnvironment.OTHER),
({'XDG_CURRENT_DESKTOP': 'my_custom_de'}, _LinuxDesktopEnvironment.OTHER),
({'DESKTOP_SESSION': 'gnome'}, _LinuxDesktopEnvironment.GNOME),
({'DESKTOP_SESSION': 'mate'}, _LinuxDesktopEnvironment.GNOME),
({'DESKTOP_SESSION': 'kde4'}, _LinuxDesktopEnvironment.KDE),
({'DESKTOP_SESSION': 'kde'}, _LinuxDesktopEnvironment.KDE),
({'DESKTOP_SESSION': 'kde4'}, _LinuxDesktopEnvironment.KDE4),
({'DESKTOP_SESSION': 'kde'}, _LinuxDesktopEnvironment.KDE3),
({'DESKTOP_SESSION': 'xfce'}, _LinuxDesktopEnvironment.XFCE),
({'GNOME_DESKTOP_SESSION_ID': 1}, _LinuxDesktopEnvironment.GNOME),
({'KDE_FULL_SESSION': 1}, _LinuxDesktopEnvironment.KDE),
({'KDE_FULL_SESSION': 1}, _LinuxDesktopEnvironment.KDE3),
({'KDE_FULL_SESSION': 1, 'DESKTOP_SESSION': 'kde4'}, _LinuxDesktopEnvironment.KDE4),
({'XDG_CURRENT_DESKTOP': 'X-Cinnamon'}, _LinuxDesktopEnvironment.CINNAMON),
({'XDG_CURRENT_DESKTOP': 'Deepin'}, _LinuxDesktopEnvironment.DEEPIN),
({'XDG_CURRENT_DESKTOP': 'GNOME'}, _LinuxDesktopEnvironment.GNOME),
({'XDG_CURRENT_DESKTOP': 'GNOME:GNOME-Classic'}, _LinuxDesktopEnvironment.GNOME),
({'XDG_CURRENT_DESKTOP': 'GNOME : GNOME-Classic'}, _LinuxDesktopEnvironment.GNOME),
({'XDG_CURRENT_DESKTOP': 'Unity', 'DESKTOP_SESSION': 'gnome-fallback'}, _LinuxDesktopEnvironment.GNOME),
({'XDG_CURRENT_DESKTOP': 'KDE', 'KDE_SESSION_VERSION': '5'}, _LinuxDesktopEnvironment.KDE),
({'XDG_CURRENT_DESKTOP': 'KDE'}, _LinuxDesktopEnvironment.KDE),
({'XDG_CURRENT_DESKTOP': 'KDE', 'KDE_SESSION_VERSION': '5'}, _LinuxDesktopEnvironment.KDE5),
({'XDG_CURRENT_DESKTOP': 'KDE', 'KDE_SESSION_VERSION': '6'}, _LinuxDesktopEnvironment.KDE6),
({'XDG_CURRENT_DESKTOP': 'KDE'}, _LinuxDesktopEnvironment.KDE4),
({'XDG_CURRENT_DESKTOP': 'Pantheon'}, _LinuxDesktopEnvironment.PANTHEON),
({'XDG_CURRENT_DESKTOP': 'UKUI'}, _LinuxDesktopEnvironment.UKUI),
({'XDG_CURRENT_DESKTOP': 'Unity'}, _LinuxDesktopEnvironment.UNITY),
({'XDG_CURRENT_DESKTOP': 'Unity:Unity7'}, _LinuxDesktopEnvironment.UNITY),
({'XDG_CURRENT_DESKTOP': 'Unity:Unity8'}, _LinuxDesktopEnvironment.UNITY),
]
for env, expected_desktop_environment in test_cases:
self.assertEqual(_get_linux_desktop_environment(env), expected_desktop_environment)
self.assertEqual(_get_linux_desktop_environment(env, Logger()), expected_desktop_environment)
def test_chrome_cookie_decryptor_linux_derive_key(self):
key = LinuxChromeCookieDecryptor.derive_key(b'abc')
@ -132,7 +138,7 @@ class TestCookies(unittest.TestCase):
self.assertEqual(cookie.name, 'foo')
self.assertEqual(cookie.value, 'test%20%3Bcookie')
self.assertFalse(cookie.secure)
expected_expiration = datetime(2021, 6, 18, 21, 39, 19, tzinfo=timezone.utc)
expected_expiration = dt.datetime(2021, 6, 18, 21, 39, 19, tzinfo=dt.timezone.utc)
self.assertEqual(cookie.expires, int(expected_expiration.timestamp()))
def test_pbkdf2_sha1(self):

View File

@ -10,10 +10,7 @@ sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import collections
import hashlib
import http.client
import json
import socket
import urllib.error
from test.helper import (
assertGreaterEqual,
@ -29,10 +26,12 @@ from test.helper import (
import yt_dlp.YoutubeDL # isort: split
from yt_dlp.extractor import get_info_extractor
from yt_dlp.networking.exceptions import HTTPError, TransportError
from yt_dlp.utils import (
DownloadError,
ExtractorError,
UnavailableVideoError,
YoutubeDLError,
format_bytes,
join_nonempty,
)
@ -102,6 +101,8 @@ def generator(test_case, tname):
print_skipping('IE marked as not _WORKING')
for tc in test_cases:
if tc.get('expected_exception'):
continue
info_dict = tc.get('info_dict', {})
params = tc.get('params', {})
if not info_dict.get('id'):
@ -141,6 +142,17 @@ def generator(test_case, tname):
res_dict = None
def match_exception(err):
expected_exception = test_case.get('expected_exception')
if not expected_exception:
return False
if err.__class__.__name__ == expected_exception:
return True
for exc in err.exc_info:
if exc.__class__.__name__ == expected_exception:
return True
return False
def try_rm_tcs_files(tcs=None):
if tcs is None:
tcs = test_cases
@ -162,8 +174,9 @@ def generator(test_case, tname):
force_generic_extractor=params.get('force_generic_extractor', False))
except (DownloadError, ExtractorError) as err:
# Check if the exception is not a network related one
if (err.exc_info[0] not in (urllib.error.URLError, socket.timeout, UnavailableVideoError, http.client.BadStatusLine)
or (err.exc_info[0] == urllib.error.HTTPError and err.exc_info[1].code == 503)):
if not isinstance(err.exc_info[1], (TransportError, UnavailableVideoError)) or (isinstance(err.exc_info[1], HTTPError) and err.exc_info[1].status == 503):
if match_exception(err):
return
err.msg = f'{getattr(err, "msg", err)} ({tname})'
raise
@ -174,6 +187,10 @@ def generator(test_case, tname):
print(f'Retrying: {try_num} failed tries\n\n##########\n\n')
try_num += 1
except YoutubeDLError as err:
if match_exception(err):
return
raise
else:
break
@ -249,7 +266,7 @@ def generator(test_case, tname):
# extractor returns full results even with extract_flat
res_tcs = [{'info_dict': e} for e in res_dict['entries']]
try_rm_tcs_files(res_tcs)
ydl.close()
return test_template

View File

@ -0,0 +1,139 @@
#!/usr/bin/env python3
# Allow direct execution
import os
import sys
import unittest
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import http.cookiejar
from test.helper import FakeYDL
from yt_dlp.downloader.external import (
Aria2cFD,
AxelFD,
CurlFD,
FFmpegFD,
HttpieFD,
WgetFD,
)
TEST_COOKIE = {
'version': 0,
'name': 'test',
'value': 'ytdlp',
'port': None,
'port_specified': False,
'domain': '.example.com',
'domain_specified': True,
'domain_initial_dot': False,
'path': '/',
'path_specified': True,
'secure': False,
'expires': None,
'discard': False,
'comment': None,
'comment_url': None,
'rest': {},
}
TEST_INFO = {'url': 'http://www.example.com/'}
class TestHttpieFD(unittest.TestCase):
def test_make_cmd(self):
with FakeYDL() as ydl:
downloader = HttpieFD(ydl, {})
self.assertEqual(
downloader._make_cmd('test', TEST_INFO),
['http', '--download', '--output', 'test', 'http://www.example.com/'])
# Test cookie header is added
ydl.cookiejar.set_cookie(http.cookiejar.Cookie(**TEST_COOKIE))
self.assertEqual(
downloader._make_cmd('test', TEST_INFO),
['http', '--download', '--output', 'test', 'http://www.example.com/', 'Cookie:test=ytdlp'])
class TestAxelFD(unittest.TestCase):
def test_make_cmd(self):
with FakeYDL() as ydl:
downloader = AxelFD(ydl, {})
self.assertEqual(
downloader._make_cmd('test', TEST_INFO),
['axel', '-o', 'test', '--', 'http://www.example.com/'])
# Test cookie header is added
ydl.cookiejar.set_cookie(http.cookiejar.Cookie(**TEST_COOKIE))
self.assertEqual(
downloader._make_cmd('test', TEST_INFO),
['axel', '-o', 'test', '-H', 'Cookie: test=ytdlp', '--max-redirect=0', '--', 'http://www.example.com/'])
class TestWgetFD(unittest.TestCase):
def test_make_cmd(self):
with FakeYDL() as ydl:
downloader = WgetFD(ydl, {})
self.assertNotIn('--load-cookies', downloader._make_cmd('test', TEST_INFO))
# Test cookiejar tempfile arg is added
ydl.cookiejar.set_cookie(http.cookiejar.Cookie(**TEST_COOKIE))
self.assertIn('--load-cookies', downloader._make_cmd('test', TEST_INFO))
class TestCurlFD(unittest.TestCase):
def test_make_cmd(self):
with FakeYDL() as ydl:
downloader = CurlFD(ydl, {})
self.assertNotIn('--cookie', downloader._make_cmd('test', TEST_INFO))
# Test cookie header is added
ydl.cookiejar.set_cookie(http.cookiejar.Cookie(**TEST_COOKIE))
self.assertIn('--cookie', downloader._make_cmd('test', TEST_INFO))
self.assertIn('test=ytdlp', downloader._make_cmd('test', TEST_INFO))
class TestAria2cFD(unittest.TestCase):
def test_make_cmd(self):
with FakeYDL() as ydl:
downloader = Aria2cFD(ydl, {})
downloader._make_cmd('test', TEST_INFO)
self.assertFalse(hasattr(downloader, '_cookies_tempfile'))
# Test cookiejar tempfile arg is added
ydl.cookiejar.set_cookie(http.cookiejar.Cookie(**TEST_COOKIE))
cmd = downloader._make_cmd('test', TEST_INFO)
self.assertIn(f'--load-cookies={downloader._cookies_tempfile}', cmd)
@unittest.skipUnless(FFmpegFD.available(), 'ffmpeg not found')
class TestFFmpegFD(unittest.TestCase):
_args = []
def _test_cmd(self, args):
self._args = args
def test_make_cmd(self):
with FakeYDL() as ydl:
downloader = FFmpegFD(ydl, {})
downloader._debug_cmd = self._test_cmd
downloader._call_downloader('test', {**TEST_INFO, 'ext': 'mp4'})
self.assertEqual(self._args, [
'ffmpeg', '-y', '-hide_banner', '-i', 'http://www.example.com/',
'-c', 'copy', '-f', 'mp4', 'file:test'])
# Test cookies arg is added
ydl.cookiejar.set_cookie(http.cookiejar.Cookie(**TEST_COOKIE))
downloader._call_downloader('test', {**TEST_INFO, 'ext': 'mp4'})
self.assertEqual(self._args, [
'ffmpeg', '-y', '-hide_banner', '-cookies', 'test=ytdlp; path=/; domain=.example.com;\r\n',
'-i', 'http://www.example.com/', '-c', 'copy', '-f', 'mp4', 'file:test'])
# Test with non-url input (ffmpeg reads from stdin '-' for websockets)
downloader._call_downloader('test', {'url': 'x', 'ext': 'mp4'})
self.assertEqual(self._args, [
'ffmpeg', '-y', '-hide_banner', '-i', 'x', '-c', 'copy', '-f', 'mp4', 'file:test'])
if __name__ == '__main__':
unittest.main()

View File

@ -16,6 +16,7 @@ from test.helper import http_server_port, try_rm
from yt_dlp import YoutubeDL
from yt_dlp.downloader.http import HttpFD
from yt_dlp.utils import encodeFilename
from yt_dlp.utils._utils import _YDLLogger as FakeLogger
TEST_DIR = os.path.dirname(os.path.abspath(__file__))
@ -67,17 +68,6 @@ class HTTPTestRequestHandler(http.server.BaseHTTPRequestHandler):
assert False
class FakeLogger:
def debug(self, msg):
pass
def warning(self, msg):
pass
def error(self, msg):
pass
class TestHttpFD(unittest.TestCase):
def setUp(self):
self.httpd = http.server.HTTPServer(

View File

@ -45,6 +45,9 @@ class TestExecution(unittest.TestCase):
self.assertTrue(os.path.exists(LAZY_EXTRACTORS))
_, stderr = self.run_yt_dlp(opts=('-s', 'test:'))
# `MIN_RECOMMENDED` emits a deprecated feature warning for deprecated Python versions
if stderr and stderr.startswith('Deprecated Feature: Support for Python'):
stderr = ''
self.assertFalse(stderr)
subprocess.check_call([sys.executable, 'test/test_all_urls.py'], cwd=rootDir, stdout=subprocess.DEVNULL)

View File

@ -1,192 +0,0 @@
#!/usr/bin/env python3
# Allow direct execution
import os
import sys
import unittest
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import http.server
import ssl
import threading
import urllib.request
from test.helper import http_server_port
from yt_dlp import YoutubeDL
TEST_DIR = os.path.dirname(os.path.abspath(__file__))
class HTTPTestRequestHandler(http.server.BaseHTTPRequestHandler):
def log_message(self, format, *args):
pass
def do_GET(self):
if self.path == '/video.html':
self.send_response(200)
self.send_header('Content-Type', 'text/html; charset=utf-8')
self.end_headers()
self.wfile.write(b'<html><video src="/vid.mp4" /></html>')
elif self.path == '/vid.mp4':
self.send_response(200)
self.send_header('Content-Type', 'video/mp4')
self.end_headers()
self.wfile.write(b'\x00\x00\x00\x00\x20\x66\x74[video]')
elif self.path == '/%E4%B8%AD%E6%96%87.html':
self.send_response(200)
self.send_header('Content-Type', 'text/html; charset=utf-8')
self.end_headers()
self.wfile.write(b'<html><video src="/vid.mp4" /></html>')
else:
assert False
class FakeLogger:
def debug(self, msg):
pass
def warning(self, msg):
pass
def error(self, msg):
pass
class TestHTTP(unittest.TestCase):
def setUp(self):
self.httpd = http.server.HTTPServer(
('127.0.0.1', 0), HTTPTestRequestHandler)
self.port = http_server_port(self.httpd)
self.server_thread = threading.Thread(target=self.httpd.serve_forever)
self.server_thread.daemon = True
self.server_thread.start()
class TestHTTPS(unittest.TestCase):
def setUp(self):
certfn = os.path.join(TEST_DIR, 'testcert.pem')
self.httpd = http.server.HTTPServer(
('127.0.0.1', 0), HTTPTestRequestHandler)
sslctx = ssl.SSLContext(ssl.PROTOCOL_TLS_SERVER)
sslctx.load_cert_chain(certfn, None)
self.httpd.socket = sslctx.wrap_socket(self.httpd.socket, server_side=True)
self.port = http_server_port(self.httpd)
self.server_thread = threading.Thread(target=self.httpd.serve_forever)
self.server_thread.daemon = True
self.server_thread.start()
def test_nocheckcertificate(self):
ydl = YoutubeDL({'logger': FakeLogger()})
self.assertRaises(
Exception,
ydl.extract_info, 'https://127.0.0.1:%d/video.html' % self.port)
ydl = YoutubeDL({'logger': FakeLogger(), 'nocheckcertificate': True})
r = ydl.extract_info('https://127.0.0.1:%d/video.html' % self.port)
self.assertEqual(r['url'], 'https://127.0.0.1:%d/vid.mp4' % self.port)
class TestClientCert(unittest.TestCase):
def setUp(self):
certfn = os.path.join(TEST_DIR, 'testcert.pem')
self.certdir = os.path.join(TEST_DIR, 'testdata', 'certificate')
cacertfn = os.path.join(self.certdir, 'ca.crt')
self.httpd = http.server.HTTPServer(('127.0.0.1', 0), HTTPTestRequestHandler)
sslctx = ssl.SSLContext(ssl.PROTOCOL_TLS_SERVER)
sslctx.verify_mode = ssl.CERT_REQUIRED
sslctx.load_verify_locations(cafile=cacertfn)
sslctx.load_cert_chain(certfn, None)
self.httpd.socket = sslctx.wrap_socket(self.httpd.socket, server_side=True)
self.port = http_server_port(self.httpd)
self.server_thread = threading.Thread(target=self.httpd.serve_forever)
self.server_thread.daemon = True
self.server_thread.start()
def _run_test(self, **params):
ydl = YoutubeDL({
'logger': FakeLogger(),
# Disable client-side validation of unacceptable self-signed testcert.pem
# The test is of a check on the server side, so unaffected
'nocheckcertificate': True,
**params,
})
r = ydl.extract_info('https://127.0.0.1:%d/video.html' % self.port)
self.assertEqual(r['url'], 'https://127.0.0.1:%d/vid.mp4' % self.port)
def test_certificate_combined_nopass(self):
self._run_test(client_certificate=os.path.join(self.certdir, 'clientwithkey.crt'))
def test_certificate_nocombined_nopass(self):
self._run_test(client_certificate=os.path.join(self.certdir, 'client.crt'),
client_certificate_key=os.path.join(self.certdir, 'client.key'))
def test_certificate_combined_pass(self):
self._run_test(client_certificate=os.path.join(self.certdir, 'clientwithencryptedkey.crt'),
client_certificate_password='foobar')
def test_certificate_nocombined_pass(self):
self._run_test(client_certificate=os.path.join(self.certdir, 'client.crt'),
client_certificate_key=os.path.join(self.certdir, 'clientencrypted.key'),
client_certificate_password='foobar')
def _build_proxy_handler(name):
class HTTPTestRequestHandler(http.server.BaseHTTPRequestHandler):
proxy_name = name
def log_message(self, format, *args):
pass
def do_GET(self):
self.send_response(200)
self.send_header('Content-Type', 'text/plain; charset=utf-8')
self.end_headers()
self.wfile.write(f'{self.proxy_name}: {self.path}'.encode())
return HTTPTestRequestHandler
class TestProxy(unittest.TestCase):
def setUp(self):
self.proxy = http.server.HTTPServer(
('127.0.0.1', 0), _build_proxy_handler('normal'))
self.port = http_server_port(self.proxy)
self.proxy_thread = threading.Thread(target=self.proxy.serve_forever)
self.proxy_thread.daemon = True
self.proxy_thread.start()
self.geo_proxy = http.server.HTTPServer(
('127.0.0.1', 0), _build_proxy_handler('geo'))
self.geo_port = http_server_port(self.geo_proxy)
self.geo_proxy_thread = threading.Thread(target=self.geo_proxy.serve_forever)
self.geo_proxy_thread.daemon = True
self.geo_proxy_thread.start()
def test_proxy(self):
geo_proxy = f'127.0.0.1:{self.geo_port}'
ydl = YoutubeDL({
'proxy': f'127.0.0.1:{self.port}',
'geo_verification_proxy': geo_proxy,
})
url = 'http://foo.com/bar'
response = ydl.urlopen(url).read().decode()
self.assertEqual(response, f'normal: {url}')
req = urllib.request.Request(url)
req.add_header('Ytdl-request-proxy', geo_proxy)
response = ydl.urlopen(req).read().decode()
self.assertEqual(response, f'geo: {url}')
def test_proxy_with_idn(self):
ydl = YoutubeDL({
'proxy': f'127.0.0.1:{self.port}',
})
url = 'http://中文.tw/'
response = ydl.urlopen(url).read().decode()
# b'xn--fiq228c' is '中文'.encode('idna')
self.assertEqual(response, 'normal: http://xn--fiq228c.tw/')
if __name__ == '__main__':
unittest.main()

View File

@ -8,442 +8,372 @@ import unittest
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import math
import re
from yt_dlp.jsinterp import JS_Undefined, JSInterpreter
class NaN:
pass
class TestJSInterpreter(unittest.TestCase):
def _test(self, jsi_or_code, expected, func='f', args=()):
if isinstance(jsi_or_code, str):
jsi_or_code = JSInterpreter(jsi_or_code)
got = jsi_or_code.call_function(func, *args)
if expected is NaN:
self.assertTrue(math.isnan(got), f'{got} is not NaN')
else:
self.assertEqual(got, expected)
def test_basic(self):
jsi = JSInterpreter('function x(){;}')
self.assertEqual(jsi.call_function('x'), None)
jsi = JSInterpreter('function f(){;}')
self.assertEqual(repr(jsi.extract_function('f')), 'F<f>')
self._test(jsi, None)
jsi = JSInterpreter('function x3(){return 42;}')
self.assertEqual(jsi.call_function('x3'), 42)
self._test('function f(){return 42;}', 42)
self._test('function f(){42}', None)
self._test('var f = function(){return 42;}', 42)
jsi = JSInterpreter('function x3(){42}')
self.assertEqual(jsi.call_function('x3'), None)
def test_add(self):
self._test('function f(){return 42 + 7;}', 49)
self._test('function f(){return 42 + undefined;}', NaN)
self._test('function f(){return 42 + null;}', 42)
jsi = JSInterpreter('var x5 = function(){return 42;}')
self.assertEqual(jsi.call_function('x5'), 42)
def test_sub(self):
self._test('function f(){return 42 - 7;}', 35)
self._test('function f(){return 42 - undefined;}', NaN)
self._test('function f(){return 42 - null;}', 42)
def test_mul(self):
self._test('function f(){return 42 * 7;}', 294)
self._test('function f(){return 42 * undefined;}', NaN)
self._test('function f(){return 42 * null;}', 0)
def test_div(self):
jsi = JSInterpreter('function f(a, b){return a / b;}')
self._test(jsi, NaN, args=(0, 0))
self._test(jsi, NaN, args=(JS_Undefined, 1))
self._test(jsi, float('inf'), args=(2, 0))
self._test(jsi, 0, args=(0, 3))
def test_mod(self):
self._test('function f(){return 42 % 7;}', 0)
self._test('function f(){return 42 % 0;}', NaN)
self._test('function f(){return 42 % undefined;}', NaN)
def test_exp(self):
self._test('function f(){return 42 ** 2;}', 1764)
self._test('function f(){return 42 ** undefined;}', NaN)
self._test('function f(){return 42 ** null;}', 1)
self._test('function f(){return undefined ** 42;}', NaN)
def test_calc(self):
jsi = JSInterpreter('function x4(a){return 2*a+1;}')
self.assertEqual(jsi.call_function('x4', 3), 7)
self._test('function f(a){return 2*a+1;}', 7, args=[3])
def test_empty_return(self):
jsi = JSInterpreter('function f(){return; y()}')
self.assertEqual(jsi.call_function('f'), None)
self._test('function f(){return; y()}', None)
def test_morespace(self):
jsi = JSInterpreter('function x (a) { return 2 * a + 1 ; }')
self.assertEqual(jsi.call_function('x', 3), 7)
jsi = JSInterpreter('function f () { x = 2 ; return x; }')
self.assertEqual(jsi.call_function('f'), 2)
self._test('function f (a) { return 2 * a + 1 ; }', 7, args=[3])
self._test('function f () { x = 2 ; return x; }', 2)
def test_strange_chars(self):
jsi = JSInterpreter('function $_xY1 ($_axY1) { var $_axY2 = $_axY1 + 1; return $_axY2; }')
self.assertEqual(jsi.call_function('$_xY1', 20), 21)
self._test('function $_xY1 ($_axY1) { var $_axY2 = $_axY1 + 1; return $_axY2; }',
21, args=[20], func='$_xY1')
def test_operators(self):
jsi = JSInterpreter('function f(){return 1 << 5;}')
self.assertEqual(jsi.call_function('f'), 32)
jsi = JSInterpreter('function f(){return 2 ** 5}')
self.assertEqual(jsi.call_function('f'), 32)
jsi = JSInterpreter('function f(){return 19 & 21;}')
self.assertEqual(jsi.call_function('f'), 17)
jsi = JSInterpreter('function f(){return 11 >> 2;}')
self.assertEqual(jsi.call_function('f'), 2)
jsi = JSInterpreter('function f(){return []? 2+3: 4;}')
self.assertEqual(jsi.call_function('f'), 5)
jsi = JSInterpreter('function f(){return 1 == 2}')
self.assertEqual(jsi.call_function('f'), False)
jsi = JSInterpreter('function f(){return 0 && 1 || 2;}')
self.assertEqual(jsi.call_function('f'), 2)
jsi = JSInterpreter('function f(){return 0 ?? 42;}')
self.assertEqual(jsi.call_function('f'), 0)
jsi = JSInterpreter('function f(){return "life, the universe and everything" < 42;}')
self.assertFalse(jsi.call_function('f'))
self._test('function f(){return 1 << 5;}', 32)
self._test('function f(){return 2 ** 5}', 32)
self._test('function f(){return 19 & 21;}', 17)
self._test('function f(){return 11 >> 2;}', 2)
self._test('function f(){return []? 2+3: 4;}', 5)
self._test('function f(){return 1 == 2}', False)
self._test('function f(){return 0 && 1 || 2;}', 2)
self._test('function f(){return 0 ?? 42;}', 0)
self._test('function f(){return "life, the universe and everything" < 42;}', False)
def test_array_access(self):
jsi = JSInterpreter('function f(){var x = [1,2,3]; x[0] = 4; x[0] = 5; x[2.0] = 7; return x;}')
self.assertEqual(jsi.call_function('f'), [5, 2, 7])
self._test('function f(){var x = [1,2,3]; x[0] = 4; x[0] = 5; x[2.0] = 7; return x;}', [5, 2, 7])
def test_parens(self):
jsi = JSInterpreter('function f(){return (1) + (2) * ((( (( (((((3)))))) )) ));}')
self.assertEqual(jsi.call_function('f'), 7)
jsi = JSInterpreter('function f(){return (1 + 2) * 3;}')
self.assertEqual(jsi.call_function('f'), 9)
self._test('function f(){return (1) + (2) * ((( (( (((((3)))))) )) ));}', 7)
self._test('function f(){return (1 + 2) * 3;}', 9)
def test_quotes(self):
jsi = JSInterpreter(R'function f(){return "a\"\\("}')
self.assertEqual(jsi.call_function('f'), R'a"\(')
self._test(R'function f(){return "a\"\\("}', R'a"\(')
def test_assignments(self):
jsi = JSInterpreter('function f(){var x = 20; x = 30 + 1; return x;}')
self.assertEqual(jsi.call_function('f'), 31)
jsi = JSInterpreter('function f(){var x = 20; x += 30 + 1; return x;}')
self.assertEqual(jsi.call_function('f'), 51)
jsi = JSInterpreter('function f(){var x = 20; x -= 30 + 1; return x;}')
self.assertEqual(jsi.call_function('f'), -11)
self._test('function f(){var x = 20; x = 30 + 1; return x;}', 31)
self._test('function f(){var x = 20; x += 30 + 1; return x;}', 51)
self._test('function f(){var x = 20; x -= 30 + 1; return x;}', -11)
@unittest.skip('Not implemented')
def test_comments(self):
'Skipping: Not yet fully implemented'
return
jsi = JSInterpreter('''
function x() {
var x = /* 1 + */ 2;
var y = /* 30
* 40 */ 50;
return x + y;
}
''')
self.assertEqual(jsi.call_function('x'), 52)
self._test('''
function f() {
var x = /* 1 + */ 2;
var y = /* 30
* 40 */ 50;
return x + y;
}
''', 52)
jsi = JSInterpreter('''
function f() {
var x = "/*";
var y = 1 /* comment */ + 2;
return y;
}
''')
self.assertEqual(jsi.call_function('f'), 3)
self._test('''
function f() {
var x = "/*";
var y = 1 /* comment */ + 2;
return y;
}
''', 3)
def test_precedence(self):
jsi = JSInterpreter('''
function x() {
var a = [10, 20, 30, 40, 50];
var b = 6;
a[0]=a[b%a.length];
return a;
}''')
self.assertEqual(jsi.call_function('x'), [20, 20, 30, 40, 50])
self._test('''
function f() {
var a = [10, 20, 30, 40, 50];
var b = 6;
a[0]=a[b%a.length];
return a;
}
''', [20, 20, 30, 40, 50])
def test_builtins(self):
jsi = JSInterpreter('''
function x() { return NaN }
''')
self.assertTrue(math.isnan(jsi.call_function('x')))
self._test('function f() { return NaN }', NaN)
jsi = JSInterpreter('''
function x() { return new Date('Wednesday 31 December 1969 18:01:26 MDT') - 0; }
''')
self.assertEqual(jsi.call_function('x'), 86000)
jsi = JSInterpreter('''
function x(dt) { return new Date(dt) - 0; }
''')
self.assertEqual(jsi.call_function('x', 'Wednesday 31 December 1969 18:01:26 MDT'), 86000)
def test_date(self):
self._test('function f() { return new Date("Wednesday 31 December 1969 18:01:26 MDT") - 0; }', 86000)
jsi = JSInterpreter('function f(dt) { return new Date(dt) - 0; }')
self._test(jsi, 86000, args=['Wednesday 31 December 1969 18:01:26 MDT'])
self._test(jsi, 86000, args=['12/31/1969 18:01:26 MDT']) # m/d/y
self._test(jsi, 0, args=['1 January 1970 00:00:00 UTC'])
def test_call(self):
jsi = JSInterpreter('''
function x() { return 2; }
function y(a) { return x() + (a?a:0); }
function z() { return y(3); }
function x() { return 2; }
function y(a) { return x() + (a?a:0); }
function z() { return y(3); }
''')
self.assertEqual(jsi.call_function('z'), 5)
self.assertEqual(jsi.call_function('y'), 2)
self._test(jsi, 5, func='z')
self._test(jsi, 2, func='y')
def test_if(self):
jsi = JSInterpreter('''
function x() {
let a = 9;
if (0==0) {a++}
return a
}''')
self.assertEqual(jsi.call_function('x'), 10)
self._test('''
function f() {
let a = 9;
if (0==0) {a++}
return a
}
''', 10)
jsi = JSInterpreter('''
function x() {
if (0==0) {return 10}
}''')
self.assertEqual(jsi.call_function('x'), 10)
self._test('''
function f() {
if (0==0) {return 10}
}
''', 10)
jsi = JSInterpreter('''
function x() {
if (0!=0) {return 1}
else {return 10}
}''')
self.assertEqual(jsi.call_function('x'), 10)
self._test('''
function f() {
if (0!=0) {return 1}
else {return 10}
}
''', 10)
""" # Unsupported
jsi = JSInterpreter('''
function x() {
if (0!=0) {return 1}
else if (1==0) {return 2}
else {return 10}
}''')
self.assertEqual(jsi.call_function('x'), 10)
self._test('''
function f() {
if (0!=0) {return 1}
else if (1==0) {return 2}
else {return 10}
}
''', 10)
"""
def test_for_loop(self):
jsi = JSInterpreter('''
function x() { a=0; for (i=0; i-10; i++) {a++} return a }
''')
self.assertEqual(jsi.call_function('x'), 10)
self._test('function f() { a=0; for (i=0; i-10; i++) {a++} return a }', 10)
def test_switch(self):
jsi = JSInterpreter('''
function x(f) { switch(f){
case 1:f+=1;
case 2:f+=2;
case 3:f+=3;break;
case 4:f+=4;
default:f=0;
} return f }
function f(x) { switch(x){
case 1:x+=1;
case 2:x+=2;
case 3:x+=3;break;
case 4:x+=4;
default:x=0;
} return x }
''')
self.assertEqual(jsi.call_function('x', 1), 7)
self.assertEqual(jsi.call_function('x', 3), 6)
self.assertEqual(jsi.call_function('x', 5), 0)
self._test(jsi, 7, args=[1])
self._test(jsi, 6, args=[3])
self._test(jsi, 0, args=[5])
def test_switch_default(self):
jsi = JSInterpreter('''
function x(f) { switch(f){
case 2: f+=2;
default: f-=1;
case 5:
case 6: f+=6;
case 0: break;
case 1: f+=1;
} return f }
function f(x) { switch(x){
case 2: x+=2;
default: x-=1;
case 5:
case 6: x+=6;
case 0: break;
case 1: x+=1;
} return x }
''')
self.assertEqual(jsi.call_function('x', 1), 2)
self.assertEqual(jsi.call_function('x', 5), 11)
self.assertEqual(jsi.call_function('x', 9), 14)
self._test(jsi, 2, args=[1])
self._test(jsi, 11, args=[5])
self._test(jsi, 14, args=[9])
def test_try(self):
jsi = JSInterpreter('''
function x() { try{return 10} catch(e){return 5} }
''')
self.assertEqual(jsi.call_function('x'), 10)
self._test('function f() { try{return 10} catch(e){return 5} }', 10)
def test_catch(self):
jsi = JSInterpreter('''
function x() { try{throw 10} catch(e){return 5} }
''')
self.assertEqual(jsi.call_function('x'), 5)
self._test('function f() { try{throw 10} catch(e){return 5} }', 5)
def test_finally(self):
jsi = JSInterpreter('''
function x() { try{throw 10} finally {return 42} }
''')
self.assertEqual(jsi.call_function('x'), 42)
jsi = JSInterpreter('''
function x() { try{throw 10} catch(e){return 5} finally {return 42} }
''')
self.assertEqual(jsi.call_function('x'), 42)
self._test('function f() { try{throw 10} finally {return 42} }', 42)
self._test('function f() { try{throw 10} catch(e){return 5} finally {return 42} }', 42)
def test_nested_try(self):
jsi = JSInterpreter('''
function x() {try {
try{throw 10} finally {throw 42}
} catch(e){return 5} }
''')
self.assertEqual(jsi.call_function('x'), 5)
self._test('''
function f() {try {
try{throw 10} finally {throw 42}
} catch(e){return 5} }
''', 5)
def test_for_loop_continue(self):
jsi = JSInterpreter('''
function x() { a=0; for (i=0; i-10; i++) { continue; a++ } return a }
''')
self.assertEqual(jsi.call_function('x'), 0)
self._test('function f() { a=0; for (i=0; i-10; i++) { continue; a++ } return a }', 0)
def test_for_loop_break(self):
jsi = JSInterpreter('''
function x() { a=0; for (i=0; i-10; i++) { break; a++ } return a }
''')
self.assertEqual(jsi.call_function('x'), 0)
self._test('function f() { a=0; for (i=0; i-10; i++) { break; a++ } return a }', 0)
def test_for_loop_try(self):
jsi = JSInterpreter('''
function x() {
for (i=0; i-10; i++) { try { if (i == 5) throw i} catch {return 10} finally {break} };
return 42 }
''')
self.assertEqual(jsi.call_function('x'), 42)
self._test('''
function f() {
for (i=0; i-10; i++) { try { if (i == 5) throw i} catch {return 10} finally {break} };
return 42 }
''', 42)
def test_literal_list(self):
jsi = JSInterpreter('''
function x() { return [1, 2, "asdf", [5, 6, 7]][3] }
''')
self.assertEqual(jsi.call_function('x'), [5, 6, 7])
self._test('function f() { return [1, 2, "asdf", [5, 6, 7]][3] }', [5, 6, 7])
def test_comma(self):
jsi = JSInterpreter('''
function x() { a=5; a -= 1, a+=3; return a }
''')
self.assertEqual(jsi.call_function('x'), 7)
jsi = JSInterpreter('''
function x() { a=5; return (a -= 1, a+=3, a); }
''')
self.assertEqual(jsi.call_function('x'), 7)
jsi = JSInterpreter('''
function x() { return (l=[0,1,2,3], function(a, b){return a+b})((l[1], l[2]), l[3]) }
''')
self.assertEqual(jsi.call_function('x'), 5)
self._test('function f() { a=5; a -= 1, a+=3; return a }', 7)
self._test('function f() { a=5; return (a -= 1, a+=3, a); }', 7)
self._test('function f() { return (l=[0,1,2,3], function(a, b){return a+b})((l[1], l[2]), l[3]) }', 5)
def test_void(self):
jsi = JSInterpreter('''
function x() { return void 42; }
''')
self.assertEqual(jsi.call_function('x'), None)
self._test('function f() { return void 42; }', None)
def test_return_function(self):
jsi = JSInterpreter('''
function x() { return [1, function(){return 1}][1] }
function f() { return [1, function(){return 1}][1] }
''')
self.assertEqual(jsi.call_function('x')([]), 1)
self.assertEqual(jsi.call_function('f')([]), 1)
def test_null(self):
jsi = JSInterpreter('''
function x() { return null; }
''')
self.assertEqual(jsi.call_function('x'), None)
jsi = JSInterpreter('''
function x() { return [null > 0, null < 0, null == 0, null === 0]; }
''')
self.assertEqual(jsi.call_function('x'), [False, False, False, False])
jsi = JSInterpreter('''
function x() { return [null >= 0, null <= 0]; }
''')
self.assertEqual(jsi.call_function('x'), [True, True])
self._test('function f() { return null; }', None)
self._test('function f() { return [null > 0, null < 0, null == 0, null === 0]; }',
[False, False, False, False])
self._test('function f() { return [null >= 0, null <= 0]; }', [True, True])
def test_undefined(self):
jsi = JSInterpreter('''
function x() { return undefined === undefined; }
''')
self.assertEqual(jsi.call_function('x'), True)
self._test('function f() { return undefined === undefined; }', True)
self._test('function f() { return undefined; }', JS_Undefined)
self._test('function f() {return undefined ?? 42; }', 42)
self._test('function f() { let v; return v; }', JS_Undefined)
self._test('function f() { let v; return v**0; }', 1)
self._test('function f() { let v; return [v>42, v<=42, v&&42, 42&&v]; }',
[False, False, JS_Undefined, JS_Undefined])
self._test('''
function f() { return [
undefined === undefined,
undefined == undefined,
undefined == null,
undefined < undefined,
undefined > undefined,
undefined === 0,
undefined == 0,
undefined < 0,
undefined > 0,
undefined >= 0,
undefined <= 0,
undefined > null,
undefined < null,
undefined === null
]; }
''', list(map(bool, (1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0))))
jsi = JSInterpreter('''
function x() { return undefined; }
function f() { let v; return [42+v, v+42, v**42, 42**v, 0**v]; }
''')
self.assertEqual(jsi.call_function('x'), JS_Undefined)
jsi = JSInterpreter('''
function x() { let v; return v; }
''')
self.assertEqual(jsi.call_function('x'), JS_Undefined)
jsi = JSInterpreter('''
function x() { return [undefined === undefined, undefined == undefined, undefined < undefined, undefined > undefined]; }
''')
self.assertEqual(jsi.call_function('x'), [True, True, False, False])
jsi = JSInterpreter('''
function x() { return [undefined === 0, undefined == 0, undefined < 0, undefined > 0]; }
''')
self.assertEqual(jsi.call_function('x'), [False, False, False, False])
jsi = JSInterpreter('''
function x() { return [undefined >= 0, undefined <= 0]; }
''')
self.assertEqual(jsi.call_function('x'), [False, False])
jsi = JSInterpreter('''
function x() { return [undefined > null, undefined < null, undefined == null, undefined === null]; }
''')
self.assertEqual(jsi.call_function('x'), [False, False, True, False])
jsi = JSInterpreter('''
function x() { return [undefined === null, undefined == null, undefined < null, undefined > null]; }
''')
self.assertEqual(jsi.call_function('x'), [False, True, False, False])
jsi = JSInterpreter('''
function x() { let v; return [42+v, v+42, v**42, 42**v, 0**v]; }
''')
for y in jsi.call_function('x'):
for y in jsi.call_function('f'):
self.assertTrue(math.isnan(y))
jsi = JSInterpreter('''
function x() { let v; return v**0; }
''')
self.assertEqual(jsi.call_function('x'), 1)
jsi = JSInterpreter('''
function x() { let v; return [v>42, v<=42, v&&42, 42&&v]; }
''')
self.assertEqual(jsi.call_function('x'), [False, False, JS_Undefined, JS_Undefined])
jsi = JSInterpreter('function x(){return undefined ?? 42; }')
self.assertEqual(jsi.call_function('x'), 42)
def test_object(self):
jsi = JSInterpreter('''
function x() { return {}; }
''')
self.assertEqual(jsi.call_function('x'), {})
jsi = JSInterpreter('''
function x() { let a = {m1: 42, m2: 0 }; return [a["m1"], a.m2]; }
''')
self.assertEqual(jsi.call_function('x'), [42, 0])
jsi = JSInterpreter('''
function x() { let a; return a?.qq; }
''')
self.assertEqual(jsi.call_function('x'), JS_Undefined)
jsi = JSInterpreter('''
function x() { let a = {m1: 42, m2: 0 }; return a?.qq; }
''')
self.assertEqual(jsi.call_function('x'), JS_Undefined)
self._test('function f() { return {}; }', {})
self._test('function f() { let a = {m1: 42, m2: 0 }; return [a["m1"], a.m2]; }', [42, 0])
self._test('function f() { let a; return a?.qq; }', JS_Undefined)
self._test('function f() { let a = {m1: 42, m2: 0 }; return a?.qq; }', JS_Undefined)
def test_regex(self):
jsi = JSInterpreter('''
function x() { let a=/,,[/,913,/](,)}/; }
''')
self.assertEqual(jsi.call_function('x'), None)
self._test('function f() { let a=/,,[/,913,/](,)}/; }', None)
self._test('function f() { let a=/,,[/,913,/](,)}/; return a; }', R'/,,[/,913,/](,)}/0')
jsi = JSInterpreter('''
function x() { let a=/,,[/,913,/](,)}/; return a; }
''')
self.assertIsInstance(jsi.call_function('x'), re.Pattern)
R''' # We are not compiling regex
jsi = JSInterpreter('function f() { let a=/,,[/,913,/](,)}/; return a; }')
self.assertIsInstance(jsi.call_function('f'), re.Pattern)
jsi = JSInterpreter('''
function x() { let a=/,,[/,913,/](,)}/i; return a; }
''')
self.assertEqual(jsi.call_function('x').flags & re.I, re.I)
jsi = JSInterpreter('function f() { let a=/,,[/,913,/](,)}/i; return a; }')
self.assertEqual(jsi.call_function('f').flags & re.I, re.I)
jsi = JSInterpreter(R'''
function x() { let a=/,][}",],()}(\[)/; return a; }
''')
self.assertEqual(jsi.call_function('x').pattern, r',][}",],()}(\[)')
jsi = JSInterpreter(R'function f() { let a=/,][}",],()}(\[)/; return a; }')
self.assertEqual(jsi.call_function('f').pattern, r',][}",],()}(\[)')
jsi = JSInterpreter(R'''
function x() { let a=[/[)\\]/]; return a[0]; }
''')
self.assertEqual(jsi.call_function('x').pattern, r'[)\\]')
jsi = JSInterpreter(R'function f() { let a=[/[)\\]/]; return a[0]; }')
self.assertEqual(jsi.call_function('f').pattern, r'[)\\]')
'''
@unittest.skip('Not implemented')
def test_replace(self):
self._test('function f() { let a="data-name".replace("data-", ""); return a }',
'name')
self._test('function f() { let a="data-name".replace(new RegExp("^.+-"), ""); return a; }',
'name')
self._test('function f() { let a="data-name".replace(/^.+-/, ""); return a; }',
'name')
self._test('function f() { let a="data-name".replace(/a/g, "o"); return a; }',
'doto-nome')
self._test('function f() { let a="data-name".replaceAll("a", "o"); return a; }',
'doto-nome')
def test_char_code_at(self):
jsi = JSInterpreter('function x(i){return "test".charCodeAt(i)}')
self.assertEqual(jsi.call_function('x', 0), 116)
self.assertEqual(jsi.call_function('x', 1), 101)
self.assertEqual(jsi.call_function('x', 2), 115)
self.assertEqual(jsi.call_function('x', 3), 116)
self.assertEqual(jsi.call_function('x', 4), None)
self.assertEqual(jsi.call_function('x', 'not_a_number'), 116)
jsi = JSInterpreter('function f(i){return "test".charCodeAt(i)}')
self._test(jsi, 116, args=[0])
self._test(jsi, 101, args=[1])
self._test(jsi, 115, args=[2])
self._test(jsi, 116, args=[3])
self._test(jsi, None, args=[4])
self._test(jsi, 116, args=['not_a_number'])
def test_bitwise_operators_overflow(self):
jsi = JSInterpreter('function x(){return -524999584 << 5}')
self.assertEqual(jsi.call_function('x'), 379882496)
self._test('function f(){return -524999584 << 5}', 379882496)
self._test('function f(){return 1236566549 << 5}', 915423904)
jsi = JSInterpreter('function x(){return 1236566549 << 5}')
self.assertEqual(jsi.call_function('x'), 915423904)
def test_bitwise_operators_typecast(self):
self._test('function f(){return null << 5}', 0)
self._test('function f(){return undefined >> 5}', 0)
self._test('function f(){return 42 << NaN}', 42)
def test_negative(self):
self._test('function f(){return 2 * -2.0 ;}', -4)
self._test('function f(){return 2 - - -2 ;}', 0)
self._test('function f(){return 2 - - - -2 ;}', 4)
self._test('function f(){return 2 - + + - -2;}', 0)
self._test('function f(){return 2 + - + - -2;}', 0)
@unittest.skip('Not implemented')
def test_packed(self):
jsi = JSInterpreter('''function f(p,a,c,k,e,d){while(c--)if(k[c])p=p.replace(new RegExp('\\b'+c.toString(a)+'\\b','g'),k[c]);return p}''')
self.assertEqual(jsi.call_function('f', '''h 7=g("1j");7.7h({7g:[{33:"w://7f-7e-7d-7c.v.7b/7a/79/78/77/76.74?t=73&s=2s&e=72&f=2t&71=70.0.0.1&6z=6y&6x=6w"}],6v:"w://32.v.u/6u.31",16:"r%",15:"r%",6t:"6s",6r:"",6q:"l",6p:"l",6o:"6n",6m:\'6l\',6k:"6j",9:[{33:"/2u?b=6i&n=50&6h=w://32.v.u/6g.31",6f:"6e"}],1y:{6d:1,6c:\'#6b\',6a:\'#69\',68:"67",66:30,65:r,},"64":{63:"%62 2m%m%61%5z%5y%5x.u%5w%5v%5u.2y%22 2k%m%1o%22 5t%m%1o%22 5s%m%1o%22 2j%m%5r%22 16%m%5q%22 15%m%5p%22 5o%2z%5n%5m%2z",5l:"w://v.u/d/1k/5k.2y",5j:[]},\'5i\':{"5h":"5g"},5f:"5e",5d:"w://v.u",5c:{},5b:l,1x:[0.25,0.50,0.75,1,1.25,1.5,2]});h 1m,1n,5a;h 59=0,58=0;h 7=g("1j");h 2x=0,57=0,56=0;$.55({54:{\'53-52\':\'2i-51\'}});7.j(\'4z\',6(x){c(5>0&&x.1l>=5&&1n!=1){1n=1;$(\'q.4y\').4x(\'4w\')}});7.j(\'13\',6(x){2x=x.1l});7.j(\'2g\',6(x){2w(x)});7.j(\'4v\',6(){$(\'q.2v\').4u()});6 2w(x){$(\'q.2v\').4t();c(1m)19;1m=1;17=0;c(4s.4r===l){17=1}$.4q(\'/2u?b=4p&2l=1k&4o=2t-4n-4m-2s-4l&4k=&4j=&4i=&17=\'+17,6(2r){$(\'#4h\').4g(2r)});$(\'.3-8-4f-4e:4d("4c")\').2h(6(e){2q();g().4b(0);g().4a(l)});6 2q(){h $14=$("<q />").2p({1l:"49",16:"r%",15:"r%",48:0,2n:0,2o:47,46:"45(10%, 10%, 10%, 0.4)","44-43":"42"});$("<41 />").2p({16:"60%",15:"60%",2o:40,"3z-2n":"3y"}).3x({\'2m\':\'/?b=3w&2l=1k\',\'2k\':\'0\',\'2j\':\'2i\'}).2f($14);$14.2h(6(){$(3v).3u();g().2g()});$14.2f($(\'#1j\'))}g().13(0);}6 3t(){h 9=7.1b(2e);2d.2c(9);c(9.n>1){1r(i=0;i<9.n;i++){c(9[i].1a==2e){2d.2c(\'!!=\'+i);7.1p(i)}}}}7.j(\'3s\',6(){g().1h("/2a/3r.29","3q 10 28",6(){g().13(g().27()+10)},"2b");$("q[26=2b]").23().21(\'.3-20-1z\');g().1h("/2a/3p.29","3o 10 28",6(){h 12=g().27()-10;c(12<0)12=0;g().13(12)},"24");$("q[26=24]").23().21(\'.3-20-1z\');});6 1i(){}7.j(\'3n\',6(){1i()});7.j(\'3m\',6(){1i()});7.j("k",6(y){h 9=7.1b();c(9.n<2)19;$(\'.3-8-3l-3k\').3j(6(){$(\'#3-8-a-k\').1e(\'3-8-a-z\');$(\'.3-a-k\').p(\'o-1f\',\'11\')});7.1h("/3i/3h.3g","3f 3e",6(){$(\'.3-1w\').3d(\'3-8-1v\');$(\'.3-8-1y, .3-8-1x\').p(\'o-1g\',\'11\');c($(\'.3-1w\').3c(\'3-8-1v\')){$(\'.3-a-k\').p(\'o-1g\',\'l\');$(\'.3-a-k\').p(\'o-1f\',\'l\');$(\'.3-8-a\').1e(\'3-8-a-z\');$(\'.3-8-a:1u\').3b(\'3-8-a-z\')}3a{$(\'.3-a-k\').p(\'o-1g\',\'11\');$(\'.3-a-k\').p(\'o-1f\',\'11\');$(\'.3-8-a:1u\').1e(\'3-8-a-z\')}},"39");7.j("38",6(y){1d.37(\'1c\',y.9[y.36].1a)});c(1d.1t(\'1c\')){35("1s(1d.1t(\'1c\'));",34)}});h 18;6 1s(1q){h 9=7.1b();c(9.n>1){1r(i=0;i<9.n;i++){c(9[i].1a==1q){c(i==18){19}18=i;7.1p(i)}}}}',36,270,'|||jw|||function|player|settings|tracks|submenu||if||||jwplayer|var||on|audioTracks|true|3D|length|aria|attr|div|100|||sx|filemoon|https||event|active||false|tt|seek|dd|height|width|adb|current_audio|return|name|getAudioTracks|default_audio|localStorage|removeClass|expanded|checked|addButton|callMeMaybe|vplayer|0fxcyc2ajhp1|position|vvplay|vvad|220|setCurrentAudioTrack|audio_name|for|audio_set|getItem|last|open|controls|playbackRates|captions|rewind|icon|insertAfter||detach|ff00||button|getPosition|sec|png|player8|ff11|log|console|track_name|appendTo|play|click|no|scrolling|frameborder|file_code|src|top|zIndex|css|showCCform|data|1662367683|383371|dl|video_ad|doPlay|prevt|mp4|3E||jpg|thumbs|file|300|setTimeout|currentTrack|setItem|audioTrackChanged|dualSound|else|addClass|hasClass|toggleClass|Track|Audio|svg|dualy|images|mousedown|buttons|topbar|playAttemptFailed|beforePlay|Rewind|fr|Forward|ff|ready|set_audio_track|remove|this|upload_srt|prop|50px|margin|1000001|iframe|center|align|text|rgba|background|1000000|left|absolute|pause|setCurrentCaptions|Upload|contains|item|content|html|fviews|referer|prem|embed|3e57249ef633e0d03bf76ceb8d8a4b65|216|83|hash|view|get|TokenZir|window|hide|show|complete|slow|fadeIn|video_ad_fadein|time||cache|Cache|Content|headers|ajaxSetup|v2done|tott|vastdone2|vastdone1|vvbefore|playbackRateControls|cast|aboutlink|FileMoon|abouttext|UHD|1870|qualityLabels|sites|GNOME_POWER|link|2Fiframe|3C|allowfullscreen|22360|22640|22no|marginheight|marginwidth|2FGNOME_POWER|2F0fxcyc2ajhp1|2Fe|2Ffilemoon|2F|3A||22https|3Ciframe|code|sharing|fontOpacity|backgroundOpacity|Tahoma|fontFamily|303030|backgroundColor|FFFFFF|color|userFontScale|thumbnails|kind|0fxcyc2ajhp10000|url|get_slides|start|startparam|none|preload|html5|primary|hlshtml|androidhls|duration|uniform|stretching|0fxcyc2ajhp1_xt|image|2048|sp|6871|asn|127|srv|43200|_g3XlBcu2lmD9oDexD2NLWSmah2Nu3XcDrl93m9PwXY|m3u8||master|0fxcyc2ajhp1_x|00076|01|hls2|to|s01|delivery|storage|moon|sources|setup'''.split('|')))
if __name__ == '__main__':

2009
test/test_networking.py Normal file

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,208 @@
#!/usr/bin/env python3
# Allow direct execution
import os
import sys
import pytest
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import io
import random
import ssl
from yt_dlp.cookies import YoutubeDLCookieJar
from yt_dlp.dependencies import certifi
from yt_dlp.networking import Response
from yt_dlp.networking._helper import (
InstanceStoreMixin,
add_accept_encoding_header,
get_redirect_method,
make_socks_proxy_opts,
select_proxy,
ssl_load_certs,
)
from yt_dlp.networking.exceptions import (
HTTPError,
IncompleteRead,
)
from yt_dlp.socks import ProxyType
from yt_dlp.utils.networking import HTTPHeaderDict
TEST_DIR = os.path.dirname(os.path.abspath(__file__))
class TestNetworkingUtils:
def test_select_proxy(self):
proxies = {
'all': 'socks5://example.com',
'http': 'http://example.com:1080',
'no': 'bypass.example.com,yt-dl.org'
}
assert select_proxy('https://example.com', proxies) == proxies['all']
assert select_proxy('http://example.com', proxies) == proxies['http']
assert select_proxy('http://bypass.example.com', proxies) is None
assert select_proxy('https://yt-dl.org', proxies) is None
@pytest.mark.parametrize('socks_proxy,expected', [
('socks5h://example.com', {
'proxytype': ProxyType.SOCKS5,
'addr': 'example.com',
'port': 1080,
'rdns': True,
'username': None,
'password': None
}),
('socks5://user:@example.com:5555', {
'proxytype': ProxyType.SOCKS5,
'addr': 'example.com',
'port': 5555,
'rdns': False,
'username': 'user',
'password': ''
}),
('socks4://u%40ser:pa%20ss@127.0.0.1:1080', {
'proxytype': ProxyType.SOCKS4,
'addr': '127.0.0.1',
'port': 1080,
'rdns': False,
'username': 'u@ser',
'password': 'pa ss'
}),
('socks4a://:pa%20ss@127.0.0.1', {
'proxytype': ProxyType.SOCKS4A,
'addr': '127.0.0.1',
'port': 1080,
'rdns': True,
'username': '',
'password': 'pa ss'
})
])
def test_make_socks_proxy_opts(self, socks_proxy, expected):
assert make_socks_proxy_opts(socks_proxy) == expected
def test_make_socks_proxy_unknown(self):
with pytest.raises(ValueError, match='Unknown SOCKS proxy version: socks'):
make_socks_proxy_opts('socks://127.0.0.1')
@pytest.mark.skipif(not certifi, reason='certifi is not installed')
def test_load_certifi(self):
context_certifi = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
context_certifi.load_verify_locations(cafile=certifi.where())
context = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
ssl_load_certs(context, use_certifi=True)
assert context.get_ca_certs() == context_certifi.get_ca_certs()
context_default = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
context_default.load_default_certs()
context = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
ssl_load_certs(context, use_certifi=False)
assert context.get_ca_certs() == context_default.get_ca_certs()
if context_default.get_ca_certs() == context_certifi.get_ca_certs():
pytest.skip('System uses certifi as default. The test is not valid')
@pytest.mark.parametrize('method,status,expected', [
('GET', 303, 'GET'),
('HEAD', 303, 'HEAD'),
('PUT', 303, 'GET'),
('POST', 301, 'GET'),
('HEAD', 301, 'HEAD'),
('POST', 302, 'GET'),
('HEAD', 302, 'HEAD'),
('PUT', 302, 'PUT'),
('POST', 308, 'POST'),
('POST', 307, 'POST'),
('HEAD', 308, 'HEAD'),
('HEAD', 307, 'HEAD'),
])
def test_get_redirect_method(self, method, status, expected):
assert get_redirect_method(method, status) == expected
@pytest.mark.parametrize('headers,supported_encodings,expected', [
({'Accept-Encoding': 'br'}, ['gzip', 'br'], {'Accept-Encoding': 'br'}),
({}, ['gzip', 'br'], {'Accept-Encoding': 'gzip, br'}),
({'Content-type': 'application/json'}, [], {'Content-type': 'application/json', 'Accept-Encoding': 'identity'}),
])
def test_add_accept_encoding_header(self, headers, supported_encodings, expected):
headers = HTTPHeaderDict(headers)
add_accept_encoding_header(headers, supported_encodings)
assert headers == HTTPHeaderDict(expected)
class TestInstanceStoreMixin:
class FakeInstanceStoreMixin(InstanceStoreMixin):
def _create_instance(self, **kwargs):
return random.randint(0, 1000000)
def _close_instance(self, instance):
pass
def test_mixin(self):
mixin = self.FakeInstanceStoreMixin()
assert mixin._get_instance(d={'a': 1, 'b': 2, 'c': {'d', 4}}) == mixin._get_instance(d={'a': 1, 'b': 2, 'c': {'d', 4}})
assert mixin._get_instance(d={'a': 1, 'b': 2, 'c': {'e', 4}}) != mixin._get_instance(d={'a': 1, 'b': 2, 'c': {'d', 4}})
assert mixin._get_instance(d={'a': 1, 'b': 2, 'c': {'d', 4}} != mixin._get_instance(d={'a': 1, 'b': 2, 'g': {'d', 4}}))
assert mixin._get_instance(d={'a': 1}, e=[1, 2, 3]) == mixin._get_instance(d={'a': 1}, e=[1, 2, 3])
assert mixin._get_instance(d={'a': 1}, e=[1, 2, 3]) != mixin._get_instance(d={'a': 1}, e=[1, 2, 3, 4])
cookiejar = YoutubeDLCookieJar()
assert mixin._get_instance(b=[1, 2], c=cookiejar) == mixin._get_instance(b=[1, 2], c=cookiejar)
assert mixin._get_instance(b=[1, 2], c=cookiejar) != mixin._get_instance(b=[1, 2], c=YoutubeDLCookieJar())
# Different order
assert mixin._get_instance(c=cookiejar, b=[1, 2]) == mixin._get_instance(b=[1, 2], c=cookiejar)
m = mixin._get_instance(t=1234)
assert mixin._get_instance(t=1234) == m
mixin._clear_instances()
assert mixin._get_instance(t=1234) != m
class TestNetworkingExceptions:
@staticmethod
def create_response(status):
return Response(fp=io.BytesIO(b'test'), url='http://example.com', headers={'tesT': 'test'}, status=status)
def test_http_error(self):
response = self.create_response(403)
error = HTTPError(response)
assert error.status == 403
assert str(error) == error.msg == 'HTTP Error 403: Forbidden'
assert error.reason == response.reason
assert error.response is response
data = error.response.read()
assert data == b'test'
assert repr(error) == '<HTTPError 403: Forbidden>'
def test_redirect_http_error(self):
response = self.create_response(301)
error = HTTPError(response, redirect_loop=True)
assert str(error) == error.msg == 'HTTP Error 301: Moved Permanently (redirect loop detected)'
assert error.reason == 'Moved Permanently'
def test_incomplete_read_error(self):
error = IncompleteRead(4, 3, cause='test')
assert isinstance(error, IncompleteRead)
assert repr(error) == '<IncompleteRead: 4 bytes read, 3 more expected>'
assert str(error) == error.msg == '4 bytes read, 3 more expected'
assert error.partial == 4
assert error.expected == 3
assert error.cause == 'test'
error = IncompleteRead(3)
assert repr(error) == '<IncompleteRead: 3 bytes read>'
assert str(error) == '3 bytes read'

View File

@ -1,113 +1,471 @@
#!/usr/bin/env python3
# Allow direct execution
import os
import sys
import threading
import unittest
import pytest
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import abc
import contextlib
import enum
import functools
import http.server
import json
import random
import subprocess
import urllib.request
import socket
import struct
import time
from socketserver import (
BaseRequestHandler,
StreamRequestHandler,
ThreadingTCPServer,
)
from test.helper import FakeYDL, get_params, is_download_test
from test.helper import http_server_port, verify_address_availability
from yt_dlp.networking import Request
from yt_dlp.networking.exceptions import ProxyError, TransportError
from yt_dlp.socks import (
SOCKS4_REPLY_VERSION,
SOCKS4_VERSION,
SOCKS5_USER_AUTH_SUCCESS,
SOCKS5_USER_AUTH_VERSION,
SOCKS5_VERSION,
Socks5AddressType,
Socks5Auth,
)
SOCKS5_USER_AUTH_FAILURE = 0x1
@is_download_test
class TestMultipleSocks(unittest.TestCase):
@staticmethod
def _check_params(attrs):
params = get_params()
for attr in attrs:
if attr not in params:
print('Missing %s. Skipping.' % attr)
class Socks4CD(enum.IntEnum):
REQUEST_GRANTED = 90
REQUEST_REJECTED_OR_FAILED = 91
REQUEST_REJECTED_CANNOT_CONNECT_TO_IDENTD = 92
REQUEST_REJECTED_DIFFERENT_USERID = 93
class Socks5Reply(enum.IntEnum):
SUCCEEDED = 0x0
GENERAL_FAILURE = 0x1
CONNECTION_NOT_ALLOWED = 0x2
NETWORK_UNREACHABLE = 0x3
HOST_UNREACHABLE = 0x4
CONNECTION_REFUSED = 0x5
TTL_EXPIRED = 0x6
COMMAND_NOT_SUPPORTED = 0x7
ADDRESS_TYPE_NOT_SUPPORTED = 0x8
class SocksTestRequestHandler(BaseRequestHandler):
def __init__(self, *args, socks_info=None, **kwargs):
self.socks_info = socks_info
super().__init__(*args, **kwargs)
class SocksProxyHandler(BaseRequestHandler):
def __init__(self, request_handler_class, socks_server_kwargs, *args, **kwargs):
self.socks_kwargs = socks_server_kwargs or {}
self.request_handler_class = request_handler_class
super().__init__(*args, **kwargs)
class Socks5ProxyHandler(StreamRequestHandler, SocksProxyHandler):
# SOCKS5 protocol https://tools.ietf.org/html/rfc1928
# SOCKS5 username/password authentication https://tools.ietf.org/html/rfc1929
def handle(self):
sleep = self.socks_kwargs.get('sleep')
if sleep:
time.sleep(sleep)
version, nmethods = self.connection.recv(2)
assert version == SOCKS5_VERSION
methods = list(self.connection.recv(nmethods))
auth = self.socks_kwargs.get('auth')
if auth is not None and Socks5Auth.AUTH_USER_PASS not in methods:
self.connection.sendall(struct.pack('!BB', SOCKS5_VERSION, Socks5Auth.AUTH_NO_ACCEPTABLE))
self.server.close_request(self.request)
return
elif Socks5Auth.AUTH_USER_PASS in methods:
self.connection.sendall(struct.pack("!BB", SOCKS5_VERSION, Socks5Auth.AUTH_USER_PASS))
_, user_len = struct.unpack('!BB', self.connection.recv(2))
username = self.connection.recv(user_len).decode()
pass_len = ord(self.connection.recv(1))
password = self.connection.recv(pass_len).decode()
if username == auth[0] and password == auth[1]:
self.connection.sendall(struct.pack('!BB', SOCKS5_USER_AUTH_VERSION, SOCKS5_USER_AUTH_SUCCESS))
else:
self.connection.sendall(struct.pack('!BB', SOCKS5_USER_AUTH_VERSION, SOCKS5_USER_AUTH_FAILURE))
self.server.close_request(self.request)
return
return params
def test_proxy_http(self):
params = self._check_params(['primary_proxy', 'primary_server_ip'])
if params is None:
return
ydl = FakeYDL({
'proxy': params['primary_proxy']
})
self.assertEqual(
ydl.urlopen('http://yt-dl.org/ip').read().decode(),
params['primary_server_ip'])
def test_proxy_https(self):
params = self._check_params(['primary_proxy', 'primary_server_ip'])
if params is None:
return
ydl = FakeYDL({
'proxy': params['primary_proxy']
})
self.assertEqual(
ydl.urlopen('https://yt-dl.org/ip').read().decode(),
params['primary_server_ip'])
def test_secondary_proxy_http(self):
params = self._check_params(['secondary_proxy', 'secondary_server_ip'])
if params is None:
return
ydl = FakeYDL()
req = urllib.request.Request('http://yt-dl.org/ip')
req.add_header('Ytdl-request-proxy', params['secondary_proxy'])
self.assertEqual(
ydl.urlopen(req).read().decode(),
params['secondary_server_ip'])
def test_secondary_proxy_https(self):
params = self._check_params(['secondary_proxy', 'secondary_server_ip'])
if params is None:
return
ydl = FakeYDL()
req = urllib.request.Request('https://yt-dl.org/ip')
req.add_header('Ytdl-request-proxy', params['secondary_proxy'])
self.assertEqual(
ydl.urlopen(req).read().decode(),
params['secondary_server_ip'])
@is_download_test
class TestSocks(unittest.TestCase):
_SKIP_SOCKS_TEST = True
def setUp(self):
if self._SKIP_SOCKS_TEST:
elif Socks5Auth.AUTH_NONE in methods:
self.connection.sendall(struct.pack('!BB', SOCKS5_VERSION, Socks5Auth.AUTH_NONE))
else:
self.connection.sendall(struct.pack('!BB', SOCKS5_VERSION, Socks5Auth.AUTH_NO_ACCEPTABLE))
self.server.close_request(self.request)
return
self.port = random.randint(20000, 30000)
self.server_process = subprocess.Popen([
'srelay', '-f', '-i', '127.0.0.1:%d' % self.port],
stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
version, command, _, address_type = struct.unpack('!BBBB', self.connection.recv(4))
socks_info = {
'version': version,
'auth_methods': methods,
'command': command,
'client_address': self.client_address,
'ipv4_address': None,
'domain_address': None,
'ipv6_address': None,
}
if address_type == Socks5AddressType.ATYP_IPV4:
socks_info['ipv4_address'] = socket.inet_ntoa(self.connection.recv(4))
elif address_type == Socks5AddressType.ATYP_DOMAINNAME:
socks_info['domain_address'] = self.connection.recv(ord(self.connection.recv(1))).decode()
elif address_type == Socks5AddressType.ATYP_IPV6:
socks_info['ipv6_address'] = socket.inet_ntop(socket.AF_INET6, self.connection.recv(16))
else:
self.server.close_request(self.request)
def tearDown(self):
if self._SKIP_SOCKS_TEST:
socks_info['port'] = struct.unpack('!H', self.connection.recv(2))[0]
# dummy response, the returned IP is just a placeholder
self.connection.sendall(struct.pack(
'!BBBBIH', SOCKS5_VERSION, self.socks_kwargs.get('reply', Socks5Reply.SUCCEEDED), 0x0, 0x1, 0x7f000001, 40000))
self.request_handler_class(self.request, self.client_address, self.server, socks_info=socks_info)
class Socks4ProxyHandler(StreamRequestHandler, SocksProxyHandler):
# SOCKS4 protocol http://www.openssh.com/txt/socks4.protocol
# SOCKS4A protocol http://www.openssh.com/txt/socks4a.protocol
def _read_until_null(self):
return b''.join(iter(functools.partial(self.connection.recv, 1), b'\x00'))
def handle(self):
sleep = self.socks_kwargs.get('sleep')
if sleep:
time.sleep(sleep)
socks_info = {
'version': SOCKS4_VERSION,
'command': None,
'client_address': self.client_address,
'ipv4_address': None,
'port': None,
'domain_address': None,
}
version, command, dest_port, dest_ip = struct.unpack('!BBHI', self.connection.recv(8))
socks_info['port'] = dest_port
socks_info['command'] = command
if version != SOCKS4_VERSION:
self.server.close_request(self.request)
return
use_remote_dns = False
if 0x0 < dest_ip <= 0xFF:
use_remote_dns = True
else:
socks_info['ipv4_address'] = socket.inet_ntoa(struct.pack("!I", dest_ip))
user_id = self._read_until_null().decode()
if user_id != (self.socks_kwargs.get('user_id') or ''):
self.connection.sendall(struct.pack(
'!BBHI', SOCKS4_REPLY_VERSION, Socks4CD.REQUEST_REJECTED_DIFFERENT_USERID, 0x00, 0x00000000))
self.server.close_request(self.request)
return
self.server_process.terminate()
self.server_process.communicate()
if use_remote_dns:
socks_info['domain_address'] = self._read_until_null().decode()
def _get_ip(self, protocol):
if self._SKIP_SOCKS_TEST:
return '127.0.0.1'
# dummy response, the returned IP is just a placeholder
self.connection.sendall(
struct.pack(
'!BBHI', SOCKS4_REPLY_VERSION,
self.socks_kwargs.get('cd_reply', Socks4CD.REQUEST_GRANTED), 40000, 0x7f000001))
ydl = FakeYDL({
'proxy': '%s://127.0.0.1:%d' % (protocol, self.port),
})
return ydl.urlopen('http://yt-dl.org/ip').read().decode()
self.request_handler_class(self.request, self.client_address, self.server, socks_info=socks_info)
def test_socks4(self):
self.assertTrue(isinstance(self._get_ip('socks4'), str))
def test_socks4a(self):
self.assertTrue(isinstance(self._get_ip('socks4a'), str))
class IPv6ThreadingTCPServer(ThreadingTCPServer):
address_family = socket.AF_INET6
def test_socks5(self):
self.assertTrue(isinstance(self._get_ip('socks5'), str))
class SocksHTTPTestRequestHandler(http.server.BaseHTTPRequestHandler, SocksTestRequestHandler):
def do_GET(self):
if self.path == '/socks_info':
payload = json.dumps(self.socks_info.copy())
self.send_response(200)
self.send_header('Content-Type', 'application/json; charset=utf-8')
self.send_header('Content-Length', str(len(payload)))
self.end_headers()
self.wfile.write(payload.encode())
class SocksWebSocketTestRequestHandler(SocksTestRequestHandler):
def handle(self):
import websockets.sync.server
protocol = websockets.ServerProtocol()
connection = websockets.sync.server.ServerConnection(socket=self.request, protocol=protocol, close_timeout=0)
connection.handshake()
connection.send(json.dumps(self.socks_info))
connection.close()
@contextlib.contextmanager
def socks_server(socks_server_class, request_handler, bind_ip=None, **socks_server_kwargs):
server = server_thread = None
try:
bind_address = bind_ip or '127.0.0.1'
server_type = ThreadingTCPServer if '.' in bind_address else IPv6ThreadingTCPServer
server = server_type(
(bind_address, 0), functools.partial(socks_server_class, request_handler, socks_server_kwargs))
server_port = http_server_port(server)
server_thread = threading.Thread(target=server.serve_forever)
server_thread.daemon = True
server_thread.start()
if '.' not in bind_address:
yield f'[{bind_address}]:{server_port}'
else:
yield f'{bind_address}:{server_port}'
finally:
server.shutdown()
server.server_close()
server_thread.join(2.0)
class SocksProxyTestContext(abc.ABC):
REQUEST_HANDLER_CLASS = None
def socks_server(self, server_class, *args, **kwargs):
return socks_server(server_class, self.REQUEST_HANDLER_CLASS, *args, **kwargs)
@abc.abstractmethod
def socks_info_request(self, handler, target_domain=None, target_port=None, **req_kwargs) -> dict:
"""return a dict of socks_info"""
class HTTPSocksTestProxyContext(SocksProxyTestContext):
REQUEST_HANDLER_CLASS = SocksHTTPTestRequestHandler
def socks_info_request(self, handler, target_domain=None, target_port=None, **req_kwargs):
request = Request(f'http://{target_domain or "127.0.0.1"}:{target_port or "40000"}/socks_info', **req_kwargs)
handler.validate(request)
return json.loads(handler.send(request).read().decode())
class WebSocketSocksTestProxyContext(SocksProxyTestContext):
REQUEST_HANDLER_CLASS = SocksWebSocketTestRequestHandler
def socks_info_request(self, handler, target_domain=None, target_port=None, **req_kwargs):
request = Request(f'ws://{target_domain or "127.0.0.1"}:{target_port or "40000"}', **req_kwargs)
handler.validate(request)
ws = handler.send(request)
ws.send('socks_info')
socks_info = ws.recv()
ws.close()
return json.loads(socks_info)
CTX_MAP = {
'http': HTTPSocksTestProxyContext,
'ws': WebSocketSocksTestProxyContext,
}
@pytest.fixture(scope='module')
def ctx(request):
return CTX_MAP[request.param]()
@pytest.mark.parametrize(
'handler,ctx', [
('Urllib', 'http'),
('Requests', 'http'),
('Websockets', 'ws'),
('CurlCFFI', 'http')
], indirect=True)
class TestSocks4Proxy:
def test_socks4_no_auth(self, handler, ctx):
with handler() as rh:
with ctx.socks_server(Socks4ProxyHandler) as server_address:
response = ctx.socks_info_request(
rh, proxies={'all': f'socks4://{server_address}'})
assert response['version'] == 4
def test_socks4_auth(self, handler, ctx):
with handler() as rh:
with ctx.socks_server(Socks4ProxyHandler, user_id='user') as server_address:
with pytest.raises(ProxyError):
ctx.socks_info_request(rh, proxies={'all': f'socks4://{server_address}'})
response = ctx.socks_info_request(
rh, proxies={'all': f'socks4://user:@{server_address}'})
assert response['version'] == 4
def test_socks4a_ipv4_target(self, handler, ctx):
with ctx.socks_server(Socks4ProxyHandler) as server_address:
with handler(proxies={'all': f'socks4a://{server_address}'}) as rh:
response = ctx.socks_info_request(rh, target_domain='127.0.0.1')
assert response['version'] == 4
assert (response['ipv4_address'] == '127.0.0.1') != (response['domain_address'] == '127.0.0.1')
def test_socks4a_domain_target(self, handler, ctx):
with ctx.socks_server(Socks4ProxyHandler) as server_address:
with handler(proxies={'all': f'socks4a://{server_address}'}) as rh:
response = ctx.socks_info_request(rh, target_domain='localhost')
assert response['version'] == 4
assert response['ipv4_address'] is None
assert response['domain_address'] == 'localhost'
def test_ipv4_client_source_address(self, handler, ctx):
with ctx.socks_server(Socks4ProxyHandler) as server_address:
source_address = f'127.0.0.{random.randint(5, 255)}'
verify_address_availability(source_address)
with handler(proxies={'all': f'socks4://{server_address}'},
source_address=source_address) as rh:
response = ctx.socks_info_request(rh)
assert response['client_address'][0] == source_address
assert response['version'] == 4
@pytest.mark.parametrize('reply_code', [
Socks4CD.REQUEST_REJECTED_OR_FAILED,
Socks4CD.REQUEST_REJECTED_CANNOT_CONNECT_TO_IDENTD,
Socks4CD.REQUEST_REJECTED_DIFFERENT_USERID,
])
def test_socks4_errors(self, handler, ctx, reply_code):
with ctx.socks_server(Socks4ProxyHandler, cd_reply=reply_code) as server_address:
with handler(proxies={'all': f'socks4://{server_address}'}) as rh:
with pytest.raises(ProxyError):
ctx.socks_info_request(rh)
def test_ipv6_socks4_proxy(self, handler, ctx):
with ctx.socks_server(Socks4ProxyHandler, bind_ip='::1') as server_address:
with handler(proxies={'all': f'socks4://{server_address}'}) as rh:
response = ctx.socks_info_request(rh, target_domain='127.0.0.1')
assert response['client_address'][0] == '::1'
assert response['ipv4_address'] == '127.0.0.1'
assert response['version'] == 4
def test_timeout(self, handler, ctx):
with ctx.socks_server(Socks4ProxyHandler, sleep=2) as server_address:
with handler(proxies={'all': f'socks4://{server_address}'}, timeout=0.5) as rh:
with pytest.raises(TransportError):
ctx.socks_info_request(rh)
@pytest.mark.parametrize(
'handler,ctx', [
('Urllib', 'http'),
('Requests', 'http'),
('Websockets', 'ws'),
('CurlCFFI', 'http')
], indirect=True)
class TestSocks5Proxy:
def test_socks5_no_auth(self, handler, ctx):
with ctx.socks_server(Socks5ProxyHandler) as server_address:
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
response = ctx.socks_info_request(rh)
assert response['auth_methods'] == [0x0]
assert response['version'] == 5
def test_socks5_user_pass(self, handler, ctx):
with ctx.socks_server(Socks5ProxyHandler, auth=('test', 'testpass')) as server_address:
with handler() as rh:
with pytest.raises(ProxyError):
ctx.socks_info_request(rh, proxies={'all': f'socks5://{server_address}'})
response = ctx.socks_info_request(
rh, proxies={'all': f'socks5://test:testpass@{server_address}'})
assert response['auth_methods'] == [Socks5Auth.AUTH_NONE, Socks5Auth.AUTH_USER_PASS]
assert response['version'] == 5
def test_socks5_ipv4_target(self, handler, ctx):
with ctx.socks_server(Socks5ProxyHandler) as server_address:
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
response = ctx.socks_info_request(rh, target_domain='127.0.0.1')
assert response['ipv4_address'] == '127.0.0.1'
assert response['version'] == 5
def test_socks5_domain_target(self, handler, ctx):
with ctx.socks_server(Socks5ProxyHandler) as server_address:
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
response = ctx.socks_info_request(rh, target_domain='localhost')
assert (response['ipv4_address'] == '127.0.0.1') != (response['ipv6_address'] == '::1')
assert response['version'] == 5
def test_socks5h_domain_target(self, handler, ctx):
with ctx.socks_server(Socks5ProxyHandler) as server_address:
with handler(proxies={'all': f'socks5h://{server_address}'}) as rh:
response = ctx.socks_info_request(rh, target_domain='localhost')
assert response['ipv4_address'] is None
assert response['domain_address'] == 'localhost'
assert response['version'] == 5
def test_socks5h_ip_target(self, handler, ctx):
with ctx.socks_server(Socks5ProxyHandler) as server_address:
with handler(proxies={'all': f'socks5h://{server_address}'}) as rh:
response = ctx.socks_info_request(rh, target_domain='127.0.0.1')
assert response['ipv4_address'] == '127.0.0.1'
assert response['domain_address'] is None
assert response['version'] == 5
def test_socks5_ipv6_destination(self, handler, ctx):
with ctx.socks_server(Socks5ProxyHandler) as server_address:
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
response = ctx.socks_info_request(rh, target_domain='[::1]')
assert response['ipv6_address'] == '::1'
assert response['version'] == 5
def test_ipv6_socks5_proxy(self, handler, ctx):
with ctx.socks_server(Socks5ProxyHandler, bind_ip='::1') as server_address:
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
response = ctx.socks_info_request(rh, target_domain='127.0.0.1')
assert response['client_address'][0] == '::1'
assert response['ipv4_address'] == '127.0.0.1'
assert response['version'] == 5
# XXX: is there any feasible way of testing IPv6 source addresses?
# Same would go for non-proxy source_address test...
def test_ipv4_client_source_address(self, handler, ctx):
with ctx.socks_server(Socks5ProxyHandler) as server_address:
source_address = f'127.0.0.{random.randint(5, 255)}'
verify_address_availability(source_address)
with handler(proxies={'all': f'socks5://{server_address}'}, source_address=source_address) as rh:
response = ctx.socks_info_request(rh)
assert response['client_address'][0] == source_address
assert response['version'] == 5
@pytest.mark.parametrize('reply_code', [
Socks5Reply.GENERAL_FAILURE,
Socks5Reply.CONNECTION_NOT_ALLOWED,
Socks5Reply.NETWORK_UNREACHABLE,
Socks5Reply.HOST_UNREACHABLE,
Socks5Reply.CONNECTION_REFUSED,
Socks5Reply.TTL_EXPIRED,
Socks5Reply.COMMAND_NOT_SUPPORTED,
Socks5Reply.ADDRESS_TYPE_NOT_SUPPORTED,
])
def test_socks5_errors(self, handler, ctx, reply_code):
with ctx.socks_server(Socks5ProxyHandler, reply=reply_code) as server_address:
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
with pytest.raises(ProxyError):
ctx.socks_info_request(rh)
def test_timeout(self, handler, ctx):
with ctx.socks_server(Socks5ProxyHandler, sleep=2) as server_address:
with handler(proxies={'all': f'socks5://{server_address}'}, timeout=1) as rh:
with pytest.raises(TransportError):
ctx.socks_info_request(rh)
if __name__ == '__main__':

444
test/test_traversal.py Normal file
View File

@ -0,0 +1,444 @@
import http.cookies
import re
import xml.etree.ElementTree
import pytest
from yt_dlp.utils import dict_get, int_or_none, str_or_none
from yt_dlp.utils.traversal import traverse_obj
_TEST_DATA = {
100: 100,
1.2: 1.2,
'str': 'str',
'None': None,
'...': ...,
'urls': [
{'index': 0, 'url': 'https://www.example.com/0'},
{'index': 1, 'url': 'https://www.example.com/1'},
],
'data': (
{'index': 2},
{'index': 3},
),
'dict': {},
}
class TestTraversal:
def test_traversal_base(self):
assert traverse_obj(_TEST_DATA, ('str',)) == 'str', \
'allow tuple path'
assert traverse_obj(_TEST_DATA, ['str']) == 'str', \
'allow list path'
assert traverse_obj(_TEST_DATA, (value for value in ("str",))) == 'str', \
'allow iterable path'
assert traverse_obj(_TEST_DATA, 'str') == 'str', \
'single items should be treated as a path'
assert traverse_obj(_TEST_DATA, 100) == 100, \
'allow int path'
assert traverse_obj(_TEST_DATA, 1.2) == 1.2, \
'allow float path'
assert traverse_obj(_TEST_DATA, None) == _TEST_DATA, \
'`None` should not perform any modification'
def test_traversal_ellipsis(self):
assert traverse_obj(_TEST_DATA, ...) == [x for x in _TEST_DATA.values() if x not in (None, {})], \
'`...` should give all non discarded values'
assert traverse_obj(_TEST_DATA, ('urls', 0, ...)) == list(_TEST_DATA['urls'][0].values()), \
'`...` selection for dicts should select all values'
assert traverse_obj(_TEST_DATA, (..., ..., 'url')) == ['https://www.example.com/0', 'https://www.example.com/1'], \
'nested `...` queries should work'
assert traverse_obj(_TEST_DATA, (..., ..., 'index')) == list(range(4)), \
'`...` query result should be flattened'
assert traverse_obj(iter(range(4)), ...) == list(range(4)), \
'`...` should accept iterables'
def test_traversal_function(self):
filter_func = lambda x, y: x == 'urls' and isinstance(y, list)
assert traverse_obj(_TEST_DATA, filter_func) == [_TEST_DATA['urls']], \
'function as query key should perform a filter based on (key, value)'
assert traverse_obj(_TEST_DATA, lambda _, x: isinstance(x[0], str)) == ['str'], \
'exceptions in the query function should be catched'
assert traverse_obj(iter(range(4)), lambda _, x: x % 2 == 0) == [0, 2], \
'function key should accept iterables'
# Wrong function signature should raise (debug mode)
with pytest.raises(Exception):
traverse_obj(_TEST_DATA, lambda a: ...)
with pytest.raises(Exception):
traverse_obj(_TEST_DATA, lambda a, b, c: ...)
def test_traversal_set(self):
# transformation/type, like `expected_type`
assert traverse_obj(_TEST_DATA, (..., {str.upper}, )) == ['STR'], \
'Function in set should be a transformation'
assert traverse_obj(_TEST_DATA, (..., {str})) == ['str'], \
'Type in set should be a type filter'
assert traverse_obj(_TEST_DATA, (..., {str, int})) == [100, 'str'], \
'Multiple types in set should be a type filter'
assert traverse_obj(_TEST_DATA, {dict}) == _TEST_DATA, \
'A single set should be wrapped into a path'
assert traverse_obj(_TEST_DATA, (..., {str.upper})) == ['STR'], \
'Transformation function should not raise'
expected = [x for x in map(str_or_none, _TEST_DATA.values()) if x is not None]
assert traverse_obj(_TEST_DATA, (..., {str_or_none})) == expected, \
'Function in set should be a transformation'
assert traverse_obj(_TEST_DATA, ('fail', {lambda _: 'const'})) == 'const', \
'Function in set should always be called'
# Sets with length < 1 or > 1 not including only types should raise
with pytest.raises(Exception):
traverse_obj(_TEST_DATA, set())
with pytest.raises(Exception):
traverse_obj(_TEST_DATA, {str.upper, str})
def test_traversal_slice(self):
_SLICE_DATA = [0, 1, 2, 3, 4]
assert traverse_obj(_TEST_DATA, ('dict', slice(1))) is None, \
'slice on a dictionary should not throw'
assert traverse_obj(_SLICE_DATA, slice(1)) == _SLICE_DATA[:1], \
'slice key should apply slice to sequence'
assert traverse_obj(_SLICE_DATA, slice(1, 2)) == _SLICE_DATA[1:2], \
'slice key should apply slice to sequence'
assert traverse_obj(_SLICE_DATA, slice(1, 4, 2)) == _SLICE_DATA[1:4:2], \
'slice key should apply slice to sequence'
def test_traversal_alternatives(self):
assert traverse_obj(_TEST_DATA, 'fail', 'str') == 'str', \
'multiple `paths` should be treated as alternative paths'
assert traverse_obj(_TEST_DATA, 'str', 100) == 'str', \
'alternatives should exit early'
assert traverse_obj(_TEST_DATA, 'fail', 'fail') is None, \
'alternatives should return `default` if exhausted'
assert traverse_obj(_TEST_DATA, (..., 'fail'), 100) == 100, \
'alternatives should track their own branching return'
assert traverse_obj(_TEST_DATA, ('dict', ...), ('data', ...)) == list(_TEST_DATA['data']), \
'alternatives on empty objects should search further'
def test_traversal_branching_nesting(self):
assert traverse_obj(_TEST_DATA, ('urls', (3, 0), 'url')) == ['https://www.example.com/0'], \
'tuple as key should be treated as branches'
assert traverse_obj(_TEST_DATA, ('urls', [3, 0], 'url')) == ['https://www.example.com/0'], \
'list as key should be treated as branches'
assert traverse_obj(_TEST_DATA, ('urls', ((1, 'fail'), (0, 'url')))) == ['https://www.example.com/0'], \
'double nesting in path should be treated as paths'
assert traverse_obj(['0', [1, 2]], [(0, 1), 0]) == [1], \
'do not fail early on branching'
expected = ['https://www.example.com/0', 'https://www.example.com/1']
assert traverse_obj(_TEST_DATA, ('urls', ((0, ('fail', 'url')), (1, 'url')))) == expected, \
'tripple nesting in path should be treated as branches'
assert traverse_obj(_TEST_DATA, ('urls', ('fail', (..., 'url')))) == expected, \
'ellipsis as branch path start gets flattened'
def test_traversal_dict(self):
assert traverse_obj(_TEST_DATA, {0: 100, 1: 1.2}) == {0: 100, 1: 1.2}, \
'dict key should result in a dict with the same keys'
expected = {0: 'https://www.example.com/0'}
assert traverse_obj(_TEST_DATA, {0: ('urls', 0, 'url')}) == expected, \
'dict key should allow paths'
expected = {0: ['https://www.example.com/0']}
assert traverse_obj(_TEST_DATA, {0: ('urls', (3, 0), 'url')}) == expected, \
'tuple in dict path should be treated as branches'
assert traverse_obj(_TEST_DATA, {0: ('urls', ((1, 'fail'), (0, 'url')))}) == expected, \
'double nesting in dict path should be treated as paths'
expected = {0: ['https://www.example.com/1', 'https://www.example.com/0']}
assert traverse_obj(_TEST_DATA, {0: ('urls', ((1, ('fail', 'url')), (0, 'url')))}) == expected, \
'tripple nesting in dict path should be treated as branches'
assert traverse_obj(_TEST_DATA, {0: 'fail'}) == {}, \
'remove `None` values when top level dict key fails'
assert traverse_obj(_TEST_DATA, {0: 'fail'}, default=...) == {0: ...}, \
'use `default` if key fails and `default`'
assert traverse_obj(_TEST_DATA, {0: 'dict'}) == {}, \
'remove empty values when dict key'
assert traverse_obj(_TEST_DATA, {0: 'dict'}, default=...) == {0: ...}, \
'use `default` when dict key and `default`'
assert traverse_obj(_TEST_DATA, {0: {0: 'fail'}}) == {}, \
'remove empty values when nested dict key fails'
assert traverse_obj(None, {0: 'fail'}) == {}, \
'default to dict if pruned'
assert traverse_obj(None, {0: 'fail'}, default=...) == {0: ...}, \
'default to dict if pruned and default is given'
assert traverse_obj(_TEST_DATA, {0: {0: 'fail'}}, default=...) == {0: {0: ...}}, \
'use nested `default` when nested dict key fails and `default`'
assert traverse_obj(_TEST_DATA, {0: ('dict', ...)}) == {}, \
'remove key if branch in dict key not successful'
def test_traversal_default(self):
_DEFAULT_DATA = {'None': None, 'int': 0, 'list': []}
assert traverse_obj(_DEFAULT_DATA, 'fail') is None, \
'default value should be `None`'
assert traverse_obj(_DEFAULT_DATA, 'fail', 'fail', default=...) == ..., \
'chained fails should result in default'
assert traverse_obj(_DEFAULT_DATA, 'None', 'int') == 0, \
'should not short cirquit on `None`'
assert traverse_obj(_DEFAULT_DATA, 'fail', default=1) == 1, \
'invalid dict key should result in `default`'
assert traverse_obj(_DEFAULT_DATA, 'None', default=1) == 1, \
'`None` is a deliberate sentinel and should become `default`'
assert traverse_obj(_DEFAULT_DATA, ('list', 10)) is None, \
'`IndexError` should result in `default`'
assert traverse_obj(_DEFAULT_DATA, (..., 'fail'), default=1) == 1, \
'if branched but not successful return `default` if defined, not `[]`'
assert traverse_obj(_DEFAULT_DATA, (..., 'fail'), default=None) is None, \
'if branched but not successful return `default` even if `default` is `None`'
assert traverse_obj(_DEFAULT_DATA, (..., 'fail')) == [], \
'if branched but not successful return `[]`, not `default`'
assert traverse_obj(_DEFAULT_DATA, ('list', ...)) == [], \
'if branched but object is empty return `[]`, not `default`'
assert traverse_obj(None, ...) == [], \
'if branched but object is `None` return `[]`, not `default`'
assert traverse_obj({0: None}, (0, ...)) == [], \
'if branched but state is `None` return `[]`, not `default`'
@pytest.mark.parametrize('path', [
('fail', ...),
(..., 'fail'),
100 * ('fail',) + (...,),
(...,) + 100 * ('fail',),
])
def test_traversal_branching(self, path):
assert traverse_obj({}, path) == [], \
'if branched but state is `None`, return `[]` (not `default`)'
assert traverse_obj({}, 'fail', path) == [], \
'if branching in last alternative and previous did not match, return `[]` (not `default`)'
assert traverse_obj({0: 'x'}, 0, path) == 'x', \
'if branching in last alternative and previous did match, return single value'
assert traverse_obj({0: 'x'}, path, 0) == 'x', \
'if branching in first alternative and non-branching path does match, return single value'
assert traverse_obj({}, path, 'fail') is None, \
'if branching in first alternative and non-branching path does not match, return `default`'
def test_traversal_expected_type(self):
_EXPECTED_TYPE_DATA = {'str': 'str', 'int': 0}
assert traverse_obj(_EXPECTED_TYPE_DATA, 'str', expected_type=str) == 'str', \
'accept matching `expected_type` type'
assert traverse_obj(_EXPECTED_TYPE_DATA, 'str', expected_type=int) is None, \
'reject non matching `expected_type` type'
assert traverse_obj(_EXPECTED_TYPE_DATA, 'int', expected_type=lambda x: str(x)) == '0', \
'transform type using type function'
assert traverse_obj(_EXPECTED_TYPE_DATA, 'str', expected_type=lambda _: 1 / 0) is None, \
'wrap expected_type fuction in try_call'
assert traverse_obj(_EXPECTED_TYPE_DATA, ..., expected_type=str) == ['str'], \
'eliminate items that expected_type fails on'
assert traverse_obj(_TEST_DATA, {0: 100, 1: 1.2}, expected_type=int) == {0: 100}, \
'type as expected_type should filter dict values'
assert traverse_obj(_TEST_DATA, {0: 100, 1: 1.2, 2: 'None'}, expected_type=str_or_none) == {0: '100', 1: '1.2'}, \
'function as expected_type should transform dict values'
assert traverse_obj(_TEST_DATA, ({0: 1.2}, 0, {int_or_none}), expected_type=int) == 1, \
'expected_type should not filter non final dict values'
assert traverse_obj(_TEST_DATA, {0: {0: 100, 1: 'str'}}, expected_type=int) == {0: {0: 100}}, \
'expected_type should transform deep dict values'
assert traverse_obj(_TEST_DATA, [({0: '...'}, {0: '...'})], expected_type=type(...)) == [{0: ...}, {0: ...}], \
'expected_type should transform branched dict values'
assert traverse_obj({1: {3: 4}}, [(1, 2), 3], expected_type=int) == [4], \
'expected_type regression for type matching in tuple branching'
assert traverse_obj(_TEST_DATA, ['data', ...], expected_type=int) == [], \
'expected_type regression for type matching in dict result'
def test_traversal_get_all(self):
_GET_ALL_DATA = {'key': [0, 1, 2]}
assert traverse_obj(_GET_ALL_DATA, ('key', ...), get_all=False) == 0, \
'if not `get_all`, return only first matching value'
assert traverse_obj(_GET_ALL_DATA, ..., get_all=False) == [0, 1, 2], \
'do not overflatten if not `get_all`'
def test_traversal_casesense(self):
_CASESENSE_DATA = {
'KeY': 'value0',
0: {
'KeY': 'value1',
0: {'KeY': 'value2'},
},
}
assert traverse_obj(_CASESENSE_DATA, 'key') is None, \
'dict keys should be case sensitive unless `casesense`'
assert traverse_obj(_CASESENSE_DATA, 'keY', casesense=False) == 'value0', \
'allow non matching key case if `casesense`'
assert traverse_obj(_CASESENSE_DATA, [0, ('keY',)], casesense=False) == ['value1'], \
'allow non matching key case in branch if `casesense`'
assert traverse_obj(_CASESENSE_DATA, [0, ([0, 'keY'],)], casesense=False) == ['value2'], \
'allow non matching key case in branch path if `casesense`'
def test_traversal_traverse_string(self):
_TRAVERSE_STRING_DATA = {'str': 'str', 1.2: 1.2}
assert traverse_obj(_TRAVERSE_STRING_DATA, ('str', 0)) is None, \
'do not traverse into string if not `traverse_string`'
assert traverse_obj(_TRAVERSE_STRING_DATA, ('str', 0), traverse_string=True) == 's', \
'traverse into string if `traverse_string`'
assert traverse_obj(_TRAVERSE_STRING_DATA, (1.2, 1), traverse_string=True) == '.', \
'traverse into converted data if `traverse_string`'
assert traverse_obj(_TRAVERSE_STRING_DATA, ('str', ...), traverse_string=True) == 'str', \
'`...` should result in string (same value) if `traverse_string`'
assert traverse_obj(_TRAVERSE_STRING_DATA, ('str', slice(0, None, 2)), traverse_string=True) == 'sr', \
'`slice` should result in string if `traverse_string`'
assert traverse_obj(_TRAVERSE_STRING_DATA, ('str', lambda i, v: i or v == "s"), traverse_string=True) == 'str', \
'function should result in string if `traverse_string`'
assert traverse_obj(_TRAVERSE_STRING_DATA, ('str', (0, 2)), traverse_string=True) == ['s', 'r'], \
'branching should result in list if `traverse_string`'
assert traverse_obj({}, (0, ...), traverse_string=True) == [], \
'branching should result in list if `traverse_string`'
assert traverse_obj({}, (0, lambda x, y: True), traverse_string=True) == [], \
'branching should result in list if `traverse_string`'
assert traverse_obj({}, (0, slice(1)), traverse_string=True) == [], \
'branching should result in list if `traverse_string`'
def test_traversal_re(self):
mobj = re.fullmatch(r'0(12)(?P<group>3)(4)?', '0123')
assert traverse_obj(mobj, ...) == [x for x in mobj.groups() if x is not None], \
'`...` on a `re.Match` should give its `groups()`'
assert traverse_obj(mobj, lambda k, _: k in (0, 2)) == ['0123', '3'], \
'function on a `re.Match` should give groupno, value starting at 0'
assert traverse_obj(mobj, 'group') == '3', \
'str key on a `re.Match` should give group with that name'
assert traverse_obj(mobj, 2) == '3', \
'int key on a `re.Match` should give group with that name'
assert traverse_obj(mobj, 'gRoUp', casesense=False) == '3', \
'str key on a `re.Match` should respect casesense'
assert traverse_obj(mobj, 'fail') is None, \
'failing str key on a `re.Match` should return `default`'
assert traverse_obj(mobj, 'gRoUpS', casesense=False) is None, \
'failing str key on a `re.Match` should return `default`'
assert traverse_obj(mobj, 8) is None, \
'failing int key on a `re.Match` should return `default`'
assert traverse_obj(mobj, lambda k, _: k in (0, 'group')) == ['0123', '3'], \
'function on a `re.Match` should give group name as well'
def test_traversal_xml_etree(self):
etree = xml.etree.ElementTree.fromstring('''<?xml version="1.0"?>
<data>
<country name="Liechtenstein">
<rank>1</rank>
<year>2008</year>
<gdppc>141100</gdppc>
<neighbor name="Austria" direction="E"/>
<neighbor name="Switzerland" direction="W"/>
</country>
<country name="Singapore">
<rank>4</rank>
<year>2011</year>
<gdppc>59900</gdppc>
<neighbor name="Malaysia" direction="N"/>
</country>
<country name="Panama">
<rank>68</rank>
<year>2011</year>
<gdppc>13600</gdppc>
<neighbor name="Costa Rica" direction="W"/>
<neighbor name="Colombia" direction="E"/>
</country>
</data>''')
assert traverse_obj(etree, '') == etree, \
'empty str key should return the element itself'
assert traverse_obj(etree, 'country') == list(etree), \
'str key should lead all children with that tag name'
assert traverse_obj(etree, ...) == list(etree), \
'`...` as key should return all children'
assert traverse_obj(etree, lambda _, x: x[0].text == '4') == [etree[1]], \
'function as key should get element as value'
assert traverse_obj(etree, lambda i, _: i == 1) == [etree[1]], \
'function as key should get index as key'
assert traverse_obj(etree, 0) == etree[0], \
'int key should return the nth child'
expected = ['Austria', 'Switzerland', 'Malaysia', 'Costa Rica', 'Colombia']
assert traverse_obj(etree, './/neighbor/@name') == expected, \
'`@<attribute>` at end of path should give that attribute'
assert traverse_obj(etree, '//neighbor/@fail') == [None, None, None, None, None], \
'`@<nonexistant>` at end of path should give `None`'
assert traverse_obj(etree, ('//neighbor/@', 2)) == {'name': 'Malaysia', 'direction': 'N'}, \
'`@` should give the full attribute dict'
assert traverse_obj(etree, '//year/text()') == ['2008', '2011', '2011'], \
'`text()` at end of path should give the inner text'
assert traverse_obj(etree, '//*[@direction]/@direction') == ['E', 'W', 'N', 'W', 'E'], \
'full Python xpath features should be supported'
assert traverse_obj(etree, (0, '@name')) == 'Liechtenstein', \
'special transformations should act on current element'
assert traverse_obj(etree, ('country', 0, ..., 'text()', {int_or_none})) == [1, 2008, 141100], \
'special transformations should act on current element'
def test_traversal_unbranching(self):
assert traverse_obj(_TEST_DATA, [(100, 1.2), all]) == [100, 1.2], \
'`all` should give all results as list'
assert traverse_obj(_TEST_DATA, [(100, 1.2), any]) == 100, \
'`any` should give the first result'
assert traverse_obj(_TEST_DATA, [100, all]) == [100], \
'`all` should give list if non branching'
assert traverse_obj(_TEST_DATA, [100, any]) == 100, \
'`any` should give single item if non branching'
assert traverse_obj(_TEST_DATA, [('dict', 'None', 100), all]) == [100], \
'`all` should filter `None` and empty dict'
assert traverse_obj(_TEST_DATA, [('dict', 'None', 100), any]) == 100, \
'`any` should filter `None` and empty dict'
assert traverse_obj(_TEST_DATA, [{
'all': [('dict', 'None', 100, 1.2), all],
'any': [('dict', 'None', 100, 1.2), any],
}]) == {'all': [100, 1.2], 'any': 100}, \
'`all`/`any` should apply to each dict path separately'
assert traverse_obj(_TEST_DATA, [{
'all': [('dict', 'None', 100, 1.2), all],
'any': [('dict', 'None', 100, 1.2), any],
}], get_all=False) == {'all': [100, 1.2], 'any': 100}, \
'`all`/`any` should apply to dict regardless of `get_all`'
assert traverse_obj(_TEST_DATA, [('dict', 'None', 100, 1.2), all, {float}]) is None, \
'`all` should reset branching status'
assert traverse_obj(_TEST_DATA, [('dict', 'None', 100, 1.2), any, {float}]) is None, \
'`any` should reset branching status'
assert traverse_obj(_TEST_DATA, [('dict', 'None', 100, 1.2), all, ..., {float}]) == [1.2], \
'`all` should allow further branching'
assert traverse_obj(_TEST_DATA, [('dict', 'None', 'urls', 'data'), any, ..., 'index']) == [0, 1], \
'`any` should allow further branching'
def test_traversal_morsel(self):
values = {
'expires': 'a',
'path': 'b',
'comment': 'c',
'domain': 'd',
'max-age': 'e',
'secure': 'f',
'httponly': 'g',
'version': 'h',
'samesite': 'i',
}
morsel = http.cookies.Morsel()
morsel.set('item_key', 'item_value', 'coded_value')
morsel.update(values)
values['key'] = 'item_key'
values['value'] = 'item_value'
for key, value in values.items():
assert traverse_obj(morsel, key) == value, \
'Morsel should provide access to all values'
assert traverse_obj(morsel, ...) == list(values.values()), \
'`...` should yield all values'
assert traverse_obj(morsel, lambda k, v: True) == list(values.values()), \
'function key should yield all values'
assert traverse_obj(morsel, [(None,), any]) == morsel, \
'Morsel should not be implicitly changed to dict on usage'
class TestDictGet:
def test_dict_get(self):
FALSE_VALUES = {
'none': None,
'false': False,
'zero': 0,
'empty_string': '',
'empty_list': [],
}
d = {**FALSE_VALUES, 'a': 42}
assert dict_get(d, 'a') == 42
assert dict_get(d, 'b') is None
assert dict_get(d, 'b', 42) == 42
assert dict_get(d, ('a',)) == 42
assert dict_get(d, ('b', 'a')) == 42
assert dict_get(d, ('b', 'c', 'a', 'd')) == 42
assert dict_get(d, ('b', 'c')) is None
assert dict_get(d, ('b', 'c'), 42) == 42
for key, false_value in FALSE_VALUES.items():
assert dict_get(d, ('b', 'c', key)) is None
assert dict_get(d, ('b', 'c', key), skip_false_values=False) == false_value

228
test/test_update.py Normal file
View File

@ -0,0 +1,228 @@
#!/usr/bin/env python3
# Allow direct execution
import os
import sys
import unittest
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from test.helper import FakeYDL, report_warning
from yt_dlp.update import UpdateInfo, Updater
# XXX: Keep in sync with yt_dlp.update.UPDATE_SOURCES
TEST_UPDATE_SOURCES = {
'stable': 'yt-dlp/yt-dlp',
'nightly': 'yt-dlp/yt-dlp-nightly-builds',
'master': 'yt-dlp/yt-dlp-master-builds',
}
TEST_API_DATA = {
'yt-dlp/yt-dlp/latest': {
'tag_name': '2023.12.31',
'target_commitish': 'bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb',
'name': 'yt-dlp 2023.12.31',
'body': 'BODY',
},
'yt-dlp/yt-dlp-nightly-builds/latest': {
'tag_name': '2023.12.31.123456',
'target_commitish': 'master',
'name': 'yt-dlp nightly 2023.12.31.123456',
'body': 'Generated from: https://github.com/yt-dlp/yt-dlp/commit/cccccccccccccccccccccccccccccccccccccccc',
},
'yt-dlp/yt-dlp-master-builds/latest': {
'tag_name': '2023.12.31.987654',
'target_commitish': 'master',
'name': 'yt-dlp master 2023.12.31.987654',
'body': 'Generated from: https://github.com/yt-dlp/yt-dlp/commit/dddddddddddddddddddddddddddddddddddddddd',
},
'yt-dlp/yt-dlp/tags/testing': {
'tag_name': 'testing',
'target_commitish': '9999999999999999999999999999999999999999',
'name': 'testing',
'body': 'BODY',
},
'fork/yt-dlp/latest': {
'tag_name': '2050.12.31',
'target_commitish': 'eeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee',
'name': '2050.12.31',
'body': 'BODY',
},
'fork/yt-dlp/tags/pr0000': {
'tag_name': 'pr0000',
'target_commitish': 'ffffffffffffffffffffffffffffffffffffffff',
'name': 'pr1234 2023.11.11.000000',
'body': 'BODY',
},
'fork/yt-dlp/tags/pr1234': {
'tag_name': 'pr1234',
'target_commitish': '0000000000000000000000000000000000000000',
'name': 'pr1234 2023.12.31.555555',
'body': 'BODY',
},
'fork/yt-dlp/tags/pr9999': {
'tag_name': 'pr9999',
'target_commitish': '1111111111111111111111111111111111111111',
'name': 'pr9999',
'body': 'BODY',
},
'fork/yt-dlp-satellite/tags/pr987': {
'tag_name': 'pr987',
'target_commitish': 'master',
'name': 'pr987',
'body': 'Generated from: https://github.com/yt-dlp/yt-dlp/commit/2222222222222222222222222222222222222222',
},
}
TEST_LOCKFILE_COMMENT = '# This file is used for regulating self-update'
TEST_LOCKFILE_V1 = r'''%s
lock 2022.08.18.36 .+ Python 3\.6
lock 2023.11.16 (?!win_x86_exe).+ Python 3\.7
lock 2023.11.16 win_x86_exe .+ Windows-(?:Vista|2008Server)
''' % TEST_LOCKFILE_COMMENT
TEST_LOCKFILE_V2_TMPL = r'''%s
lockV2 yt-dlp/yt-dlp 2022.08.18.36 .+ Python 3\.6
lockV2 yt-dlp/yt-dlp 2023.11.16 (?!win_x86_exe).+ Python 3\.7
lockV2 yt-dlp/yt-dlp 2023.11.16 win_x86_exe .+ Windows-(?:Vista|2008Server)
lockV2 yt-dlp/yt-dlp-nightly-builds 2023.11.15.232826 (?!win_x86_exe).+ Python 3\.7
lockV2 yt-dlp/yt-dlp-nightly-builds 2023.11.15.232826 win_x86_exe .+ Windows-(?:Vista|2008Server)
lockV2 yt-dlp/yt-dlp-master-builds 2023.11.15.232812 (?!win_x86_exe).+ Python 3\.7
lockV2 yt-dlp/yt-dlp-master-builds 2023.11.15.232812 win_x86_exe .+ Windows-(?:Vista|2008Server)
'''
TEST_LOCKFILE_V2 = TEST_LOCKFILE_V2_TMPL % TEST_LOCKFILE_COMMENT
TEST_LOCKFILE_ACTUAL = TEST_LOCKFILE_V2_TMPL % TEST_LOCKFILE_V1.rstrip('\n')
TEST_LOCKFILE_FORK = r'''%s# Test if a fork blocks updates to non-numeric tags
lockV2 fork/yt-dlp pr0000 .+ Python 3.6
lockV2 fork/yt-dlp pr1234 (?!win_x86_exe).+ Python 3\.7
lockV2 fork/yt-dlp pr1234 win_x86_exe .+ Windows-(?:Vista|2008Server)
lockV2 fork/yt-dlp pr9999 .+ Python 3.11
''' % TEST_LOCKFILE_ACTUAL
class FakeUpdater(Updater):
current_version = '2022.01.01'
current_commit = 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa'
_channel = 'stable'
_origin = 'yt-dlp/yt-dlp'
_update_sources = TEST_UPDATE_SOURCES
def _download_update_spec(self, *args, **kwargs):
return TEST_LOCKFILE_ACTUAL
def _call_api(self, tag):
tag = f'tags/{tag}' if tag != 'latest' else tag
return TEST_API_DATA[f'{self.requested_repo}/{tag}']
def _report_error(self, msg, *args, **kwargs):
report_warning(msg)
class TestUpdate(unittest.TestCase):
maxDiff = None
def test_update_spec(self):
ydl = FakeYDL()
updater = FakeUpdater(ydl, 'stable')
def test(lockfile, identifier, input_tag, expect_tag, exact=False, repo='yt-dlp/yt-dlp'):
updater._identifier = identifier
updater._exact = exact
updater.requested_repo = repo
result = updater._process_update_spec(lockfile, input_tag)
self.assertEqual(
result, expect_tag,
f'{identifier!r} requesting {repo}@{input_tag} (exact={exact}) '
f'returned {result!r} instead of {expect_tag!r}')
for lockfile in (TEST_LOCKFILE_V1, TEST_LOCKFILE_V2, TEST_LOCKFILE_ACTUAL, TEST_LOCKFILE_FORK):
# Normal operation
test(lockfile, 'zip Python 3.12.0', '2023.12.31', '2023.12.31')
test(lockfile, 'zip stable Python 3.12.0', '2023.12.31', '2023.12.31', exact=True)
# Python 3.6 --update should update only to its lock
test(lockfile, 'zip Python 3.6.0', '2023.11.16', '2022.08.18.36')
# --update-to an exact version later than the lock should return None
test(lockfile, 'zip stable Python 3.6.0', '2023.11.16', None, exact=True)
# Python 3.7 should be able to update to its lock
test(lockfile, 'zip Python 3.7.0', '2023.11.16', '2023.11.16')
test(lockfile, 'zip stable Python 3.7.1', '2023.11.16', '2023.11.16', exact=True)
# Non-win_x86_exe builds on py3.7 must be locked
test(lockfile, 'zip Python 3.7.1', '2023.12.31', '2023.11.16')
test(lockfile, 'zip stable Python 3.7.1', '2023.12.31', None, exact=True)
test( # Windows Vista w/ win_x86_exe must be locked
lockfile, 'win_x86_exe stable Python 3.7.9 (CPython x86 32bit) - Windows-Vista-6.0.6003-SP2',
'2023.12.31', '2023.11.16')
test( # Windows 2008Server w/ win_x86_exe must be locked
lockfile, 'win_x86_exe Python 3.7.9 (CPython x86 32bit) - Windows-2008Server',
'2023.12.31', None, exact=True)
test( # Windows 7 w/ win_x86_exe py3.7 build should be able to update beyond lock
lockfile, 'win_x86_exe stable Python 3.7.9 (CPython x86 32bit) - Windows-7-6.1.7601-SP1',
'2023.12.31', '2023.12.31')
test( # Windows 8.1 w/ '2008Server' in platform string should be able to update beyond lock
lockfile, 'win_x86_exe Python 3.7.9 (CPython x86 32bit) - Windows-post2008Server-6.2.9200',
'2023.12.31', '2023.12.31', exact=True)
# Forks can block updates to non-numeric tags rather than lock
test(TEST_LOCKFILE_FORK, 'zip Python 3.6.3', 'pr0000', None, repo='fork/yt-dlp')
test(TEST_LOCKFILE_FORK, 'zip stable Python 3.7.4', 'pr0000', 'pr0000', repo='fork/yt-dlp')
test(TEST_LOCKFILE_FORK, 'zip stable Python 3.7.4', 'pr1234', None, repo='fork/yt-dlp')
test(TEST_LOCKFILE_FORK, 'zip Python 3.8.1', 'pr1234', 'pr1234', repo='fork/yt-dlp', exact=True)
test(
TEST_LOCKFILE_FORK, 'win_x86_exe stable Python 3.7.9 (CPython x86 32bit) - Windows-Vista-6.0.6003-SP2',
'pr1234', None, repo='fork/yt-dlp')
test(
TEST_LOCKFILE_FORK, 'win_x86_exe stable Python 3.7.9 (CPython x86 32bit) - Windows-7-6.1.7601-SP1',
'2023.12.31', '2023.12.31', repo='fork/yt-dlp')
test(TEST_LOCKFILE_FORK, 'zip Python 3.11.2', 'pr9999', None, repo='fork/yt-dlp', exact=True)
test(TEST_LOCKFILE_FORK, 'zip stable Python 3.12.0', 'pr9999', 'pr9999', repo='fork/yt-dlp')
def test_query_update(self):
ydl = FakeYDL()
def test(target, expected, current_version=None, current_commit=None, identifier=None):
updater = FakeUpdater(ydl, target)
if current_version:
updater.current_version = current_version
if current_commit:
updater.current_commit = current_commit
updater._identifier = identifier or 'zip'
update_info = updater.query_update(_output=True)
self.assertDictEqual(
update_info.__dict__ if update_info else {}, expected.__dict__ if expected else {})
test('yt-dlp/yt-dlp@latest', UpdateInfo(
'2023.12.31', version='2023.12.31', requested_version='2023.12.31', commit='b' * 40))
test('yt-dlp/yt-dlp-nightly-builds@latest', UpdateInfo(
'2023.12.31.123456', version='2023.12.31.123456', requested_version='2023.12.31.123456', commit='c' * 40))
test('yt-dlp/yt-dlp-master-builds@latest', UpdateInfo(
'2023.12.31.987654', version='2023.12.31.987654', requested_version='2023.12.31.987654', commit='d' * 40))
test('fork/yt-dlp@latest', UpdateInfo(
'2050.12.31', version='2050.12.31', requested_version='2050.12.31', commit='e' * 40))
test('fork/yt-dlp@pr0000', UpdateInfo(
'pr0000', version='2023.11.11.000000', requested_version='2023.11.11.000000', commit='f' * 40))
test('fork/yt-dlp@pr1234', UpdateInfo(
'pr1234', version='2023.12.31.555555', requested_version='2023.12.31.555555', commit='0' * 40))
test('fork/yt-dlp@pr9999', UpdateInfo(
'pr9999', version=None, requested_version=None, commit='1' * 40))
test('fork/yt-dlp-satellite@pr987', UpdateInfo(
'pr987', version=None, requested_version=None, commit='2' * 40))
test('yt-dlp/yt-dlp', None, current_version='2024.01.01')
test('stable', UpdateInfo(
'2023.12.31', version='2023.12.31', requested_version='2023.12.31', commit='b' * 40))
test('nightly', UpdateInfo(
'2023.12.31.123456', version='2023.12.31.123456', requested_version='2023.12.31.123456', commit='c' * 40))
test('master', UpdateInfo(
'2023.12.31.987654', version='2023.12.31.987654', requested_version='2023.12.31.987654', commit='d' * 40))
test('testing', None, current_commit='9' * 40)
test('testing', UpdateInfo('testing', commit='9' * 40))
if __name__ == '__main__':
unittest.main()

View File

@ -1,30 +0,0 @@
#!/usr/bin/env python3
# Allow direct execution
import os
import sys
import unittest
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import json
from yt_dlp.update import rsa_verify
class TestUpdate(unittest.TestCase):
def test_rsa_verify(self):
UPDATES_RSA_KEY = (0x9d60ee4d8f805312fdb15a62f87b95bd66177b91df176765d13514a0f1754bcd2057295c5b6f1d35daa6742c3ffc9a82d3e118861c207995a8031e151d863c9927e304576bc80692bc8e094896fcf11b66f3e29e04e3a71e9a11558558acea1840aec37fc396fb6b65dc81a1c4144e03bd1c011de62e3f1357b327d08426fe93, 65537)
with open(os.path.join(os.path.dirname(os.path.abspath(__file__)), 'versions.json'), 'rb') as f:
versions_info = f.read().decode()
versions_info = json.loads(versions_info)
signature = versions_info['signature']
del versions_info['signature']
self.assertTrue(rsa_verify(
json.dumps(versions_info, sort_keys=True).encode(),
signature, UPDATES_RSA_KEY))
if __name__ == '__main__':
unittest.main()

View File

@ -2,9 +2,9 @@
# Allow direct execution
import os
import re
import sys
import unittest
import warnings
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
@ -13,6 +13,7 @@ import contextlib
import io
import itertools
import json
import subprocess
import xml.etree.ElementTree
from yt_dlp.compat import (
@ -27,6 +28,7 @@ from yt_dlp.utils import (
InAdvancePagedList,
LazyList,
OnDemandPagedList,
Popen,
age_restricted,
args_to_str,
base_url,
@ -42,14 +44,12 @@ from yt_dlp.utils import (
determine_ext,
determine_file_encoding,
dfxp2srt,
dict_get,
encode_base_n,
encode_compat_str,
encodeFilename,
escape_rfc3986,
escape_url,
expand_path,
extract_attributes,
extract_basic_auth,
find_xpath_attr,
fix_xml_ampersands,
float_or_none,
@ -102,16 +102,14 @@ from yt_dlp.utils import (
sanitize_filename,
sanitize_path,
sanitize_url,
sanitized_Request,
shell_quote,
smuggle_url,
str_or_none,
str_to_int,
strip_jsonp,
strip_or_none,
subtitles_filename,
timeconvert,
traverse_obj,
try_call,
unescapeHTML,
unified_strdate,
unified_timestamp,
@ -123,12 +121,19 @@ from yt_dlp.utils import (
urlencode_postdata,
urljoin,
urshift,
variadic,
version_tuple,
xpath_attr,
xpath_element,
xpath_text,
xpath_with_ns,
)
from yt_dlp.utils.networking import (
HTTPHeaderDict,
escape_rfc3986,
normalize_url,
remove_dot_segments,
)
class TestUtil(unittest.TestCase):
@ -255,15 +260,6 @@ class TestUtil(unittest.TestCase):
self.assertEqual(sanitize_url('https://foo.bar'), 'https://foo.bar')
self.assertEqual(sanitize_url('foo bar'), 'foo bar')
def test_extract_basic_auth(self):
auth_header = lambda url: sanitized_Request(url).get_header('Authorization')
self.assertFalse(auth_header('http://foo.bar'))
self.assertFalse(auth_header('http://:foo.bar'))
self.assertEqual(auth_header('http://@foo.bar'), 'Basic Og==')
self.assertEqual(auth_header('http://:pass@foo.bar'), 'Basic OnBhc3M=')
self.assertEqual(auth_header('http://user:@foo.bar'), 'Basic dXNlcjo=')
self.assertEqual(auth_header('http://user:pass@foo.bar'), 'Basic dXNlcjpwYXNz')
def test_expand_path(self):
def env(var):
return f'%{var}%' if sys.platform == 'win32' else f'${var}'
@ -660,6 +656,8 @@ class TestUtil(unittest.TestCase):
self.assertEqual(parse_duration('P0Y0M0DT0H4M20.880S'), 260.88)
self.assertEqual(parse_duration('01:02:03:050'), 3723.05)
self.assertEqual(parse_duration('103:050'), 103.05)
self.assertEqual(parse_duration('1HR 3MIN'), 3780)
self.assertEqual(parse_duration('2hrs 3mins'), 7380)
def test_fix_xml_ampersands(self):
self.assertEqual(
@ -753,28 +751,6 @@ class TestUtil(unittest.TestCase):
self.assertRaises(
ValueError, multipart_encode, {b'field': b'value'}, boundary='value')
def test_dict_get(self):
FALSE_VALUES = {
'none': None,
'false': False,
'zero': 0,
'empty_string': '',
'empty_list': [],
}
d = FALSE_VALUES.copy()
d['a'] = 42
self.assertEqual(dict_get(d, 'a'), 42)
self.assertEqual(dict_get(d, 'b'), None)
self.assertEqual(dict_get(d, 'b', 42), 42)
self.assertEqual(dict_get(d, ('a', )), 42)
self.assertEqual(dict_get(d, ('b', 'a', )), 42)
self.assertEqual(dict_get(d, ('b', 'c', 'a', 'd', )), 42)
self.assertEqual(dict_get(d, ('b', 'c', )), None)
self.assertEqual(dict_get(d, ('b', 'c', ), 42), 42)
for key, false_value in FALSE_VALUES.items():
self.assertEqual(dict_get(d, ('b', 'c', key, )), None)
self.assertEqual(dict_get(d, ('b', 'c', key, ), skip_false_values=False), false_value)
def test_merge_dicts(self):
self.assertEqual(merge_dicts({'a': 1}, {'b': 2}), {'a': 1, 'b': 2})
self.assertEqual(merge_dicts({'a': 1}, {'a': 2}), {'a': 1})
@ -936,24 +912,45 @@ class TestUtil(unittest.TestCase):
self.assertEqual(escape_rfc3986('foo bar'), 'foo%20bar')
self.assertEqual(escape_rfc3986('foo%20bar'), 'foo%20bar')
def test_escape_url(self):
def test_normalize_url(self):
self.assertEqual(
escape_url('http://wowza.imust.org/srv/vod/telemb/new/UPLOAD/UPLOAD/20224_IncendieHavré_FD.mp4'),
normalize_url('http://wowza.imust.org/srv/vod/telemb/new/UPLOAD/UPLOAD/20224_IncendieHavré_FD.mp4'),
'http://wowza.imust.org/srv/vod/telemb/new/UPLOAD/UPLOAD/20224_IncendieHavre%CC%81_FD.mp4'
)
self.assertEqual(
escape_url('http://www.ardmediathek.de/tv/Sturm-der-Liebe/Folge-2036-Zu-Mann-und-Frau-erklärt/Das-Erste/Video?documentId=22673108&bcastId=5290'),
normalize_url('http://www.ardmediathek.de/tv/Sturm-der-Liebe/Folge-2036-Zu-Mann-und-Frau-erklärt/Das-Erste/Video?documentId=22673108&bcastId=5290'),
'http://www.ardmediathek.de/tv/Sturm-der-Liebe/Folge-2036-Zu-Mann-und-Frau-erkl%C3%A4rt/Das-Erste/Video?documentId=22673108&bcastId=5290'
)
self.assertEqual(
escape_url('http://тест.рф/фрагмент'),
normalize_url('http://тест.рф/фрагмент'),
'http://xn--e1aybc.xn--p1ai/%D1%84%D1%80%D0%B0%D0%B3%D0%BC%D0%B5%D0%BD%D1%82'
)
self.assertEqual(
escape_url('http://тест.рф/абв?абв=абв#абв'),
normalize_url('http://тест.рф/абв?абв=абв#абв'),
'http://xn--e1aybc.xn--p1ai/%D0%B0%D0%B1%D0%B2?%D0%B0%D0%B1%D0%B2=%D0%B0%D0%B1%D0%B2#%D0%B0%D0%B1%D0%B2'
)
self.assertEqual(escape_url('http://vimeo.com/56015672#at=0'), 'http://vimeo.com/56015672#at=0')
self.assertEqual(normalize_url('http://vimeo.com/56015672#at=0'), 'http://vimeo.com/56015672#at=0')
self.assertEqual(normalize_url('http://www.example.com/../a/b/../c/./d.html'), 'http://www.example.com/a/c/d.html')
def test_remove_dot_segments(self):
self.assertEqual(remove_dot_segments('/a/b/c/./../../g'), '/a/g')
self.assertEqual(remove_dot_segments('mid/content=5/../6'), 'mid/6')
self.assertEqual(remove_dot_segments('/ad/../cd'), '/cd')
self.assertEqual(remove_dot_segments('/ad/../cd/'), '/cd/')
self.assertEqual(remove_dot_segments('/..'), '/')
self.assertEqual(remove_dot_segments('/./'), '/')
self.assertEqual(remove_dot_segments('/./a'), '/a')
self.assertEqual(remove_dot_segments('/abc/./.././d/././e/.././f/./../../ghi'), '/ghi')
self.assertEqual(remove_dot_segments('/'), '/')
self.assertEqual(remove_dot_segments('/t'), '/t')
self.assertEqual(remove_dot_segments('t'), 't')
self.assertEqual(remove_dot_segments(''), '')
self.assertEqual(remove_dot_segments('/../a/b/c'), '/a/b/c')
self.assertEqual(remove_dot_segments('../a'), 'a')
self.assertEqual(remove_dot_segments('./a'), 'a')
self.assertEqual(remove_dot_segments('.'), '')
self.assertEqual(remove_dot_segments('////'), '////')
def test_js_to_json_vars_strings(self):
self.assertDictEqual(
@ -1186,10 +1183,28 @@ class TestUtil(unittest.TestCase):
on = js_to_json('\'"\\""\'')
self.assertEqual(json.loads(on), '"""', msg='Unnecessary quote escape should be escaped')
on = js_to_json('[new Date("spam"), \'("eggs")\']')
self.assertEqual(json.loads(on), ['spam', '("eggs")'], msg='Date regex should match a single string')
def test_js_to_json_malformed(self):
self.assertEqual(js_to_json('42a1'), '42"a1"')
self.assertEqual(js_to_json('42a-1'), '42"a"-1')
def test_js_to_json_template_literal(self):
self.assertEqual(js_to_json('`Hello ${name}`', {'name': '"world"'}), '"Hello world"')
self.assertEqual(js_to_json('`${name}${name}`', {'name': '"X"'}), '"XX"')
self.assertEqual(js_to_json('`${name}${name}`', {'name': '5'}), '"55"')
self.assertEqual(js_to_json('`${name}"${name}"`', {'name': '5'}), '"5\\"5\\""')
self.assertEqual(js_to_json('`${name}`', {}), '"name"')
def test_js_to_json_common_constructors(self):
self.assertEqual(json.loads(js_to_json('new Map([["a", 5]])')), {'a': 5})
self.assertEqual(json.loads(js_to_json('Array(5, 10)')), [5, 10])
self.assertEqual(json.loads(js_to_json('new Array(15,5)')), [15, 5])
self.assertEqual(json.loads(js_to_json('new Map([Array(5, 10),new Array(15,5)])')), {'5': 10, '15': 5})
self.assertEqual(json.loads(js_to_json('new Date("123")')), "123")
self.assertEqual(json.loads(js_to_json('new Date(\'2023-10-19\')')), "2023-10-19")
def test_extract_attributes(self):
self.assertEqual(extract_attributes('<e x="y">'), {'x': 'y'})
self.assertEqual(extract_attributes("<e x='y'>"), {'x': 'y'})
@ -1825,6 +1840,8 @@ Line 1
def test_clean_podcast_url(self):
self.assertEqual(clean_podcast_url('https://www.podtrac.com/pts/redirect.mp3/chtbl.com/track/5899E/traffic.megaphone.fm/HSW7835899191.mp3'), 'https://traffic.megaphone.fm/HSW7835899191.mp3')
self.assertEqual(clean_podcast_url('https://play.podtrac.com/npr-344098539/edge1.pod.npr.org/anon.npr-podcasts/podcast/npr/waitwait/2020/10/20201003_waitwait_wwdtmpodcast201003-015621a5-f035-4eca-a9a1-7c118d90bc3c.mp3'), 'https://edge1.pod.npr.org/anon.npr-podcasts/podcast/npr/waitwait/2020/10/20201003_waitwait_wwdtmpodcast201003-015621a5-f035-4eca-a9a1-7c118d90bc3c.mp3')
self.assertEqual(clean_podcast_url('https://pdst.fm/e/2.gum.fm/chtbl.com/track/chrt.fm/track/34D33/pscrb.fm/rss/p/traffic.megaphone.fm/ITLLC7765286967.mp3?updated=1687282661'), 'https://traffic.megaphone.fm/ITLLC7765286967.mp3?updated=1687282661')
self.assertEqual(clean_podcast_url('https://pdst.fm/e/https://mgln.ai/e/441/www.buzzsprout.com/1121972/13019085-ep-252-the-deep-life-stack.mp3'), 'https://www.buzzsprout.com/1121972/13019085-ep-252-the-deep-life-stack.mp3')
def test_LazyList(self):
it = list(range(10))
@ -1967,300 +1984,98 @@ Line 1
self.assertEqual(get_compatible_ext(
vcodecs=['av1'], acodecs=['mp4a'], vexts=['webm'], aexts=['m4a'], preferences=('webm', 'mkv')), 'mkv')
def test_traverse_obj(self):
_TEST_DATA = {
100: 100,
1.2: 1.2,
'str': 'str',
'None': None,
'...': ...,
'urls': [
{'index': 0, 'url': 'https://www.example.com/0'},
{'index': 1, 'url': 'https://www.example.com/1'},
],
'data': (
{'index': 2},
{'index': 3},
),
'dict': {},
}
def test_try_call(self):
def total(*x, **kwargs):
return sum(x) + sum(kwargs.values())
# Test base functionality
self.assertEqual(traverse_obj(_TEST_DATA, ('str',)), 'str',
msg='allow tuple path')
self.assertEqual(traverse_obj(_TEST_DATA, ['str']), 'str',
msg='allow list path')
self.assertEqual(traverse_obj(_TEST_DATA, (value for value in ("str",))), 'str',
msg='allow iterable path')
self.assertEqual(traverse_obj(_TEST_DATA, 'str'), 'str',
msg='single items should be treated as a path')
self.assertEqual(traverse_obj(_TEST_DATA, None), _TEST_DATA)
self.assertEqual(traverse_obj(_TEST_DATA, 100), 100)
self.assertEqual(traverse_obj(_TEST_DATA, 1.2), 1.2)
self.assertEqual(try_call(None), None,
msg='not a fn should give None')
self.assertEqual(try_call(lambda: 1), 1,
msg='int fn with no expected_type should give int')
self.assertEqual(try_call(lambda: 1, expected_type=int), 1,
msg='int fn with expected_type int should give int')
self.assertEqual(try_call(lambda: 1, expected_type=dict), None,
msg='int fn with wrong expected_type should give None')
self.assertEqual(try_call(total, args=(0, 1, 0, ), expected_type=int), 1,
msg='fn should accept arglist')
self.assertEqual(try_call(total, kwargs={'a': 0, 'b': 1, 'c': 0}, expected_type=int), 1,
msg='fn should accept kwargs')
self.assertEqual(try_call(lambda: 1, expected_type=dict), None,
msg='int fn with no expected_type should give None')
self.assertEqual(try_call(lambda x: {}, total, args=(42, ), expected_type=int), 42,
msg='expect first int result with expected_type int')
# Test Ellipsis behavior
self.assertCountEqual(traverse_obj(_TEST_DATA, ...),
(item for item in _TEST_DATA.values() if item not in (None, {})),
msg='`...` should give all non discarded values')
self.assertCountEqual(traverse_obj(_TEST_DATA, ('urls', 0, ...)), _TEST_DATA['urls'][0].values(),
msg='`...` selection for dicts should select all values')
self.assertEqual(traverse_obj(_TEST_DATA, (..., ..., 'url')),
['https://www.example.com/0', 'https://www.example.com/1'],
msg='nested `...` queries should work')
self.assertCountEqual(traverse_obj(_TEST_DATA, (..., ..., 'index')), range(4),
msg='`...` query result should be flattened')
def test_variadic(self):
self.assertEqual(variadic(None), (None, ))
self.assertEqual(variadic('spam'), ('spam', ))
self.assertEqual(variadic('spam', allowed_types=dict), 'spam')
with warnings.catch_warnings():
warnings.simplefilter('ignore')
self.assertEqual(variadic('spam', allowed_types=[dict]), 'spam')
# Test function as key
self.assertEqual(traverse_obj(_TEST_DATA, lambda x, y: x == 'urls' and isinstance(y, list)),
[_TEST_DATA['urls']],
msg='function as query key should perform a filter based on (key, value)')
self.assertCountEqual(traverse_obj(_TEST_DATA, lambda _, x: isinstance(x[0], str)), {'str'},
msg='exceptions in the query function should be catched')
if __debug__:
with self.assertRaises(Exception, msg='Wrong function signature should raise in debug'):
traverse_obj(_TEST_DATA, lambda a: ...)
with self.assertRaises(Exception, msg='Wrong function signature should raise in debug'):
traverse_obj(_TEST_DATA, lambda a, b, c: ...)
def test_http_header_dict(self):
headers = HTTPHeaderDict()
headers['ytdl-test'] = b'0'
self.assertEqual(list(headers.items()), [('Ytdl-Test', '0')])
headers['ytdl-test'] = 1
self.assertEqual(list(headers.items()), [('Ytdl-Test', '1')])
headers['Ytdl-test'] = '2'
self.assertEqual(list(headers.items()), [('Ytdl-Test', '2')])
self.assertTrue('ytDl-Test' in headers)
self.assertEqual(str(headers), str(dict(headers)))
self.assertEqual(repr(headers), str(dict(headers)))
# Test set as key (transformation/type, like `expected_type`)
self.assertEqual(traverse_obj(_TEST_DATA, (..., {str.upper}, )), ['STR'],
msg='Function in set should be a transformation')
self.assertEqual(traverse_obj(_TEST_DATA, (..., {str})), ['str'],
msg='Type in set should be a type filter')
self.assertEqual(traverse_obj(_TEST_DATA, {dict}), _TEST_DATA,
msg='A single set should be wrapped into a path')
self.assertEqual(traverse_obj(_TEST_DATA, (..., {str.upper})), ['STR'],
msg='Transformation function should not raise')
self.assertEqual(traverse_obj(_TEST_DATA, (..., {str_or_none})),
[item for item in map(str_or_none, _TEST_DATA.values()) if item is not None],
msg='Function in set should be a transformation')
if __debug__:
with self.assertRaises(Exception, msg='Sets with length != 1 should raise in debug'):
traverse_obj(_TEST_DATA, set())
with self.assertRaises(Exception, msg='Sets with length != 1 should raise in debug'):
traverse_obj(_TEST_DATA, {str.upper, str})
headers.update({'X-dlp': 'data'})
self.assertEqual(set(headers.items()), {('Ytdl-Test', '2'), ('X-Dlp', 'data')})
self.assertEqual(dict(headers), {'Ytdl-Test': '2', 'X-Dlp': 'data'})
self.assertEqual(len(headers), 2)
self.assertEqual(headers.copy(), headers)
headers2 = HTTPHeaderDict({'X-dlp': 'data3'}, **headers, **{'X-dlp': 'data2'})
self.assertEqual(set(headers2.items()), {('Ytdl-Test', '2'), ('X-Dlp', 'data2')})
self.assertEqual(len(headers2), 2)
headers2.clear()
self.assertEqual(len(headers2), 0)
# Test alternative paths
self.assertEqual(traverse_obj(_TEST_DATA, 'fail', 'str'), 'str',
msg='multiple `paths` should be treated as alternative paths')
self.assertEqual(traverse_obj(_TEST_DATA, 'str', 100), 'str',
msg='alternatives should exit early')
self.assertEqual(traverse_obj(_TEST_DATA, 'fail', 'fail'), None,
msg='alternatives should return `default` if exhausted')
self.assertEqual(traverse_obj(_TEST_DATA, (..., 'fail'), 100), 100,
msg='alternatives should track their own branching return')
self.assertEqual(traverse_obj(_TEST_DATA, ('dict', ...), ('data', ...)), list(_TEST_DATA['data']),
msg='alternatives on empty objects should search further')
# ensure we prefer latter headers
headers3 = HTTPHeaderDict({'Ytdl-TeSt': 1}, {'Ytdl-test': 2})
self.assertEqual(set(headers3.items()), {('Ytdl-Test', '2')})
del headers3['ytdl-tesT']
self.assertEqual(dict(headers3), {})
# Test branch and path nesting
self.assertEqual(traverse_obj(_TEST_DATA, ('urls', (3, 0), 'url')), ['https://www.example.com/0'],
msg='tuple as key should be treated as branches')
self.assertEqual(traverse_obj(_TEST_DATA, ('urls', [3, 0], 'url')), ['https://www.example.com/0'],
msg='list as key should be treated as branches')
self.assertEqual(traverse_obj(_TEST_DATA, ('urls', ((1, 'fail'), (0, 'url')))), ['https://www.example.com/0'],
msg='double nesting in path should be treated as paths')
self.assertEqual(traverse_obj(['0', [1, 2]], [(0, 1), 0]), [1],
msg='do not fail early on branching')
self.assertCountEqual(traverse_obj(_TEST_DATA, ('urls', ((1, ('fail', 'url')), (0, 'url')))),
['https://www.example.com/0', 'https://www.example.com/1'],
msg='tripple nesting in path should be treated as branches')
self.assertEqual(traverse_obj(_TEST_DATA, ('urls', ('fail', (..., 'url')))),
['https://www.example.com/0', 'https://www.example.com/1'],
msg='ellipsis as branch path start gets flattened')
headers4 = HTTPHeaderDict({'ytdl-test': 'data;'})
self.assertEqual(set(headers4.items()), {('Ytdl-Test', 'data;')})
# Test dictionary as key
self.assertEqual(traverse_obj(_TEST_DATA, {0: 100, 1: 1.2}), {0: 100, 1: 1.2},
msg='dict key should result in a dict with the same keys')
self.assertEqual(traverse_obj(_TEST_DATA, {0: ('urls', 0, 'url')}),
{0: 'https://www.example.com/0'},
msg='dict key should allow paths')
self.assertEqual(traverse_obj(_TEST_DATA, {0: ('urls', (3, 0), 'url')}),
{0: ['https://www.example.com/0']},
msg='tuple in dict path should be treated as branches')
self.assertEqual(traverse_obj(_TEST_DATA, {0: ('urls', ((1, 'fail'), (0, 'url')))}),
{0: ['https://www.example.com/0']},
msg='double nesting in dict path should be treated as paths')
self.assertEqual(traverse_obj(_TEST_DATA, {0: ('urls', ((1, ('fail', 'url')), (0, 'url')))}),
{0: ['https://www.example.com/1', 'https://www.example.com/0']},
msg='tripple nesting in dict path should be treated as branches')
self.assertEqual(traverse_obj(_TEST_DATA, {0: 'fail'}), {},
msg='remove `None` values when top level dict key fails')
self.assertEqual(traverse_obj(_TEST_DATA, {0: 'fail'}, default=...), {0: ...},
msg='use `default` if key fails and `default`')
self.assertEqual(traverse_obj(_TEST_DATA, {0: 'dict'}), {},
msg='remove empty values when dict key')
self.assertEqual(traverse_obj(_TEST_DATA, {0: 'dict'}, default=...), {0: ...},
msg='use `default` when dict key and `default`')
self.assertEqual(traverse_obj(_TEST_DATA, {0: {0: 'fail'}}), {},
msg='remove empty values when nested dict key fails')
self.assertEqual(traverse_obj(None, {0: 'fail'}), {},
msg='default to dict if pruned')
self.assertEqual(traverse_obj(None, {0: 'fail'}, default=...), {0: ...},
msg='default to dict if pruned and default is given')
self.assertEqual(traverse_obj(_TEST_DATA, {0: {0: 'fail'}}, default=...), {0: {0: ...}},
msg='use nested `default` when nested dict key fails and `default`')
self.assertEqual(traverse_obj(_TEST_DATA, {0: ('dict', ...)}), {},
msg='remove key if branch in dict key not successful')
# common mistake: strip whitespace from values
# https://github.com/yt-dlp/yt-dlp/issues/8729
headers5 = HTTPHeaderDict({'ytdl-test': ' data; '})
self.assertEqual(set(headers5.items()), {('Ytdl-Test', 'data;')})
# Testing default parameter behavior
_DEFAULT_DATA = {'None': None, 'int': 0, 'list': []}
self.assertEqual(traverse_obj(_DEFAULT_DATA, 'fail'), None,
msg='default value should be `None`')
self.assertEqual(traverse_obj(_DEFAULT_DATA, 'fail', 'fail', default=...), ...,
msg='chained fails should result in default')
self.assertEqual(traverse_obj(_DEFAULT_DATA, 'None', 'int'), 0,
msg='should not short cirquit on `None`')
self.assertEqual(traverse_obj(_DEFAULT_DATA, 'fail', default=1), 1,
msg='invalid dict key should result in `default`')
self.assertEqual(traverse_obj(_DEFAULT_DATA, 'None', default=1), 1,
msg='`None` is a deliberate sentinel and should become `default`')
self.assertEqual(traverse_obj(_DEFAULT_DATA, ('list', 10)), None,
msg='`IndexError` should result in `default`')
self.assertEqual(traverse_obj(_DEFAULT_DATA, (..., 'fail'), default=1), 1,
msg='if branched but not successful return `default` if defined, not `[]`')
self.assertEqual(traverse_obj(_DEFAULT_DATA, (..., 'fail'), default=None), None,
msg='if branched but not successful return `default` even if `default` is `None`')
self.assertEqual(traverse_obj(_DEFAULT_DATA, (..., 'fail')), [],
msg='if branched but not successful return `[]`, not `default`')
self.assertEqual(traverse_obj(_DEFAULT_DATA, ('list', ...)), [],
msg='if branched but object is empty return `[]`, not `default`')
self.assertEqual(traverse_obj(None, ...), [],
msg='if branched but object is `None` return `[]`, not `default`')
self.assertEqual(traverse_obj({0: None}, (0, ...)), [],
msg='if branched but state is `None` return `[]`, not `default`')
def test_extract_basic_auth(self):
assert extract_basic_auth('http://:foo.bar') == ('http://:foo.bar', None)
assert extract_basic_auth('http://foo.bar') == ('http://foo.bar', None)
assert extract_basic_auth('http://@foo.bar') == ('http://foo.bar', 'Basic Og==')
assert extract_basic_auth('http://:pass@foo.bar') == ('http://foo.bar', 'Basic OnBhc3M=')
assert extract_basic_auth('http://user:@foo.bar') == ('http://foo.bar', 'Basic dXNlcjo=')
assert extract_basic_auth('http://user:pass@foo.bar') == ('http://foo.bar', 'Basic dXNlcjpwYXNz')
branching_paths = [
('fail', ...),
(..., 'fail'),
100 * ('fail',) + (...,),
(...,) + 100 * ('fail',),
]
for branching_path in branching_paths:
self.assertEqual(traverse_obj({}, branching_path), [],
msg='if branched but state is `None`, return `[]` (not `default`)')
self.assertEqual(traverse_obj({}, 'fail', branching_path), [],
msg='if branching in last alternative and previous did not match, return `[]` (not `default`)')
self.assertEqual(traverse_obj({0: 'x'}, 0, branching_path), 'x',
msg='if branching in last alternative and previous did match, return single value')
self.assertEqual(traverse_obj({0: 'x'}, branching_path, 0), 'x',
msg='if branching in first alternative and non-branching path does match, return single value')
self.assertEqual(traverse_obj({}, branching_path, 'fail'), None,
msg='if branching in first alternative and non-branching path does not match, return `default`')
@unittest.skipUnless(compat_os_name == 'nt', 'Only relevant on Windows')
def test_Popen_windows_escaping(self):
def run_shell(args):
stdout, stderr, error = Popen.run(
args, text=True, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
assert not stderr
assert not error
return stdout
# Testing expected_type behavior
_EXPECTED_TYPE_DATA = {'str': 'str', 'int': 0}
self.assertEqual(traverse_obj(_EXPECTED_TYPE_DATA, 'str', expected_type=str),
'str', msg='accept matching `expected_type` type')
self.assertEqual(traverse_obj(_EXPECTED_TYPE_DATA, 'str', expected_type=int),
None, msg='reject non matching `expected_type` type')
self.assertEqual(traverse_obj(_EXPECTED_TYPE_DATA, 'int', expected_type=lambda x: str(x)),
'0', msg='transform type using type function')
self.assertEqual(traverse_obj(_EXPECTED_TYPE_DATA, 'str', expected_type=lambda _: 1 / 0),
None, msg='wrap expected_type fuction in try_call')
self.assertEqual(traverse_obj(_EXPECTED_TYPE_DATA, ..., expected_type=str),
['str'], msg='eliminate items that expected_type fails on')
self.assertEqual(traverse_obj(_TEST_DATA, {0: 100, 1: 1.2}, expected_type=int),
{0: 100}, msg='type as expected_type should filter dict values')
self.assertEqual(traverse_obj(_TEST_DATA, {0: 100, 1: 1.2, 2: 'None'}, expected_type=str_or_none),
{0: '100', 1: '1.2'}, msg='function as expected_type should transform dict values')
self.assertEqual(traverse_obj(_TEST_DATA, ({0: 1.2}, 0, {int_or_none}), expected_type=int),
1, msg='expected_type should not filter non final dict values')
self.assertEqual(traverse_obj(_TEST_DATA, {0: {0: 100, 1: 'str'}}, expected_type=int),
{0: {0: 100}}, msg='expected_type should transform deep dict values')
self.assertEqual(traverse_obj(_TEST_DATA, [({0: '...'}, {0: '...'})], expected_type=type(...)),
[{0: ...}, {0: ...}], msg='expected_type should transform branched dict values')
self.assertEqual(traverse_obj({1: {3: 4}}, [(1, 2), 3], expected_type=int),
[4], msg='expected_type regression for type matching in tuple branching')
self.assertEqual(traverse_obj(_TEST_DATA, ['data', ...], expected_type=int),
[], msg='expected_type regression for type matching in dict result')
# Test get_all behavior
_GET_ALL_DATA = {'key': [0, 1, 2]}
self.assertEqual(traverse_obj(_GET_ALL_DATA, ('key', ...), get_all=False), 0,
msg='if not `get_all`, return only first matching value')
self.assertEqual(traverse_obj(_GET_ALL_DATA, ..., get_all=False), [0, 1, 2],
msg='do not overflatten if not `get_all`')
# Test casesense behavior
_CASESENSE_DATA = {
'KeY': 'value0',
0: {
'KeY': 'value1',
0: {'KeY': 'value2'},
},
}
self.assertEqual(traverse_obj(_CASESENSE_DATA, 'key'), None,
msg='dict keys should be case sensitive unless `casesense`')
self.assertEqual(traverse_obj(_CASESENSE_DATA, 'keY',
casesense=False), 'value0',
msg='allow non matching key case if `casesense`')
self.assertEqual(traverse_obj(_CASESENSE_DATA, (0, ('keY',)),
casesense=False), ['value1'],
msg='allow non matching key case in branch if `casesense`')
self.assertEqual(traverse_obj(_CASESENSE_DATA, (0, ((0, 'keY'),)),
casesense=False), ['value2'],
msg='allow non matching key case in branch path if `casesense`')
# Test traverse_string behavior
_TRAVERSE_STRING_DATA = {'str': 'str', 1.2: 1.2}
self.assertEqual(traverse_obj(_TRAVERSE_STRING_DATA, ('str', 0)), None,
msg='do not traverse into string if not `traverse_string`')
self.assertEqual(traverse_obj(_TRAVERSE_STRING_DATA, ('str', 0),
traverse_string=True), 's',
msg='traverse into string if `traverse_string`')
self.assertEqual(traverse_obj(_TRAVERSE_STRING_DATA, (1.2, 1),
traverse_string=True), '.',
msg='traverse into converted data if `traverse_string`')
self.assertEqual(traverse_obj(_TRAVERSE_STRING_DATA, ('str', ...),
traverse_string=True), 'str',
msg='`...` should result in string (same value) if `traverse_string`')
self.assertEqual(traverse_obj(_TRAVERSE_STRING_DATA, ('str', slice(0, None, 2)),
traverse_string=True), 'sr',
msg='`slice` should result in string if `traverse_string`')
self.assertEqual(traverse_obj(_TRAVERSE_STRING_DATA, ('str', lambda i, v: i or v == "s"),
traverse_string=True), 'str',
msg='function should result in string if `traverse_string`')
self.assertEqual(traverse_obj(_TRAVERSE_STRING_DATA, ('str', (0, 2)),
traverse_string=True), ['s', 'r'],
msg='branching should result in list if `traverse_string`')
# Test is_user_input behavior
_IS_USER_INPUT_DATA = {'range8': list(range(8))}
self.assertEqual(traverse_obj(_IS_USER_INPUT_DATA, ('range8', '3'),
is_user_input=True), 3,
msg='allow for string indexing if `is_user_input`')
self.assertCountEqual(traverse_obj(_IS_USER_INPUT_DATA, ('range8', '3:'),
is_user_input=True), tuple(range(8))[3:],
msg='allow for string slice if `is_user_input`')
self.assertCountEqual(traverse_obj(_IS_USER_INPUT_DATA, ('range8', ':4:2'),
is_user_input=True), tuple(range(8))[:4:2],
msg='allow step in string slice if `is_user_input`')
self.assertCountEqual(traverse_obj(_IS_USER_INPUT_DATA, ('range8', ':'),
is_user_input=True), range(8),
msg='`:` should be treated as `...` if `is_user_input`')
with self.assertRaises(TypeError, msg='too many params should result in error'):
traverse_obj(_IS_USER_INPUT_DATA, ('range8', ':::'), is_user_input=True)
# Test re.Match as input obj
mobj = re.fullmatch(r'0(12)(?P<group>3)(4)?', '0123')
self.assertEqual(traverse_obj(mobj, ...), [x for x in mobj.groups() if x is not None],
msg='`...` on a `re.Match` should give its `groups()`')
self.assertEqual(traverse_obj(mobj, lambda k, _: k in (0, 2)), ['0123', '3'],
msg='function on a `re.Match` should give groupno, value starting at 0')
self.assertEqual(traverse_obj(mobj, 'group'), '3',
msg='str key on a `re.Match` should give group with that name')
self.assertEqual(traverse_obj(mobj, 2), '3',
msg='int key on a `re.Match` should give group with that name')
self.assertEqual(traverse_obj(mobj, 'gRoUp', casesense=False), '3',
msg='str key on a `re.Match` should respect casesense')
self.assertEqual(traverse_obj(mobj, 'fail'), None,
msg='failing str key on a `re.Match` should return `default`')
self.assertEqual(traverse_obj(mobj, 'gRoUpS', casesense=False), None,
msg='failing str key on a `re.Match` should return `default`')
self.assertEqual(traverse_obj(mobj, 8), None,
msg='failing int key on a `re.Match` should return `default`')
self.assertEqual(traverse_obj(mobj, lambda k, _: k in (0, 'group')), ['0123', '3'],
msg='function on a `re.Match` should give group name as well')
# Test escaping
assert run_shell(['echo', 'test"&']) == '"test""&"\n'
assert run_shell(['echo', '%CMDCMDLINE:~-1%&']) == '"%CMDCMDLINE:~-1%&"\n'
assert run_shell(['echo', 'a\nb']) == '"a"\n"b"\n'
assert run_shell(['echo', '"']) == '""""\n'
assert run_shell(['echo', '\\']) == '\\\n'
# Test if delayed expansion is disabled
assert run_shell(['echo', '^!']) == '"^!"\n'
assert run_shell('echo "^!"') == '"^!"\n'
if __name__ == '__main__':

396
test/test_websockets.py Normal file
View File

@ -0,0 +1,396 @@
#!/usr/bin/env python3
# Allow direct execution
import os
import sys
import pytest
from test.helper import verify_address_availability
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
import http.client
import http.cookiejar
import http.server
import json
import random
import ssl
import threading
from yt_dlp import socks
from yt_dlp.cookies import YoutubeDLCookieJar
from yt_dlp.dependencies import websockets
from yt_dlp.networking import Request
from yt_dlp.networking.exceptions import (
CertificateVerifyError,
HTTPError,
ProxyError,
RequestError,
SSLError,
TransportError,
)
from yt_dlp.utils.networking import HTTPHeaderDict
TEST_DIR = os.path.dirname(os.path.abspath(__file__))
def websocket_handler(websocket):
for message in websocket:
if isinstance(message, bytes):
if message == b'bytes':
return websocket.send('2')
elif isinstance(message, str):
if message == 'headers':
return websocket.send(json.dumps(dict(websocket.request.headers)))
elif message == 'path':
return websocket.send(websocket.request.path)
elif message == 'source_address':
return websocket.send(websocket.remote_address[0])
elif message == 'str':
return websocket.send('1')
return websocket.send(message)
def process_request(self, request):
if request.path.startswith('/gen_'):
status = http.HTTPStatus(int(request.path[5:]))
if 300 <= status.value <= 300:
return websockets.http11.Response(
status.value, status.phrase, websockets.datastructures.Headers([('Location', '/')]), b'')
return self.protocol.reject(status.value, status.phrase)
return self.protocol.accept(request)
def create_websocket_server(**ws_kwargs):
import websockets.sync.server
wsd = websockets.sync.server.serve(
websocket_handler, '127.0.0.1', 0,
process_request=process_request, open_timeout=2, **ws_kwargs)
ws_port = wsd.socket.getsockname()[1]
ws_server_thread = threading.Thread(target=wsd.serve_forever)
ws_server_thread.daemon = True
ws_server_thread.start()
return ws_server_thread, ws_port
def create_ws_websocket_server():
return create_websocket_server()
def create_wss_websocket_server():
certfn = os.path.join(TEST_DIR, 'testcert.pem')
sslctx = ssl.SSLContext(ssl.PROTOCOL_TLS_SERVER)
sslctx.load_cert_chain(certfn, None)
return create_websocket_server(ssl_context=sslctx)
MTLS_CERT_DIR = os.path.join(TEST_DIR, 'testdata', 'certificate')
def create_mtls_wss_websocket_server():
certfn = os.path.join(TEST_DIR, 'testcert.pem')
cacertfn = os.path.join(MTLS_CERT_DIR, 'ca.crt')
sslctx = ssl.SSLContext(ssl.PROTOCOL_TLS_SERVER)
sslctx.verify_mode = ssl.CERT_REQUIRED
sslctx.load_verify_locations(cafile=cacertfn)
sslctx.load_cert_chain(certfn, None)
return create_websocket_server(ssl_context=sslctx)
def ws_validate_and_send(rh, req):
rh.validate(req)
max_tries = 3
for i in range(max_tries):
try:
return rh.send(req)
except TransportError as e:
if i < (max_tries - 1) and 'connection closed during handshake' in str(e):
# websockets server sometimes hangs on new connections
continue
raise
@pytest.mark.skipif(not websockets, reason='websockets must be installed to test websocket request handlers')
class TestWebsSocketRequestHandlerConformance:
@classmethod
def setup_class(cls):
cls.ws_thread, cls.ws_port = create_ws_websocket_server()
cls.ws_base_url = f'ws://127.0.0.1:{cls.ws_port}'
cls.wss_thread, cls.wss_port = create_wss_websocket_server()
cls.wss_base_url = f'wss://127.0.0.1:{cls.wss_port}'
cls.bad_wss_thread, cls.bad_wss_port = create_websocket_server(ssl_context=ssl.SSLContext(ssl.PROTOCOL_TLS_SERVER))
cls.bad_wss_host = f'wss://127.0.0.1:{cls.bad_wss_port}'
cls.mtls_wss_thread, cls.mtls_wss_port = create_mtls_wss_websocket_server()
cls.mtls_wss_base_url = f'wss://127.0.0.1:{cls.mtls_wss_port}'
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
def test_basic_websockets(self, handler):
with handler() as rh:
ws = ws_validate_and_send(rh, Request(self.ws_base_url))
assert 'upgrade' in ws.headers
assert ws.status == 101
ws.send('foo')
assert ws.recv() == 'foo'
ws.close()
# https://www.rfc-editor.org/rfc/rfc6455.html#section-5.6
@pytest.mark.parametrize('msg,opcode', [('str', 1), (b'bytes', 2)])
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
def test_send_types(self, handler, msg, opcode):
with handler() as rh:
ws = ws_validate_and_send(rh, Request(self.ws_base_url))
ws.send(msg)
assert int(ws.recv()) == opcode
ws.close()
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
def test_verify_cert(self, handler):
with handler() as rh:
with pytest.raises(CertificateVerifyError):
ws_validate_and_send(rh, Request(self.wss_base_url))
with handler(verify=False) as rh:
ws = ws_validate_and_send(rh, Request(self.wss_base_url))
assert ws.status == 101
ws.close()
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
def test_ssl_error(self, handler):
with handler(verify=False) as rh:
with pytest.raises(SSLError, match=r'ssl(?:v3|/tls) alert handshake failure') as exc_info:
ws_validate_and_send(rh, Request(self.bad_wss_host))
assert not issubclass(exc_info.type, CertificateVerifyError)
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
@pytest.mark.parametrize('path,expected', [
# Unicode characters should be encoded with uppercase percent-encoding
('/中文', '/%E4%B8%AD%E6%96%87'),
# don't normalize existing percent encodings
('/%c7%9f', '/%c7%9f'),
])
def test_percent_encode(self, handler, path, expected):
with handler() as rh:
ws = ws_validate_and_send(rh, Request(f'{self.ws_base_url}{path}'))
ws.send('path')
assert ws.recv() == expected
assert ws.status == 101
ws.close()
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
def test_remove_dot_segments(self, handler):
with handler() as rh:
# This isn't a comprehensive test,
# but it should be enough to check whether the handler is removing dot segments
ws = ws_validate_and_send(rh, Request(f'{self.ws_base_url}/a/b/./../../test'))
assert ws.status == 101
ws.send('path')
assert ws.recv() == '/test'
ws.close()
# We are restricted to known HTTP status codes in http.HTTPStatus
# Redirects are not supported for websockets
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
@pytest.mark.parametrize('status', (200, 204, 301, 302, 303, 400, 500, 511))
def test_raise_http_error(self, handler, status):
with handler() as rh:
with pytest.raises(HTTPError) as exc_info:
ws_validate_and_send(rh, Request(f'{self.ws_base_url}/gen_{status}'))
assert exc_info.value.status == status
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
@pytest.mark.parametrize('params,extensions', [
({'timeout': sys.float_info.min}, {}),
({}, {'timeout': sys.float_info.min}),
])
def test_timeout(self, handler, params, extensions):
with handler(**params) as rh:
with pytest.raises(TransportError):
ws_validate_and_send(rh, Request(self.ws_base_url, extensions=extensions))
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
def test_cookies(self, handler):
cookiejar = YoutubeDLCookieJar()
cookiejar.set_cookie(http.cookiejar.Cookie(
version=0, name='test', value='ytdlp', port=None, port_specified=False,
domain='127.0.0.1', domain_specified=True, domain_initial_dot=False, path='/',
path_specified=True, secure=False, expires=None, discard=False, comment=None,
comment_url=None, rest={}))
with handler(cookiejar=cookiejar) as rh:
ws = ws_validate_and_send(rh, Request(self.ws_base_url))
ws.send('headers')
assert json.loads(ws.recv())['cookie'] == 'test=ytdlp'
ws.close()
with handler() as rh:
ws = ws_validate_and_send(rh, Request(self.ws_base_url))
ws.send('headers')
assert 'cookie' not in json.loads(ws.recv())
ws.close()
ws = ws_validate_and_send(rh, Request(self.ws_base_url, extensions={'cookiejar': cookiejar}))
ws.send('headers')
assert json.loads(ws.recv())['cookie'] == 'test=ytdlp'
ws.close()
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
def test_source_address(self, handler):
source_address = f'127.0.0.{random.randint(5, 255)}'
verify_address_availability(source_address)
with handler(source_address=source_address) as rh:
ws = ws_validate_and_send(rh, Request(self.ws_base_url))
ws.send('source_address')
assert source_address == ws.recv()
ws.close()
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
def test_response_url(self, handler):
with handler() as rh:
url = f'{self.ws_base_url}/something'
ws = ws_validate_and_send(rh, Request(url))
assert ws.url == url
ws.close()
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
def test_request_headers(self, handler):
with handler(headers=HTTPHeaderDict({'test1': 'test', 'test2': 'test2'})) as rh:
# Global Headers
ws = ws_validate_and_send(rh, Request(self.ws_base_url))
ws.send('headers')
headers = HTTPHeaderDict(json.loads(ws.recv()))
assert headers['test1'] == 'test'
ws.close()
# Per request headers, merged with global
ws = ws_validate_and_send(rh, Request(
self.ws_base_url, headers={'test2': 'changed', 'test3': 'test3'}))
ws.send('headers')
headers = HTTPHeaderDict(json.loads(ws.recv()))
assert headers['test1'] == 'test'
assert headers['test2'] == 'changed'
assert headers['test3'] == 'test3'
ws.close()
@pytest.mark.parametrize('client_cert', (
{'client_certificate': os.path.join(MTLS_CERT_DIR, 'clientwithkey.crt')},
{
'client_certificate': os.path.join(MTLS_CERT_DIR, 'client.crt'),
'client_certificate_key': os.path.join(MTLS_CERT_DIR, 'client.key'),
},
{
'client_certificate': os.path.join(MTLS_CERT_DIR, 'clientwithencryptedkey.crt'),
'client_certificate_password': 'foobar',
},
{
'client_certificate': os.path.join(MTLS_CERT_DIR, 'client.crt'),
'client_certificate_key': os.path.join(MTLS_CERT_DIR, 'clientencrypted.key'),
'client_certificate_password': 'foobar',
}
))
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
def test_mtls(self, handler, client_cert):
with handler(
# Disable client-side validation of unacceptable self-signed testcert.pem
# The test is of a check on the server side, so unaffected
verify=False,
client_cert=client_cert
) as rh:
ws_validate_and_send(rh, Request(self.mtls_wss_base_url)).close()
def create_fake_ws_connection(raised):
import websockets.sync.client
class FakeWsConnection(websockets.sync.client.ClientConnection):
def __init__(self, *args, **kwargs):
class FakeResponse:
body = b''
headers = {}
status_code = 101
reason_phrase = 'test'
self.response = FakeResponse()
def send(self, *args, **kwargs):
raise raised()
def recv(self, *args, **kwargs):
raise raised()
def close(self, *args, **kwargs):
return
return FakeWsConnection()
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
class TestWebsocketsRequestHandler:
@pytest.mark.parametrize('raised,expected', [
# https://websockets.readthedocs.io/en/stable/reference/exceptions.html
(lambda: websockets.exceptions.InvalidURI(msg='test', uri='test://'), RequestError),
# Requires a response object. Should be covered by HTTP error tests.
# (lambda: websockets.exceptions.InvalidStatus(), TransportError),
(lambda: websockets.exceptions.InvalidHandshake(), TransportError),
# These are subclasses of InvalidHandshake
(lambda: websockets.exceptions.InvalidHeader(name='test'), TransportError),
(lambda: websockets.exceptions.NegotiationError(), TransportError),
# Catch-all
(lambda: websockets.exceptions.WebSocketException(), TransportError),
(lambda: TimeoutError(), TransportError),
# These may be raised by our create_connection implementation, which should also be caught
(lambda: OSError(), TransportError),
(lambda: ssl.SSLError(), SSLError),
(lambda: ssl.SSLCertVerificationError(), CertificateVerifyError),
(lambda: socks.ProxyError(), ProxyError),
])
def test_request_error_mapping(self, handler, monkeypatch, raised, expected):
import websockets.sync.client
import yt_dlp.networking._websockets
with handler() as rh:
def fake_connect(*args, **kwargs):
raise raised()
monkeypatch.setattr(yt_dlp.networking._websockets, 'create_connection', lambda *args, **kwargs: None)
monkeypatch.setattr(websockets.sync.client, 'connect', fake_connect)
with pytest.raises(expected) as exc_info:
rh.send(Request('ws://fake-url'))
assert exc_info.type is expected
@pytest.mark.parametrize('raised,expected,match', [
# https://websockets.readthedocs.io/en/stable/reference/sync/client.html#websockets.sync.client.ClientConnection.send
(lambda: websockets.exceptions.ConnectionClosed(None, None), TransportError, None),
(lambda: RuntimeError(), TransportError, None),
(lambda: TimeoutError(), TransportError, None),
(lambda: TypeError(), RequestError, None),
(lambda: socks.ProxyError(), ProxyError, None),
# Catch-all
(lambda: websockets.exceptions.WebSocketException(), TransportError, None),
])
def test_ws_send_error_mapping(self, handler, monkeypatch, raised, expected, match):
from yt_dlp.networking._websockets import WebsocketsResponseAdapter
ws = WebsocketsResponseAdapter(create_fake_ws_connection(raised), url='ws://fake-url')
with pytest.raises(expected, match=match) as exc_info:
ws.send('test')
assert exc_info.type is expected
@pytest.mark.parametrize('raised,expected,match', [
# https://websockets.readthedocs.io/en/stable/reference/sync/client.html#websockets.sync.client.ClientConnection.recv
(lambda: websockets.exceptions.ConnectionClosed(None, None), TransportError, None),
(lambda: RuntimeError(), TransportError, None),
(lambda: TimeoutError(), TransportError, None),
(lambda: socks.ProxyError(), ProxyError, None),
# Catch-all
(lambda: websockets.exceptions.WebSocketException(), TransportError, None),
])
def test_ws_recv_error_mapping(self, handler, monkeypatch, raised, expected, match):
from yt_dlp.networking._websockets import WebsocketsResponseAdapter
ws = WebsocketsResponseAdapter(create_fake_ws_connection(raised), url='ws://fake-url')
with pytest.raises(expected, match=match) as exc_info:
ws.recv()
assert exc_info.type is expected

View File

@ -62,7 +62,12 @@ _SIG_TESTS = [
'https://s.ytimg.com/yts/jsbin/html5player-en_US-vflKjOTVq/html5player.js',
'312AA52209E3623129A412D56A40F11CB0AF14AE.3EE09501CB14E3BCDC3B2AE808BF3F1D14E7FBF12',
'112AA5220913623229A412D56A40F11CB0AF14AE.3EE0950FCB14EEBCDC3B2AE808BF331D14E7FBF3',
)
),
(
'https://www.youtube.com/s/player/6ed0d907/player_ias.vflset/en_US/base.js',
'2aq0aqSyOoJXtK73m-uME_jv7-pT15gOFC02RFkGMqWpzEICs69VdbwQ0LDp1v7j8xx92efCJlYFYb1sUkkBSPOlPmXgIARw8JQ0qOAOAA',
'AOq0QJ8wRAIgXmPlOPSBkkUs1bYFYlJCfe29xx8j7v1pDL2QwbdV96sCIEzpWqMGkFR20CFOg51Tp-7vj_EMu-m37KtXJoOySqa0',
),
]
_NSIG_TESTS = [
@ -142,6 +147,22 @@ _NSIG_TESTS = [
'https://www.youtube.com/s/player/dac945fd/player_ias.vflset/en_US/base.js',
'o8BkRxXhuYsBCWi6RplPdP', '3Lx32v_hmzTm6A',
),
(
'https://www.youtube.com/s/player/6f20102c/player_ias.vflset/en_US/base.js',
'lE8DhoDmKqnmJJ', 'pJTTX6XyJP2BYw',
),
(
'https://www.youtube.com/s/player/cfa9e7cb/player_ias.vflset/en_US/base.js',
'aCi3iElgd2kq0bxVbQ', 'QX1y8jGb2IbZ0w',
),
(
'https://www.youtube.com/s/player/8c7583ff/player_ias.vflset/en_US/base.js',
'1wWCVpRR96eAmMI87L', 'KSkWAVv1ZQxC3A',
),
(
'https://www.youtube.com/s/player/b7910ca8/player_ias.vflset/en_US/base.js',
'_hXMCwMt9qE310D', 'LoZMgkkofRMCZQ',
),
]
@ -218,7 +239,7 @@ def n_sig(jscode, sig_input):
make_sig_test = t_factory(
'signature', signature, re.compile(r'.*-(?P<id>[a-zA-Z0-9_-]+)(?:/watch_as3|/html5player)?\.[a-z]+$'))
'signature', signature, re.compile(r'.*(?:-|/player/)(?P<id>[a-zA-Z0-9_-]+)(?:/.+\.js|(?:/watch_as3|/html5player)?\.[a-z]+)$'))
for test_spec in _SIG_TESTS:
make_sig_test(*test_spec)

View File

@ -1,34 +0,0 @@
{
"latest": "2013.01.06",
"signature": "72158cdba391628569ffdbea259afbcf279bbe3d8aeb7492690735dc1cfa6afa754f55c61196f3871d429599ab22f2667f1fec98865527b32632e7f4b3675a7ef0f0fbe084d359256ae4bba68f0d33854e531a70754712f244be71d4b92e664302aa99653ee4df19800d955b6c4149cd2b3f24288d6e4b40b16126e01f4c8ce6",
"versions": {
"2013.01.02": {
"bin": [
"http://youtube-dl.org/downloads/2013.01.02/youtube-dl",
"f5b502f8aaa77675c4884938b1e4871ebca2611813a0c0e74f60c0fbd6dcca6b"
],
"exe": [
"http://youtube-dl.org/downloads/2013.01.02/youtube-dl.exe",
"75fa89d2ce297d102ff27675aa9d92545bbc91013f52ec52868c069f4f9f0422"
],
"tar": [
"http://youtube-dl.org/downloads/2013.01.02/youtube-dl-2013.01.02.tar.gz",
"6a66d022ac8e1c13da284036288a133ec8dba003b7bd3a5179d0c0daca8c8196"
]
},
"2013.01.06": {
"bin": [
"http://youtube-dl.org/downloads/2013.01.06/youtube-dl",
"64b6ed8865735c6302e836d4d832577321b4519aa02640dc508580c1ee824049"
],
"exe": [
"http://youtube-dl.org/downloads/2013.01.06/youtube-dl.exe",
"58609baf91e4389d36e3ba586e21dab882daaaee537e4448b1265392ae86ff84"
],
"tar": [
"http://youtube-dl.org/downloads/2013.01.06/youtube-dl-2013.01.06.tar.gz",
"fe77ab20a95d980ed17a659aa67e371fdd4d656d19c4c7950e7b720b0c2f1a86"
]
}
}
}

View File

@ -1 +1 @@
@py -bb -Werror -Xdev "%~dp0yt_dlp\__main__.py" %*
@py -Werror -Xdev "%~dp0yt_dlp\__main__.py" %*

View File

@ -1,2 +1,2 @@
#!/usr/bin/env sh
exec "${PYTHON:-python3}" -bb -Werror -Xdev "$(dirname "$(realpath "$0")")/yt_dlp/__main__.py" "$@"
exec "${PYTHON:-python3}" -Werror -Xdev "$(dirname "$(realpath "$0")")/yt_dlp/__main__.py" "$@"

File diff suppressed because it is too large Load Diff

View File

@ -1,10 +1,10 @@
try:
import contextvars # noqa: F401
except Exception:
raise Exception(
f'You are using an unsupported version of Python. Only Python versions 3.7 and above are supported by yt-dlp') # noqa: F541
import sys
__license__ = 'Public Domain'
if sys.version_info < (3, 8):
raise ImportError(
f'You are using an unsupported version of Python. Only Python versions 3.8 and above are supported by yt-dlp') # noqa: F541
__license__ = 'The Unlicense'
import collections
import getpass
@ -12,13 +12,14 @@ import itertools
import optparse
import os
import re
import sys
import traceback
from .compat import compat_shlex_quote
from .compat import compat_os_name, compat_shlex_quote
from .cookies import SUPPORTED_BROWSERS, SUPPORTED_KEYRINGS
from .downloader.external import get_external_downloader
from .extractor import list_extractor_classes
from .extractor.adobepass import MSO_INFO
from .networking.impersonate import ImpersonateTarget
from .options import parseOpts
from .postprocessor import (
FFmpegExtractAudioPP,
@ -48,6 +49,7 @@ from .utils import (
float_or_none,
format_field,
int_or_none,
join_nonempty,
match_filter_func,
parse_bytes,
parse_duration,
@ -56,11 +58,11 @@ from .utils import (
read_stdin,
render_table,
setproctitle,
std_headers,
traverse_obj,
variadic,
write_string,
)
from .utils.networking import std_headers
from .YoutubeDL import YoutubeDL
_IN_CLI = False
@ -73,14 +75,16 @@ def _exit(status=0, *args):
def get_urls(urls, batchfile, verbose):
# Batch file verification
"""
@param verbose -1: quiet, 0: normal, 1: verbose
"""
batch_urls = []
if batchfile is not None:
try:
batch_urls = read_batch_urls(
read_stdin('URLs') if batchfile == '-'
read_stdin(None if verbose == -1 else 'URLs') if batchfile == '-'
else open(expand_path(batchfile), encoding='utf-8', errors='ignore'))
if verbose:
if verbose == 1:
write_string('[debug] Batch file urls: ' + repr(batch_urls) + '\n')
except OSError:
_exit(f'ERROR: batch file {batchfile} could not be read')
@ -187,8 +191,8 @@ def validate_options(opts):
raise ValueError(f'{max_name} "{max_val}" must be must be greater than or equal to {min_name} "{min_val}"')
# Usernames and passwords
validate(not opts.usenetrc or (opts.username is None and opts.password is None),
'.netrc', msg='using {name} conflicts with giving username/password')
validate(sum(map(bool, (opts.usenetrc, opts.netrc_cmd, opts.username))) <= 1, '.netrc',
msg='{name}, netrc command and username/password are mutually exclusive options')
validate(opts.password is None or opts.username is not None, 'account username', msg='{name} missing')
validate(opts.ap_password is None or opts.ap_username is not None,
'TV Provider account username', msg='{name} missing')
@ -319,26 +323,49 @@ def validate_options(opts):
opts.skip_download = None
del opts.outtmpl['default']
def parse_chapters(name, value):
chapters, ranges = [], []
def parse_chapters(name, value, advanced=False):
parse_timestamp = lambda x: float('inf') if x in ('inf', 'infinite') else parse_duration(x)
for regex in value or []:
if regex.startswith('*'):
for range_ in map(str.strip, regex[1:].split(',')):
mobj = range_ != '-' and re.fullmatch(r'([^-]+)?\s*-\s*([^-]+)?', range_)
dur = mobj and (parse_timestamp(mobj.group(1) or '0'), parse_timestamp(mobj.group(2) or 'inf'))
if None in (dur or [None]):
raise ValueError(f'invalid {name} time range "{regex}". Must be of the form "*start-end"')
ranges.append(dur)
continue
try:
chapters.append(re.compile(regex))
except re.error as err:
raise ValueError(f'invalid {name} regex "{regex}" - {err}')
return chapters, ranges
TIMESTAMP_RE = r'''(?x)(?:
(?P<start_sign>-?)(?P<start>[^-]+)
)?\s*-\s*(?:
(?P<end_sign>-?)(?P<end>[^-]+)
)?'''
opts.remove_chapters, opts.remove_ranges = parse_chapters('--remove-chapters', opts.remove_chapters)
opts.download_ranges = download_range_func(*parse_chapters('--download-sections', opts.download_ranges))
chapters, ranges, from_url = [], [], False
for regex in value or []:
if advanced and regex == '*from-url':
from_url = True
continue
elif not regex.startswith('*'):
try:
chapters.append(re.compile(regex))
except re.error as err:
raise ValueError(f'invalid {name} regex "{regex}" - {err}')
continue
for range_ in map(str.strip, regex[1:].split(',')):
mobj = range_ != '-' and re.fullmatch(TIMESTAMP_RE, range_)
dur = mobj and [parse_timestamp(mobj.group('start') or '0'), parse_timestamp(mobj.group('end') or 'inf')]
signs = mobj and (mobj.group('start_sign'), mobj.group('end_sign'))
err = None
if None in (dur or [None]):
err = 'Must be of the form "*start-end"'
elif not advanced and any(signs):
err = 'Negative timestamps are not allowed'
else:
dur[0] *= -1 if signs[0] else 1
dur[1] *= -1 if signs[1] else 1
if dur[1] == float('-inf'):
err = '"-inf" is not a valid end'
if err:
raise ValueError(f'invalid {name} time range "{regex}". {err}')
ranges.append(dur)
return chapters, ranges, from_url
opts.remove_chapters, opts.remove_ranges, _ = parse_chapters('--remove-chapters', opts.remove_chapters)
opts.download_ranges = download_range_func(*parse_chapters('--download-sections', opts.download_ranges, True))
# Cookies from browser
if opts.cookiesfrombrowser:
@ -363,6 +390,9 @@ def validate_options(opts):
f'Supported keyrings are: {", ".join(sorted(SUPPORTED_KEYRINGS))}')
opts.cookiesfrombrowser = (browser_name, profile, keyring, container)
if opts.impersonate is not None:
opts.impersonate = ImpersonateTarget.from_str(opts.impersonate.lower())
# MetadataParser
def metadataparser_actions(f):
if isinstance(f, str):
@ -396,12 +426,17 @@ def validate_options(opts):
except Exception as err:
raise ValueError(f'Invalid playlist-items {opts.playlist_items!r}: {err}')
geo_bypass_code = opts.geo_bypass_ip_block or opts.geo_bypass_country
if geo_bypass_code is not None:
opts.geo_bypass_country, opts.geo_bypass_ip_block = None, None
if opts.geo_bypass.lower() not in ('default', 'never'):
try:
GeoUtils.random_ipv4(geo_bypass_code)
GeoUtils.random_ipv4(opts.geo_bypass)
except Exception:
raise ValueError('unsupported geo-bypass country or ip-block')
raise ValueError(f'Unsupported --xff "{opts.geo_bypass}"')
if len(opts.geo_bypass) == 2:
opts.geo_bypass_country = opts.geo_bypass
else:
opts.geo_bypass_ip_block = opts.geo_bypass
opts.geo_bypass = opts.geo_bypass.lower() != 'never'
opts.match_filter = match_filter_func(opts.match_filter, opts.breaking_match_filter)
@ -430,6 +465,10 @@ def validate_options(opts):
elif ed and proto == 'default':
default_downloader = ed.get_basename()
for policy in opts.color.values():
if policy not in ('always', 'auto', 'no_color', 'never'):
raise ValueError(f'"{policy}" is not a valid color policy')
warnings, deprecation_warnings = [], []
# Common mistake: -f best
@ -689,7 +728,7 @@ ParsedOptions = collections.namedtuple('ParsedOptions', ('parser', 'options', 'u
def parse_options(argv=None):
"""@returns ParsedOptions(parser, opts, urls, ydl_opts)"""
parser, opts, urls = parseOpts(argv)
urls = get_urls(urls, opts.batchfile, opts.verbose)
urls = get_urls(urls, opts.batchfile, -1 if opts.quiet and not opts.verbose else opts.verbose)
set_compat_opts(opts)
try:
@ -704,7 +743,8 @@ def parse_options(argv=None):
'dumpjson', 'dump_single_json', 'getdescription', 'getduration', 'getfilename',
'getformat', 'getid', 'getthumbnail', 'gettitle', 'geturl'
))
opts.quiet = opts.quiet or any_getting or opts.print_json or bool(opts.forceprint)
if opts.quiet is None:
opts.quiet = any_getting or opts.print_json or bool(opts.forceprint)
playlist_pps = [pp for pp in postprocessors if pp.get('when') == 'playlist']
write_playlist_infojson = (opts.writeinfojson and not opts.clean_infojson
@ -730,6 +770,7 @@ def parse_options(argv=None):
return ParsedOptions(parser, opts, urls, {
'usenetrc': opts.usenetrc,
'netrc_location': opts.netrc_location,
'netrc_cmd': opts.netrc_cmd,
'username': opts.username,
'password': opts.password,
'twofactor': opts.twofactor,
@ -795,6 +836,7 @@ def parse_options(argv=None):
'noprogress': opts.quiet if opts.noprogress is None else opts.noprogress,
'progress_with_newline': opts.progress_with_newline,
'progress_template': opts.progress_template,
'progress_delta': opts.progress_delta,
'playliststart': opts.playliststart,
'playlistend': opts.playlistend,
'playlistreverse': opts.playlist_reverse,
@ -875,6 +917,7 @@ def parse_options(argv=None):
'postprocessors': postprocessors,
'fixup': opts.fixup,
'source_address': opts.source_address,
'impersonate': opts.impersonate,
'call_home': opts.call_home,
'sleep_interval_requests': opts.sleep_interval_requests,
'sleep_interval': opts.sleep_interval,
@ -887,7 +930,7 @@ def parse_options(argv=None):
'playlist_items': opts.playlist_items,
'xattr_set_filesize': opts.xattr_set_filesize,
'match_filter': opts.match_filter,
'no_color': opts.no_color,
'color': opts.color,
'ffmpeg_location': opts.ffmpeg_location,
'hls_prefer_native': opts.hls_prefer_native,
'hls_use_mpegts': opts.hls_use_mpegts,
@ -931,20 +974,80 @@ def _real_main(argv=None):
if opts.rm_cachedir:
ydl.cache.remove()
updater = Updater(ydl, opts.update_self if isinstance(opts.update_self, str) else None)
if opts.update_self and updater.update() and actual_use:
if updater.cmd:
return updater.restart()
# This code is reachable only for zip variant in py < 3.10
# It makes sense to exit here, but the old behavior is to continue
ydl.report_warning('Restart yt-dlp to use the updated version')
# return 100, 'ERROR: The program must exit for the update to complete'
try:
updater = Updater(ydl, opts.update_self)
if opts.update_self and updater.update() and actual_use:
if updater.cmd:
return updater.restart()
# This code is reachable only for zip variant in py < 3.10
# It makes sense to exit here, but the old behavior is to continue
ydl.report_warning('Restart yt-dlp to use the updated version')
# return 100, 'ERROR: The program must exit for the update to complete'
except Exception:
traceback.print_exc()
ydl._download_retcode = 100
if opts.list_impersonate_targets:
known_targets = [
# List of simplified targets we know are supported,
# to help users know what dependencies may be required.
(ImpersonateTarget('chrome'), 'curl_cffi'),
(ImpersonateTarget('edge'), 'curl_cffi'),
(ImpersonateTarget('safari'), 'curl_cffi'),
]
available_targets = ydl._get_available_impersonate_targets()
def make_row(target, handler):
return [
join_nonempty(target.client.title(), target.version, delim='-') or '-',
join_nonempty((target.os or "").title(), target.os_version, delim='-') or '-',
handler,
]
rows = [make_row(target, handler) for target, handler in available_targets]
for known_target, known_handler in known_targets:
if not any(
known_target in target and handler == known_handler
for target, handler in available_targets
):
rows.append([
ydl._format_out(text, ydl.Styles.SUPPRESS)
for text in make_row(known_target, f'{known_handler} (not available)')
])
ydl.to_screen('[info] Available impersonate targets')
ydl.to_stdout(render_table(['Client', 'OS', 'Source'], rows, extra_gap=2, delim='-'))
return
if not actual_use:
if pre_process:
return ydl._download_retcode
ydl.warn_if_short_id(sys.argv[1:] if argv is None else argv)
args = sys.argv[1:] if argv is None else argv
ydl.warn_if_short_id(args)
# Show a useful error message and wait for keypress if not launched from shell on Windows
if not args and compat_os_name == 'nt' and getattr(sys, 'frozen', False):
import ctypes.wintypes
import msvcrt
kernel32 = ctypes.WinDLL('Kernel32')
buffer = (1 * ctypes.wintypes.DWORD)()
attached_processes = kernel32.GetConsoleProcessList(buffer, 1)
# If we only have a single process attached, then the executable was double clicked
# When using `pyinstaller` with `--onefile`, two processes get attached
is_onefile = hasattr(sys, '_MEIPASS') and os.path.basename(sys._MEIPASS).startswith('_MEI')
if attached_processes == 1 or is_onefile and attached_processes == 2:
print(parser._generate_error_message(
'Do not double-click the executable, instead call it from a command line.\n'
'Please read the README for further information on how to use yt-dlp: '
'https://github.com/yt-dlp/yt-dlp#readme'))
msvcrt.getch()
_exit(2)
parser.error(
'You must provide at least one URL.\n'
'Type yt-dlp --help to see a list of all options.')
@ -952,6 +1055,8 @@ def _real_main(argv=None):
parser.destroy()
try:
if opts.load_info_filename is not None:
if all_urls:
ydl.report_warning('URLs are ignored due to --load-info-json')
return ydl.download_with_info_file(expand_path(opts.load_info_filename))
else:
return ydl.download(all_urls)

View File

@ -1,7 +1,7 @@
#!/usr/bin/env python3
# Execute with
# $ python -m yt_dlp
# $ python3 -m yt_dlp
import sys

View File

@ -1,6 +1,6 @@
import sys
from PyInstaller.utils.hooks import collect_submodules
from PyInstaller.utils.hooks import collect_submodules, collect_data_files
def pycryptodome_module():
@ -10,7 +10,7 @@ def pycryptodome_module():
try:
import Crypto # noqa: F401
print('WARNING: Using Crypto since Cryptodome is not available. '
'Install with: pip install pycryptodomex', file=sys.stderr)
'Install with: python3 -m pip install pycryptodomex', file=sys.stderr)
return 'Crypto'
except ImportError:
pass
@ -18,14 +18,19 @@ def pycryptodome_module():
def get_hidden_imports():
yield 'yt_dlp.compat._legacy'
yield from ('yt_dlp.compat._legacy', 'yt_dlp.compat._deprecated')
yield from ('yt_dlp.utils._legacy', 'yt_dlp.utils._deprecated')
yield pycryptodome_module()
yield from collect_submodules('websockets')
# Only `websockets` is required, others are collected just in case
for module in ('websockets', 'requests', 'urllib3'):
yield from collect_submodules(module)
# These are auto-detected, but explicitly add them just in case
yield from ('mutagen', 'brotli', 'certifi')
yield from ('mutagen', 'brotli', 'certifi', 'secretstorage', 'curl_cffi')
hiddenimports = list(get_hidden_imports())
print(f'Adding imports: {hiddenimports}')
excludedimports = ['youtube_dl', 'youtube_dlc', 'test', 'ytdlp_plugins', 'devscripts']
excludedimports = ['youtube_dl', 'youtube_dlc', 'test', 'ytdlp_plugins', 'devscripts', 'bundle']
datas = collect_data_files('curl_cffi', includes=['cacert.pem'])

View File

@ -1,14 +1,11 @@
import os
import sys
import warnings
import xml.etree.ElementTree as etree
from ._deprecated import * # noqa: F401, F403
from .compat_utils import passthrough_module
# XXX: Implement this the same way as other DeprecationWarnings without circular import
passthrough_module(__name__, '._legacy', callback=lambda attr: warnings.warn(
DeprecationWarning(f'{__name__}.{attr} is deprecated'), stacklevel=5))
passthrough_module(__name__, '._deprecated')
del passthrough_module
# HTMLParseError has been deprecated in Python 3.3 and removed in
@ -30,12 +27,9 @@ def compat_etree_fromstring(text):
compat_os_name = os._name if os.name == 'java' else os.name
if compat_os_name == 'nt':
def compat_shlex_quote(s):
import re
return s if re.match(r'^[-_\w./]+$', s) else '"%s"' % s.replace('"', '\\"')
else:
from shlex import quote as compat_shlex_quote # noqa: F401
def compat_shlex_quote(s):
from ..utils import shell_quote
return shell_quote(s)
def compat_ord(c):
@ -70,3 +64,13 @@ if compat_os_name in ('nt', 'ce'):
return userhome + path[i:]
else:
compat_expanduser = os.path.expanduser
def urllib_req_to_req(urllib_request):
"""Convert urllib Request to a networking Request"""
from ..networking import Request
from ..utils.networking import HTTPHeaderDict
return Request(
urllib_request.get_full_url(), data=urllib_request.data, method=urllib_request.get_method(),
headers=HTTPHeaderDict(urllib_request.headers, urllib_request.unredirected_hdrs),
extensions={'timeout': urllib_request.timeout} if hasattr(urllib_request, 'timeout') else None)

View File

@ -1,4 +1,12 @@
"""Deprecated - New code should avoid these"""
import warnings
from .compat_utils import passthrough_module
# XXX: Implement this the same way as other DeprecationWarnings without circular import
passthrough_module(__name__, '.._legacy', callback=lambda attr: warnings.warn(
DeprecationWarning(f'{__name__}.{attr} is deprecated'), stacklevel=6))
del passthrough_module
import base64
import urllib.error
@ -8,7 +16,6 @@ compat_str = str
compat_b64decode = base64.b64decode
compat_HTTPError = urllib.error.HTTPError
compat_urlparse = urllib.parse
compat_parse_qs = urllib.parse.parse_qs
compat_urllib_parse_unquote = urllib.parse.unquote

View File

@ -16,12 +16,12 @@ import shlex
import shutil
import socket
import struct
import subprocess
import tokenize
import urllib.error
import urllib.parse
import urllib.request
import xml.etree.ElementTree as etree
from subprocess import DEVNULL
# isort: split
import asyncio # noqa: F401
@ -35,6 +35,7 @@ from .compat_utils import passthrough_module
from ..dependencies import brotli as compat_brotli # noqa: F401
from ..dependencies import websockets as compat_websockets # noqa: F401
from ..dependencies.Cryptodome import AES as compat_pycrypto_AES # noqa: F401
from ..networking.exceptions import HTTPError as compat_HTTPError # noqa: F401
passthrough_module(__name__, '...utils', ('WINDOWS_VT_MODE', 'windows_enable_vt_mode'))
@ -84,10 +85,10 @@ compat_socket_create_connection = socket.create_connection
compat_Struct = struct.Struct
compat_struct_pack = struct.pack
compat_struct_unpack = struct.unpack
compat_subprocess_get_DEVNULL = lambda: DEVNULL
compat_subprocess_get_DEVNULL = lambda: subprocess.DEVNULL
compat_tokenize_tokenize = tokenize.tokenize
compat_urllib_error = urllib.error
compat_urllib_HTTPError = urllib.error.HTTPError
compat_urllib_HTTPError = compat_HTTPError
compat_urllib_parse = urllib.parse
compat_urllib_parse_parse_qs = urllib.parse.parse_qs
compat_urllib_parse_quote = urllib.parse.quote

View File

@ -15,7 +15,7 @@ def get_package_info(module):
name=getattr(module, '_yt_dlp__identifier', module.__name__),
version=str(next(filter(None, (
getattr(module, attr, None)
for attr in ('__version__', 'version_string', 'version')
for attr in ('_yt_dlp__version', '__version__', 'version_string', 'version')
)), None)))

View File

@ -10,17 +10,3 @@ try:
cache # >= 3.9
except NameError:
cache = lru_cache(maxsize=None)
try:
cached_property # >= 3.8
except NameError:
class cached_property:
def __init__(self, func):
update_wrapper(self, func)
self.func = func
def __get__(self, instance, _):
if instance is None:
return self
setattr(instance, self.func.__name__, self.func(instance))
return getattr(instance, self.func.__name__)

13
yt_dlp/compat/types.py Normal file
View File

@ -0,0 +1,13 @@
# flake8: noqa: F405
from types import * # noqa: F403
from .compat_utils import passthrough_module
passthrough_module(__name__, 'types')
del passthrough_module
try:
# NB: pypy has builtin NoneType, so checking NameError won't work
from types import NoneType # >= 3.10
except ImportError:
NoneType = type(None)

View File

@ -0,0 +1,10 @@
# flake8: noqa: F405
from urllib import * # noqa: F403
del request # noqa: F821
from . import request # noqa: F401
from ..compat_utils import passthrough_module
passthrough_module(__name__, 'urllib')
del passthrough_module

View File

@ -0,0 +1,40 @@
# flake8: noqa: F405
from urllib.request import * # noqa: F403
from ..compat_utils import passthrough_module
passthrough_module(__name__, 'urllib.request')
del passthrough_module
from .. import compat_os_name
if compat_os_name == 'nt':
# On older Python versions, proxies are extracted from Windows registry erroneously. [1]
# If the https proxy in the registry does not have a scheme, urllib will incorrectly add https:// to it. [2]
# It is unlikely that the user has actually set it to be https, so we should be fine to safely downgrade
# it to http on these older Python versions to avoid issues
# This also applies for ftp proxy type, as ftp:// proxy scheme is not supported.
# 1: https://github.com/python/cpython/issues/86793
# 2: https://github.com/python/cpython/blob/51f1ae5ceb0673316c4e4b0175384e892e33cc6e/Lib/urllib/request.py#L2683-L2698
import sys
from urllib.request import getproxies_environment, getproxies_registry
def getproxies_registry_patched():
proxies = getproxies_registry()
if (
sys.version_info >= (3, 10, 5) # https://docs.python.org/3.10/whatsnew/changelog.html#python-3-10-5-final
or (3, 9, 13) <= sys.version_info < (3, 10) # https://docs.python.org/3.9/whatsnew/changelog.html#python-3-9-13-final
):
return proxies
for scheme in ('https', 'ftp'):
if scheme in proxies and proxies[scheme].startswith(f'{scheme}://'):
proxies[scheme] = 'http' + proxies[scheme][len(scheme):]
return proxies
def getproxies():
return getproxies_environment() or getproxies_registry_patched()
del compat_os_name

View File

@ -1,7 +1,11 @@
import base64
import collections
import contextlib
import datetime as dt
import glob
import http.cookiejar
import http.cookies
import io
import json
import os
import re
@ -11,7 +15,7 @@ import subprocess
import sys
import tempfile
import time
from datetime import datetime, timedelta, timezone
import urllib.request
from enum import Enum, auto
from hashlib import pbkdf2_hmac
@ -20,7 +24,8 @@ from .aes import (
aes_gcm_decrypt_and_verify_bytes,
unpad_pkcs7,
)
from .compat import functools
from .compat import functools # isort: split
from .compat import compat_os_name
from .dependencies import (
_SECRETSTORAGE_UNAVAILABLE_REASON,
secretstorage,
@ -28,37 +33,26 @@ from .dependencies import (
)
from .minicurses import MultilinePrinter, QuietMultilinePrinter
from .utils import (
DownloadError,
Popen,
YoutubeDLCookieJar,
error_to_str,
expand_path,
is_path_like,
sanitize_url,
str_or_none,
try_call,
write_string,
)
from .utils._utils import _YDLLogger
from .utils.networking import normalize_url
CHROMIUM_BASED_BROWSERS = {'brave', 'chrome', 'chromium', 'edge', 'opera', 'vivaldi'}
SUPPORTED_BROWSERS = CHROMIUM_BASED_BROWSERS | {'firefox', 'safari'}
class YDLLogger:
def __init__(self, ydl=None):
self._ydl = ydl
def debug(self, message):
if self._ydl:
self._ydl.write_debug(message)
def info(self, message):
if self._ydl:
self._ydl.to_screen(f'[Cookies] {message}')
def warning(self, message, only_once=False):
if self._ydl:
self._ydl.report_warning(message, only_once)
def error(self, message):
if self._ydl:
self._ydl.report_error(message)
class YDLLogger(_YDLLogger):
def warning(self, message, only_once=False): # compat
return super().warning(message, once=only_once)
class ProgressBar(MultilinePrinter):
_DELAY, _timer = 0.1, 0
@ -106,7 +100,7 @@ def load_cookies(cookie_file, browser_specification, ydl):
jar = YoutubeDLCookieJar(cookie_file)
if not is_filename or os.access(cookie_file, os.R_OK):
jar.load(ignore_discard=True, ignore_expires=True)
jar.load()
cookie_jars.append(jar)
return _merge_cookie_jars(cookie_jars)
@ -127,17 +121,18 @@ def _extract_firefox_cookies(profile, container, logger):
logger.info('Extracting cookies from firefox')
if not sqlite3:
logger.warning('Cannot extract cookies from firefox without sqlite3 support. '
'Please use a python interpreter compiled with sqlite3 support')
'Please use a Python interpreter compiled with sqlite3 support')
return YoutubeDLCookieJar()
if profile is None:
search_root = _firefox_browser_dir()
search_roots = list(_firefox_browser_dirs())
elif _is_path(profile):
search_root = profile
search_roots = [profile]
else:
search_root = os.path.join(_firefox_browser_dir(), profile)
search_roots = [os.path.join(path, profile) for path in _firefox_browser_dirs()]
search_root = ', '.join(map(repr, search_roots))
cookie_database_path = _find_most_recently_used_file(search_root, 'cookies.sqlite', logger)
cookie_database_path = _newest(_firefox_cookie_dbs(search_roots))
if cookie_database_path is None:
raise FileNotFoundError(f'could not find firefox cookies database in {search_root}')
logger.debug(f'Extracting cookies from: "{cookie_database_path}"')
@ -147,7 +142,7 @@ def _extract_firefox_cookies(profile, container, logger):
containers_path = os.path.join(os.path.dirname(cookie_database_path), 'containers.json')
if not os.path.isfile(containers_path) or not os.access(containers_path, os.R_OK):
raise FileNotFoundError(f'could not read containers.json in {search_root}')
with open(containers_path) as containers:
with open(containers_path, encoding='utf8') as containers:
identities = json.load(containers).get('identities', [])
container_id = next((context.get('userContextId') for context in identities if container in (
context.get('name'),
@ -191,12 +186,25 @@ def _extract_firefox_cookies(profile, container, logger):
cursor.connection.close()
def _firefox_browser_dir():
def _firefox_browser_dirs():
if sys.platform in ('cygwin', 'win32'):
return os.path.expandvars(R'%APPDATA%\Mozilla\Firefox\Profiles')
yield os.path.expandvars(R'%APPDATA%\Mozilla\Firefox\Profiles')
elif sys.platform == 'darwin':
return os.path.expanduser('~/Library/Application Support/Firefox')
return os.path.expanduser('~/.mozilla/firefox')
yield os.path.expanduser('~/Library/Application Support/Firefox/Profiles')
else:
yield from map(os.path.expanduser, (
'~/.mozilla/firefox',
'~/snap/firefox/common/.mozilla/firefox',
'~/.var/app/org.mozilla.firefox/.mozilla/firefox',
))
def _firefox_cookie_dbs(roots):
for root in map(os.path.abspath, roots):
for pattern in ('', '*/', 'Profiles/*/'):
yield from glob.iglob(os.path.join(root, pattern, 'cookies.sqlite'))
def _get_chromium_based_browser_settings(browser_name):
@ -260,7 +268,7 @@ def _extract_chrome_cookies(browser_name, profile, keyring, logger):
if not sqlite3:
logger.warning(f'Cannot extract cookies from {browser_name} without sqlite3 support. '
'Please use a python interpreter compiled with sqlite3 support')
'Please use a Python interpreter compiled with sqlite3 support')
return YoutubeDLCookieJar()
config = _get_chromium_based_browser_settings(browser_name)
@ -277,7 +285,7 @@ def _extract_chrome_cookies(browser_name, profile, keyring, logger):
logger.error(f'{browser_name} does not support profiles')
search_root = config['browser_dir']
cookie_database_path = _find_most_recently_used_file(search_root, 'Cookies', logger)
cookie_database_path = _newest(_find_files(search_root, 'Cookies', logger))
if cookie_database_path is None:
raise FileNotFoundError(f'could not find {browser_name} cookies database in "{search_root}"')
logger.debug(f'Extracting cookies from: "{cookie_database_path}"')
@ -316,6 +324,12 @@ def _extract_chrome_cookies(browser_name, profile, keyring, logger):
counts['unencrypted'] = unencrypted_cookies
logger.debug(f'cookie version breakdown: {counts}')
return jar
except PermissionError as error:
if compat_os_name == 'nt' and error.errno == 13:
message = 'Could not copy Chrome cookie database. See https://github.com/yt-dlp/yt-dlp/issues/7271 for more info'
logger.error(message)
raise DownloadError(message) # force exit
raise
finally:
if cursor is not None:
cursor.connection.close()
@ -347,7 +361,9 @@ class ChromeCookieDecryptor:
Linux:
- cookies are either v10 or v11
- v10: AES-CBC encrypted with a fixed key
- also attempts empty password if decryption fails
- v11: AES-CBC encrypted with an OS protected key (keyring)
- also attempts empty password if decryption fails
- v11 keys can be stored in various places depending on the activate desktop environment [2]
Mac:
@ -362,7 +378,7 @@ class ChromeCookieDecryptor:
Sources:
- [1] https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/
- [2] https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/key_storage_linux.cc
- [2] https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/sync/key_storage_linux.cc
- KeyStorageLinux::CreateService
"""
@ -384,6 +400,7 @@ class LinuxChromeCookieDecryptor(ChromeCookieDecryptor):
def __init__(self, browser_keyring_name, logger, *, keyring=None):
self._logger = logger
self._v10_key = self.derive_key(b'peanuts')
self._empty_key = self.derive_key(b'')
self._cookie_counts = {'v10': 0, 'v11': 0, 'other': 0}
self._browser_keyring_name = browser_keyring_name
self._keyring = keyring
@ -396,25 +413,36 @@ class LinuxChromeCookieDecryptor(ChromeCookieDecryptor):
@staticmethod
def derive_key(password):
# values from
# https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/os_crypt_linux.cc
# https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/sync/os_crypt_linux.cc
return pbkdf2_sha1(password, salt=b'saltysalt', iterations=1, key_length=16)
def decrypt(self, encrypted_value):
"""
following the same approach as the fix in [1]: if cookies fail to decrypt then attempt to decrypt
with an empty password. The failure detection is not the same as what chromium uses so the
results won't be perfect
References:
- [1] https://chromium.googlesource.com/chromium/src/+/bbd54702284caca1f92d656fdcadf2ccca6f4165%5E%21/
- a bugfix to try an empty password as a fallback
"""
version = encrypted_value[:3]
ciphertext = encrypted_value[3:]
if version == b'v10':
self._cookie_counts['v10'] += 1
return _decrypt_aes_cbc(ciphertext, self._v10_key, self._logger)
return _decrypt_aes_cbc_multi(ciphertext, (self._v10_key, self._empty_key), self._logger)
elif version == b'v11':
self._cookie_counts['v11'] += 1
if self._v11_key is None:
self._logger.warning('cannot decrypt v11 cookies: no key found', only_once=True)
return None
return _decrypt_aes_cbc(ciphertext, self._v11_key, self._logger)
return _decrypt_aes_cbc_multi(ciphertext, (self._v11_key, self._empty_key), self._logger)
else:
self._logger.warning(f'unknown cookie version: "{version}"', only_once=True)
self._cookie_counts['other'] += 1
return None
@ -429,7 +457,7 @@ class MacChromeCookieDecryptor(ChromeCookieDecryptor):
@staticmethod
def derive_key(password):
# values from
# https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/os_crypt_mac.mm
# https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/sync/os_crypt_mac.mm
return pbkdf2_sha1(password, salt=b'saltysalt', iterations=1003, key_length=16)
def decrypt(self, encrypted_value):
@ -442,12 +470,12 @@ class MacChromeCookieDecryptor(ChromeCookieDecryptor):
self._logger.warning('cannot decrypt v10 cookies: no key found', only_once=True)
return None
return _decrypt_aes_cbc(ciphertext, self._v10_key, self._logger)
return _decrypt_aes_cbc_multi(ciphertext, (self._v10_key,), self._logger)
else:
self._cookie_counts['other'] += 1
# other prefixes are considered 'old data' which were stored as plaintext
# https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/os_crypt_mac.mm
# https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/sync/os_crypt_mac.mm
return encrypted_value
@ -467,7 +495,7 @@ class WindowsChromeCookieDecryptor(ChromeCookieDecryptor):
self._logger.warning('cannot decrypt v10 cookies: no key found', only_once=True)
return None
# https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/os_crypt_win.cc
# https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/sync/os_crypt_win.cc
# kNonceLength
nonce_length = 96 // 8
# boringssl
@ -484,23 +512,27 @@ class WindowsChromeCookieDecryptor(ChromeCookieDecryptor):
else:
self._cookie_counts['other'] += 1
# any other prefix means the data is DPAPI encrypted
# https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/os_crypt_win.cc
# https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/sync/os_crypt_win.cc
return _decrypt_windows_dpapi(encrypted_value, self._logger).decode()
def _extract_safari_cookies(profile, logger):
if profile is not None:
logger.error('safari does not support profiles')
if sys.platform != 'darwin':
raise ValueError(f'unsupported platform: {sys.platform}')
cookies_path = os.path.expanduser('~/Library/Cookies/Cookies.binarycookies')
if not os.path.isfile(cookies_path):
logger.debug('Trying secondary cookie location')
cookies_path = os.path.expanduser('~/Library/Containers/com.apple.Safari/Data/Library/Cookies/Cookies.binarycookies')
if profile:
cookies_path = os.path.expanduser(profile)
if not os.path.isfile(cookies_path):
raise FileNotFoundError('could not find safari cookies database')
raise FileNotFoundError('custom safari cookies database not found')
else:
cookies_path = os.path.expanduser('~/Library/Cookies/Cookies.binarycookies')
if not os.path.isfile(cookies_path):
logger.debug('Trying secondary cookie location')
cookies_path = os.path.expanduser('~/Library/Containers/com.apple.Safari/Data/Library/Cookies/Cookies.binarycookies')
if not os.path.isfile(cookies_path):
raise FileNotFoundError('could not find safari cookies database')
with open(cookies_path, 'rb') as f:
cookies_data = f.read()
@ -566,7 +598,7 @@ class DataParser:
def _mac_absolute_time_to_posix(timestamp):
return int((datetime(2001, 1, 1, 0, 0, tzinfo=timezone.utc) + timedelta(seconds=timestamp)).timestamp())
return int((dt.datetime(2001, 1, 1, 0, 0, tzinfo=dt.timezone.utc) + dt.timedelta(seconds=timestamp)).timestamp())
def _parse_safari_cookies_header(data, logger):
@ -663,19 +695,27 @@ class _LinuxDesktopEnvironment(Enum):
"""
OTHER = auto()
CINNAMON = auto()
DEEPIN = auto()
GNOME = auto()
KDE = auto()
KDE3 = auto()
KDE4 = auto()
KDE5 = auto()
KDE6 = auto()
PANTHEON = auto()
UKUI = auto()
UNITY = auto()
XFCE = auto()
LXQT = auto()
class _LinuxKeyring(Enum):
"""
https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/key_storage_util_linux.h
https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/sync/key_storage_util_linux.h
SelectedLinuxBackend
"""
KWALLET = auto()
KWALLET = auto() # KDE4
KWALLET5 = auto()
KWALLET6 = auto()
GNOMEKEYRING = auto()
BASICTEXT = auto()
@ -683,7 +723,7 @@ class _LinuxKeyring(Enum):
SUPPORTED_KEYRINGS = _LinuxKeyring.__members__.keys()
def _get_linux_desktop_environment(env):
def _get_linux_desktop_environment(env, logger):
"""
https://chromium.googlesource.com/chromium/src/+/refs/heads/main/base/nix/xdg_util.cc
GetDesktopEnvironment
@ -698,51 +738,97 @@ def _get_linux_desktop_environment(env):
return _LinuxDesktopEnvironment.GNOME
else:
return _LinuxDesktopEnvironment.UNITY
elif xdg_current_desktop == 'Deepin':
return _LinuxDesktopEnvironment.DEEPIN
elif xdg_current_desktop == 'GNOME':
return _LinuxDesktopEnvironment.GNOME
elif xdg_current_desktop == 'X-Cinnamon':
return _LinuxDesktopEnvironment.CINNAMON
elif xdg_current_desktop == 'KDE':
return _LinuxDesktopEnvironment.KDE
kde_version = env.get('KDE_SESSION_VERSION', None)
if kde_version == '5':
return _LinuxDesktopEnvironment.KDE5
elif kde_version == '6':
return _LinuxDesktopEnvironment.KDE6
elif kde_version == '4':
return _LinuxDesktopEnvironment.KDE4
else:
logger.info(f'unknown KDE version: "{kde_version}". Assuming KDE4')
return _LinuxDesktopEnvironment.KDE4
elif xdg_current_desktop == 'Pantheon':
return _LinuxDesktopEnvironment.PANTHEON
elif xdg_current_desktop == 'XFCE':
return _LinuxDesktopEnvironment.XFCE
elif xdg_current_desktop == 'UKUI':
return _LinuxDesktopEnvironment.UKUI
elif xdg_current_desktop == 'LXQt':
return _LinuxDesktopEnvironment.LXQT
else:
logger.info(f'XDG_CURRENT_DESKTOP is set to an unknown value: "{xdg_current_desktop}"')
elif desktop_session is not None:
if desktop_session in ('mate', 'gnome'):
if desktop_session == 'deepin':
return _LinuxDesktopEnvironment.DEEPIN
elif desktop_session in ('mate', 'gnome'):
return _LinuxDesktopEnvironment.GNOME
elif 'kde' in desktop_session:
return _LinuxDesktopEnvironment.KDE
elif 'xfce' in desktop_session:
elif desktop_session in ('kde4', 'kde-plasma'):
return _LinuxDesktopEnvironment.KDE4
elif desktop_session == 'kde':
if 'KDE_SESSION_VERSION' in env:
return _LinuxDesktopEnvironment.KDE4
else:
return _LinuxDesktopEnvironment.KDE3
elif 'xfce' in desktop_session or desktop_session == 'xubuntu':
return _LinuxDesktopEnvironment.XFCE
elif desktop_session == 'ukui':
return _LinuxDesktopEnvironment.UKUI
else:
logger.info(f'DESKTOP_SESSION is set to an unknown value: "{desktop_session}"')
else:
if 'GNOME_DESKTOP_SESSION_ID' in env:
return _LinuxDesktopEnvironment.GNOME
elif 'KDE_FULL_SESSION' in env:
return _LinuxDesktopEnvironment.KDE
if 'KDE_SESSION_VERSION' in env:
return _LinuxDesktopEnvironment.KDE4
else:
return _LinuxDesktopEnvironment.KDE3
return _LinuxDesktopEnvironment.OTHER
def _choose_linux_keyring(logger):
"""
https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/key_storage_util_linux.cc
SelectBackend
SelectBackend in [1]
There is currently support for forcing chromium to use BASIC_TEXT by creating a file called
`Disable Local Encryption` [1] in the user data dir. The function to write this file (`WriteBackendUse()` [1])
does not appear to be called anywhere other than in tests, so the user would have to create this file manually
and so would be aware enough to tell yt-dlp to use the BASIC_TEXT keyring.
References:
- [1] https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/sync/key_storage_util_linux.cc
"""
desktop_environment = _get_linux_desktop_environment(os.environ)
desktop_environment = _get_linux_desktop_environment(os.environ, logger)
logger.debug(f'detected desktop environment: {desktop_environment.name}')
if desktop_environment == _LinuxDesktopEnvironment.KDE:
if desktop_environment == _LinuxDesktopEnvironment.KDE4:
linux_keyring = _LinuxKeyring.KWALLET
elif desktop_environment == _LinuxDesktopEnvironment.OTHER:
elif desktop_environment == _LinuxDesktopEnvironment.KDE5:
linux_keyring = _LinuxKeyring.KWALLET5
elif desktop_environment == _LinuxDesktopEnvironment.KDE6:
linux_keyring = _LinuxKeyring.KWALLET6
elif desktop_environment in (
_LinuxDesktopEnvironment.KDE3, _LinuxDesktopEnvironment.LXQT, _LinuxDesktopEnvironment.OTHER
):
linux_keyring = _LinuxKeyring.BASICTEXT
else:
linux_keyring = _LinuxKeyring.GNOMEKEYRING
return linux_keyring
def _get_kwallet_network_wallet(logger):
def _get_kwallet_network_wallet(keyring, logger):
""" The name of the wallet used to store network passwords.
https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/kwallet_dbus.cc
https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/sync/kwallet_dbus.cc
KWalletDBus::NetworkWallet
which does a dbus call to the following function:
https://api.kde.org/frameworks/kwallet/html/classKWallet_1_1Wallet.html
@ -750,10 +836,22 @@ def _get_kwallet_network_wallet(logger):
"""
default_wallet = 'kdewallet'
try:
if keyring == _LinuxKeyring.KWALLET:
service_name = 'org.kde.kwalletd'
wallet_path = '/modules/kwalletd'
elif keyring == _LinuxKeyring.KWALLET5:
service_name = 'org.kde.kwalletd5'
wallet_path = '/modules/kwalletd5'
elif keyring == _LinuxKeyring.KWALLET6:
service_name = 'org.kde.kwalletd6'
wallet_path = '/modules/kwalletd6'
else:
raise ValueError(keyring)
stdout, _, returncode = Popen.run([
'dbus-send', '--session', '--print-reply=literal',
'--dest=org.kde.kwalletd5',
'/modules/kwalletd5',
f'--dest={service_name}',
wallet_path,
'org.kde.KWallet.networkWallet'
], text=True, stdout=subprocess.PIPE, stderr=subprocess.DEVNULL)
@ -768,8 +866,8 @@ def _get_kwallet_network_wallet(logger):
return default_wallet
def _get_kwallet_password(browser_keyring_name, logger):
logger.debug('using kwallet-query to obtain password from kwallet')
def _get_kwallet_password(browser_keyring_name, keyring, logger):
logger.debug(f'using kwallet-query to obtain password from {keyring.name}')
if shutil.which('kwallet-query') is None:
logger.error('kwallet-query command not found. KWallet and kwallet-query '
@ -777,7 +875,7 @@ def _get_kwallet_password(browser_keyring_name, logger):
'included in the kwallet package for your distribution')
return b''
network_wallet = _get_kwallet_network_wallet(logger)
network_wallet = _get_kwallet_network_wallet(keyring, logger)
try:
stdout, _, returncode = Popen.run([
@ -799,8 +897,9 @@ def _get_kwallet_password(browser_keyring_name, logger):
# checks hasEntry. To verify this:
# dbus-monitor "interface='org.kde.KWallet'" "type=method_return"
# while starting chrome.
# this may be a bug as the intended behaviour is to generate a random password and store
# it, but that doesn't matter here.
# this was identified as a bug later and fixed in
# https://chromium.googlesource.com/chromium/src/+/bbd54702284caca1f92d656fdcadf2ccca6f4165%5E%21/#F0
# https://chromium.googlesource.com/chromium/src/+/5463af3c39d7f5b6d11db7fbd51e38cc1974d764
return b''
else:
logger.debug('password found')
@ -838,8 +937,8 @@ def _get_linux_keyring_password(browser_keyring_name, keyring, logger):
keyring = _LinuxKeyring[keyring] if keyring else _choose_linux_keyring(logger)
logger.debug(f'Chosen keyring: {keyring.name}')
if keyring == _LinuxKeyring.KWALLET:
return _get_kwallet_password(browser_keyring_name, logger)
if keyring in (_LinuxKeyring.KWALLET, _LinuxKeyring.KWALLET5, _LinuxKeyring.KWALLET6):
return _get_kwallet_password(browser_keyring_name, keyring, logger)
elif keyring == _LinuxKeyring.GNOMEKEYRING:
return _get_gnome_keyring_password(browser_keyring_name, logger)
elif keyring == _LinuxKeyring.BASICTEXT:
@ -867,7 +966,11 @@ def _get_mac_keyring_password(browser_keyring_name, logger):
def _get_windows_v10_key(browser_root, logger):
path = _find_most_recently_used_file(browser_root, 'Local State', logger)
"""
References:
- [1] https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/sync/os_crypt_win.cc
"""
path = _newest(_find_files(browser_root, 'Local State', logger))
if path is None:
logger.error('could not find local state file')
return None
@ -875,11 +978,13 @@ def _get_windows_v10_key(browser_root, logger):
with open(path, encoding='utf8') as f:
data = json.load(f)
try:
# kOsCryptEncryptedKeyPrefName in [1]
base64_key = data['os_crypt']['encrypted_key']
except KeyError:
logger.error('no encrypted key in Local State')
return None
encrypted_key = base64.b64decode(base64_key)
# kDPAPIKeyPrefix in [1]
prefix = b'DPAPI'
if not encrypted_key.startswith(prefix):
logger.error('invalid key')
@ -891,13 +996,15 @@ def pbkdf2_sha1(password, salt, iterations, key_length):
return pbkdf2_hmac('sha1', password, salt, iterations, key_length)
def _decrypt_aes_cbc(ciphertext, key, logger, initialization_vector=b' ' * 16):
plaintext = unpad_pkcs7(aes_cbc_decrypt_bytes(ciphertext, key, initialization_vector))
try:
return plaintext.decode()
except UnicodeDecodeError:
logger.warning('failed to decrypt cookie (AES-CBC) because UTF-8 decoding failed. Possibly the key is wrong?', only_once=True)
return None
def _decrypt_aes_cbc_multi(ciphertext, keys, logger, initialization_vector=b' ' * 16):
for key in keys:
plaintext = unpad_pkcs7(aes_cbc_decrypt_bytes(ciphertext, key, initialization_vector))
try:
return plaintext.decode()
except UnicodeDecodeError:
pass
logger.warning('failed to decrypt cookie (AES-CBC) because UTF-8 decoding failed. Possibly the key is wrong?', only_once=True)
return None
def _decrypt_aes_gcm(ciphertext, key, nonce, authentication_tag, logger):
@ -965,17 +1072,20 @@ def _get_column_names(cursor, table_name):
return [row[1].decode() for row in table_info]
def _find_most_recently_used_file(root, filename, logger):
def _newest(files):
return max(files, key=lambda path: os.lstat(path).st_mtime, default=None)
def _find_files(root, filename, logger):
# if there are multiple browser profiles, take the most recently used one
i, paths = 0, []
i = 0
with _create_progress_bar(logger) as progress_bar:
for curr_root, dirs, files in os.walk(root):
for curr_root, _, files in os.walk(root):
for file in files:
i += 1
progress_bar.print(f'Searching for "{filename}": {i: 6d} files searched')
if file == filename:
paths.append(os.path.join(curr_root, file))
return None if not paths else max(paths, key=lambda path: os.lstat(path).st_mtime)
yield os.path.join(curr_root, file)
def _merge_cookie_jars(jars):
@ -989,7 +1099,7 @@ def _merge_cookie_jars(jars):
def _is_path(value):
return os.path.sep in value
return any(sep in value for sep in (os.path.sep, os.path.altsep) if sep)
def _parse_browser_specification(browser_name, profile=None, keyring=None, container=None):
@ -1091,3 +1201,150 @@ class LenientSimpleCookie(http.cookies.SimpleCookie):
else:
morsel = None
class YoutubeDLCookieJar(http.cookiejar.MozillaCookieJar):
"""
See [1] for cookie file format.
1. https://curl.haxx.se/docs/http-cookies.html
"""
_HTTPONLY_PREFIX = '#HttpOnly_'
_ENTRY_LEN = 7
_HEADER = '''# Netscape HTTP Cookie File
# This file is generated by yt-dlp. Do not edit.
'''
_CookieFileEntry = collections.namedtuple(
'CookieFileEntry',
('domain_name', 'include_subdomains', 'path', 'https_only', 'expires_at', 'name', 'value'))
def __init__(self, filename=None, *args, **kwargs):
super().__init__(None, *args, **kwargs)
if is_path_like(filename):
filename = os.fspath(filename)
self.filename = filename
@staticmethod
def _true_or_false(cndn):
return 'TRUE' if cndn else 'FALSE'
@contextlib.contextmanager
def open(self, file, *, write=False):
if is_path_like(file):
with open(file, 'w' if write else 'r', encoding='utf-8') as f:
yield f
else:
if write:
file.truncate(0)
yield file
def _really_save(self, f, ignore_discard, ignore_expires):
now = time.time()
for cookie in self:
if (not ignore_discard and cookie.discard
or not ignore_expires and cookie.is_expired(now)):
continue
name, value = cookie.name, cookie.value
if value is None:
# cookies.txt regards 'Set-Cookie: foo' as a cookie
# with no name, whereas http.cookiejar regards it as a
# cookie with no value.
name, value = '', name
f.write('%s\n' % '\t'.join((
cookie.domain,
self._true_or_false(cookie.domain.startswith('.')),
cookie.path,
self._true_or_false(cookie.secure),
str_or_none(cookie.expires, default=''),
name, value
)))
def save(self, filename=None, ignore_discard=True, ignore_expires=True):
"""
Save cookies to a file.
Code is taken from CPython 3.6
https://github.com/python/cpython/blob/8d999cbf4adea053be6dbb612b9844635c4dfb8e/Lib/http/cookiejar.py#L2091-L2117 """
if filename is None:
if self.filename is not None:
filename = self.filename
else:
raise ValueError(http.cookiejar.MISSING_FILENAME_TEXT)
# Store session cookies with `expires` set to 0 instead of an empty string
for cookie in self:
if cookie.expires is None:
cookie.expires = 0
with self.open(filename, write=True) as f:
f.write(self._HEADER)
self._really_save(f, ignore_discard, ignore_expires)
def load(self, filename=None, ignore_discard=True, ignore_expires=True):
"""Load cookies from a file."""
if filename is None:
if self.filename is not None:
filename = self.filename
else:
raise ValueError(http.cookiejar.MISSING_FILENAME_TEXT)
def prepare_line(line):
if line.startswith(self._HTTPONLY_PREFIX):
line = line[len(self._HTTPONLY_PREFIX):]
# comments and empty lines are fine
if line.startswith('#') or not line.strip():
return line
cookie_list = line.split('\t')
if len(cookie_list) != self._ENTRY_LEN:
raise http.cookiejar.LoadError('invalid length %d' % len(cookie_list))
cookie = self._CookieFileEntry(*cookie_list)
if cookie.expires_at and not cookie.expires_at.isdigit():
raise http.cookiejar.LoadError('invalid expires at %s' % cookie.expires_at)
return line
cf = io.StringIO()
with self.open(filename) as f:
for line in f:
try:
cf.write(prepare_line(line))
except http.cookiejar.LoadError as e:
if f'{line.strip()} '[0] in '[{"':
raise http.cookiejar.LoadError(
'Cookies file must be Netscape formatted, not JSON. See '
'https://github.com/yt-dlp/yt-dlp/wiki/FAQ#how-do-i-pass-cookies-to-yt-dlp')
write_string(f'WARNING: skipping cookie file entry due to {e}: {line!r}\n')
continue
cf.seek(0)
self._really_load(cf, filename, ignore_discard, ignore_expires)
# Session cookies are denoted by either `expires` field set to
# an empty string or 0. MozillaCookieJar only recognizes the former
# (see [1]). So we need force the latter to be recognized as session
# cookies on our own.
# Session cookies may be important for cookies-based authentication,
# e.g. usually, when user does not check 'Remember me' check box while
# logging in on a site, some important cookies are stored as session
# cookies so that not recognizing them will result in failed login.
# 1. https://bugs.python.org/issue17164
for cookie in self:
# Treat `expires=0` cookies as session cookies
if cookie.expires == 0:
cookie.expires = None
cookie.discard = True
def get_cookie_header(self, url):
"""Generate a Cookie HTTP header for a given url"""
cookie_req = urllib.request.Request(normalize_url(sanitize_url(url)))
self.add_cookie_header(cookie_req)
return cookie_req.get_header('Cookie')
def get_cookies_for_url(self, url):
"""Generate a list of Cookie objects for a given url"""
# Policy `_now` attribute must be set before calling `_cookies_for_request`
# Ref: https://github.com/python/cpython/blob/3.7/Lib/http/cookiejar.py#L1360
self._policy._now = self._now = int(time.time())
return self._cookies_for_request(urllib.request.Request(normalize_url(sanitize_url(url))))
def clear(self, *args, **kwargs):
with contextlib.suppress(KeyError):
return super().clear(*args, **kwargs)

View File

@ -1,4 +1,4 @@
import types
from ..compat.compat_utils import passthrough_module
try:
import Cryptodome as _parent
@ -6,9 +6,11 @@ except ImportError:
try:
import Crypto as _parent
except (ImportError, SyntaxError): # Old Crypto gives SyntaxError in newer Python
_parent = types.ModuleType('no_Cryptodome')
_parent = passthrough_module(__name__, 'no_Cryptodome')
__bool__ = lambda: False
del passthrough_module
__version__ = ''
AES = PKCS1_v1_5 = Blowfish = PKCS1_OAEP = SHA1 = CMAC = RSA = None
try:

View File

@ -43,19 +43,28 @@ except Exception as _err:
try:
import sqlite3
# We need to get the underlying `sqlite` version, see https://github.com/yt-dlp/yt-dlp/issues/8152
sqlite3._yt_dlp__version = sqlite3.sqlite_version
except ImportError:
# although sqlite3 is part of the standard library, it is possible to compile python without
# although sqlite3 is part of the standard library, it is possible to compile Python without
# sqlite support. See: https://github.com/yt-dlp/yt-dlp/issues/544
sqlite3 = None
try:
import websockets
except (ImportError, SyntaxError):
# websockets 3.10 on python 3.6 causes SyntaxError
# See https://github.com/yt-dlp/yt-dlp/issues/2633
except ImportError:
websockets = None
try:
import urllib3
except ImportError:
urllib3 = None
try:
import requests
except ImportError:
requests = None
try:
import xattr # xattr or pyxattr
@ -65,6 +74,10 @@ else:
if hasattr(xattr, 'set'): # pyxattr
xattr._yt_dlp__identifier = 'pyxattr'
try:
import curl_cffi
except ImportError:
curl_cffi = None
from . import Cryptodome

View File

@ -30,7 +30,7 @@ from .hls import HlsFD
from .http import HttpFD
from .ism import IsmFD
from .mhtml import MhtmlFD
from .niconico import NiconicoDmcFD
from .niconico import NiconicoDmcFD, NiconicoLiveFD
from .rtmp import RtmpFD
from .rtsp import RtspFD
from .websocket import WebSocketFragmentFD
@ -50,6 +50,7 @@ PROTOCOL_MAP = {
'ism': IsmFD,
'mhtml': MhtmlFD,
'niconico_dmc': NiconicoDmcFD,
'niconico_live': NiconicoLiveFD,
'fc2_live': FC2LiveFD,
'websocket_frag': WebSocketFragmentFD,
'youtube_live_chat': YoutubeLiveChatFD,

View File

@ -4,6 +4,7 @@ import functools
import os
import random
import re
import threading
import time
from ..minicurses import (
@ -49,10 +50,10 @@ class FileDownloader:
verbose: Print additional info to stdout.
quiet: Do not print messages to stdout.
ratelimit: Download speed limit, in bytes/sec.
continuedl: Attempt to continue downloads if possible
throttledratelimit: Assume the download is being throttled below this speed (bytes/sec)
retries: Number of times to retry for HTTP error 5xx
file_access_retries: Number of times to retry on file access error
retries: Number of times to retry for expected network errors.
Default is 0 for API, but 10 for CLI
file_access_retries: Number of times to retry on file access error (default: 3)
buffersize: Size of download buffer in bytes.
noresizebuffer: Do not automatically resize the download buffer.
continuedl: Try to continue downloads if possible.
@ -63,6 +64,7 @@ class FileDownloader:
min_filesize: Skip files smaller than this size
max_filesize: Skip files larger than this size
xattr_set_filesize: Set ytdl.filesize user xattribute with expected size.
progress_delta: The minimum time between progress output, in seconds
external_downloader_args: A dictionary of downloader keys (in lower case)
and a list of additional command-line arguments for the
executable. Use 'default' as the name for arguments to be
@ -88,6 +90,9 @@ class FileDownloader:
self.params = params
self._prepare_multiline_status()
self.add_progress_hook(self.report_progress)
if self.params.get('progress_delta'):
self._progress_delta_lock = threading.Lock()
self._progress_delta_time = time.monotonic()
def _set_ydl(self, ydl):
self.ydl = ydl
@ -138,17 +143,21 @@ class FileDownloader:
def format_percent(percent):
return ' N/A%' if percent is None else f'{percent:>5.1f}%'
@staticmethod
def calc_eta(start, now, total, current):
@classmethod
def calc_eta(cls, start_or_rate, now_or_remaining, total=NO_DEFAULT, current=NO_DEFAULT):
if total is NO_DEFAULT:
rate, remaining = start_or_rate, now_or_remaining
if None in (rate, remaining):
return None
return int(float(remaining) / rate)
start, now = start_or_rate, now_or_remaining
if total is None:
return None
if now is None:
now = time.time()
dif = now - start
if current == 0 or dif < 0.001: # One millisecond
return None
rate = float(current) / dif
return int((float(total) - float(current)) / rate)
rate = cls.calc_speed(start, now, current)
return rate and int((float(total) - float(current)) / rate)
@staticmethod
def calc_speed(start, now, bytes):
@ -165,6 +174,12 @@ class FileDownloader:
def format_retries(retries):
return 'inf' if retries == float('inf') else int(retries)
@staticmethod
def filesize_or_none(unencoded_filename):
if os.path.isfile(unencoded_filename):
return os.path.getsize(unencoded_filename)
return 0
@staticmethod
def best_block_size(elapsed_time, bytes):
new_min = max(bytes / 2.0, 1.0)
@ -225,7 +240,7 @@ class FileDownloader:
sleep_func=fd.params.get('retry_sleep_functions', {}).get('file_access'))
def wrapper(self, func, *args, **kwargs):
for retry in RetryManager(self.params.get('file_access_retries'), error_callback, fd=self):
for retry in RetryManager(self.params.get('file_access_retries', 3), error_callback, fd=self):
try:
return func(self, *args, **kwargs)
except OSError as err:
@ -245,7 +260,8 @@ class FileDownloader:
@wrap_file_access('remove')
def try_remove(self, filename):
os.remove(filename)
if os.path.isfile(filename):
os.remove(filename)
@wrap_file_access('rename')
def try_rename(self, old_filename, new_filename):
@ -285,7 +301,8 @@ class FileDownloader:
self._multiline = BreaklineStatusPrinter(self.ydl._out_files.out, lines)
else:
self._multiline = MultilinePrinter(self.ydl._out_files.out, lines, not self.params.get('quiet'))
self._multiline.allow_colors = self._multiline._HAVE_FULLCAP and not self.params.get('no_color')
self._multiline.allow_colors = self.ydl._allow_colors.out and self.ydl._allow_colors.out != 'no_color'
self._multiline._HAVE_FULLCAP = self.ydl._allow_colors.out
def _finish_multiline_status(self):
self._multiline.end()
@ -354,6 +371,12 @@ class FileDownloader:
if s['status'] != 'downloading':
return
if update_delta := self.params.get('progress_delta'):
with self._progress_delta_lock:
if time.monotonic() < self._progress_delta_time:
return
self._progress_delta_time += update_delta
s.update({
'_eta_str': self.format_eta(s.get('eta')).strip(),
'_speed_str': self.format_speed(s.get('speed')),
@ -407,7 +430,6 @@ class FileDownloader:
"""Download to a filename using the info from info_dict
Return True on success and False otherwise
"""
nooverwrites_and_exists = (
not self.params.get('overwrites', True)
and os.path.exists(encodeFilename(filename))

View File

@ -15,12 +15,15 @@ class DashSegmentsFD(FragmentFD):
FD_NAME = 'dashsegments'
def real_download(self, filename, info_dict):
if info_dict.get('is_live') and set(info_dict['protocol'].split('+')) != {'http_dash_segments_generator'}:
self.report_error('Live DASH videos are not supported')
if 'http_dash_segments_generator' in info_dict['protocol'].split('+'):
real_downloader = None # No external FD can support --live-from-start
else:
if info_dict.get('is_live'):
self.report_error('Live DASH videos are not supported')
real_downloader = get_suitable_downloader(
info_dict, self.params, None, protocol='dash_frag_urls', to_stdout=(filename == '-'))
real_start = time.time()
real_downloader = get_suitable_downloader(
info_dict, self.params, None, protocol='dash_frag_urls', to_stdout=(filename == '-'))
requested_formats = [{**info_dict, **fmt} for fmt in info_dict.get('requested_formats', [])]
args = []

Some files were not shown because too many files have changed in this diff Show More