mirror of
https://github.com/yt-dlp/yt-dlp.git
synced 2024-11-02 06:27:21 +00:00
Merge branch 'master' into radioradicale
This commit is contained in:
commit
052534bcc2
37
.github/ISSUE_TEMPLATE/1_broken_site.yml
vendored
37
.github/ISSUE_TEMPLATE/1_broken_site.yml
vendored
|
@ -1,5 +1,5 @@
|
|||
name: Broken site
|
||||
description: Report broken or misfunctioning site
|
||||
name: Broken site support
|
||||
description: Report issue with yt-dlp on a supported site
|
||||
labels: [triage, site-bug]
|
||||
body:
|
||||
- type: checkboxes
|
||||
|
@ -7,7 +7,7 @@ body:
|
|||
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||
description: Fill all fields even if you think it is irrelevant for the issue
|
||||
options:
|
||||
- label: I understand that I will be **blocked** if I remove or skip any mandatory\* field
|
||||
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
|
||||
required: true
|
||||
- type: checkboxes
|
||||
id: checklist
|
||||
|
@ -16,15 +16,15 @@ body:
|
|||
description: |
|
||||
Carefully read and work through this check list in order to prevent the most common mistakes and misuse of yt-dlp:
|
||||
options:
|
||||
- label: I'm reporting a broken site
|
||||
- label: I'm reporting that yt-dlp is broken on a **supported** site
|
||||
required: true
|
||||
- label: I've verified that I'm running yt-dlp version **2022.11.11** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||
required: true
|
||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||
required: true
|
||||
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#video-url-contains-an-ampersand--and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
|
||||
required: true
|
||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
required: true
|
||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||
required: true
|
||||
|
@ -50,6 +50,8 @@ body:
|
|||
options:
|
||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||
required: true
|
||||
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
|
||||
required: false
|
||||
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
||||
required: true
|
||||
- type: textarea
|
||||
|
@ -59,20 +61,27 @@ body:
|
|||
description: |
|
||||
It should start like this:
|
||||
placeholder: |
|
||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
||||
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||
[debug] yt-dlp version 2022.11.11 [9d339c4] (win32_exe)
|
||||
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
|
||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||
[debug] Checking exe version: ffmpeg -bsfs
|
||||
[debug] Checking exe version: ffprobe -bsfs
|
||||
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||
[debug] Proxy map: {}
|
||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
||||
Latest version: 2022.11.11, Current version: 2022.11.11
|
||||
yt-dlp is up to date (2022.11.11)
|
||||
[debug] Request Handlers: urllib, requests
|
||||
[debug] Loaded 1893 extractors
|
||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
|
||||
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
||||
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
||||
<more lines>
|
||||
render: shell
|
||||
validations:
|
||||
required: true
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
> [!CAUTION]
|
||||
> ### GitHub is experiencing a high volume of malicious spam comments.
|
||||
> ### If you receive any replies asking you download a file, do NOT follow the download links!
|
||||
>
|
||||
> Note that this issue may be temporarily locked as an anti-spam measure after it is opened.
|
||||
|
|
|
@ -7,7 +7,7 @@ body:
|
|||
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||
description: Fill all fields even if you think it is irrelevant for the issue
|
||||
options:
|
||||
- label: I understand that I will be **blocked** if I remove or skip any mandatory\* field
|
||||
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
|
||||
required: true
|
||||
- type: checkboxes
|
||||
id: checklist
|
||||
|
@ -18,13 +18,13 @@ body:
|
|||
options:
|
||||
- label: I'm reporting a new site support request
|
||||
required: true
|
||||
- label: I've verified that I'm running yt-dlp version **2022.11.11** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||
required: true
|
||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||
required: true
|
||||
- label: I've checked that none of provided URLs [violate any copyrights](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-website-primarily-used-for-piracy) or contain any [DRM](https://en.wikipedia.org/wiki/Digital_rights_management) to the best of my knowledge
|
||||
required: true
|
||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
required: true
|
||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||
required: true
|
||||
|
@ -62,6 +62,8 @@ body:
|
|||
options:
|
||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||
required: true
|
||||
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
|
||||
required: false
|
||||
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
||||
required: true
|
||||
- type: textarea
|
||||
|
@ -71,20 +73,27 @@ body:
|
|||
description: |
|
||||
It should start like this:
|
||||
placeholder: |
|
||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
||||
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||
[debug] yt-dlp version 2022.11.11 [9d339c4] (win32_exe)
|
||||
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
|
||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||
[debug] Checking exe version: ffmpeg -bsfs
|
||||
[debug] Checking exe version: ffprobe -bsfs
|
||||
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||
[debug] Proxy map: {}
|
||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
||||
Latest version: 2022.11.11, Current version: 2022.11.11
|
||||
yt-dlp is up to date (2022.11.11)
|
||||
[debug] Request Handlers: urllib, requests
|
||||
[debug] Loaded 1893 extractors
|
||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
|
||||
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
||||
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
||||
<more lines>
|
||||
render: shell
|
||||
validations:
|
||||
required: true
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
> [!CAUTION]
|
||||
> ### GitHub is experiencing a high volume of malicious spam comments.
|
||||
> ### If you receive any replies asking you download a file, do NOT follow the download links!
|
||||
>
|
||||
> Note that this issue may be temporarily locked as an anti-spam measure after it is opened.
|
||||
|
|
|
@ -7,7 +7,7 @@ body:
|
|||
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||
description: Fill all fields even if you think it is irrelevant for the issue
|
||||
options:
|
||||
- label: I understand that I will be **blocked** if I remove or skip any mandatory\* field
|
||||
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
|
||||
required: true
|
||||
- type: checkboxes
|
||||
id: checklist
|
||||
|
@ -18,11 +18,11 @@ body:
|
|||
options:
|
||||
- label: I'm requesting a site-specific feature
|
||||
required: true
|
||||
- label: I've verified that I'm running yt-dlp version **2022.11.11** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||
required: true
|
||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||
required: true
|
||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
required: true
|
||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||
required: true
|
||||
|
@ -58,6 +58,8 @@ body:
|
|||
options:
|
||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||
required: true
|
||||
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
|
||||
required: false
|
||||
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
||||
required: true
|
||||
- type: textarea
|
||||
|
@ -67,20 +69,27 @@ body:
|
|||
description: |
|
||||
It should start like this:
|
||||
placeholder: |
|
||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
||||
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||
[debug] yt-dlp version 2022.11.11 [9d339c4] (win32_exe)
|
||||
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
|
||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||
[debug] Checking exe version: ffmpeg -bsfs
|
||||
[debug] Checking exe version: ffprobe -bsfs
|
||||
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||
[debug] Proxy map: {}
|
||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
||||
Latest version: 2022.11.11, Current version: 2022.11.11
|
||||
yt-dlp is up to date (2022.11.11)
|
||||
[debug] Request Handlers: urllib, requests
|
||||
[debug] Loaded 1893 extractors
|
||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
|
||||
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
||||
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
||||
<more lines>
|
||||
render: shell
|
||||
validations:
|
||||
required: true
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
> [!CAUTION]
|
||||
> ### GitHub is experiencing a high volume of malicious spam comments.
|
||||
> ### If you receive any replies asking you download a file, do NOT follow the download links!
|
||||
>
|
||||
> Note that this issue may be temporarily locked as an anti-spam measure after it is opened.
|
||||
|
|
33
.github/ISSUE_TEMPLATE/4_bug_report.yml
vendored
33
.github/ISSUE_TEMPLATE/4_bug_report.yml
vendored
|
@ -1,4 +1,4 @@
|
|||
name: Bug report
|
||||
name: Core bug report
|
||||
description: Report a bug unrelated to any particular site or extractor
|
||||
labels: [triage, bug]
|
||||
body:
|
||||
|
@ -7,7 +7,7 @@ body:
|
|||
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||
description: Fill all fields even if you think it is irrelevant for the issue
|
||||
options:
|
||||
- label: I understand that I will be **blocked** if I remove or skip any mandatory\* field
|
||||
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
|
||||
required: true
|
||||
- type: checkboxes
|
||||
id: checklist
|
||||
|
@ -18,13 +18,13 @@ body:
|
|||
options:
|
||||
- label: I'm reporting a bug unrelated to a specific site
|
||||
required: true
|
||||
- label: I've verified that I'm running yt-dlp version **2022.11.11** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||
required: true
|
||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||
required: true
|
||||
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#video-url-contains-an-ampersand--and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
|
||||
required: true
|
||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
required: true
|
||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||
required: true
|
||||
|
@ -43,6 +43,8 @@ body:
|
|||
options:
|
||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||
required: true
|
||||
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
|
||||
required: false
|
||||
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
||||
required: true
|
||||
- type: textarea
|
||||
|
@ -52,20 +54,27 @@ body:
|
|||
description: |
|
||||
It should start like this:
|
||||
placeholder: |
|
||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
||||
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||
[debug] yt-dlp version 2022.11.11 [9d339c4] (win32_exe)
|
||||
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
|
||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||
[debug] Checking exe version: ffmpeg -bsfs
|
||||
[debug] Checking exe version: ffprobe -bsfs
|
||||
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||
[debug] Proxy map: {}
|
||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
||||
Latest version: 2022.11.11, Current version: 2022.11.11
|
||||
yt-dlp is up to date (2022.11.11)
|
||||
[debug] Request Handlers: urllib, requests
|
||||
[debug] Loaded 1893 extractors
|
||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
|
||||
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
||||
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
||||
<more lines>
|
||||
render: shell
|
||||
validations:
|
||||
required: true
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
> [!CAUTION]
|
||||
> ### GitHub is experiencing a high volume of malicious spam comments.
|
||||
> ### If you receive any replies asking you download a file, do NOT follow the download links!
|
||||
>
|
||||
> Note that this issue may be temporarily locked as an anti-spam measure after it is opened.
|
||||
|
|
31
.github/ISSUE_TEMPLATE/5_feature_request.yml
vendored
31
.github/ISSUE_TEMPLATE/5_feature_request.yml
vendored
|
@ -7,7 +7,7 @@ body:
|
|||
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||
description: Fill all fields even if you think it is irrelevant for the issue
|
||||
options:
|
||||
- label: I understand that I will be **blocked** if I remove or skip any mandatory\* field
|
||||
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
|
||||
required: true
|
||||
- type: checkboxes
|
||||
id: checklist
|
||||
|
@ -20,9 +20,9 @@ body:
|
|||
required: true
|
||||
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
|
||||
required: true
|
||||
- label: I've verified that I'm running yt-dlp version **2022.11.11** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||
required: true
|
||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
required: true
|
||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||
required: true
|
||||
|
@ -40,6 +40,8 @@ body:
|
|||
label: Provide verbose output that clearly demonstrates the problem
|
||||
options:
|
||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
|
||||
required: false
|
||||
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
||||
- type: textarea
|
||||
id: log
|
||||
|
@ -48,18 +50,25 @@ body:
|
|||
description: |
|
||||
It should start like this:
|
||||
placeholder: |
|
||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
||||
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||
[debug] yt-dlp version 2022.11.11 [9d339c4] (win32_exe)
|
||||
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
|
||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||
[debug] Checking exe version: ffmpeg -bsfs
|
||||
[debug] Checking exe version: ffprobe -bsfs
|
||||
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||
[debug] Proxy map: {}
|
||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
||||
Latest version: 2022.11.11, Current version: 2022.11.11
|
||||
yt-dlp is up to date (2022.11.11)
|
||||
[debug] Request Handlers: urllib, requests
|
||||
[debug] Loaded 1893 extractors
|
||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
|
||||
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
||||
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
||||
<more lines>
|
||||
render: shell
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
> [!CAUTION]
|
||||
> ### GitHub is experiencing a high volume of malicious spam comments.
|
||||
> ### If you receive any replies asking you download a file, do NOT follow the download links!
|
||||
>
|
||||
> Note that this issue may be temporarily locked as an anti-spam measure after it is opened.
|
||||
|
|
31
.github/ISSUE_TEMPLATE/6_question.yml
vendored
31
.github/ISSUE_TEMPLATE/6_question.yml
vendored
|
@ -7,7 +7,7 @@ body:
|
|||
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||
description: Fill all fields even if you think it is irrelevant for the issue
|
||||
options:
|
||||
- label: I understand that I will be **blocked** if I remove or skip any mandatory\* field
|
||||
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
|
||||
required: true
|
||||
- type: markdown
|
||||
attributes:
|
||||
|
@ -26,9 +26,9 @@ body:
|
|||
required: true
|
||||
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
|
||||
required: true
|
||||
- label: I've verified that I'm running yt-dlp version **2022.11.11** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||
required: true
|
||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar questions **including closed ones**. DO NOT post duplicates
|
||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar questions **including closed ones**. DO NOT post duplicates
|
||||
required: true
|
||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||
required: true
|
||||
|
@ -46,6 +46,8 @@ body:
|
|||
label: Provide verbose output that clearly demonstrates the problem
|
||||
options:
|
||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
|
||||
required: false
|
||||
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
||||
- type: textarea
|
||||
id: log
|
||||
|
@ -54,18 +56,25 @@ body:
|
|||
description: |
|
||||
It should start like this:
|
||||
placeholder: |
|
||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
||||
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||
[debug] yt-dlp version 2022.11.11 [9d339c4] (win32_exe)
|
||||
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
|
||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||
[debug] Checking exe version: ffmpeg -bsfs
|
||||
[debug] Checking exe version: ffprobe -bsfs
|
||||
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||
[debug] Proxy map: {}
|
||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
||||
Latest version: 2022.11.11, Current version: 2022.11.11
|
||||
yt-dlp is up to date (2022.11.11)
|
||||
[debug] Request Handlers: urllib, requests
|
||||
[debug] Loaded 1893 extractors
|
||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
|
||||
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
||||
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
||||
<more lines>
|
||||
render: shell
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
> [!CAUTION]
|
||||
> ### GitHub is experiencing a high volume of malicious spam comments.
|
||||
> ### If you receive any replies asking you download a file, do NOT follow the download links!
|
||||
>
|
||||
> Note that this issue may be temporarily locked as an anti-spam measure after it is opened.
|
||||
|
|
10
.github/ISSUE_TEMPLATE_tmpl/1_broken_site.yml
vendored
10
.github/ISSUE_TEMPLATE_tmpl/1_broken_site.yml
vendored
|
@ -1,5 +1,5 @@
|
|||
name: Broken site
|
||||
description: Report broken or misfunctioning site
|
||||
name: Broken site support
|
||||
description: Report issue with yt-dlp on a supported site
|
||||
labels: [triage, site-bug]
|
||||
body:
|
||||
%(no_skip)s
|
||||
|
@ -10,15 +10,15 @@ body:
|
|||
description: |
|
||||
Carefully read and work through this check list in order to prevent the most common mistakes and misuse of yt-dlp:
|
||||
options:
|
||||
- label: I'm reporting a broken site
|
||||
- label: I'm reporting that yt-dlp is broken on a **supported** site
|
||||
required: true
|
||||
- label: I've verified that I'm running yt-dlp version **%(version)s** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||
required: true
|
||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||
required: true
|
||||
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#video-url-contains-an-ampersand--and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
|
||||
required: true
|
||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
required: true
|
||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||
required: true
|
||||
|
|
|
@ -12,13 +12,13 @@ body:
|
|||
options:
|
||||
- label: I'm reporting a new site support request
|
||||
required: true
|
||||
- label: I've verified that I'm running yt-dlp version **%(version)s** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||
required: true
|
||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||
required: true
|
||||
- label: I've checked that none of provided URLs [violate any copyrights](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-website-primarily-used-for-piracy) or contain any [DRM](https://en.wikipedia.org/wiki/Digital_rights_management) to the best of my knowledge
|
||||
required: true
|
||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
required: true
|
||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||
required: true
|
||||
|
|
|
@ -12,11 +12,11 @@ body:
|
|||
options:
|
||||
- label: I'm requesting a site-specific feature
|
||||
required: true
|
||||
- label: I've verified that I'm running yt-dlp version **%(version)s** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||
required: true
|
||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||
required: true
|
||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
required: true
|
||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||
required: true
|
||||
|
|
6
.github/ISSUE_TEMPLATE_tmpl/4_bug_report.yml
vendored
6
.github/ISSUE_TEMPLATE_tmpl/4_bug_report.yml
vendored
|
@ -1,4 +1,4 @@
|
|||
name: Bug report
|
||||
name: Core bug report
|
||||
description: Report a bug unrelated to any particular site or extractor
|
||||
labels: [triage, bug]
|
||||
body:
|
||||
|
@ -12,13 +12,13 @@ body:
|
|||
options:
|
||||
- label: I'm reporting a bug unrelated to a specific site
|
||||
required: true
|
||||
- label: I've verified that I'm running yt-dlp version **%(version)s** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||
required: true
|
||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||
required: true
|
||||
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#video-url-contains-an-ampersand--and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
|
||||
required: true
|
||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
required: true
|
||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||
required: true
|
||||
|
|
|
@ -14,9 +14,9 @@ body:
|
|||
required: true
|
||||
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
|
||||
required: true
|
||||
- label: I've verified that I'm running yt-dlp version **%(version)s** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||
required: true
|
||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||
required: true
|
||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||
required: true
|
||||
|
|
4
.github/ISSUE_TEMPLATE_tmpl/6_question.yml
vendored
4
.github/ISSUE_TEMPLATE_tmpl/6_question.yml
vendored
|
@ -20,9 +20,9 @@ body:
|
|||
required: true
|
||||
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
|
||||
required: true
|
||||
- label: I've verified that I'm running yt-dlp version **%(version)s** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
||||
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||
required: true
|
||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar questions **including closed ones**. DO NOT post duplicates
|
||||
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar questions **including closed ones**. DO NOT post duplicates
|
||||
required: true
|
||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||
required: true
|
||||
|
|
7
.github/PULL_REQUEST_TEMPLATE.md
vendored
7
.github/PULL_REQUEST_TEMPLATE.md
vendored
|
@ -2,8 +2,6 @@
|
|||
|
||||
### Description of your *pull request* and other information
|
||||
|
||||
</details>
|
||||
|
||||
<!--
|
||||
|
||||
Explanation of your *pull request* in arbitrary form goes here. Please **make sure the description explains the purpose and effect** of your *pull request* and is worded well enough to be understood. Provide as much **context and examples** as possible
|
||||
|
@ -30,9 +28,8 @@ # PLEASE FOLLOW THE GUIDE BELOW
|
|||
### Before submitting a *pull request* make sure you have:
|
||||
- [ ] At least skimmed through [contributing guidelines](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#developer-instructions) including [yt-dlp coding conventions](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#yt-dlp-coding-conventions)
|
||||
- [ ] [Searched](https://github.com/yt-dlp/yt-dlp/search?q=is%3Apr&type=Issues) the bugtracker for similar pull requests
|
||||
- [ ] Checked the code with [flake8](https://pypi.python.org/pypi/flake8) and [ran relevant tests](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#developer-instructions)
|
||||
|
||||
### In order to be accepted and merged into yt-dlp each piece of code must be in public domain or released under [Unlicense](http://unlicense.org/). Check one of the following options:
|
||||
### In order to be accepted and merged into yt-dlp each piece of code must be in public domain or released under [Unlicense](http://unlicense.org/). Check all of the following options that apply:
|
||||
- [ ] I am the original author of this code and I am willing to release it under [Unlicense](http://unlicense.org/)
|
||||
- [ ] I am not the original author of this code but it is in public domain or released under [Unlicense](http://unlicense.org/) (provide reliable evidence)
|
||||
|
||||
|
@ -41,3 +38,5 @@ ### What is the purpose of your *pull request*?
|
|||
- [ ] New extractor ([Piracy websites will not be accepted](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-website-primarily-used-for-piracy))
|
||||
- [ ] Core bug fix/improvement
|
||||
- [ ] New feature (It is strongly [recommended to open an issue first](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#adding-new-feature-or-making-overarching-changes))
|
||||
|
||||
</details>
|
||||
|
|
10
.github/banner.svg
vendored
10
.github/banner.svg
vendored
File diff suppressed because one or more lines are too long
Before Width: | Height: | Size: 24 KiB After Width: | Height: | Size: 15 KiB |
839
.github/workflows/build.yml
vendored
839
.github/workflows/build.yml
vendored
|
@ -1,393 +1,574 @@
|
|||
name: Build
|
||||
on: workflow_dispatch
|
||||
name: Build Artifacts
|
||||
on:
|
||||
workflow_call:
|
||||
inputs:
|
||||
version:
|
||||
required: true
|
||||
type: string
|
||||
channel:
|
||||
required: false
|
||||
default: stable
|
||||
type: string
|
||||
unix:
|
||||
default: true
|
||||
type: boolean
|
||||
linux_static:
|
||||
default: true
|
||||
type: boolean
|
||||
linux_arm:
|
||||
default: true
|
||||
type: boolean
|
||||
macos:
|
||||
default: true
|
||||
type: boolean
|
||||
macos_legacy:
|
||||
default: true
|
||||
type: boolean
|
||||
windows:
|
||||
default: true
|
||||
type: boolean
|
||||
windows32:
|
||||
default: true
|
||||
type: boolean
|
||||
origin:
|
||||
required: false
|
||||
default: ''
|
||||
type: string
|
||||
secrets:
|
||||
GPG_SIGNING_KEY:
|
||||
required: false
|
||||
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
version:
|
||||
description: |
|
||||
VERSION: yyyy.mm.dd[.rev] or rev
|
||||
required: true
|
||||
type: string
|
||||
channel:
|
||||
description: |
|
||||
SOURCE of this build's updates: stable/nightly/master/<repo>
|
||||
required: true
|
||||
default: stable
|
||||
type: string
|
||||
unix:
|
||||
description: yt-dlp, yt-dlp.tar.gz
|
||||
default: true
|
||||
type: boolean
|
||||
linux_static:
|
||||
description: yt-dlp_linux
|
||||
default: true
|
||||
type: boolean
|
||||
linux_arm:
|
||||
description: yt-dlp_linux_aarch64, yt-dlp_linux_armv7l
|
||||
default: true
|
||||
type: boolean
|
||||
macos:
|
||||
description: yt-dlp_macos, yt-dlp_macos.zip
|
||||
default: true
|
||||
type: boolean
|
||||
macos_legacy:
|
||||
description: yt-dlp_macos_legacy
|
||||
default: true
|
||||
type: boolean
|
||||
windows:
|
||||
description: yt-dlp.exe, yt-dlp_win.zip
|
||||
default: true
|
||||
type: boolean
|
||||
windows32:
|
||||
description: yt-dlp_x86.exe
|
||||
default: true
|
||||
type: boolean
|
||||
origin:
|
||||
description: Origin
|
||||
required: false
|
||||
default: 'current repo'
|
||||
type: choice
|
||||
options:
|
||||
- 'current repo'
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
|
||||
jobs:
|
||||
prepare:
|
||||
permissions:
|
||||
contents: write # for push_release
|
||||
process:
|
||||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
version_suffix: ${{ steps.version_suffix.outputs.version_suffix }}
|
||||
ytdlp_version: ${{ steps.bump_version.outputs.ytdlp_version }}
|
||||
head_sha: ${{ steps.push_release.outputs.head_sha }}
|
||||
origin: ${{ steps.process_origin.outputs.origin }}
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
with:
|
||||
fetch-depth: 0
|
||||
- uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: '3.10'
|
||||
- name: Process origin
|
||||
id: process_origin
|
||||
run: |
|
||||
echo "origin=${{ inputs.origin == 'current repo' && github.repository || inputs.origin }}" | tee "$GITHUB_OUTPUT"
|
||||
|
||||
- name: Set version suffix
|
||||
id: version_suffix
|
||||
env:
|
||||
PUSH_VERSION_COMMIT: ${{ secrets.PUSH_VERSION_COMMIT }}
|
||||
if: "env.PUSH_VERSION_COMMIT == ''"
|
||||
run: echo "version_suffix=$(date -u +"%H%M%S")" >> "$GITHUB_OUTPUT"
|
||||
- name: Bump version
|
||||
id: bump_version
|
||||
run: |
|
||||
python devscripts/update-version.py ${{ steps.version_suffix.outputs.version_suffix }}
|
||||
make issuetemplates
|
||||
|
||||
- name: Push to release
|
||||
id: push_release
|
||||
run: |
|
||||
git config --global user.name github-actions
|
||||
git config --global user.email github-actions@example.com
|
||||
git add -u
|
||||
git commit -m "[version] update" -m "Created by: ${{ github.event.sender.login }}" -m ":ci skip all :ci run dl"
|
||||
git push origin --force ${{ github.event.ref }}:release
|
||||
echo "head_sha=$(git rev-parse HEAD)" >> "$GITHUB_OUTPUT"
|
||||
- name: Update master
|
||||
env:
|
||||
PUSH_VERSION_COMMIT: ${{ secrets.PUSH_VERSION_COMMIT }}
|
||||
if: "env.PUSH_VERSION_COMMIT != ''"
|
||||
run: git push origin ${{ github.event.ref }}
|
||||
|
||||
|
||||
build_unix:
|
||||
needs: prepare
|
||||
unix:
|
||||
needs: process
|
||||
if: inputs.unix
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- uses: actions/setup-python@v4
|
||||
with:
|
||||
python-version: '3.10'
|
||||
- uses: conda-incubator/setup-miniconda@v2
|
||||
with:
|
||||
miniforge-variant: Mambaforge
|
||||
use-mamba: true
|
||||
channels: conda-forge
|
||||
auto-update-conda: true
|
||||
activate-environment: ''
|
||||
auto-activate-base: false
|
||||
- name: Install Requirements
|
||||
run: |
|
||||
sudo apt-get -y install zip pandoc man sed
|
||||
python -m pip install -U pip setuptools wheel twine
|
||||
python -m pip install -U Pyinstaller -r requirements.txt
|
||||
reqs=$(mktemp)
|
||||
echo -e 'python=3.10.*\npyinstaller' >$reqs
|
||||
sed 's/^brotli.*/brotli-python/' <requirements.txt >>$reqs
|
||||
mamba create -n build --file $reqs
|
||||
|
||||
- name: Prepare
|
||||
run: |
|
||||
python devscripts/update-version.py ${{ needs.prepare.outputs.version_suffix }}
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0 # Needed for changelog
|
||||
- uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: "3.10"
|
||||
- name: Install Requirements
|
||||
run: |
|
||||
sudo apt -y install zip pandoc man sed
|
||||
- name: Prepare
|
||||
run: |
|
||||
python devscripts/update-version.py -c "${{ inputs.channel }}" -r "${{ needs.process.outputs.origin }}" "${{ inputs.version }}"
|
||||
python devscripts/update_changelog.py -vv
|
||||
python devscripts/make_lazy_extractors.py
|
||||
- name: Build Unix platform-independent binary
|
||||
run: |
|
||||
- name: Build Unix platform-independent binary
|
||||
run: |
|
||||
make all tar
|
||||
- name: Build Unix standalone binary
|
||||
shell: bash -l {0}
|
||||
run: |
|
||||
unset LD_LIBRARY_PATH # Harmful; set by setup-python
|
||||
conda activate build
|
||||
python pyinst.py --onedir
|
||||
(cd ./dist/yt-dlp_linux && zip -r ../yt-dlp_linux.zip .)
|
||||
python pyinst.py
|
||||
- name: Verify --update-to
|
||||
if: vars.UPDATE_TO_VERIFICATION
|
||||
run: |
|
||||
chmod +x ./yt-dlp
|
||||
cp ./yt-dlp ./yt-dlp_downgraded
|
||||
version="$(./yt-dlp --version)"
|
||||
./yt-dlp_downgraded -v --update-to yt-dlp/yt-dlp@2023.03.04
|
||||
downgraded_version="$(./yt-dlp_downgraded --version)"
|
||||
[[ "$version" != "$downgraded_version" ]]
|
||||
- name: Upload artifacts
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: build-bin-${{ github.job }}
|
||||
path: |
|
||||
yt-dlp
|
||||
yt-dlp.tar.gz
|
||||
compression-level: 0
|
||||
|
||||
- name: Upload artifacts
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
path: |
|
||||
yt-dlp
|
||||
yt-dlp.tar.gz
|
||||
dist/yt-dlp_linux
|
||||
dist/yt-dlp_linux.zip
|
||||
|
||||
- name: Build and publish on PyPi
|
||||
env:
|
||||
TWINE_USERNAME: __token__
|
||||
TWINE_PASSWORD: ${{ secrets.PYPI_TOKEN }}
|
||||
if: "env.TWINE_PASSWORD != ''"
|
||||
run: |
|
||||
rm -rf dist/*
|
||||
python devscripts/set-variant.py pip -M "You installed yt-dlp with pip or using the wheel from PyPi; Use that to update"
|
||||
python setup.py sdist bdist_wheel
|
||||
twine upload dist/*
|
||||
|
||||
- name: Install SSH private key for Homebrew
|
||||
env:
|
||||
BREW_TOKEN: ${{ secrets.BREW_TOKEN }}
|
||||
if: "env.BREW_TOKEN != ''"
|
||||
uses: yt-dlp/ssh-agent@v0.5.3
|
||||
with:
|
||||
ssh-private-key: ${{ env.BREW_TOKEN }}
|
||||
- name: Update Homebrew Formulae
|
||||
env:
|
||||
BREW_TOKEN: ${{ secrets.BREW_TOKEN }}
|
||||
if: "env.BREW_TOKEN != ''"
|
||||
run: |
|
||||
git clone git@github.com:yt-dlp/homebrew-taps taps/
|
||||
python devscripts/update-formulae.py taps/Formula/yt-dlp.rb "${{ needs.prepare.outputs.ytdlp_version }}"
|
||||
git -C taps/ config user.name github-actions
|
||||
git -C taps/ config user.email github-actions@example.com
|
||||
git -C taps/ commit -am 'yt-dlp: ${{ needs.prepare.outputs.ytdlp_version }}'
|
||||
git -C taps/ push
|
||||
|
||||
|
||||
build_linux_arm:
|
||||
permissions:
|
||||
packages: write # for Creating cache
|
||||
linux_static:
|
||||
needs: process
|
||||
if: inputs.linux_static
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- name: Build static executable
|
||||
env:
|
||||
channel: ${{ inputs.channel }}
|
||||
origin: ${{ needs.process.outputs.origin }}
|
||||
version: ${{ inputs.version }}
|
||||
run: |
|
||||
mkdir ~/build
|
||||
cd bundle/docker
|
||||
docker compose up --build static
|
||||
sudo chown "${USER}:docker" ~/build/yt-dlp_linux
|
||||
- name: Verify --update-to
|
||||
if: vars.UPDATE_TO_VERIFICATION
|
||||
run: |
|
||||
chmod +x ~/build/yt-dlp_linux
|
||||
cp ~/build/yt-dlp_linux ~/build/yt-dlp_linux_downgraded
|
||||
version="$(~/build/yt-dlp_linux --version)"
|
||||
~/build/yt-dlp_linux_downgraded -v --update-to yt-dlp/yt-dlp@2023.03.04
|
||||
downgraded_version="$(~/build/yt-dlp_linux_downgraded --version)"
|
||||
[[ "$version" != "$downgraded_version" ]]
|
||||
- name: Upload artifacts
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: build-bin-${{ github.job }}
|
||||
path: |
|
||||
~/build/yt-dlp_linux
|
||||
compression-level: 0
|
||||
|
||||
linux_arm:
|
||||
needs: process
|
||||
if: inputs.linux_arm
|
||||
permissions:
|
||||
contents: read
|
||||
packages: write # for creating cache
|
||||
runs-on: ubuntu-latest
|
||||
needs: prepare
|
||||
strategy:
|
||||
matrix:
|
||||
architecture:
|
||||
- armv7
|
||||
- aarch64
|
||||
- armv7
|
||||
- aarch64
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
with:
|
||||
path: ./repo
|
||||
- name: Virtualized Install, Prepare & Build
|
||||
uses: yt-dlp/run-on-arch-action@v2
|
||||
with:
|
||||
githubToken: ${{ github.token }} # To cache image
|
||||
arch: ${{ matrix.architecture }}
|
||||
distro: ubuntu18.04 # Standalone executable should be built on minimum supported OS
|
||||
dockerRunArgs: --volume "${PWD}/repo:/repo"
|
||||
install: | # Installing Python 3.10 from the Deadsnakes repo raises errors
|
||||
apt update
|
||||
apt -y install zlib1g-dev python3.8 python3.8-dev python3.8-distutils python3-pip
|
||||
python3.8 -m pip install -U pip setuptools wheel
|
||||
# Cannot access requirements.txt from the repo directory at this stage
|
||||
python3.8 -m pip install -U Pyinstaller mutagen pycryptodomex websockets brotli certifi
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
path: ./repo
|
||||
- name: Virtualized Install, Prepare & Build
|
||||
uses: yt-dlp/run-on-arch-action@v2
|
||||
with:
|
||||
# Ref: https://github.com/uraimo/run-on-arch-action/issues/55
|
||||
env: |
|
||||
GITHUB_WORKFLOW: build
|
||||
githubToken: ${{ github.token }} # To cache image
|
||||
arch: ${{ matrix.architecture }}
|
||||
distro: ubuntu20.04 # Standalone executable should be built on minimum supported OS
|
||||
dockerRunArgs: --volume "${PWD}/repo:/repo"
|
||||
install: | # Installing Python 3.10 from the Deadsnakes repo raises errors
|
||||
apt update
|
||||
apt -y install zlib1g-dev libffi-dev python3.9 python3.9-dev python3.9-distutils python3-pip \
|
||||
python3-secretstorage # Cannot build cryptography wheel in virtual armv7 environment
|
||||
python3.9 -m pip install -U pip wheel 'setuptools>=71.0.2'
|
||||
# XXX: Keep this in sync with pyproject.toml (it can't be accessed at this stage) and exclude secretstorage
|
||||
python3.9 -m pip install -U Pyinstaller mutagen pycryptodomex brotli certifi cffi \
|
||||
'requests>=2.32.2,<3' 'urllib3>=1.26.17,<3' 'websockets>=13.0'
|
||||
|
||||
run: |
|
||||
cd repo
|
||||
python3.9 devscripts/install_deps.py -o --include build
|
||||
python3.9 devscripts/install_deps.py --include pyinstaller # Cached versions may be out of date
|
||||
python3.9 devscripts/update-version.py -c "${{ inputs.channel }}" -r "${{ needs.process.outputs.origin }}" "${{ inputs.version }}"
|
||||
python3.9 devscripts/make_lazy_extractors.py
|
||||
python3.9 -m bundle.pyinstaller
|
||||
|
||||
if ${{ vars.UPDATE_TO_VERIFICATION && 'true' || 'false' }}; then
|
||||
arch="${{ (matrix.architecture == 'armv7' && 'armv7l') || matrix.architecture }}"
|
||||
chmod +x ./dist/yt-dlp_linux_${arch}
|
||||
cp ./dist/yt-dlp_linux_${arch} ./dist/yt-dlp_linux_${arch}_downgraded
|
||||
version="$(./dist/yt-dlp_linux_${arch} --version)"
|
||||
./dist/yt-dlp_linux_${arch}_downgraded -v --update-to yt-dlp/yt-dlp@2023.03.04
|
||||
downgraded_version="$(./dist/yt-dlp_linux_${arch}_downgraded --version)"
|
||||
[[ "$version" != "$downgraded_version" ]]
|
||||
fi
|
||||
|
||||
- name: Upload artifacts
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: build-bin-linux_${{ matrix.architecture }}
|
||||
path: | # run-on-arch-action designates armv7l as armv7
|
||||
repo/dist/yt-dlp_linux_${{ (matrix.architecture == 'armv7' && 'armv7l') || matrix.architecture }}
|
||||
compression-level: 0
|
||||
|
||||
macos:
|
||||
needs: process
|
||||
if: inputs.macos
|
||||
permissions:
|
||||
contents: read
|
||||
actions: write # For cleaning up cache
|
||||
runs-on: macos-13
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
# NB: Building universal2 does not work with python from actions/setup-python
|
||||
|
||||
- name: Restore cached requirements
|
||||
id: restore-cache
|
||||
uses: actions/cache/restore@v4
|
||||
env:
|
||||
SEGMENT_DOWNLOAD_TIMEOUT_MINS: 1
|
||||
with:
|
||||
path: |
|
||||
~/yt-dlp-build-venv
|
||||
key: cache-reqs-${{ github.job }}
|
||||
|
||||
- name: Install Requirements
|
||||
run: |
|
||||
cd repo
|
||||
python3.8 -m pip install -U Pyinstaller -r requirements.txt # Cached version may be out of date
|
||||
python3.8 devscripts/update-version.py ${{ needs.prepare.outputs.version_suffix }}
|
||||
python3.8 devscripts/make_lazy_extractors.py
|
||||
python3.8 pyinst.py
|
||||
|
||||
- name: Upload artifacts
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
path: | # run-on-arch-action designates armv7l as armv7
|
||||
repo/dist/yt-dlp_linux_${{ (matrix.architecture == 'armv7' && 'armv7l') || matrix.architecture }}
|
||||
|
||||
|
||||
build_macos:
|
||||
runs-on: macos-11
|
||||
needs: prepare
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
# NB: In order to create a universal2 application, the version of python3 in /usr/bin has to be used
|
||||
- name: Install Requirements
|
||||
run: |
|
||||
brew install coreutils
|
||||
/usr/bin/python3 -m pip install -U --user pip Pyinstaller -r requirements.txt
|
||||
python3 -m venv ~/yt-dlp-build-venv
|
||||
source ~/yt-dlp-build-venv/bin/activate
|
||||
python3 devscripts/install_deps.py -o --include build
|
||||
python3 devscripts/install_deps.py --print --include pyinstaller > requirements.txt
|
||||
# We need to ignore wheels otherwise we break universal2 builds
|
||||
python3 -m pip install -U --no-binary :all: -r requirements.txt
|
||||
# We need to fuse our own universal2 wheels for curl_cffi
|
||||
python3 -m pip install -U 'delocate==0.11.0'
|
||||
mkdir curl_cffi_whls curl_cffi_universal2
|
||||
python3 devscripts/install_deps.py --print -o --include curl-cffi > requirements.txt
|
||||
for platform in "macosx_11_0_arm64" "macosx_11_0_x86_64"; do
|
||||
python3 -m pip download \
|
||||
--only-binary=:all: \
|
||||
--platform "${platform}" \
|
||||
-d curl_cffi_whls \
|
||||
-r requirements.txt
|
||||
done
|
||||
( # Overwrite x86_64-only libs with fat/universal2 libs or else Pyinstaller will do the opposite
|
||||
# See https://github.com/yt-dlp/yt-dlp/pull/10069
|
||||
cd curl_cffi_whls
|
||||
mkdir -p curl_cffi/.dylibs
|
||||
python_libdir=$(python3 -c 'import sys; from pathlib import Path; print(Path(sys.path[1]).parent)')
|
||||
for dylib in lib{ssl,crypto}.3.dylib; do
|
||||
cp "${python_libdir}/${dylib}" "curl_cffi/.dylibs/${dylib}"
|
||||
for wheel in curl_cffi*macos*x86_64.whl; do
|
||||
zip "${wheel}" "curl_cffi/.dylibs/${dylib}"
|
||||
done
|
||||
done
|
||||
)
|
||||
python3 -m delocate.cmd.delocate_fuse curl_cffi_whls/curl_cffi*.whl -w curl_cffi_universal2
|
||||
python3 -m delocate.cmd.delocate_fuse curl_cffi_whls/cffi*.whl -w curl_cffi_universal2
|
||||
for wheel in curl_cffi_universal2/*cffi*.whl; do
|
||||
mv -n -- "${wheel}" "${wheel/x86_64/universal2}"
|
||||
done
|
||||
python3 -m pip install --force-reinstall -U curl_cffi_universal2/*cffi*.whl
|
||||
|
||||
- name: Prepare
|
||||
run: |
|
||||
/usr/bin/python3 devscripts/update-version.py ${{ needs.prepare.outputs.version_suffix }}
|
||||
/usr/bin/python3 devscripts/make_lazy_extractors.py
|
||||
- name: Build
|
||||
run: |
|
||||
/usr/bin/python3 pyinst.py --target-architecture universal2 --onedir
|
||||
- name: Prepare
|
||||
run: |
|
||||
python3 devscripts/update-version.py -c "${{ inputs.channel }}" -r "${{ needs.process.outputs.origin }}" "${{ inputs.version }}"
|
||||
python3 devscripts/make_lazy_extractors.py
|
||||
- name: Build
|
||||
run: |
|
||||
source ~/yt-dlp-build-venv/bin/activate
|
||||
python3 -m bundle.pyinstaller --target-architecture universal2 --onedir
|
||||
(cd ./dist/yt-dlp_macos && zip -r ../yt-dlp_macos.zip .)
|
||||
/usr/bin/python3 pyinst.py --target-architecture universal2
|
||||
python3 -m bundle.pyinstaller --target-architecture universal2
|
||||
|
||||
- name: Upload artifacts
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
path: |
|
||||
dist/yt-dlp_macos
|
||||
dist/yt-dlp_macos.zip
|
||||
- name: Verify --update-to
|
||||
if: vars.UPDATE_TO_VERIFICATION
|
||||
run: |
|
||||
chmod +x ./dist/yt-dlp_macos
|
||||
cp ./dist/yt-dlp_macos ./dist/yt-dlp_macos_downgraded
|
||||
version="$(./dist/yt-dlp_macos --version)"
|
||||
./dist/yt-dlp_macos_downgraded -v --update-to yt-dlp/yt-dlp@2023.03.04
|
||||
downgraded_version="$(./dist/yt-dlp_macos_downgraded --version)"
|
||||
[[ "$version" != "$downgraded_version" ]]
|
||||
|
||||
- name: Upload artifacts
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: build-bin-${{ github.job }}
|
||||
path: |
|
||||
dist/yt-dlp_macos
|
||||
dist/yt-dlp_macos.zip
|
||||
compression-level: 0
|
||||
|
||||
build_macos_legacy:
|
||||
runs-on: macos-latest
|
||||
needs: prepare
|
||||
- name: Cleanup cache
|
||||
if: steps.restore-cache.outputs.cache-hit == 'true'
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
cache_key: cache-reqs-${{ github.job }}
|
||||
repository: ${{ github.repository }}
|
||||
branch: ${{ github.ref }}
|
||||
run: |
|
||||
gh extension install actions/gh-actions-cache
|
||||
gh actions-cache delete "${cache_key}" -R "${repository}" -B "${branch}" --confirm
|
||||
|
||||
- name: Cache requirements
|
||||
uses: actions/cache/save@v4
|
||||
with:
|
||||
path: |
|
||||
~/yt-dlp-build-venv
|
||||
key: cache-reqs-${{ github.job }}
|
||||
|
||||
macos_legacy:
|
||||
needs: process
|
||||
if: inputs.macos_legacy
|
||||
runs-on: macos-13
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- name: Install Python
|
||||
# We need the official Python, because the GA ones only support newer macOS versions
|
||||
env:
|
||||
PYTHON_VERSION: 3.10.5
|
||||
MACOSX_DEPLOYMENT_TARGET: 10.9 # Used up by the Python build tools
|
||||
run: |
|
||||
- uses: actions/checkout@v4
|
||||
- name: Install Python
|
||||
# We need the official Python, because the GA ones only support newer macOS versions
|
||||
env:
|
||||
PYTHON_VERSION: 3.10.5
|
||||
MACOSX_DEPLOYMENT_TARGET: 10.9 # Used up by the Python build tools
|
||||
run: |
|
||||
# Hack to get the latest patch version. Uncomment if needed
|
||||
#brew install python@3.10
|
||||
#export PYTHON_VERSION=$( $(brew --prefix)/opt/python@3.10/bin/python3 --version | cut -d ' ' -f 2 )
|
||||
curl https://www.python.org/ftp/python/${PYTHON_VERSION}/python-${PYTHON_VERSION}-macos11.pkg -o "python.pkg"
|
||||
curl "https://www.python.org/ftp/python/${PYTHON_VERSION}/python-${PYTHON_VERSION}-macos11.pkg" -o "python.pkg"
|
||||
sudo installer -pkg python.pkg -target /
|
||||
python3 --version
|
||||
- name: Install Requirements
|
||||
run: |
|
||||
- name: Install Requirements
|
||||
run: |
|
||||
brew install coreutils
|
||||
python3 -m pip install -U --user pip Pyinstaller -r requirements.txt
|
||||
python3 devscripts/install_deps.py --user -o --include build
|
||||
python3 devscripts/install_deps.py --user --include pyinstaller
|
||||
|
||||
- name: Prepare
|
||||
run: |
|
||||
python3 devscripts/update-version.py ${{ needs.prepare.outputs.version_suffix }}
|
||||
- name: Prepare
|
||||
run: |
|
||||
python3 devscripts/update-version.py -c "${{ inputs.channel }}" -r "${{ needs.process.outputs.origin }}" "${{ inputs.version }}"
|
||||
python3 devscripts/make_lazy_extractors.py
|
||||
- name: Build
|
||||
run: |
|
||||
python3 pyinst.py
|
||||
- name: Build
|
||||
run: |
|
||||
python3 -m bundle.pyinstaller
|
||||
mv dist/yt-dlp_macos dist/yt-dlp_macos_legacy
|
||||
|
||||
- name: Upload artifacts
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
path: |
|
||||
dist/yt-dlp_macos_legacy
|
||||
- name: Verify --update-to
|
||||
if: vars.UPDATE_TO_VERIFICATION
|
||||
run: |
|
||||
chmod +x ./dist/yt-dlp_macos_legacy
|
||||
cp ./dist/yt-dlp_macos_legacy ./dist/yt-dlp_macos_legacy_downgraded
|
||||
version="$(./dist/yt-dlp_macos_legacy --version)"
|
||||
./dist/yt-dlp_macos_legacy_downgraded -v --update-to yt-dlp/yt-dlp@2023.03.04
|
||||
downgraded_version="$(./dist/yt-dlp_macos_legacy_downgraded --version)"
|
||||
[[ "$version" != "$downgraded_version" ]]
|
||||
|
||||
- name: Upload artifacts
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: build-bin-${{ github.job }}
|
||||
path: |
|
||||
dist/yt-dlp_macos_legacy
|
||||
compression-level: 0
|
||||
|
||||
build_windows:
|
||||
windows:
|
||||
needs: process
|
||||
if: inputs.windows
|
||||
runs-on: windows-latest
|
||||
needs: prepare
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- uses: actions/setup-python@v4
|
||||
with: # 3.8 is used for Win7 support
|
||||
python-version: '3.8'
|
||||
- name: Install Requirements
|
||||
run: | # Custom pyinstaller built with https://github.com/yt-dlp/pyinstaller-builds
|
||||
python -m pip install -U pip setuptools wheel py2exe
|
||||
pip install -U "https://yt-dlp.github.io/Pyinstaller-Builds/x86_64/pyinstaller-5.3-py3-none-any.whl" -r requirements.txt
|
||||
- uses: actions/checkout@v4
|
||||
- uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: "3.10"
|
||||
- name: Install Requirements
|
||||
run: | # Custom pyinstaller built with https://github.com/yt-dlp/pyinstaller-builds
|
||||
python devscripts/install_deps.py -o --include build
|
||||
python devscripts/install_deps.py --include curl-cffi
|
||||
python -m pip install -U "https://yt-dlp.github.io/Pyinstaller-Builds/x86_64/pyinstaller-6.10.0-py3-none-any.whl"
|
||||
|
||||
- name: Prepare
|
||||
run: |
|
||||
python devscripts/update-version.py ${{ needs.prepare.outputs.version_suffix }}
|
||||
- name: Prepare
|
||||
run: |
|
||||
python devscripts/update-version.py -c "${{ inputs.channel }}" -r "${{ needs.process.outputs.origin }}" "${{ inputs.version }}"
|
||||
python devscripts/make_lazy_extractors.py
|
||||
- name: Build
|
||||
run: |
|
||||
python setup.py py2exe
|
||||
Move-Item ./dist/yt-dlp.exe ./dist/yt-dlp_min.exe
|
||||
python pyinst.py
|
||||
python pyinst.py --onedir
|
||||
- name: Build
|
||||
run: |
|
||||
python -m bundle.pyinstaller
|
||||
python -m bundle.pyinstaller --onedir
|
||||
Compress-Archive -Path ./dist/yt-dlp/* -DestinationPath ./dist/yt-dlp_win.zip
|
||||
|
||||
- name: Upload artifacts
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
path: |
|
||||
dist/yt-dlp.exe
|
||||
dist/yt-dlp_min.exe
|
||||
dist/yt-dlp_win.zip
|
||||
- name: Verify --update-to
|
||||
if: vars.UPDATE_TO_VERIFICATION
|
||||
run: |
|
||||
foreach ($name in @("yt-dlp")) {
|
||||
Copy-Item "./dist/${name}.exe" "./dist/${name}_downgraded.exe"
|
||||
$version = & "./dist/${name}.exe" --version
|
||||
& "./dist/${name}_downgraded.exe" -v --update-to yt-dlp/yt-dlp@2023.03.04
|
||||
$downgraded_version = & "./dist/${name}_downgraded.exe" --version
|
||||
if ($version -eq $downgraded_version) {
|
||||
exit 1
|
||||
}
|
||||
}
|
||||
|
||||
- name: Upload artifacts
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: build-bin-${{ github.job }}
|
||||
path: |
|
||||
dist/yt-dlp.exe
|
||||
dist/yt-dlp_win.zip
|
||||
compression-level: 0
|
||||
|
||||
build_windows32:
|
||||
windows32:
|
||||
needs: process
|
||||
if: inputs.windows32
|
||||
runs-on: windows-latest
|
||||
needs: prepare
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- uses: actions/setup-python@v4
|
||||
with: # 3.7 is used for Vista support. See https://github.com/yt-dlp/yt-dlp/issues/390
|
||||
python-version: '3.7'
|
||||
architecture: 'x86'
|
||||
- name: Install Requirements
|
||||
run: |
|
||||
python -m pip install -U pip setuptools wheel
|
||||
pip install -U "https://yt-dlp.github.io/Pyinstaller-Builds/i686/pyinstaller-5.3-py3-none-any.whl" -r requirements.txt
|
||||
- uses: actions/checkout@v4
|
||||
- uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: "3.10"
|
||||
architecture: "x86"
|
||||
- name: Install Requirements
|
||||
run: |
|
||||
python devscripts/install_deps.py -o --include build
|
||||
python devscripts/install_deps.py
|
||||
python -m pip install -U "https://yt-dlp.github.io/Pyinstaller-Builds/i686/pyinstaller-6.10.0-py3-none-any.whl"
|
||||
|
||||
- name: Prepare
|
||||
run: |
|
||||
python devscripts/update-version.py ${{ needs.prepare.outputs.version_suffix }}
|
||||
- name: Prepare
|
||||
run: |
|
||||
python devscripts/update-version.py -c "${{ inputs.channel }}" -r "${{ needs.process.outputs.origin }}" "${{ inputs.version }}"
|
||||
python devscripts/make_lazy_extractors.py
|
||||
- name: Build
|
||||
run: |
|
||||
python pyinst.py
|
||||
- name: Build
|
||||
run: |
|
||||
python -m bundle.pyinstaller
|
||||
|
||||
- name: Upload artifacts
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
path: |
|
||||
dist/yt-dlp_x86.exe
|
||||
- name: Verify --update-to
|
||||
if: vars.UPDATE_TO_VERIFICATION
|
||||
run: |
|
||||
foreach ($name in @("yt-dlp_x86")) {
|
||||
Copy-Item "./dist/${name}.exe" "./dist/${name}_downgraded.exe"
|
||||
$version = & "./dist/${name}.exe" --version
|
||||
& "./dist/${name}_downgraded.exe" -v --update-to yt-dlp/yt-dlp@2023.03.04
|
||||
$downgraded_version = & "./dist/${name}_downgraded.exe" --version
|
||||
if ($version -eq $downgraded_version) {
|
||||
exit 1
|
||||
}
|
||||
}
|
||||
|
||||
- name: Upload artifacts
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: build-bin-${{ github.job }}
|
||||
path: |
|
||||
dist/yt-dlp_x86.exe
|
||||
compression-level: 0
|
||||
|
||||
publish_release:
|
||||
permissions:
|
||||
contents: write # for action-gh-release
|
||||
meta_files:
|
||||
if: always() && !cancelled()
|
||||
needs:
|
||||
- process
|
||||
- unix
|
||||
- linux_static
|
||||
- linux_arm
|
||||
- macos
|
||||
- macos_legacy
|
||||
- windows
|
||||
- windows32
|
||||
runs-on: ubuntu-latest
|
||||
needs: [prepare, build_unix, build_linux_arm, build_windows, build_windows32, build_macos, build_macos_legacy]
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- uses: actions/download-artifact@v3
|
||||
- uses: actions/download-artifact@v4
|
||||
with:
|
||||
path: artifact
|
||||
pattern: build-bin-*
|
||||
merge-multiple: true
|
||||
|
||||
- name: Get Changelog
|
||||
run: |
|
||||
changelog=$(grep -oPz '(?s)(?<=### ${{ needs.prepare.outputs.ytdlp_version }}\n{2}).+?(?=\n{2,3}###)' Changelog.md) || true
|
||||
echo "changelog<<EOF" >> $GITHUB_ENV
|
||||
echo "$changelog" >> $GITHUB_ENV
|
||||
echo "EOF" >> $GITHUB_ENV
|
||||
- name: Make Update spec
|
||||
run: |
|
||||
echo "# This file is used for regulating self-update" >> _update_spec
|
||||
echo "lock 2022.07.18 .+ Python 3.6" >> _update_spec
|
||||
- name: Make SHA2-SUMS files
|
||||
run: |
|
||||
sha256sum artifact/yt-dlp | awk '{print $1 " yt-dlp"}' >> SHA2-256SUMS
|
||||
sha256sum artifact/yt-dlp.tar.gz | awk '{print $1 " yt-dlp.tar.gz"}' >> SHA2-256SUMS
|
||||
sha256sum artifact/yt-dlp.exe | awk '{print $1 " yt-dlp.exe"}' >> SHA2-256SUMS
|
||||
sha256sum artifact/yt-dlp_win.zip | awk '{print $1 " yt-dlp_win.zip"}' >> SHA2-256SUMS
|
||||
sha256sum artifact/yt-dlp_min.exe | awk '{print $1 " yt-dlp_min.exe"}' >> SHA2-256SUMS
|
||||
sha256sum artifact/yt-dlp_x86.exe | awk '{print $1 " yt-dlp_x86.exe"}' >> SHA2-256SUMS
|
||||
sha256sum artifact/yt-dlp_macos | awk '{print $1 " yt-dlp_macos"}' >> SHA2-256SUMS
|
||||
sha256sum artifact/yt-dlp_macos.zip | awk '{print $1 " yt-dlp_macos.zip"}' >> SHA2-256SUMS
|
||||
sha256sum artifact/yt-dlp_macos_legacy | awk '{print $1 " yt-dlp_macos_legacy"}' >> SHA2-256SUMS
|
||||
sha256sum artifact/yt-dlp_linux_armv7l | awk '{print $1 " yt-dlp_linux_armv7l"}' >> SHA2-256SUMS
|
||||
sha256sum artifact/yt-dlp_linux_aarch64 | awk '{print $1 " yt-dlp_linux_aarch64"}' >> SHA2-256SUMS
|
||||
sha256sum artifact/dist/yt-dlp_linux | awk '{print $1 " yt-dlp_linux"}' >> SHA2-256SUMS
|
||||
sha256sum artifact/dist/yt-dlp_linux.zip | awk '{print $1 " yt-dlp_linux.zip"}' >> SHA2-256SUMS
|
||||
sha512sum artifact/yt-dlp | awk '{print $1 " yt-dlp"}' >> SHA2-512SUMS
|
||||
sha512sum artifact/yt-dlp.tar.gz | awk '{print $1 " yt-dlp.tar.gz"}' >> SHA2-512SUMS
|
||||
sha512sum artifact/yt-dlp.exe | awk '{print $1 " yt-dlp.exe"}' >> SHA2-512SUMS
|
||||
sha512sum artifact/yt-dlp_win.zip | awk '{print $1 " yt-dlp_win.zip"}' >> SHA2-512SUMS
|
||||
sha512sum artifact/yt-dlp_min.exe | awk '{print $1 " yt-dlp_min.exe"}' >> SHA2-512SUMS
|
||||
sha512sum artifact/yt-dlp_x86.exe | awk '{print $1 " yt-dlp_x86.exe"}' >> SHA2-512SUMS
|
||||
sha512sum artifact/yt-dlp_macos | awk '{print $1 " yt-dlp_macos"}' >> SHA2-512SUMS
|
||||
sha512sum artifact/yt-dlp_macos.zip | awk '{print $1 " yt-dlp_macos.zip"}' >> SHA2-512SUMS
|
||||
sha512sum artifact/yt-dlp_macos_legacy | awk '{print $1 " yt-dlp_macos_legacy"}' >> SHA2-512SUMS
|
||||
sha512sum artifact/yt-dlp_linux_armv7l | awk '{print $1 " yt-dlp_linux_armv7l"}' >> SHA2-512SUMS
|
||||
sha512sum artifact/yt-dlp_linux_aarch64 | awk '{print $1 " yt-dlp_linux_aarch64"}' >> SHA2-512SUMS
|
||||
sha512sum artifact/dist/yt-dlp_linux | awk '{print $1 " yt-dlp_linux"}' >> SHA2-512SUMS
|
||||
sha512sum artifact/dist/yt-dlp_linux.zip | awk '{print $1 " yt-dlp_linux.zip"}' >> SHA2-512SUMS
|
||||
- name: Make SHA2-SUMS files
|
||||
run: |
|
||||
cd ./artifact/
|
||||
# make sure SHA sums are also printed to stdout
|
||||
sha256sum -- * | tee ../SHA2-256SUMS
|
||||
sha512sum -- * | tee ../SHA2-512SUMS
|
||||
# also print as permanent annotations to the summary page
|
||||
while read -r shasum; do
|
||||
echo "::notice title=${shasum##* }::sha256: ${shasum% *}"
|
||||
done < ../SHA2-256SUMS
|
||||
|
||||
- name: Publish Release
|
||||
uses: yt-dlp/action-gh-release@v1
|
||||
with:
|
||||
tag_name: ${{ needs.prepare.outputs.ytdlp_version }}
|
||||
name: yt-dlp ${{ needs.prepare.outputs.ytdlp_version }}
|
||||
target_commitish: ${{ needs.prepare.outputs.head_sha }}
|
||||
body: |
|
||||
#### [A description of the various files]((https://github.com/yt-dlp/yt-dlp#release-files)) are in the README
|
||||
- name: Make Update spec
|
||||
run: |
|
||||
cat >> _update_spec << EOF
|
||||
# This file is used for regulating self-update
|
||||
lock 2022.08.18.36 .+ Python 3\.6
|
||||
lock 2023.11.16 (?!win_x86_exe).+ Python 3\.7
|
||||
lock 2023.11.16 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
||||
lock 2024.10.22 py2exe .+
|
||||
lock 2024.10.22 linux_(?:armv7l|aarch64)_exe .+-glibc2\.(?:[12]?\d|30)\b
|
||||
lock 2024.10.22 (?!\w+_exe).+ Python 3\.8
|
||||
lock 2024.10.22 win(?:_x86)?_exe Python 3\.[78].+ Windows-(?:7-|2008ServerR2)
|
||||
lockV2 yt-dlp/yt-dlp 2022.08.18.36 .+ Python 3\.6
|
||||
lockV2 yt-dlp/yt-dlp 2023.11.16 (?!win_x86_exe).+ Python 3\.7
|
||||
lockV2 yt-dlp/yt-dlp 2023.11.16 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
||||
lockV2 yt-dlp/yt-dlp 2024.10.22 py2exe .+
|
||||
lockV2 yt-dlp/yt-dlp 2024.10.22 linux_(?:armv7l|aarch64)_exe .+-glibc2\.(?:[12]?\d|30)\b
|
||||
lockV2 yt-dlp/yt-dlp 2024.10.22 (?!\w+_exe).+ Python 3\.8
|
||||
lockV2 yt-dlp/yt-dlp 2024.10.22 win(?:_x86)?_exe Python 3\.[78].+ Windows-(?:7-|2008ServerR2)
|
||||
lockV2 yt-dlp/yt-dlp-nightly-builds 2023.11.15.232826 (?!win_x86_exe).+ Python 3\.7
|
||||
lockV2 yt-dlp/yt-dlp-nightly-builds 2023.11.15.232826 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
||||
lockV2 yt-dlp/yt-dlp-nightly-builds 2024.10.22.051025 py2exe .+
|
||||
lockV2 yt-dlp/yt-dlp-nightly-builds 2024.10.22.051025 linux_(?:armv7l|aarch64)_exe .+-glibc2\.(?:[12]?\d|30)\b
|
||||
lockV2 yt-dlp/yt-dlp-nightly-builds 2024.10.22.051025 (?!\w+_exe).+ Python 3\.8
|
||||
lockV2 yt-dlp/yt-dlp-nightly-builds 2024.10.22.051025 win(?:_x86)?_exe Python 3\.[78].+ Windows-(?:7-|2008ServerR2)
|
||||
lockV2 yt-dlp/yt-dlp-master-builds 2023.11.15.232812 (?!win_x86_exe).+ Python 3\.7
|
||||
lockV2 yt-dlp/yt-dlp-master-builds 2023.11.15.232812 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
||||
lockV2 yt-dlp/yt-dlp-master-builds 2024.10.22.045052 py2exe .+
|
||||
lockV2 yt-dlp/yt-dlp-master-builds 2024.10.22.060347 linux_(?:armv7l|aarch64)_exe .+-glibc2\.(?:[12]?\d|30)\b
|
||||
lockV2 yt-dlp/yt-dlp-master-builds 2024.10.22.060347 (?!\w+_exe).+ Python 3\.8
|
||||
lockV2 yt-dlp/yt-dlp-master-builds 2024.10.22.060347 win(?:_x86)?_exe Python 3\.[78].+ Windows-(?:7-|2008ServerR2)
|
||||
EOF
|
||||
|
||||
---
|
||||
<details open><summary><h3>Changelog</summary>
|
||||
<p>
|
||||
- name: Sign checksum files
|
||||
env:
|
||||
GPG_SIGNING_KEY: ${{ secrets.GPG_SIGNING_KEY }}
|
||||
if: env.GPG_SIGNING_KEY != ''
|
||||
run: |
|
||||
gpg --batch --import <<< "${{ secrets.GPG_SIGNING_KEY }}"
|
||||
for signfile in ./SHA*SUMS; do
|
||||
gpg --batch --detach-sign "$signfile"
|
||||
done
|
||||
|
||||
${{ env.changelog }}
|
||||
|
||||
</p>
|
||||
</details>
|
||||
files: |
|
||||
SHA2-256SUMS
|
||||
SHA2-512SUMS
|
||||
artifact/yt-dlp
|
||||
artifact/yt-dlp.tar.gz
|
||||
artifact/yt-dlp.exe
|
||||
artifact/yt-dlp_win.zip
|
||||
artifact/yt-dlp_min.exe
|
||||
artifact/yt-dlp_x86.exe
|
||||
artifact/yt-dlp_macos
|
||||
artifact/yt-dlp_macos.zip
|
||||
artifact/yt-dlp_macos_legacy
|
||||
artifact/yt-dlp_linux_armv7l
|
||||
artifact/yt-dlp_linux_aarch64
|
||||
artifact/dist/yt-dlp_linux
|
||||
artifact/dist/yt-dlp_linux.zip
|
||||
_update_spec
|
||||
- name: Upload artifacts
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: build-${{ github.job }}
|
||||
path: |
|
||||
_update_spec
|
||||
SHA*SUMS*
|
||||
compression-level: 0
|
||||
overwrite: true
|
||||
|
|
65
.github/workflows/codeql.yml
vendored
Normal file
65
.github/workflows/codeql.yml
vendored
Normal file
|
@ -0,0 +1,65 @@
|
|||
name: "CodeQL"
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [ 'master', 'gh-pages', 'release' ]
|
||||
pull_request:
|
||||
# The branches below must be a subset of the branches above
|
||||
branches: [ 'master' ]
|
||||
schedule:
|
||||
- cron: '59 11 * * 5'
|
||||
|
||||
jobs:
|
||||
analyze:
|
||||
name: Analyze
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
actions: read
|
||||
contents: read
|
||||
security-events: write
|
||||
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
language: [ 'python' ]
|
||||
# CodeQL supports [ 'cpp', 'csharp', 'go', 'java', 'javascript', 'python', 'ruby' ]
|
||||
# Use only 'java' to analyze code written in Java, Kotlin or both
|
||||
# Use only 'javascript' to analyze code written in JavaScript, TypeScript or both
|
||||
# Learn more about CodeQL language support at https://aka.ms/codeql-docs/language-support
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
# Initializes the CodeQL tools for scanning.
|
||||
- name: Initialize CodeQL
|
||||
uses: github/codeql-action/init@v2
|
||||
with:
|
||||
languages: ${{ matrix.language }}
|
||||
# If you wish to specify custom queries, you can do so here or in a config file.
|
||||
# By default, queries listed here will override any specified in a config file.
|
||||
# Prefix the list here with "+" to use these queries and those in the config file.
|
||||
|
||||
# For more details on CodeQL's query packs, refer to: https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs
|
||||
# queries: security-extended,security-and-quality
|
||||
|
||||
|
||||
# Autobuild attempts to build any compiled languages (C/C++, C#, Go, Java, or Swift).
|
||||
# If this step fails, then you should remove it and run the build manually (see below)
|
||||
- name: Autobuild
|
||||
uses: github/codeql-action/autobuild@v2
|
||||
|
||||
# ℹ️ Command-line programs to run using the OS shell.
|
||||
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun
|
||||
|
||||
# If the Autobuild fails above, remove it and uncomment the following three lines.
|
||||
# modify them (or add more) to build your code if your project, please refer to the EXAMPLE below for guidance.
|
||||
|
||||
# - run: |
|
||||
# echo "Run, Build Application using script"
|
||||
# ./location_of_script_within_repo/buildscript.sh
|
||||
|
||||
- name: Perform CodeQL Analysis
|
||||
uses: github/codeql-action/analyze@v2
|
||||
with:
|
||||
category: "/language:${{matrix.language}}"
|
55
.github/workflows/core.yml
vendored
55
.github/workflows/core.yml
vendored
|
@ -1,8 +1,32 @@
|
|||
name: Core Tests
|
||||
on: [push, pull_request]
|
||||
on:
|
||||
push:
|
||||
paths:
|
||||
- .github/**
|
||||
- devscripts/**
|
||||
- test/**
|
||||
- yt_dlp/**.py
|
||||
- '!yt_dlp/extractor/*.py'
|
||||
- yt_dlp/extractor/__init__.py
|
||||
- yt_dlp/extractor/common.py
|
||||
- yt_dlp/extractor/extractors.py
|
||||
pull_request:
|
||||
paths:
|
||||
- .github/**
|
||||
- devscripts/**
|
||||
- test/**
|
||||
- yt_dlp/**.py
|
||||
- '!yt_dlp/extractor/*.py'
|
||||
- yt_dlp/extractor/__init__.py
|
||||
- yt_dlp/extractor/common.py
|
||||
- yt_dlp/extractor/extractors.py
|
||||
permissions:
|
||||
contents: read
|
||||
|
||||
concurrency:
|
||||
group: core-${{ github.event.pull_request.number || github.ref }}
|
||||
cancel-in-progress: ${{ github.event_name == 'pull_request' }}
|
||||
|
||||
jobs:
|
||||
tests:
|
||||
name: Core Tests
|
||||
|
@ -13,25 +37,30 @@ jobs:
|
|||
matrix:
|
||||
os: [ubuntu-latest]
|
||||
# CPython 3.9 is in quick-test
|
||||
python-version: ['3.7', '3.10', 3.11-dev, pypy-3.7, pypy-3.8]
|
||||
run-tests-ext: [sh]
|
||||
python-version: ['3.10', '3.11', '3.12', '3.13', pypy-3.10]
|
||||
include:
|
||||
# atleast one of each CPython/PyPy tests must be in windows
|
||||
- os: windows-latest
|
||||
python-version: '3.8'
|
||||
run-tests-ext: bat
|
||||
python-version: '3.9'
|
||||
- os: windows-latest
|
||||
python-version: pypy-3.9
|
||||
run-tests-ext: bat
|
||||
python-version: '3.10'
|
||||
- os: windows-latest
|
||||
python-version: '3.12'
|
||||
- os: windows-latest
|
||||
python-version: '3.13'
|
||||
- os: windows-latest
|
||||
python-version: pypy-3.10
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- uses: actions/checkout@v4
|
||||
- name: Set up Python ${{ matrix.python-version }}
|
||||
uses: actions/setup-python@v4
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
- name: Install pytest
|
||||
run: pip install pytest
|
||||
- name: Install test requirements
|
||||
run: python3 ./devscripts/install_deps.py --include test --include curl-cffi
|
||||
- name: Run tests
|
||||
timeout-minutes: 15
|
||||
continue-on-error: False
|
||||
run: ./devscripts/run_tests.${{ matrix.run-tests-ext }} core
|
||||
# Linter is in quick-test
|
||||
run: |
|
||||
python3 -m yt_dlp -v || true # Print debug head
|
||||
python3 ./devscripts/run_tests.py --pytest-args '--reruns 2 --reruns-delay 3.0' core
|
||||
|
|
27
.github/workflows/download.yml
vendored
27
.github/workflows/download.yml
vendored
|
@ -9,16 +9,16 @@ jobs:
|
|||
if: "contains(github.event.head_commit.message, 'ci run dl')"
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- uses: actions/checkout@v4
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v4
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: 3.9
|
||||
- name: Install test requirements
|
||||
run: pip install pytest
|
||||
run: python3 ./devscripts/install_deps.py --include dev
|
||||
- name: Run tests
|
||||
continue-on-error: true
|
||||
run: ./devscripts/run_tests.sh download
|
||||
run: python3 ./devscripts/run_tests.py download
|
||||
|
||||
full:
|
||||
name: Full Download Tests
|
||||
|
@ -28,24 +28,21 @@ jobs:
|
|||
fail-fast: true
|
||||
matrix:
|
||||
os: [ubuntu-latest]
|
||||
python-version: ['3.7', '3.10', 3.11-dev, pypy-3.7, pypy-3.8]
|
||||
run-tests-ext: [sh]
|
||||
python-version: ['3.10', '3.11', '3.12', '3.13', pypy-3.10]
|
||||
include:
|
||||
# atleast one of each CPython/PyPy tests must be in windows
|
||||
- os: windows-latest
|
||||
python-version: '3.8'
|
||||
run-tests-ext: bat
|
||||
python-version: '3.9'
|
||||
- os: windows-latest
|
||||
python-version: pypy-3.9
|
||||
run-tests-ext: bat
|
||||
python-version: pypy-3.10
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- uses: actions/checkout@v4
|
||||
- name: Set up Python ${{ matrix.python-version }}
|
||||
uses: actions/setup-python@v4
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
- name: Install pytest
|
||||
run: pip install pytest
|
||||
- name: Install test requirements
|
||||
run: python3 ./devscripts/install_deps.py --include dev
|
||||
- name: Run tests
|
||||
continue-on-error: true
|
||||
run: ./devscripts/run_tests.${{ matrix.run-tests-ext }} download
|
||||
run: python3 ./devscripts/run_tests.py download
|
||||
|
|
21
.github/workflows/issue-lockdown.yml
vendored
Normal file
21
.github/workflows/issue-lockdown.yml
vendored
Normal file
|
@ -0,0 +1,21 @@
|
|||
name: Issue Lockdown
|
||||
on:
|
||||
issues:
|
||||
types: [opened]
|
||||
|
||||
permissions:
|
||||
issues: write
|
||||
|
||||
jobs:
|
||||
lockdown:
|
||||
name: Issue Lockdown
|
||||
if: vars.ISSUE_LOCKDOWN
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: "Lock new issue"
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
ISSUE_NUMBER: ${{ github.event.issue.number }}
|
||||
REPOSITORY: ${{ github.repository }}
|
||||
run: |
|
||||
gh issue lock "${ISSUE_NUMBER}" -R "${REPOSITORY}"
|
38
.github/workflows/quick-test.yml
vendored
38
.github/workflows/quick-test.yml
vendored
|
@ -9,28 +9,32 @@ jobs:
|
|||
if: "!contains(github.event.head_commit.message, 'ci skip all')"
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v4
|
||||
- uses: actions/checkout@v4
|
||||
- name: Set up Python 3.9
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: 3.9
|
||||
python-version: '3.9'
|
||||
- name: Install test requirements
|
||||
run: pip install pytest pycryptodomex
|
||||
run: python3 ./devscripts/install_deps.py -o --include test
|
||||
- name: Run tests
|
||||
run: ./devscripts/run_tests.sh core
|
||||
flake8:
|
||||
name: Linter
|
||||
timeout-minutes: 15
|
||||
run: |
|
||||
python3 -m yt_dlp -v || true
|
||||
python3 ./devscripts/run_tests.py --pytest-args '--reruns 2 --reruns-delay 3.0' core
|
||||
check:
|
||||
name: Code check
|
||||
if: "!contains(github.event.head_commit.message, 'ci skip all')"
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- name: Set up Python
|
||||
uses: actions/setup-python@v4
|
||||
- uses: actions/checkout@v4
|
||||
- uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: 3.9
|
||||
- name: Install flake8
|
||||
run: pip install flake8
|
||||
python-version: '3.9'
|
||||
- name: Install dev dependencies
|
||||
run: python3 ./devscripts/install_deps.py -o --include static-analysis
|
||||
- name: Make lazy extractors
|
||||
run: python devscripts/make_lazy_extractors.py
|
||||
- name: Run flake8
|
||||
run: flake8 .
|
||||
run: python3 ./devscripts/make_lazy_extractors.py
|
||||
- name: Run ruff
|
||||
run: ruff check --output-format github .
|
||||
- name: Run autopep8
|
||||
run: autopep8 --diff .
|
||||
|
|
30
.github/workflows/release-master.yml
vendored
Normal file
30
.github/workflows/release-master.yml
vendored
Normal file
|
@ -0,0 +1,30 @@
|
|||
name: Release (master)
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- master
|
||||
paths:
|
||||
- "yt_dlp/**.py"
|
||||
- "!yt_dlp/version.py"
|
||||
- "bundle/*.py"
|
||||
- "pyproject.toml"
|
||||
- "Makefile"
|
||||
- ".github/workflows/build.yml"
|
||||
concurrency:
|
||||
group: release-master
|
||||
permissions:
|
||||
contents: read
|
||||
|
||||
jobs:
|
||||
release:
|
||||
if: vars.BUILD_MASTER != ''
|
||||
uses: ./.github/workflows/release.yml
|
||||
with:
|
||||
prerelease: true
|
||||
source: master
|
||||
permissions:
|
||||
contents: write
|
||||
packages: write # For package cache
|
||||
actions: write # For cleaning up cache
|
||||
id-token: write # mandatory for trusted publishing
|
||||
secrets: inherit
|
43
.github/workflows/release-nightly.yml
vendored
Normal file
43
.github/workflows/release-nightly.yml
vendored
Normal file
|
@ -0,0 +1,43 @@
|
|||
name: Release (nightly)
|
||||
on:
|
||||
schedule:
|
||||
- cron: '23 23 * * *'
|
||||
permissions:
|
||||
contents: read
|
||||
|
||||
jobs:
|
||||
check_nightly:
|
||||
if: vars.BUILD_NIGHTLY != ''
|
||||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
commit: ${{ steps.check_for_new_commits.outputs.commit }}
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
- name: Check for new commits
|
||||
id: check_for_new_commits
|
||||
run: |
|
||||
relevant_files=(
|
||||
"yt_dlp/*.py"
|
||||
':!yt_dlp/version.py'
|
||||
"bundle/*.py"
|
||||
"pyproject.toml"
|
||||
"Makefile"
|
||||
".github/workflows/build.yml"
|
||||
)
|
||||
echo "commit=$(git log --format=%H -1 --since="24 hours ago" -- "${relevant_files[@]}")" | tee "$GITHUB_OUTPUT"
|
||||
|
||||
release:
|
||||
needs: [check_nightly]
|
||||
if: ${{ needs.check_nightly.outputs.commit }}
|
||||
uses: ./.github/workflows/release.yml
|
||||
with:
|
||||
prerelease: true
|
||||
source: nightly
|
||||
permissions:
|
||||
contents: write
|
||||
packages: write # For package cache
|
||||
actions: write # For cleaning up cache
|
||||
id-token: write # mandatory for trusted publishing
|
||||
secrets: inherit
|
384
.github/workflows/release.yml
vendored
Normal file
384
.github/workflows/release.yml
vendored
Normal file
|
@ -0,0 +1,384 @@
|
|||
name: Release
|
||||
on:
|
||||
workflow_call:
|
||||
inputs:
|
||||
prerelease:
|
||||
required: false
|
||||
default: true
|
||||
type: boolean
|
||||
source:
|
||||
required: false
|
||||
default: ''
|
||||
type: string
|
||||
target:
|
||||
required: false
|
||||
default: ''
|
||||
type: string
|
||||
version:
|
||||
required: false
|
||||
default: ''
|
||||
type: string
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
source:
|
||||
description: |
|
||||
SOURCE of this release's updates:
|
||||
channel, repo, tag, or channel/repo@tag
|
||||
(default: <current_repo>)
|
||||
required: false
|
||||
default: ''
|
||||
type: string
|
||||
target:
|
||||
description: |
|
||||
TARGET to publish this release to:
|
||||
channel, tag, or channel@tag
|
||||
(default: <source> if writable else <current_repo>[@source_tag])
|
||||
required: false
|
||||
default: ''
|
||||
type: string
|
||||
version:
|
||||
description: |
|
||||
VERSION: yyyy.mm.dd[.rev] or rev
|
||||
(default: auto-generated)
|
||||
required: false
|
||||
default: ''
|
||||
type: string
|
||||
prerelease:
|
||||
description: Pre-release
|
||||
default: false
|
||||
type: boolean
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
|
||||
jobs:
|
||||
prepare:
|
||||
permissions:
|
||||
contents: write
|
||||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
channel: ${{ steps.setup_variables.outputs.channel }}
|
||||
version: ${{ steps.setup_variables.outputs.version }}
|
||||
target_repo: ${{ steps.setup_variables.outputs.target_repo }}
|
||||
target_repo_token: ${{ steps.setup_variables.outputs.target_repo_token }}
|
||||
target_tag: ${{ steps.setup_variables.outputs.target_tag }}
|
||||
pypi_project: ${{ steps.setup_variables.outputs.pypi_project }}
|
||||
pypi_suffix: ${{ steps.setup_variables.outputs.pypi_suffix }}
|
||||
head_sha: ${{ steps.get_target.outputs.head_sha }}
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: "3.10"
|
||||
|
||||
- name: Process inputs
|
||||
id: process_inputs
|
||||
run: |
|
||||
cat << EOF
|
||||
::group::Inputs
|
||||
prerelease=${{ inputs.prerelease }}
|
||||
source=${{ inputs.source }}
|
||||
target=${{ inputs.target }}
|
||||
version=${{ inputs.version }}
|
||||
::endgroup::
|
||||
EOF
|
||||
IFS='@' read -r source_repo source_tag <<<"${{ inputs.source }}"
|
||||
IFS='@' read -r target_repo target_tag <<<"${{ inputs.target }}"
|
||||
cat << EOF >> "$GITHUB_OUTPUT"
|
||||
source_repo=${source_repo}
|
||||
source_tag=${source_tag}
|
||||
target_repo=${target_repo}
|
||||
target_tag=${target_tag}
|
||||
EOF
|
||||
|
||||
- name: Setup variables
|
||||
id: setup_variables
|
||||
env:
|
||||
source_repo: ${{ steps.process_inputs.outputs.source_repo }}
|
||||
source_tag: ${{ steps.process_inputs.outputs.source_tag }}
|
||||
target_repo: ${{ steps.process_inputs.outputs.target_repo }}
|
||||
target_tag: ${{ steps.process_inputs.outputs.target_tag }}
|
||||
run: |
|
||||
# unholy bash monstrosity (sincere apologies)
|
||||
fallback_token () {
|
||||
if ${{ !secrets.ARCHIVE_REPO_TOKEN }}; then
|
||||
echo "::error::Repository access secret ${target_repo_token^^} not found"
|
||||
exit 1
|
||||
fi
|
||||
target_repo_token=ARCHIVE_REPO_TOKEN
|
||||
return 0
|
||||
}
|
||||
|
||||
source_is_channel=0
|
||||
[[ "${source_repo}" == 'stable' ]] && source_repo='yt-dlp/yt-dlp'
|
||||
if [[ -z "${source_repo}" ]]; then
|
||||
source_repo='${{ github.repository }}'
|
||||
elif [[ '${{ vars[format('{0}_archive_repo', env.source_repo)] }}' ]]; then
|
||||
source_is_channel=1
|
||||
source_channel='${{ vars[format('{0}_archive_repo', env.source_repo)] }}'
|
||||
elif [[ -z "${source_tag}" && "${source_repo}" != */* ]]; then
|
||||
source_tag="${source_repo}"
|
||||
source_repo='${{ github.repository }}'
|
||||
fi
|
||||
resolved_source="${source_repo}"
|
||||
if [[ "${source_tag}" ]]; then
|
||||
resolved_source="${resolved_source}@${source_tag}"
|
||||
elif [[ "${source_repo}" == 'yt-dlp/yt-dlp' ]]; then
|
||||
resolved_source='stable'
|
||||
fi
|
||||
|
||||
revision="${{ (inputs.prerelease || !vars.PUSH_VERSION_COMMIT) && '$(date -u +"%H%M%S")' || '' }}"
|
||||
version="$(
|
||||
python devscripts/update-version.py \
|
||||
-c "${resolved_source}" -r "${{ github.repository }}" ${{ inputs.version || '$revision' }} | \
|
||||
grep -Po "version=\K\d+\.\d+\.\d+(\.\d+)?")"
|
||||
|
||||
if [[ "${target_repo}" ]]; then
|
||||
if [[ -z "${target_tag}" ]]; then
|
||||
if [[ '${{ vars[format('{0}_archive_repo', env.target_repo)] }}' ]]; then
|
||||
target_tag="${source_tag:-${version}}"
|
||||
else
|
||||
target_tag="${target_repo}"
|
||||
target_repo='${{ github.repository }}'
|
||||
fi
|
||||
fi
|
||||
if [[ "${target_repo}" != '${{ github.repository}}' ]]; then
|
||||
target_repo='${{ vars[format('{0}_archive_repo', env.target_repo)] }}'
|
||||
target_repo_token='${{ env.target_repo }}_archive_repo_token'
|
||||
${{ !!secrets[format('{0}_archive_repo_token', env.target_repo)] }} || fallback_token
|
||||
pypi_project='${{ vars[format('{0}_pypi_project', env.target_repo)] }}'
|
||||
pypi_suffix='${{ vars[format('{0}_pypi_suffix', env.target_repo)] }}'
|
||||
fi
|
||||
else
|
||||
target_tag="${source_tag:-${version}}"
|
||||
if ((source_is_channel)); then
|
||||
target_repo="${source_channel}"
|
||||
target_repo_token='${{ env.source_repo }}_archive_repo_token'
|
||||
${{ !!secrets[format('{0}_archive_repo_token', env.source_repo)] }} || fallback_token
|
||||
pypi_project='${{ vars[format('{0}_pypi_project', env.source_repo)] }}'
|
||||
pypi_suffix='${{ vars[format('{0}_pypi_suffix', env.source_repo)] }}'
|
||||
else
|
||||
target_repo='${{ github.repository }}'
|
||||
fi
|
||||
fi
|
||||
|
||||
if [[ "${target_repo}" == '${{ github.repository }}' ]] && ${{ !inputs.prerelease }}; then
|
||||
pypi_project='${{ vars.PYPI_PROJECT }}'
|
||||
fi
|
||||
|
||||
echo "::group::Output variables"
|
||||
cat << EOF | tee -a "$GITHUB_OUTPUT"
|
||||
channel=${resolved_source}
|
||||
version=${version}
|
||||
target_repo=${target_repo}
|
||||
target_repo_token=${target_repo_token}
|
||||
target_tag=${target_tag}
|
||||
pypi_project=${pypi_project}
|
||||
pypi_suffix=${pypi_suffix}
|
||||
EOF
|
||||
echo "::endgroup::"
|
||||
|
||||
- name: Update documentation
|
||||
env:
|
||||
version: ${{ steps.setup_variables.outputs.version }}
|
||||
target_repo: ${{ steps.setup_variables.outputs.target_repo }}
|
||||
if: |
|
||||
!inputs.prerelease && env.target_repo == github.repository
|
||||
run: |
|
||||
python devscripts/update_changelog.py -vv
|
||||
make doc
|
||||
|
||||
- name: Push to release
|
||||
id: push_release
|
||||
env:
|
||||
version: ${{ steps.setup_variables.outputs.version }}
|
||||
target_repo: ${{ steps.setup_variables.outputs.target_repo }}
|
||||
if: |
|
||||
!inputs.prerelease && env.target_repo == github.repository
|
||||
run: |
|
||||
git config --global user.name "github-actions[bot]"
|
||||
git config --global user.email "41898282+github-actions[bot]@users.noreply.github.com"
|
||||
git add -u
|
||||
git commit -m "Release ${{ env.version }}" \
|
||||
-m "Created by: ${{ github.event.sender.login }}" -m ":ci skip all"
|
||||
git push origin --force ${{ github.event.ref }}:release
|
||||
|
||||
- name: Get target commitish
|
||||
id: get_target
|
||||
run: |
|
||||
echo "head_sha=$(git rev-parse HEAD)" >> "$GITHUB_OUTPUT"
|
||||
|
||||
- name: Update master
|
||||
env:
|
||||
target_repo: ${{ steps.setup_variables.outputs.target_repo }}
|
||||
if: |
|
||||
vars.PUSH_VERSION_COMMIT != '' && !inputs.prerelease && env.target_repo == github.repository
|
||||
run: git push origin ${{ github.event.ref }}
|
||||
|
||||
build:
|
||||
needs: prepare
|
||||
uses: ./.github/workflows/build.yml
|
||||
with:
|
||||
version: ${{ needs.prepare.outputs.version }}
|
||||
channel: ${{ needs.prepare.outputs.channel }}
|
||||
origin: ${{ needs.prepare.outputs.target_repo }}
|
||||
permissions:
|
||||
contents: read
|
||||
packages: write # For package cache
|
||||
actions: write # For cleaning up cache
|
||||
secrets:
|
||||
GPG_SIGNING_KEY: ${{ secrets.GPG_SIGNING_KEY }}
|
||||
|
||||
publish_pypi:
|
||||
needs: [prepare, build]
|
||||
if: ${{ needs.prepare.outputs.pypi_project }}
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
id-token: write # mandatory for trusted publishing
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
- uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: "3.10"
|
||||
|
||||
- name: Install Requirements
|
||||
run: |
|
||||
sudo apt -y install pandoc man
|
||||
python devscripts/install_deps.py -o --include build
|
||||
|
||||
- name: Prepare
|
||||
env:
|
||||
version: ${{ needs.prepare.outputs.version }}
|
||||
suffix: ${{ needs.prepare.outputs.pypi_suffix }}
|
||||
channel: ${{ needs.prepare.outputs.channel }}
|
||||
target_repo: ${{ needs.prepare.outputs.target_repo }}
|
||||
pypi_project: ${{ needs.prepare.outputs.pypi_project }}
|
||||
run: |
|
||||
python devscripts/update-version.py -c "${{ env.channel }}" -r "${{ env.target_repo }}" -s "${{ env.suffix }}" "${{ env.version }}"
|
||||
python devscripts/update_changelog.py -vv
|
||||
python devscripts/make_lazy_extractors.py
|
||||
sed -i -E '0,/(name = ")[^"]+(")/s//\1${{ env.pypi_project }}\2/' pyproject.toml
|
||||
|
||||
- name: Build
|
||||
run: |
|
||||
rm -rf dist/*
|
||||
make pypi-files
|
||||
printf '%s\n\n' \
|
||||
'Official repository: <https://github.com/yt-dlp/yt-dlp>' \
|
||||
'**PS**: Some links in this document will not work since this is a copy of the README.md from Github' > ./README.md.new
|
||||
cat ./README.md >> ./README.md.new && mv -f ./README.md.new ./README.md
|
||||
python devscripts/set-variant.py pip -M "You installed yt-dlp with pip or using the wheel from PyPi; Use that to update"
|
||||
make clean-cache
|
||||
python -m build --no-isolation .
|
||||
|
||||
- name: Publish to PyPI
|
||||
uses: pypa/gh-action-pypi-publish@release/v1
|
||||
with:
|
||||
verbose: true
|
||||
|
||||
publish:
|
||||
needs: [prepare, build]
|
||||
permissions:
|
||||
contents: write
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
- uses: actions/download-artifact@v4
|
||||
with:
|
||||
path: artifact
|
||||
pattern: build-*
|
||||
merge-multiple: true
|
||||
- uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: "3.10"
|
||||
|
||||
- name: Generate release notes
|
||||
env:
|
||||
head_sha: ${{ needs.prepare.outputs.head_sha }}
|
||||
target_repo: ${{ needs.prepare.outputs.target_repo }}
|
||||
target_tag: ${{ needs.prepare.outputs.target_tag }}
|
||||
run: |
|
||||
printf '%s' \
|
||||
'[![Installation](https://img.shields.io/badge/-Which%20file%20to%20download%3F-white.svg?style=for-the-badge)]' \
|
||||
'(https://github.com/${{ github.repository }}#installation "Installation instructions") ' \
|
||||
'[![Discord](https://img.shields.io/discord/807245652072857610?color=blue&labelColor=555555&label=&logo=discord&style=for-the-badge)]' \
|
||||
'(https://discord.gg/H5MNcFW63r "Discord") ' \
|
||||
'[![Donate](https://img.shields.io/badge/_-Donate-red.svg?logo=githubsponsors&labelColor=555555&style=for-the-badge)]' \
|
||||
'(https://github.com/yt-dlp/yt-dlp/blob/master/Collaborators.md#collaborators "Donate") ' \
|
||||
'[![Documentation](https://img.shields.io/badge/-Docs-brightgreen.svg?style=for-the-badge&logo=GitBook&labelColor=555555)]' \
|
||||
'(https://github.com/${{ github.repository }}' \
|
||||
'${{ env.target_repo == github.repository && format('/tree/{0}', env.target_tag) || '' }}#readme "Documentation") ' \
|
||||
${{ env.target_repo == 'yt-dlp/yt-dlp' && '\
|
||||
"[![Nightly](https://img.shields.io/badge/Nightly%20builds-purple.svg?style=for-the-badge)]" \
|
||||
"(https://github.com/yt-dlp/yt-dlp-nightly-builds/releases/latest \"Nightly builds\") " \
|
||||
"[![Master](https://img.shields.io/badge/Master%20builds-lightblue.svg?style=for-the-badge)]" \
|
||||
"(https://github.com/yt-dlp/yt-dlp-master-builds/releases/latest \"Master builds\")"' || '' }} > ./RELEASE_NOTES
|
||||
printf '\n\n' >> ./RELEASE_NOTES
|
||||
cat >> ./RELEASE_NOTES << EOF
|
||||
#### A description of the various files is in the [README](https://github.com/${{ github.repository }}#release-files)
|
||||
---
|
||||
$(python ./devscripts/make_changelog.py -vv --collapsible)
|
||||
EOF
|
||||
printf '%s\n\n' '**This is a pre-release build**' >> ./PRERELEASE_NOTES
|
||||
cat ./RELEASE_NOTES >> ./PRERELEASE_NOTES
|
||||
printf '%s\n\n' 'Generated from: https://github.com/${{ github.repository }}/commit/${{ env.head_sha }}' >> ./ARCHIVE_NOTES
|
||||
cat ./RELEASE_NOTES >> ./ARCHIVE_NOTES
|
||||
|
||||
- name: Publish to archive repo
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets[needs.prepare.outputs.target_repo_token] }}
|
||||
GH_REPO: ${{ needs.prepare.outputs.target_repo }}
|
||||
version: ${{ needs.prepare.outputs.version }}
|
||||
channel: ${{ needs.prepare.outputs.channel }}
|
||||
if: |
|
||||
inputs.prerelease && env.GH_TOKEN != '' && env.GH_REPO != '' && env.GH_REPO != github.repository
|
||||
run: |
|
||||
title="${{ startswith(env.GH_REPO, 'yt-dlp/') && 'yt-dlp ' || '' }}${{ env.channel }}"
|
||||
gh release create \
|
||||
--notes-file ARCHIVE_NOTES \
|
||||
--title "${title} ${{ env.version }}" \
|
||||
${{ env.version }} \
|
||||
artifact/*
|
||||
|
||||
- name: Prune old release
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
version: ${{ needs.prepare.outputs.version }}
|
||||
target_repo: ${{ needs.prepare.outputs.target_repo }}
|
||||
target_tag: ${{ needs.prepare.outputs.target_tag }}
|
||||
if: |
|
||||
env.target_repo == github.repository && env.target_tag != env.version
|
||||
run: |
|
||||
gh release delete --yes --cleanup-tag "${{ env.target_tag }}" || true
|
||||
git tag --delete "${{ env.target_tag }}" || true
|
||||
sleep 5 # Enough time to cover deletion race condition
|
||||
|
||||
- name: Publish release
|
||||
env:
|
||||
GH_TOKEN: ${{ github.token }}
|
||||
version: ${{ needs.prepare.outputs.version }}
|
||||
target_repo: ${{ needs.prepare.outputs.target_repo }}
|
||||
target_tag: ${{ needs.prepare.outputs.target_tag }}
|
||||
head_sha: ${{ needs.prepare.outputs.head_sha }}
|
||||
if: |
|
||||
env.target_repo == github.repository
|
||||
run: |
|
||||
title="${{ github.repository == 'yt-dlp/yt-dlp' && 'yt-dlp ' || '' }}"
|
||||
title+="${{ env.target_tag != env.version && format('{0} ', env.target_tag) || '' }}"
|
||||
gh release create \
|
||||
--notes-file ${{ inputs.prerelease && 'PRERELEASE_NOTES' || 'RELEASE_NOTES' }} \
|
||||
--target ${{ env.head_sha }} \
|
||||
--title "${title}${{ env.version }}" \
|
||||
${{ inputs.prerelease && '--prerelease' || '' }} \
|
||||
${{ env.target_tag }} \
|
||||
artifact/*
|
17
.github/workflows/sanitize-comment.yml
vendored
Normal file
17
.github/workflows/sanitize-comment.yml
vendored
Normal file
|
@ -0,0 +1,17 @@
|
|||
name: Sanitize comment
|
||||
|
||||
on:
|
||||
issue_comment:
|
||||
types: [created, edited]
|
||||
|
||||
permissions:
|
||||
issues: write
|
||||
|
||||
jobs:
|
||||
sanitize-comment:
|
||||
name: Sanitize comment
|
||||
if: vars.SANITIZE_COMMENT && !github.event.issue.pull_request
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Sanitize comment
|
||||
uses: yt-dlp/sanitize-comment@v1
|
17
.gitignore
vendored
17
.gitignore
vendored
|
@ -30,8 +30,10 @@ cookies
|
|||
*.f4v
|
||||
*.flac
|
||||
*.flv
|
||||
*.gif
|
||||
*.jpeg
|
||||
*.jpg
|
||||
*.lrc
|
||||
*.m4a
|
||||
*.m4v
|
||||
*.mhtml
|
||||
|
@ -39,6 +41,7 @@ cookies
|
|||
*.mov
|
||||
*.mp3
|
||||
*.mp4
|
||||
*.mpg
|
||||
*.mpga
|
||||
*.oga
|
||||
*.ogg
|
||||
|
@ -46,8 +49,8 @@ cookies
|
|||
*.png
|
||||
*.sbv
|
||||
*.srt
|
||||
*.ssa
|
||||
*.swf
|
||||
*.swp
|
||||
*.tt
|
||||
*.ttml
|
||||
*.url
|
||||
|
@ -63,7 +66,7 @@ cookies
|
|||
# Python
|
||||
*.pyc
|
||||
*.pyo
|
||||
.pytest_cache
|
||||
.*_cache
|
||||
wine-py2exe/
|
||||
py2exe.log
|
||||
build/
|
||||
|
@ -71,6 +74,7 @@ dist/
|
|||
zip/
|
||||
tmp/
|
||||
venv/
|
||||
.venv/
|
||||
completions/
|
||||
|
||||
# Misc
|
||||
|
@ -114,14 +118,11 @@ yt-dlp.zip
|
|||
.vscode
|
||||
*.sublime-*
|
||||
*.code-workspace
|
||||
*.swp
|
||||
|
||||
# Lazy extractors
|
||||
*/extractor/lazy_extractors.py
|
||||
|
||||
# Plugins
|
||||
ytdlp_plugins/extractor/*
|
||||
!ytdlp_plugins/extractor/__init__.py
|
||||
!ytdlp_plugins/extractor/sample.py
|
||||
ytdlp_plugins/postprocessor/*
|
||||
!ytdlp_plugins/postprocessor/__init__.py
|
||||
!ytdlp_plugins/postprocessor/sample.py
|
||||
ytdlp_plugins/
|
||||
yt-dlp-plugins
|
||||
|
|
14
.pre-commit-config.yaml
Normal file
14
.pre-commit-config.yaml
Normal file
|
@ -0,0 +1,14 @@
|
|||
repos:
|
||||
- repo: local
|
||||
hooks:
|
||||
- id: linter
|
||||
name: Apply linter fixes
|
||||
entry: ruff check --fix .
|
||||
language: system
|
||||
types: [python]
|
||||
require_serial: true
|
||||
- id: format
|
||||
name: Apply formatting fixes
|
||||
entry: autopep8 --in-place .
|
||||
language: system
|
||||
types: [python]
|
9
.pre-commit-hatch.yaml
Normal file
9
.pre-commit-hatch.yaml
Normal file
|
@ -0,0 +1,9 @@
|
|||
repos:
|
||||
- repo: local
|
||||
hooks:
|
||||
- id: fix
|
||||
name: Apply code fixes
|
||||
entry: hatch fmt
|
||||
language: system
|
||||
types: [python]
|
||||
require_serial: true
|
167
CONTRIBUTING.md
167
CONTRIBUTING.md
|
@ -37,14 +37,18 @@ # OPENING AN ISSUE
|
|||
**Please include the full output of yt-dlp when run with `-vU`**, i.e. **add** `-vU` flag to **your command line**, copy the **whole** output and post it in the issue body wrapped in \`\`\` for better formatting. It should look similar to this:
|
||||
```
|
||||
$ yt-dlp -vU <your command line>
|
||||
[debug] Command-line config: ['-v', 'demo.com']
|
||||
[debug] Encodings: locale UTF-8, fs utf-8, out utf-8, pref UTF-8
|
||||
[debug] yt-dlp version 2021.09.25 (zip)
|
||||
[debug] Python version 3.8.10 (CPython 64bit) - Linux-5.4.0-74-generic-x86_64-with-glibc2.29
|
||||
[debug] exe versions: ffmpeg 4.2.4, ffprobe 4.2.4
|
||||
[debug] Command-line config: ['-vU', 'https://www.example.com/']
|
||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp-nightly-builds [1a176d874] (win_exe)
|
||||
[debug] Python 3.10.11 (CPython AMD64 64bit) - Windows-10-10.0.20348-SP0 (OpenSSL 1.1.1t 7 Feb 2023)
|
||||
[debug] exe versions: ffmpeg 7.0.2 (setts), ffprobe 7.0.2
|
||||
[debug] Optional libraries: Cryptodome-3.21.0, brotli-1.1.0, certifi-2024.08.30, curl_cffi-0.5.10, mutagen-1.47.0, requests-2.32.3, sqlite3-3.40.1, urllib3-2.2.3, websockets-13.1
|
||||
[debug] Proxy map: {}
|
||||
Current Build Hash 25cc412d1d3c0725a1f2f5b7e4682f6fb40e6d15f7024e96f7afd572e9919535
|
||||
yt-dlp is up to date (2021.09.25)
|
||||
[debug] Request Handlers: urllib, requests, websockets, curl_cffi
|
||||
[debug] Loaded 1838 extractors
|
||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
||||
Latest version: nightly@... from yt-dlp/yt-dlp-nightly-builds
|
||||
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
||||
...
|
||||
```
|
||||
**Do not post screenshots of verbose logs; only plain text is acceptable.**
|
||||
|
@ -79,7 +83,7 @@ ### Are you using the latest version?
|
|||
|
||||
### Is the issue already documented?
|
||||
|
||||
Make sure that someone has not already opened the issue you're trying to open. Search at the top of the window or browse the [GitHub Issues](https://github.com/yt-dlp/yt-dlp/search?type=Issues) of this repository. If there is an issue, feel free to write something along the lines of "This affects me as well, with version 2021.01.01. Here is some more information on the issue: ...". While some issues may be old, a new post into them often spurs rapid activity.
|
||||
Make sure that someone has not already opened the issue you're trying to open. Search at the top of the window or browse the [GitHub Issues](https://github.com/yt-dlp/yt-dlp/search?type=Issues) of this repository. If there is an issue, subscribe to it to be notified when there is any progress. Unless you have something useful to add to the conversation, please refrain from commenting.
|
||||
|
||||
Additionally, it is also helpful to see if the issue has already been documented in the [youtube-dl issue tracker](https://github.com/ytdl-org/youtube-dl/issues). If similar issues have already been reported in youtube-dl (but not in our issue tracker), links to them can be included in your issue report here.
|
||||
|
||||
|
@ -127,34 +131,66 @@ ### Are you willing to share account details if needed?
|
|||
|
||||
### Is the website primarily used for piracy?
|
||||
|
||||
We follow [youtube-dl's policy](https://github.com/ytdl-org/youtube-dl#can-you-add-support-for-this-anime-video-site-or-site-which-shows-current-movies-for-free) to not support services that is primarily used for infringing copyright. Additionally, it has been decided to not to support porn sites that specialize in deep fake. We also cannot support any service that serves only [DRM protected content](https://en.wikipedia.org/wiki/Digital_rights_management).
|
||||
We follow [youtube-dl's policy](https://github.com/ytdl-org/youtube-dl#can-you-add-support-for-this-anime-video-site-or-site-which-shows-current-movies-for-free) to not support services that is primarily used for infringing copyright. Additionally, it has been decided to not to support porn sites that specialize in fakes. We also cannot support any service that serves only [DRM protected content](https://en.wikipedia.org/wiki/Digital_rights_management).
|
||||
|
||||
|
||||
|
||||
|
||||
# DEVELOPER INSTRUCTIONS
|
||||
|
||||
Most users do not need to build yt-dlp and can [download the builds](https://github.com/yt-dlp/yt-dlp/releases) or get them via [the other installation methods](README.md#installation).
|
||||
Most users do not need to build yt-dlp and can [download the builds](https://github.com/yt-dlp/yt-dlp/releases), get them via [the other installation methods](README.md#installation) or directly run it using `python -m yt_dlp`.
|
||||
|
||||
To run yt-dlp as a developer, you don't need to build anything either. Simply execute
|
||||
`yt-dlp` uses [`hatch`](<https://hatch.pypa.io>) as a project management tool.
|
||||
You can easily install it using [`pipx`](<https://pipx.pypa.io>) via `pipx install hatch`, or else via `pip` or your package manager of choice. Make sure you are using at least version `1.10.0`, otherwise some functionality might not work as expected.
|
||||
|
||||
python -m yt_dlp
|
||||
If you plan on contributing to `yt-dlp`, best practice is to start by running the following command:
|
||||
|
||||
To run the test, simply invoke your favorite test runner, or execute a test file directly; any of the following work:
|
||||
```shell
|
||||
$ hatch run setup
|
||||
```
|
||||
|
||||
python -m unittest discover
|
||||
python test/test_download.py
|
||||
nosetests
|
||||
pytest
|
||||
The above command will install a `pre-commit` hook so that required checks/fixes (linting, formatting) will run automatically before each commit. If any code needs to be linted or formatted, then the commit will be blocked and the necessary changes will be made; you should review all edits and re-commit the fixed version.
|
||||
|
||||
After this you can use `hatch shell` to enable a virtual environment that has `yt-dlp` and its development dependencies installed.
|
||||
|
||||
In addition, the following script commands can be used to run simple tasks such as linting or testing (without having to run `hatch shell` first):
|
||||
* `hatch fmt`: Automatically fix linter violations and apply required code formatting changes
|
||||
* See `hatch fmt --help` for more info
|
||||
* `hatch test`: Run extractor or core tests
|
||||
* See `hatch test --help` for more info
|
||||
|
||||
See item 6 of [new extractor tutorial](#adding-support-for-a-new-site) for how to run extractor specific test cases.
|
||||
|
||||
While it is strongly recommended to use `hatch` for yt-dlp development, if you are unable to do so, alternatively you can manually create a virtual environment and use the following commands:
|
||||
|
||||
```shell
|
||||
# To only install development dependencies:
|
||||
$ python -m devscripts.install_deps --include dev
|
||||
|
||||
# Or, for an editable install plus dev dependencies:
|
||||
$ python -m pip install -e ".[default,dev]"
|
||||
|
||||
# To setup the pre-commit hook:
|
||||
$ pre-commit install
|
||||
|
||||
# To be used in place of `hatch test`:
|
||||
$ python -m devscripts.run_tests
|
||||
|
||||
# To be used in place of `hatch fmt`:
|
||||
$ ruff check --fix .
|
||||
$ autopep8 --in-place .
|
||||
|
||||
# To only check code instead of applying fixes:
|
||||
$ ruff check .
|
||||
$ autopep8 --diff .
|
||||
```
|
||||
|
||||
If you want to create a build of yt-dlp yourself, you can follow the instructions [here](README.md#compile).
|
||||
|
||||
|
||||
## Adding new feature or making overarching changes
|
||||
|
||||
Before you start writing code for implementing a new feature, open an issue explaining your feature request and atleast one use case. This allows the maintainers to decide whether such a feature is desired for the project in the first place, and will provide an avenue to discuss some implementation details. If you open a pull request for a new feature without discussing with us first, do not be surprised when we ask for large changes to the code, or even reject it outright.
|
||||
Before you start writing code for implementing a new feature, open an issue explaining your feature request and at least one use case. This allows the maintainers to decide whether such a feature is desired for the project in the first place, and will provide an avenue to discuss some implementation details. If you open a pull request for a new feature without discussing with us first, do not be surprised when we ask for large changes to the code, or even reject it outright.
|
||||
|
||||
The same applies for changes to the documentation, code style, or overarching changes to the architecture
|
||||
|
||||
|
@ -168,12 +204,16 @@ ## Adding support for a new site
|
|||
1. [Fork this repository](https://github.com/yt-dlp/yt-dlp/fork)
|
||||
1. Check out the source code with:
|
||||
|
||||
git clone git@github.com:YOUR_GITHUB_USERNAME/yt-dlp.git
|
||||
```shell
|
||||
$ git clone git@github.com:YOUR_GITHUB_USERNAME/yt-dlp.git
|
||||
```
|
||||
|
||||
1. Start a new git branch with
|
||||
|
||||
cd yt-dlp
|
||||
git checkout -b yourextractor
|
||||
```shell
|
||||
$ cd yt-dlp
|
||||
$ git checkout -b yourextractor
|
||||
```
|
||||
|
||||
1. Start with this simple template and save it to `yt_dlp/extractor/yourextractor.py`:
|
||||
|
||||
|
@ -187,15 +227,21 @@ ## Adding support for a new site
|
|||
'url': 'https://yourextractor.com/watch/42',
|
||||
'md5': 'TODO: md5 sum of the first 10241 bytes of the video file (use --test)',
|
||||
'info_dict': {
|
||||
# For videos, only the 'id' and 'ext' fields are required to RUN the test:
|
||||
'id': '42',
|
||||
'ext': 'mp4',
|
||||
'title': 'Video title goes here',
|
||||
'thumbnail': r're:^https?://.*\.jpg$',
|
||||
# TODO more properties, either as:
|
||||
# * A value
|
||||
# * MD5 checksum; start the string with md5:
|
||||
# * A regular expression; start the string with re:
|
||||
# * Any Python type, e.g. int or float
|
||||
# Then if the test run fails, it will output the missing/incorrect fields.
|
||||
# Properties can be added as:
|
||||
# * A value, e.g.
|
||||
# 'title': 'Video title goes here',
|
||||
# * MD5 checksum; start the string with 'md5:', e.g.
|
||||
# 'description': 'md5:098f6bcd4621d373cade4e832627b4f6',
|
||||
# * A regular expression; start the string with 're:', e.g.
|
||||
# 'thumbnail': r're:https?://.*\.jpg$',
|
||||
# * A count of elements in a list; start the string with 'count:', e.g.
|
||||
# 'tags': 'count:10',
|
||||
# * Any Python type, e.g.
|
||||
# 'view_count': int,
|
||||
}
|
||||
}]
|
||||
|
||||
|
@ -214,27 +260,33 @@ ## Adding support for a new site
|
|||
# TODO more properties (see yt_dlp/extractor/common.py)
|
||||
}
|
||||
```
|
||||
1. Add an import in [`yt_dlp/extractor/_extractors.py`](yt_dlp/extractor/_extractors.py). Note that the class name must end with `IE`.
|
||||
1. Run `python test/test_download.py TestDownload.test_YourExtractor` (note that `YourExtractor` doesn't end with `IE`). This *should fail* at first, but you can continually re-run it until you're done. If you decide to add more than one test, the tests will then be named `TestDownload.test_YourExtractor`, `TestDownload.test_YourExtractor_1`, `TestDownload.test_YourExtractor_2`, etc. Note that tests with `only_matching` key in test's dict are not counted in. You can also run all the tests in one go with `TestDownload.test_YourExtractor_all`
|
||||
1. Make sure you have atleast one test for your extractor. Even if all videos covered by the extractor are expected to be inaccessible for automated testing, tests should still be added with a `skip` parameter indicating why the particular test is disabled from running.
|
||||
1. Have a look at [`yt_dlp/extractor/common.py`](yt_dlp/extractor/common.py) for possible helper methods and a [detailed description of what your extractor should and may return](yt_dlp/extractor/common.py#L91-L426). Add tests and code for as many as you want.
|
||||
1. Make sure your code follows [yt-dlp coding conventions](#yt-dlp-coding-conventions) and check the code with [flake8](https://flake8.pycqa.org/en/latest/index.html#quickstart):
|
||||
1. Add an import in [`yt_dlp/extractor/_extractors.py`](yt_dlp/extractor/_extractors.py). Note that the class name must end with `IE`. Also note that when adding a parenthesized import group, the last import in the group must have a trailing comma in order for this formatting to be respected by our code formatter.
|
||||
1. Run `hatch test YourExtractor`. This *may fail* at first, but you can continually re-run it until you're done. Upon failure, it will output the missing fields and/or correct values which you can copy. If you decide to add more than one test, the tests will then be named `YourExtractor`, `YourExtractor_1`, `YourExtractor_2`, etc. Note that tests with an `only_matching` key in the test's dict are not included in the count. You can also run all the tests in one go with `YourExtractor_all`
|
||||
1. Make sure you have at least one test for your extractor. Even if all videos covered by the extractor are expected to be inaccessible for automated testing, tests should still be added with a `skip` parameter indicating why the particular test is disabled from running.
|
||||
1. Have a look at [`yt_dlp/extractor/common.py`](yt_dlp/extractor/common.py) for possible helper methods and a [detailed description of what your extractor should and may return](yt_dlp/extractor/common.py#L119-L440). Add tests and code for as many as you want.
|
||||
1. Make sure your code follows [yt-dlp coding conventions](#yt-dlp-coding-conventions), passes [ruff](https://docs.astral.sh/ruff/tutorial/#getting-started) code checks and is properly formatted:
|
||||
|
||||
$ flake8 yt_dlp/extractor/yourextractor.py
|
||||
```shell
|
||||
$ hatch fmt --check
|
||||
```
|
||||
|
||||
1. Make sure your code works under all [Python](https://www.python.org/) versions supported by yt-dlp, namely CPython and PyPy for Python 3.7 and above. Backward compatibility is not required for even older versions of Python.
|
||||
You can use `hatch fmt` to automatically fix problems. Rules that the linter/formatter enforces should not be disabled with `# noqa` unless a maintainer requests it. The only exception allowed is for old/printf-style string formatting in GraphQL query templates (use `# noqa: UP031`).
|
||||
|
||||
1. Make sure your code works under all [Python](https://www.python.org/) versions supported by yt-dlp, namely CPython >=3.9 and PyPy >=3.10. Backward compatibility is not required for even older versions of Python.
|
||||
1. When the tests pass, [add](https://git-scm.com/docs/git-add) the new files, [commit](https://git-scm.com/docs/git-commit) them and [push](https://git-scm.com/docs/git-push) the result, like this:
|
||||
|
||||
$ git add yt_dlp/extractor/_extractors.py
|
||||
$ git add yt_dlp/extractor/yourextractor.py
|
||||
$ git commit -m '[yourextractor] Add extractor'
|
||||
$ git push origin yourextractor
|
||||
```shell
|
||||
$ git add yt_dlp/extractor/_extractors.py
|
||||
$ git add yt_dlp/extractor/yourextractor.py
|
||||
$ git commit -m '[yourextractor] Add extractor'
|
||||
$ git push origin yourextractor
|
||||
```
|
||||
|
||||
1. Finally, [create a pull request](https://help.github.com/articles/creating-a-pull-request). We'll then review and merge it.
|
||||
|
||||
In any case, thank you very much for your contributions!
|
||||
|
||||
**Tip:** To test extractors that require login information, create a file `test/local_parameters.json` and add `"usenetrc": true` or your username and password in it:
|
||||
**Tip:** To test extractors that require login information, create a file `test/local_parameters.json` and add `"usenetrc": true` or your `username`&`password` or `cookiefile`/`cookiesfrombrowser` in it:
|
||||
```json
|
||||
{
|
||||
"username": "your user name",
|
||||
|
@ -246,22 +298,21 @@ ## yt-dlp coding conventions
|
|||
|
||||
This section introduces a guide lines for writing idiomatic, robust and future-proof extractor code.
|
||||
|
||||
Extractors are very fragile by nature since they depend on the layout of the source data provided by 3rd party media hosters out of your control and this layout tends to change. As an extractor implementer your task is not only to write code that will extract media links and metadata correctly but also to minimize dependency on the source's layout and even to make the code foresee potential future changes and be ready for that. This is important because it will allow the extractor not to break on minor layout changes thus keeping old yt-dlp versions working. Even though this breakage issue may be easily fixed by a new version of yt-dlp, this could take some time, during which the the extractor will remain broken.
|
||||
Extractors are very fragile by nature since they depend on the layout of the source data provided by 3rd party media hosters out of your control and this layout tends to change. As an extractor implementer your task is not only to write code that will extract media links and metadata correctly but also to minimize dependency on the source's layout and even to make the code foresee potential future changes and be ready for that. This is important because it will allow the extractor not to break on minor layout changes thus keeping old yt-dlp versions working. Even though this breakage issue may be easily fixed by a new version of yt-dlp, this could take some time, during which the extractor will remain broken.
|
||||
|
||||
|
||||
### Mandatory and optional metafields
|
||||
|
||||
For extraction to work yt-dlp relies on metadata your extractor extracts and provides to yt-dlp expressed by an [information dictionary](yt_dlp/extractor/common.py#L91-L426) or simply *info dict*. Only the following meta fields in the *info dict* are considered mandatory for a successful extraction process by yt-dlp:
|
||||
For extraction to work yt-dlp relies on metadata your extractor extracts and provides to yt-dlp expressed by an [information dictionary](yt_dlp/extractor/common.py#L119-L440) or simply *info dict*. Only the following meta fields in the *info dict* are considered mandatory for a successful extraction process by yt-dlp:
|
||||
|
||||
- `id` (media identifier)
|
||||
- `title` (media title)
|
||||
- `url` (media download URL) or `formats`
|
||||
|
||||
The aforementioned metafields are the critical data that the extraction does not make any sense without and if any of them fail to be extracted then the extractor is considered completely broken. While all extractors must return a `title`, they must also allow it's extraction to be non-fatal.
|
||||
The aforementioned metadata fields are the critical data without which extraction does not make any sense. If any of them fail to be extracted, then the extractor is considered broken. All other metadata extraction should be completely non-fatal.
|
||||
|
||||
For pornographic sites, appropriate `age_limit` must also be returned.
|
||||
|
||||
The extractor is allowed to return the info dict without url or formats in some special cases if it allows the user to extract usefull information with `--ignore-no-formats-error` - e.g. when the video is a live stream that has not started yet.
|
||||
The extractor is allowed to return the info dict without url or formats in some special cases if it allows the user to extract useful information with `--ignore-no-formats-error` - e.g. when the video is a live stream that has not started yet.
|
||||
|
||||
[Any field](yt_dlp/extractor/common.py#219-L426) apart from the aforementioned ones are considered **optional**. That means that extraction should be **tolerant** to situations when sources for these fields can potentially be unavailable (even if they are always available at the moment) and **future-proof** in order not to break the extraction of general purpose mandatory fields.
|
||||
|
||||
|
@ -351,8 +402,9 @@ #### Example
|
|||
```python
|
||||
thumbnail_data = data.get('thumbnails') or []
|
||||
thumbnails = [{
|
||||
'url': item['url']
|
||||
} for item in thumbnail_data] # correct
|
||||
'url': item['url'],
|
||||
'height': item.get('h'),
|
||||
} for item in thumbnail_data if item.get('url')] # correct
|
||||
```
|
||||
|
||||
and not like:
|
||||
|
@ -360,12 +412,27 @@ #### Example
|
|||
```python
|
||||
thumbnail_data = data.get('thumbnails')
|
||||
thumbnails = [{
|
||||
'url': item['url']
|
||||
'url': item['url'],
|
||||
'height': item.get('h'),
|
||||
} for item in thumbnail_data] # incorrect
|
||||
```
|
||||
|
||||
In this case, `thumbnail_data` will be `None` if the field was not found and this will cause the loop `for item in thumbnail_data` to raise a fatal error. Using `or []` avoids this error and results in setting an empty list in `thumbnails` instead.
|
||||
|
||||
Alternately, this can be further simplified by using `traverse_obj`
|
||||
|
||||
```python
|
||||
thumbnails = [{
|
||||
'url': item['url'],
|
||||
'height': item.get('h'),
|
||||
} for item in traverse_obj(data, ('thumbnails', lambda _, v: v['url']))]
|
||||
```
|
||||
|
||||
or, even better,
|
||||
|
||||
```python
|
||||
thumbnails = traverse_obj(data, ('thumbnails', ..., {'url': 'url', 'height': 'h'}))
|
||||
```
|
||||
|
||||
### Provide fallbacks
|
||||
|
||||
|
@ -680,7 +747,7 @@ #### Examples
|
|||
|
||||
### Use convenience conversion and parsing functions
|
||||
|
||||
Wrap all extracted numeric data into safe functions from [`yt_dlp/utils.py`](yt_dlp/utils.py): `int_or_none`, `float_or_none`. Use them for string to number conversions as well.
|
||||
Wrap all extracted numeric data into safe functions from [`yt_dlp/utils/`](yt_dlp/utils/): `int_or_none`, `float_or_none`. Use them for string to number conversions as well.
|
||||
|
||||
Use `url_or_none` for safe URL processing.
|
||||
|
||||
|
@ -688,7 +755,7 @@ ### Use convenience conversion and parsing functions
|
|||
|
||||
Use `unified_strdate` for uniform `upload_date` or any `YYYYMMDD` meta field extraction, `unified_timestamp` for uniform `timestamp` extraction, `parse_filesize` for `filesize` extraction, `parse_count` for count meta fields extraction, `parse_resolution`, `parse_duration` for `duration` extraction, `parse_age_limit` for `age_limit` extraction.
|
||||
|
||||
Explore [`yt_dlp/utils.py`](yt_dlp/utils.py) for more useful convenience functions.
|
||||
Explore [`yt_dlp/utils/`](yt_dlp/utils/) for more useful convenience functions.
|
||||
|
||||
#### Examples
|
||||
|
||||
|
|
337
CONTRIBUTORS
337
CONTRIBUTORS
|
@ -2,7 +2,8 @@ pukkandan (owner)
|
|||
shirt-dev (collaborator)
|
||||
coletdjnz/colethedj (collaborator)
|
||||
Ashish0804 (collaborator)
|
||||
nao20010128nao/Lesmiscore (collaborator)
|
||||
bashonly (collaborator)
|
||||
Grub4K (collaborator)
|
||||
h-h-h-h
|
||||
pauldubois98
|
||||
nixxo
|
||||
|
@ -295,7 +296,6 @@ Mehavoid
|
|||
winterbird-code
|
||||
yashkc2025
|
||||
aldoridhoni
|
||||
bashonly
|
||||
jacobtruman
|
||||
masta79
|
||||
palewire
|
||||
|
@ -319,7 +319,6 @@ columndeeply
|
|||
DoubleCouponDay
|
||||
Fabi019
|
||||
GautamMKGarg
|
||||
Grub4K
|
||||
itachi-19
|
||||
jeroenj
|
||||
josanabr
|
||||
|
@ -357,3 +356,335 @@ SG5
|
|||
the-marenga
|
||||
tkgmomosheep
|
||||
vitkhab
|
||||
glensc
|
||||
synthpop123
|
||||
tntmod54321
|
||||
milkknife
|
||||
Bnyro
|
||||
CapacitorSet
|
||||
stelcodes
|
||||
skbeh
|
||||
muddi900
|
||||
digitall
|
||||
chengzhicn
|
||||
mexus
|
||||
JChris246
|
||||
redraskal
|
||||
Spicadox
|
||||
barsnick
|
||||
docbender
|
||||
KurtBestor
|
||||
Chrissi2812
|
||||
FrederikNS
|
||||
gschizas
|
||||
JC-Chung
|
||||
mzhou
|
||||
OndrejBakan
|
||||
ab4cbef
|
||||
aionescu
|
||||
amra
|
||||
ByteDream
|
||||
carusocr
|
||||
chexxor
|
||||
felixonmars
|
||||
FrankZ85
|
||||
FriedrichRehren
|
||||
gregsadetsky
|
||||
LeoniePhiline
|
||||
LowSuggestion912
|
||||
Matumo
|
||||
OIRNOIR
|
||||
OMEGARAZER
|
||||
oxamun
|
||||
pmitchell86
|
||||
qbnu
|
||||
qulaz
|
||||
rebane2001
|
||||
road-master
|
||||
rohieb
|
||||
sdht0
|
||||
seproDev
|
||||
Hill-98
|
||||
LXYan2333
|
||||
mushbite
|
||||
venkata-krishnas
|
||||
7vlad7
|
||||
alexklapheke
|
||||
arobase-che
|
||||
bepvte
|
||||
bergoid
|
||||
blmarket
|
||||
brandon-dacrib
|
||||
c-basalt
|
||||
CoryTibbettsDev
|
||||
Cyberes
|
||||
D0LLYNH0
|
||||
danog
|
||||
DataGhost
|
||||
falbrechtskirchinger
|
||||
foreignBlade
|
||||
garret1317
|
||||
hasezoey
|
||||
hoaluvn
|
||||
ItzMaxTV
|
||||
ivanskodje
|
||||
jo-nike
|
||||
kangalio
|
||||
linsui
|
||||
makew0rld
|
||||
menschel
|
||||
mikf
|
||||
mrscrapy
|
||||
NDagestad
|
||||
Neurognostic
|
||||
NextFire
|
||||
nick-cd
|
||||
permunkle
|
||||
pzhlkj6612
|
||||
ringus1
|
||||
rjy
|
||||
Schmoaaaaah
|
||||
sjthespian
|
||||
theperfectpunk
|
||||
toomyzoom
|
||||
truedread
|
||||
TxI5
|
||||
unbeatable-101
|
||||
vampirefrog
|
||||
vidiot720
|
||||
viktor-enzell
|
||||
zhgwn
|
||||
barthelmannk
|
||||
berkanteber
|
||||
OverlordQ
|
||||
rexlambert22
|
||||
Ti4eeT4e
|
||||
AmanSal1
|
||||
bbilly1
|
||||
meliber
|
||||
nnoboa
|
||||
rdamas
|
||||
RfadnjdExt
|
||||
urectanc
|
||||
nao20010128nao/Lesmiscore
|
||||
04-pasha-04
|
||||
aaruni96
|
||||
aky-01
|
||||
AmirAflak
|
||||
ApoorvShah111
|
||||
at-wat
|
||||
davinkevin
|
||||
demon071
|
||||
denhotte
|
||||
FinnRG
|
||||
fireattack
|
||||
Frankgoji
|
||||
GD-Slime
|
||||
hatsomatt
|
||||
ifan-t
|
||||
kshitiz305
|
||||
kylegustavo
|
||||
mabdelfattah
|
||||
nathantouze
|
||||
niemands
|
||||
Rajeshwaran2001
|
||||
RedDeffender
|
||||
Rohxn16
|
||||
sb0stn
|
||||
SevenLives
|
||||
simon300000
|
||||
snixon
|
||||
soundchaser128
|
||||
szabyg
|
||||
trainman261
|
||||
trislee
|
||||
wader
|
||||
Yalab7
|
||||
zhallgato
|
||||
zhong-yiyu
|
||||
Zprokkel
|
||||
AS6939
|
||||
drzraf
|
||||
handlerug
|
||||
jiru
|
||||
madewokherd
|
||||
xofe
|
||||
awalgarg
|
||||
midnightveil
|
||||
naginatana
|
||||
Riteo
|
||||
1100101
|
||||
aniolpages
|
||||
bartbroere
|
||||
CrendKing
|
||||
Esokrates
|
||||
HitomaruKonpaku
|
||||
LoserFox
|
||||
peci1
|
||||
saintliao
|
||||
shubhexists
|
||||
SirElderling
|
||||
almx
|
||||
elivinsky
|
||||
starius
|
||||
TravisDupes
|
||||
amir16yp
|
||||
Fymyte
|
||||
Ganesh910
|
||||
hashFactory
|
||||
kclauhk
|
||||
Kyraminol
|
||||
lstrojny
|
||||
middlingphys
|
||||
NickCis
|
||||
nicodato
|
||||
prettykool
|
||||
S-Aarab
|
||||
sonmezberkay
|
||||
TSRBerry
|
||||
114514ns
|
||||
agibson-fl
|
||||
alard
|
||||
alien-developers
|
||||
antonkesy
|
||||
ArnauvGilotra
|
||||
Arthurszzz
|
||||
Bibhav48
|
||||
Bl4Cc4t
|
||||
boredzo
|
||||
Caesim404
|
||||
chkuendig
|
||||
chtk
|
||||
Danish-H
|
||||
dasidiot
|
||||
diman8
|
||||
divStar
|
||||
DmitryScaletta
|
||||
feederbox826
|
||||
gmes78
|
||||
gonzalezjo
|
||||
hui1601
|
||||
infanf
|
||||
jazz1611
|
||||
jingtra
|
||||
jkmartindale
|
||||
johnvictorfs
|
||||
llistochek
|
||||
marcdumais
|
||||
martinxyz
|
||||
michal-repo
|
||||
mrmedieval
|
||||
nbr23
|
||||
Nicals
|
||||
Noor-5
|
||||
NurTasin
|
||||
pompos02
|
||||
Pranaxcau
|
||||
pwaldhauer
|
||||
RaduManole
|
||||
RalphORama
|
||||
rrgomes
|
||||
ruiminggu
|
||||
rvsit
|
||||
sefidel
|
||||
shmohawk
|
||||
Snack-X
|
||||
src-tinkerer
|
||||
stilor
|
||||
syntaxsurge
|
||||
t-nil
|
||||
ufukk
|
||||
vista-narvas
|
||||
x11x
|
||||
xpadev-net
|
||||
Xpl0itU
|
||||
YoshichikaAAA
|
||||
zhijinwuu
|
||||
alb
|
||||
hruzgar
|
||||
kasper93
|
||||
leoheitmannruiz
|
||||
luiso1979
|
||||
nipotan
|
||||
Offert4324
|
||||
sta1us
|
||||
Tomoka1
|
||||
trwstin
|
||||
alexhuot1
|
||||
clienthax
|
||||
DaPotato69
|
||||
emqi
|
||||
hugohaa
|
||||
imanoreotwe
|
||||
JakeFinley96
|
||||
lostfictions
|
||||
minamotorin
|
||||
ocococococ
|
||||
Podiumnoche
|
||||
RasmusAntons
|
||||
roeniss
|
||||
shoxie007
|
||||
Szpachlarz
|
||||
The-MAGI
|
||||
TuxCoder
|
||||
voidful
|
||||
vtexier
|
||||
WyohKnott
|
||||
trueauracoral
|
||||
ASertacAkkaya
|
||||
axpauls
|
||||
chilinux
|
||||
hafeoz
|
||||
JSubelj
|
||||
jucor
|
||||
megumintyan
|
||||
mgedmin
|
||||
Niluge-KiWi
|
||||
peisenwang
|
||||
TheZ3ro
|
||||
tippfehlr
|
||||
varunchopra
|
||||
DrakoCpp
|
||||
PatrykMis
|
||||
DinhHuy2010
|
||||
exterrestris
|
||||
harbhim
|
||||
LeSuisse
|
||||
DunnesH
|
||||
iancmy
|
||||
mokrueger
|
||||
luvyana
|
||||
szantnerb
|
||||
hugepower
|
||||
scribblemaniac
|
||||
Codenade
|
||||
Demon000
|
||||
Deukhoofd
|
||||
grqz
|
||||
hibes
|
||||
Khaoklong51
|
||||
kieraneglin
|
||||
lengzuo
|
||||
naglis
|
||||
ndyanx
|
||||
otovalek
|
||||
quad
|
||||
rakslice
|
||||
sahilsinghss73
|
||||
tony-hn
|
||||
xingchensong
|
||||
BallzCrasher
|
||||
coreywright
|
||||
eric321
|
||||
poyhen
|
||||
tetra-fox
|
||||
444995
|
||||
63427083
|
||||
allendema
|
||||
DarkZeros
|
||||
DTrombett
|
||||
imranh2
|
||||
KarboniteKream
|
||||
mikkovedru
|
||||
pktiuk
|
||||
rubyevadestaxes
|
||||
|
|
1968
Changelog.md
1968
Changelog.md
File diff suppressed because it is too large
Load diff
|
@ -8,6 +8,7 @@ # Collaborators
|
|||
## [pukkandan](https://github.com/pukkandan)
|
||||
|
||||
[![ko-fi](https://img.shields.io/badge/_-Ko--fi-red.svg?logo=kofi&labelColor=555555&style=for-the-badge)](https://ko-fi.com/pukkandan)
|
||||
[![gh-sponsor](https://img.shields.io/badge/_-Github-white.svg?logo=github&labelColor=555555&style=for-the-badge)](https://github.com/sponsors/pukkandan)
|
||||
|
||||
* Owner of the fork
|
||||
|
||||
|
@ -25,8 +26,10 @@ ## [shirt](https://github.com/shirt-dev)
|
|||
|
||||
## [coletdjnz](https://github.com/coletdjnz)
|
||||
|
||||
[![gh-sponsor](https://img.shields.io/badge/_-Sponsor-red.svg?logo=githubsponsors&labelColor=555555&style=for-the-badge)](https://github.com/sponsors/coletdjnz)
|
||||
[![gh-sponsor](https://img.shields.io/badge/_-Github-white.svg?logo=github&labelColor=555555&style=for-the-badge)](https://github.com/sponsors/coletdjnz)
|
||||
|
||||
* Improved plugin architecture
|
||||
* Rewrote the networking infrastructure, implemented support for `requests`
|
||||
* YouTube improvements including: age-gate bypass, private playlists, multiple-clients (to avoid throttling) and a lot of under-the-hood improvements
|
||||
* Added support for new websites YoutubeWebArchive, MainStreaming, PRX, nzherald, Mediaklikk, StarTV etc
|
||||
* Improved/fixed support for Patreon, panopto, gfycat, itv, pbs, SouthParkDE etc
|
||||
|
@ -42,11 +45,26 @@ ## [Ashish0804](https://github.com/Ashish0804) <sub><sup>[Inactive]</sup></sub>
|
|||
* Improved/fixed support for HiDive, HotStar, Hungama, LBRY, LinkedInLearning, Mxplayer, SonyLiv, TV2, Vimeo, VLive etc
|
||||
|
||||
|
||||
## [Lesmiscore](https://github.com/Lesmiscore) (nao20010128nao)
|
||||
## [bashonly](https://github.com/bashonly)
|
||||
|
||||
**Bitcoin**: bc1qfd02r007cutfdjwjmyy9w23rjvtls6ncve7r3s
|
||||
**Monacoin**: mona1q3tf7dzvshrhfe3md379xtvt2n22duhglv5dskr
|
||||
* `--update-to`, self-updater rewrite, automated/nightly/master releases
|
||||
* `--cookies-from-browser` support for Firefox containers, external downloader cookie handling overhaul
|
||||
* Added support for new websites like Dacast, Kick, NBCStations, Triller, VideoKen, Weverse, WrestleUniverse etc
|
||||
* Improved/fixed support for Anvato, Brightcove, Reddit, SlidesLive, TikTok, Twitter, Vimeo etc
|
||||
|
||||
* Download live from start to end for YouTube
|
||||
* Added support for new websites AbemaTV, mildom, PixivSketch, skeb, radiko, voicy, mirrativ, openrec, whowatch, damtomo, 17.live, mixch etc
|
||||
* Improved/fixed support for fc2, YahooJapanNews, tver, iwara etc
|
||||
|
||||
## [Grub4K](https://github.com/Grub4K)
|
||||
|
||||
[![gh-sponsor](https://img.shields.io/badge/_-Github-white.svg?logo=github&labelColor=555555&style=for-the-badge)](https://github.com/sponsors/Grub4K) [![ko-fi](https://img.shields.io/badge/_-Ko--fi-red.svg?logo=kofi&labelColor=555555&style=for-the-badge)](https://ko-fi.com/Grub4K)
|
||||
|
||||
* `--update-to`, self-updater rewrite, automated/nightly/master releases
|
||||
* Reworked internals like `traverse_obj`, various core refactors and bugs fixes
|
||||
* Implemented proper progress reporting for parallel downloads
|
||||
* Improved/fixed/added Bundestag, crunchyroll, pr0gramm, Twitter, WrestleUniverse etc
|
||||
|
||||
|
||||
## [sepro](https://github.com/seproDev)
|
||||
|
||||
* UX improvements: Warn when ffmpeg is missing, warn when double-clicking exe
|
||||
* Code cleanup: Remove dead extractors, mark extractors as broken, enable/apply ruff rules
|
||||
* Improved/fixed/added ArdMediathek, DRTV, Floatplane, MagentaMusik, Naver, Nebula, OnDemandKorea, Vbox7 etc
|
||||
|
|
10
MANIFEST.in
10
MANIFEST.in
|
@ -1,10 +0,0 @@
|
|||
include AUTHORS
|
||||
include Changelog.md
|
||||
include LICENSE
|
||||
include README.md
|
||||
include completions/*/*
|
||||
include supportedsites.md
|
||||
include yt-dlp.1
|
||||
include requirements.txt
|
||||
recursive-include devscripts *
|
||||
recursive-include test *
|
90
Makefile
90
Makefile
|
@ -2,29 +2,32 @@ all: lazy-extractors yt-dlp doc pypi-files
|
|||
clean: clean-test clean-dist
|
||||
clean-all: clean clean-cache
|
||||
completions: completion-bash completion-fish completion-zsh
|
||||
doc: README.md CONTRIBUTING.md issuetemplates supportedsites
|
||||
doc: README.md CONTRIBUTING.md CONTRIBUTORS issuetemplates supportedsites
|
||||
ot: offlinetest
|
||||
tar: yt-dlp.tar.gz
|
||||
|
||||
# Keep this list in sync with MANIFEST.in
|
||||
# Keep this list in sync with pyproject.toml includes/artifacts
|
||||
# intended use: when building a source distribution,
|
||||
# make pypi-files && python setup.py sdist
|
||||
# make pypi-files && python3 -m build -sn .
|
||||
pypi-files: AUTHORS Changelog.md LICENSE README.md README.txt supportedsites \
|
||||
completions yt-dlp.1 requirements.txt setup.cfg devscripts/* test/*
|
||||
completions yt-dlp.1 pyproject.toml setup.cfg devscripts/* test/*
|
||||
|
||||
.PHONY: all clean install test tar pypi-files completions ot offlinetest codetest supportedsites
|
||||
.PHONY: all clean clean-all clean-test clean-dist clean-cache \
|
||||
completions completion-bash completion-fish completion-zsh \
|
||||
doc issuetemplates supportedsites ot offlinetest codetest test \
|
||||
tar pypi-files lazy-extractors install uninstall
|
||||
|
||||
clean-test:
|
||||
rm -rf test/testdata/sigs/player-*.js tmp/ *.annotations.xml *.aria2 *.description *.dump *.frag \
|
||||
*.frag.aria2 *.frag.urls *.info.json *.live_chat.json *.meta *.part* *.tmp *.temp *.unknown_video *.ytdl \
|
||||
*.3gp *.ape *.ass *.avi *.desktop *.f4v *.flac *.flv *.jpeg *.jpg *.m4a *.m4v *.mhtml *.mkv *.mov *.mp3 *.mp4 \
|
||||
*.mpga *.oga *.ogg *.opus *.png *.sbv *.srt *.swf *.swp *.tt *.ttml *.url *.vtt *.wav *.webloc *.webm *.webp
|
||||
*.3gp *.ape *.ass *.avi *.desktop *.f4v *.flac *.flv *.gif *.jpeg *.jpg *.lrc *.m4a *.m4v *.mhtml *.mkv *.mov *.mp3 *.mp4 \
|
||||
*.mpg *.mpga *.oga *.ogg *.opus *.png *.sbv *.srt *.ssa *.swf *.tt *.ttml *.url *.vtt *.wav *.webloc *.webm *.webp
|
||||
clean-dist:
|
||||
rm -rf yt-dlp.1.temp.md yt-dlp.1 README.txt MANIFEST build/ dist/ .coverage cover/ yt-dlp.tar.gz completions/ \
|
||||
yt_dlp/extractor/lazy_extractors.py *.spec CONTRIBUTING.md.tmp yt-dlp yt-dlp.exe yt_dlp.egg-info/ AUTHORS .mailmap
|
||||
yt_dlp/extractor/lazy_extractors.py *.spec CONTRIBUTING.md.tmp yt-dlp yt-dlp.exe yt_dlp.egg-info/ AUTHORS
|
||||
clean-cache:
|
||||
find . \( \
|
||||
-type d -name .pytest_cache -o -type d -name __pycache__ -o -name "*.pyc" -o -name "*.class" \
|
||||
-type d -name ".*_cache" -o -type d -name __pycache__ -o -name "*.pyc" -o -name "*.class" \
|
||||
\) -prune -exec rm -rf {} \;
|
||||
|
||||
completion-bash: completions/bash/yt-dlp
|
||||
|
@ -37,12 +40,15 @@ BINDIR ?= $(PREFIX)/bin
|
|||
MANDIR ?= $(PREFIX)/man
|
||||
SHAREDIR ?= $(PREFIX)/share
|
||||
PYTHON ?= /usr/bin/env python3
|
||||
GNUTAR ?= tar
|
||||
|
||||
# set SYSCONFDIR to /etc if PREFIX=/usr or PREFIX=/usr/local
|
||||
SYSCONFDIR = $(shell if [ $(PREFIX) = /usr -o $(PREFIX) = /usr/local ]; then echo /etc; else echo $(PREFIX)/etc; fi)
|
||||
|
||||
# set markdown input format to "markdown-smart" for pandoc version 2 and to "markdown" for pandoc prior to version 2
|
||||
MARKDOWN = $(shell if [ `pandoc -v | head -n1 | cut -d" " -f2 | head -c1` = "2" ]; then echo markdown-smart; else echo markdown; fi)
|
||||
# set markdown input format to "markdown-smart" for pandoc version 2+ and to "markdown" for pandoc prior to version 2
|
||||
PANDOC_VERSION_CMD = pandoc -v 2>/dev/null | head -n1 | cut -d' ' -f2 | head -c1
|
||||
PANDOC_VERSION != $(PANDOC_VERSION_CMD)
|
||||
PANDOC_VERSION ?= $(shell $(PANDOC_VERSION_CMD))
|
||||
MARKDOWN_CMD = if [ "$(PANDOC_VERSION)" = "1" -o "$(PANDOC_VERSION)" = "0" ]; then echo markdown; else echo markdown-smart; fi
|
||||
MARKDOWN != $(MARKDOWN_CMD)
|
||||
MARKDOWN ?= $(shell $(MARKDOWN_CMD))
|
||||
|
||||
install: lazy-extractors yt-dlp yt-dlp.1 completions
|
||||
mkdir -p $(DESTDIR)$(BINDIR)
|
||||
|
@ -64,33 +70,38 @@ uninstall:
|
|||
rm -f $(DESTDIR)$(SHAREDIR)/fish/vendor_completions.d/yt-dlp.fish
|
||||
|
||||
codetest:
|
||||
flake8 .
|
||||
ruff check .
|
||||
autopep8 --diff .
|
||||
|
||||
test:
|
||||
$(PYTHON) -m pytest
|
||||
$(PYTHON) -m pytest -Werror
|
||||
$(MAKE) codetest
|
||||
|
||||
offlinetest: codetest
|
||||
$(PYTHON) -m pytest -k "not download"
|
||||
$(PYTHON) -m pytest -Werror -m "not download"
|
||||
|
||||
# XXX: This is hard to maintain
|
||||
CODE_FOLDERS = yt_dlp yt_dlp/downloader yt_dlp/extractor yt_dlp/postprocessor yt_dlp/compat
|
||||
yt-dlp: yt_dlp/*.py yt_dlp/*/*.py
|
||||
CODE_FOLDERS_CMD = find yt_dlp -type f -name '__init__.py' | sed 's,/__init__.py,,' | grep -v '/__' | sort
|
||||
CODE_FOLDERS != $(CODE_FOLDERS_CMD)
|
||||
CODE_FOLDERS ?= $(shell $(CODE_FOLDERS_CMD))
|
||||
CODE_FILES_CMD = for f in $(CODE_FOLDERS) ; do echo "$$f" | sed 's,$$,/*.py,' ; done
|
||||
CODE_FILES != $(CODE_FILES_CMD)
|
||||
CODE_FILES ?= $(shell $(CODE_FILES_CMD))
|
||||
yt-dlp: $(CODE_FILES)
|
||||
mkdir -p zip
|
||||
for d in $(CODE_FOLDERS) ; do \
|
||||
mkdir -p zip/$$d ;\
|
||||
cp -pPR $$d/*.py zip/$$d/ ;\
|
||||
done
|
||||
touch -t 200001010101 zip/yt_dlp/*.py zip/yt_dlp/*/*.py
|
||||
(cd zip && touch -t 200001010101 $(CODE_FILES))
|
||||
mv zip/yt_dlp/__main__.py zip/
|
||||
cd zip ; zip -q ../yt-dlp yt_dlp/*.py yt_dlp/*/*.py __main__.py
|
||||
(cd zip && zip -q ../yt-dlp $(CODE_FILES) __main__.py)
|
||||
rm -rf zip
|
||||
echo '#!$(PYTHON)' > yt-dlp
|
||||
cat yt-dlp.zip >> yt-dlp
|
||||
rm yt-dlp.zip
|
||||
chmod a+x yt-dlp
|
||||
|
||||
README.md: yt_dlp/*.py yt_dlp/*/*.py devscripts/make_readme.py
|
||||
README.md: $(CODE_FILES) devscripts/make_readme.py
|
||||
COLUMNS=80 $(PYTHON) yt_dlp/__main__.py --ignore-config --help | $(PYTHON) devscripts/make_readme.py
|
||||
|
||||
CONTRIBUTING.md: README.md devscripts/make_contributing.py
|
||||
|
@ -115,41 +126,48 @@ yt-dlp.1: README.md devscripts/prepare_manpage.py
|
|||
pandoc -s -f $(MARKDOWN) -t man yt-dlp.1.temp.md -o yt-dlp.1
|
||||
rm -f yt-dlp.1.temp.md
|
||||
|
||||
completions/bash/yt-dlp: yt_dlp/*.py yt_dlp/*/*.py devscripts/bash-completion.in
|
||||
completions/bash/yt-dlp: $(CODE_FILES) devscripts/bash-completion.in
|
||||
mkdir -p completions/bash
|
||||
$(PYTHON) devscripts/bash-completion.py
|
||||
|
||||
completions/zsh/_yt-dlp: yt_dlp/*.py yt_dlp/*/*.py devscripts/zsh-completion.in
|
||||
completions/zsh/_yt-dlp: $(CODE_FILES) devscripts/zsh-completion.in
|
||||
mkdir -p completions/zsh
|
||||
$(PYTHON) devscripts/zsh-completion.py
|
||||
|
||||
completions/fish/yt-dlp.fish: yt_dlp/*.py yt_dlp/*/*.py devscripts/fish-completion.in
|
||||
completions/fish/yt-dlp.fish: $(CODE_FILES) devscripts/fish-completion.in
|
||||
mkdir -p completions/fish
|
||||
$(PYTHON) devscripts/fish-completion.py
|
||||
|
||||
_EXTRACTOR_FILES = $(shell find yt_dlp/extractor -name '*.py' -and -not -name 'lazy_extractors.py')
|
||||
_EXTRACTOR_FILES_CMD = find yt_dlp/extractor -name '*.py' -and -not -name 'lazy_extractors.py'
|
||||
_EXTRACTOR_FILES != $(_EXTRACTOR_FILES_CMD)
|
||||
_EXTRACTOR_FILES ?= $(shell $(_EXTRACTOR_FILES_CMD))
|
||||
yt_dlp/extractor/lazy_extractors.py: devscripts/make_lazy_extractors.py devscripts/lazy_load_template.py $(_EXTRACTOR_FILES)
|
||||
$(PYTHON) devscripts/make_lazy_extractors.py $@
|
||||
|
||||
yt-dlp.tar.gz: all
|
||||
@tar -czf yt-dlp.tar.gz --transform "s|^|yt-dlp/|" --owner 0 --group 0 \
|
||||
@$(GNUTAR) -czf yt-dlp.tar.gz --transform "s|^|yt-dlp/|" --owner 0 --group 0 \
|
||||
--exclude '*.DS_Store' \
|
||||
--exclude '*.kate-swp' \
|
||||
--exclude '*.pyc' \
|
||||
--exclude '*.pyo' \
|
||||
--exclude '*~' \
|
||||
--exclude '__pycache__' \
|
||||
--exclude '.pytest_cache' \
|
||||
--exclude '.*_cache' \
|
||||
--exclude '.git' \
|
||||
-- \
|
||||
README.md supportedsites.md Changelog.md LICENSE \
|
||||
CONTRIBUTING.md Collaborators.md CONTRIBUTORS AUTHORS \
|
||||
Makefile MANIFEST.in yt-dlp.1 README.txt completions \
|
||||
setup.py setup.cfg yt-dlp yt_dlp requirements.txt \
|
||||
devscripts test
|
||||
Makefile yt-dlp.1 README.txt completions .gitignore \
|
||||
setup.cfg yt-dlp yt_dlp pyproject.toml devscripts test
|
||||
|
||||
AUTHORS: .mailmap
|
||||
git shortlog -s -n | cut -f2 | sort > AUTHORS
|
||||
AUTHORS: Changelog.md
|
||||
@if [ -d '.git' ] && command -v git > /dev/null ; then \
|
||||
echo 'Generating $@ from git commit history' ; \
|
||||
git shortlog -s -n HEAD | cut -f2 | sort > $@ ; \
|
||||
fi
|
||||
|
||||
.mailmap:
|
||||
git shortlog -s -e -n | awk '!(out[$$NF]++) { $$1="";sub(/^[ \t]+/,""); print}' > .mailmap
|
||||
CONTRIBUTORS: Changelog.md
|
||||
@if [ -d '.git' ] && command -v git > /dev/null ; then \
|
||||
echo 'Updating $@ from git commit history' ; \
|
||||
$(PYTHON) devscripts/make_changelog.py -v -c > /dev/null ; \
|
||||
fi
|
||||
|
|
0
bundle/__init__.py
Normal file
0
bundle/__init__.py
Normal file
10
bundle/docker/compose.yml
Normal file
10
bundle/docker/compose.yml
Normal file
|
@ -0,0 +1,10 @@
|
|||
services:
|
||||
static:
|
||||
build: static
|
||||
environment:
|
||||
channel: ${channel}
|
||||
origin: ${origin}
|
||||
version: ${version}
|
||||
volumes:
|
||||
- ~/build:/build
|
||||
- ../..:/yt-dlp
|
21
bundle/docker/static/Dockerfile
Normal file
21
bundle/docker/static/Dockerfile
Normal file
|
@ -0,0 +1,21 @@
|
|||
FROM alpine:3.19 as base
|
||||
|
||||
RUN apk --update add --no-cache \
|
||||
build-base \
|
||||
python3 \
|
||||
pipx \
|
||||
;
|
||||
|
||||
RUN pipx install pyinstaller
|
||||
# Requires above step to prepare the shared venv
|
||||
RUN ~/.local/share/pipx/shared/bin/python -m pip install -U wheel
|
||||
RUN apk --update add --no-cache \
|
||||
scons \
|
||||
patchelf \
|
||||
binutils \
|
||||
;
|
||||
RUN pipx install staticx
|
||||
|
||||
WORKDIR /yt-dlp
|
||||
COPY entrypoint.sh /entrypoint.sh
|
||||
ENTRYPOINT /entrypoint.sh
|
13
bundle/docker/static/entrypoint.sh
Executable file
13
bundle/docker/static/entrypoint.sh
Executable file
|
@ -0,0 +1,13 @@
|
|||
#!/bin/ash
|
||||
set -e
|
||||
|
||||
source ~/.local/share/pipx/venvs/pyinstaller/bin/activate
|
||||
python -m devscripts.install_deps --include secretstorage --include curl-cffi
|
||||
python -m devscripts.make_lazy_extractors
|
||||
python devscripts/update-version.py -c "${channel}" -r "${origin}" "${version}"
|
||||
python -m bundle.pyinstaller
|
||||
deactivate
|
||||
|
||||
source ~/.local/share/pipx/venvs/staticx/bin/activate
|
||||
staticx /yt-dlp/dist/yt-dlp_linux /build/yt-dlp_linux
|
||||
deactivate
|
42
pyinst.py → bundle/pyinstaller.py
Normal file → Executable file
42
pyinst.py → bundle/pyinstaller.py
Normal file → Executable file
|
@ -4,7 +4,7 @@
|
|||
import os
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.abspath(__file__)))
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
import platform
|
||||
|
||||
|
@ -37,7 +37,7 @@ def main():
|
|||
'--icon=devscripts/logo.ico',
|
||||
'--upx-exclude=vcruntime140.dll',
|
||||
'--noconfirm',
|
||||
*dependency_options(),
|
||||
'--additional-hooks-dir=yt_dlp/__pyinstaller',
|
||||
*opts,
|
||||
'yt_dlp/__main__.py',
|
||||
]
|
||||
|
@ -68,7 +68,7 @@ def exe(onedir):
|
|||
'dist/',
|
||||
onedir and f'{name}/',
|
||||
name,
|
||||
OS_NAME == 'win32' and '.exe'
|
||||
OS_NAME == 'win32' and '.exe',
|
||||
)))
|
||||
|
||||
|
||||
|
@ -77,30 +77,6 @@ def version_to_list(version):
|
|||
return list(map(int, version_list)) + [0] * (4 - len(version_list))
|
||||
|
||||
|
||||
def dependency_options():
|
||||
# Due to the current implementation, these are auto-detected, but explicitly add them just in case
|
||||
dependencies = [pycryptodome_module(), 'mutagen', 'brotli', 'certifi', 'websockets']
|
||||
excluded_modules = ('youtube_dl', 'youtube_dlc', 'test', 'ytdlp_plugins', 'devscripts')
|
||||
|
||||
yield from (f'--hidden-import={module}' for module in dependencies)
|
||||
yield '--collect-submodules=websockets'
|
||||
yield from (f'--exclude-module={module}' for module in excluded_modules)
|
||||
|
||||
|
||||
def pycryptodome_module():
|
||||
try:
|
||||
import Cryptodome # noqa: F401
|
||||
except ImportError:
|
||||
try:
|
||||
import Crypto # noqa: F401
|
||||
print('WARNING: Using Crypto since Cryptodome is not available. '
|
||||
'Install with: pip install pycryptodomex', file=sys.stderr)
|
||||
return 'Crypto'
|
||||
except ImportError:
|
||||
pass
|
||||
return 'Cryptodome'
|
||||
|
||||
|
||||
def set_version_info(exe, version):
|
||||
if OS_NAME == 'win32':
|
||||
windows_set_version(exe, version)
|
||||
|
@ -109,7 +85,6 @@ def set_version_info(exe, version):
|
|||
def windows_set_version(exe, version):
|
||||
from PyInstaller.utils.win32.versioninfo import (
|
||||
FixedFileInfo,
|
||||
SetVersion,
|
||||
StringFileInfo,
|
||||
StringStruct,
|
||||
StringTable,
|
||||
|
@ -118,6 +93,11 @@ def windows_set_version(exe, version):
|
|||
VSVersionInfo,
|
||||
)
|
||||
|
||||
try:
|
||||
from PyInstaller.utils.win32.versioninfo import SetVersion
|
||||
except ImportError: # Pyinstaller >= 5.8
|
||||
from PyInstaller.utils.win32.versioninfo import write_version_info_to_executable as SetVersion
|
||||
|
||||
version_list = version_to_list(version)
|
||||
suffix = MACHINE and f'_{MACHINE}'
|
||||
SetVersion(exe, VSVersionInfo(
|
||||
|
@ -133,7 +113,7 @@ def windows_set_version(exe, version):
|
|||
),
|
||||
kids=[
|
||||
StringFileInfo([StringTable('040904B0', [
|
||||
StringStruct('Comments', 'yt-dlp%s Command Line Interface' % suffix),
|
||||
StringStruct('Comments', f'yt-dlp{suffix} Command Line Interface'),
|
||||
StringStruct('CompanyName', 'https://github.com/yt-dlp'),
|
||||
StringStruct('FileDescription', 'yt-dlp%s' % (MACHINE and f' ({MACHINE})')),
|
||||
StringStruct('FileVersion', version),
|
||||
|
@ -143,8 +123,8 @@ def windows_set_version(exe, version):
|
|||
StringStruct('ProductName', f'yt-dlp{suffix}'),
|
||||
StringStruct(
|
||||
'ProductVersion', f'{version}{suffix} on Python {platform.python_version()}'),
|
||||
])]), VarFileInfo([VarStruct('Translation', [0, 1200])])
|
||||
]
|
||||
])]), VarFileInfo([VarStruct('Translation', [0, 1200])]),
|
||||
],
|
||||
))
|
||||
|
||||
|
Binary file not shown.
Binary file not shown.
|
@ -1 +0,0 @@
|
|||
# Empty file needed to make devscripts.utils properly importable from outside
|
|
@ -9,8 +9,8 @@
|
|||
|
||||
import yt_dlp
|
||||
|
||||
BASH_COMPLETION_FILE = "completions/bash/yt-dlp"
|
||||
BASH_COMPLETION_TEMPLATE = "devscripts/bash-completion.in"
|
||||
BASH_COMPLETION_FILE = 'completions/bash/yt-dlp'
|
||||
BASH_COMPLETION_TEMPLATE = 'devscripts/bash-completion.in'
|
||||
|
||||
|
||||
def build_completion(opt_parser):
|
||||
|
@ -21,9 +21,9 @@ def build_completion(opt_parser):
|
|||
opts_flag.append(option.get_opt_string())
|
||||
with open(BASH_COMPLETION_TEMPLATE) as f:
|
||||
template = f.read()
|
||||
with open(BASH_COMPLETION_FILE, "w") as f:
|
||||
with open(BASH_COMPLETION_FILE, 'w') as f:
|
||||
# just using the special char
|
||||
filled_template = template.replace("{{flags}}", " ".join(opts_flag))
|
||||
filled_template = template.replace('{{flags}}', ' '.join(opts_flag))
|
||||
f.write(filled_template)
|
||||
|
||||
|
||||
|
|
220
devscripts/changelog_override.json
Normal file
220
devscripts/changelog_override.json
Normal file
|
@ -0,0 +1,220 @@
|
|||
[
|
||||
{
|
||||
"action": "add",
|
||||
"when": "29cb20bd563c02671b31dd840139e93dd37150a1",
|
||||
"short": "[priority] **A new release type has been added!**\n * [`nightly`](https://github.com/yt-dlp/yt-dlp/releases/tag/nightly) builds will be made after each push, containing the latest fixes (but also possibly bugs).\n * When using `--update`/`-U`, a release binary will only update to its current channel (either `stable` or `nightly`).\n * The `--update-to` option has been added allowing the user more control over program upgrades (or downgrades).\n * `--update-to` can change the release channel (`stable`, `nightly`) and also upgrade or downgrade to specific tags.\n * **Usage**: `--update-to CHANNEL`, `--update-to TAG`, `--update-to CHANNEL@TAG`"
|
||||
},
|
||||
{
|
||||
"action": "add",
|
||||
"when": "5038f6d713303e0967d002216e7a88652401c22a",
|
||||
"short": "[priority] **YouTube throttling fixes!**"
|
||||
},
|
||||
{
|
||||
"action": "remove",
|
||||
"when": "2e023649ea4e11151545a34dc1360c114981a236"
|
||||
},
|
||||
{
|
||||
"action": "add",
|
||||
"when": "01aba2519a0884ef17d5f85608dbd2a455577147",
|
||||
"short": "[priority] YouTube: Improved throttling and signature fixes"
|
||||
},
|
||||
{
|
||||
"action": "change",
|
||||
"when": "c86e433c35fe5da6cb29f3539eef97497f84ed38",
|
||||
"short": "[extractor/niconico:series] Fix extraction (#6898)",
|
||||
"authors": ["sqrtNOT"]
|
||||
},
|
||||
{
|
||||
"action": "change",
|
||||
"when": "69a40e4a7f6caa5662527ebd2f3c4e8aa02857a2",
|
||||
"short": "[extractor/youtube:music_search_url] Extract title (#7102)",
|
||||
"authors": ["kangalio"]
|
||||
},
|
||||
{
|
||||
"action": "change",
|
||||
"when": "8417f26b8a819cd7ffcd4e000ca3e45033e670fb",
|
||||
"short": "Add option `--color` (#6904)",
|
||||
"authors": ["Grub4K"]
|
||||
},
|
||||
{
|
||||
"action": "change",
|
||||
"when": "b4e0d75848e9447cee2cd3646ce54d4744a7ff56",
|
||||
"short": "Improve `--download-sections`\n - Support negative time-ranges\n - Add `*from-url` to obey time-ranges in URL",
|
||||
"authors": ["pukkandan"]
|
||||
},
|
||||
{
|
||||
"action": "change",
|
||||
"when": "1e75d97db21152acc764b30a688e516f04b8a142",
|
||||
"short": "[extractor/youtube] Add `ios` to default clients used\n - IOS is affected neither by 403 nor by nsig so helps mitigate them preemptively\n - IOS also has higher bit-rate 'premium' formats though they are not labeled as such",
|
||||
"authors": ["pukkandan"]
|
||||
},
|
||||
{
|
||||
"action": "change",
|
||||
"when": "f2ff0f6f1914b82d4a51681a72cc0828115dcb4a",
|
||||
"short": "[extractor/motherless] Add gallery support, fix groups (#7211)",
|
||||
"authors": ["rexlambert22", "Ti4eeT4e"]
|
||||
},
|
||||
{
|
||||
"action": "change",
|
||||
"when": "a4486bfc1dc7057efca9dd3fe70d7fa25c56f700",
|
||||
"short": "[misc] Revert \"Add automatic duplicate issue detection\"",
|
||||
"authors": ["pukkandan"]
|
||||
},
|
||||
{
|
||||
"action": "add",
|
||||
"when": "1ceb657bdd254ad961489e5060f2ccc7d556b729",
|
||||
"short": "[priority] Security: [[CVE-2023-35934](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2023-35934)] Fix [Cookie leak](https://github.com/yt-dlp/yt-dlp/security/advisories/GHSA-v8mc-9377-rwjj)\n - `--add-header Cookie:` is deprecated and auto-scoped to input URL domains\n - Cookies are scoped when passed to external downloaders\n - Add `cookies` field to info.json and deprecate `http_headers.Cookie`"
|
||||
},
|
||||
{
|
||||
"action": "change",
|
||||
"when": "b03fa7834579a01cc5fba48c0e73488a16683d48",
|
||||
"short": "[ie/twitter] Revert 92315c03774cfabb3a921884326beb4b981f786b",
|
||||
"authors": ["pukkandan"]
|
||||
},
|
||||
{
|
||||
"action": "change",
|
||||
"when": "fcd6a76adc49d5cd8783985c7ce35384b72e545f",
|
||||
"short": "[test] Add tests for socks proxies (#7908)",
|
||||
"authors": ["coletdjnz"]
|
||||
},
|
||||
{
|
||||
"action": "change",
|
||||
"when": "4bf912282a34b58b6b35d8f7e6be535770c89c76",
|
||||
"short": "[rh:urllib] Remove dot segments during URL normalization (#7662)",
|
||||
"authors": ["coletdjnz"]
|
||||
},
|
||||
{
|
||||
"action": "change",
|
||||
"when": "59e92b1f1833440bb2190f847eb735cf0f90bc85",
|
||||
"short": "[rh:urllib] Simplify gzip decoding (#7611)",
|
||||
"authors": ["Grub4K"]
|
||||
},
|
||||
{
|
||||
"action": "add",
|
||||
"when": "c1d71d0d9f41db5e4306c86af232f5f6220a130b",
|
||||
"short": "[priority] **The minimum *recommended* Python version has been raised to 3.8**\nSince Python 3.7 has reached end-of-life, support for it will be dropped soon. [Read more](https://github.com/yt-dlp/yt-dlp/issues/7803)"
|
||||
},
|
||||
{
|
||||
"action": "add",
|
||||
"when": "61bdf15fc7400601c3da1aa7a43917310a5bf391",
|
||||
"short": "[priority] Security: [[CVE-2023-40581](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2023-40581)] [Prevent RCE when using `--exec` with `%q` on Windows](https://github.com/yt-dlp/yt-dlp/security/advisories/GHSA-42h4-v29r-42qg)\n - The shell escape function is now using `\"\"` instead of `\\\"`.\n - `utils.Popen` has been patched to properly quote commands."
|
||||
},
|
||||
{
|
||||
"action": "change",
|
||||
"when": "8a8b54523addf46dfd50ef599761a81bc22362e6",
|
||||
"short": "[rh:requests] Add handler for `requests` HTTP library (#3668)\n\n\tAdds support for HTTPS proxies and persistent connections (keep-alive)",
|
||||
"authors": ["bashonly", "coletdjnz", "Grub4K"]
|
||||
},
|
||||
{
|
||||
"action": "add",
|
||||
"when": "1d03633c5a1621b9f3a756f0a4f9dc61fab3aeaa",
|
||||
"short": "[priority] **The release channels have been adjusted!**\n\t* [`master`](https://github.com/yt-dlp/yt-dlp-master-builds) builds are made after each push, containing the latest fixes (but also possibly bugs). This was previously the `nightly` channel.\n\t* [`nightly`](https://github.com/yt-dlp/yt-dlp-nightly-builds) builds are now made once a day, if there were any changes."
|
||||
},
|
||||
{
|
||||
"action": "add",
|
||||
"when": "f04b5bedad7b281bee9814686bba1762bae092eb",
|
||||
"short": "[priority] Security: [[CVE-2023-46121](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2023-46121)] Patch [Generic Extractor MITM Vulnerability via Arbitrary Proxy Injection](https://github.com/yt-dlp/yt-dlp/security/advisories/GHSA-3ch3-jhc6-5r8x)\n\t- Disallow smuggling of arbitrary `http_headers`; extractors now only use specific headers"
|
||||
},
|
||||
{
|
||||
"action": "change",
|
||||
"when": "15f22b4880b6b3f71f350c64d70976ae65b9f1ca",
|
||||
"short": "[webvtt] Allow spaces before newlines for CueBlock (#7681)",
|
||||
"authors": ["TSRBerry"]
|
||||
},
|
||||
{
|
||||
"action": "change",
|
||||
"when": "4ce57d3b873c2887814cbec03d029533e82f7db5",
|
||||
"short": "[ie] Support multi-period MPD streams (#6654)",
|
||||
"authors": ["alard", "pukkandan"]
|
||||
},
|
||||
{
|
||||
"action": "change",
|
||||
"when": "aa7e9ae4f48276bd5d0173966c77db9484f65a0a",
|
||||
"short": "[ie/xvideos] Support new URL format (#9502)",
|
||||
"authors": ["sta1us"]
|
||||
},
|
||||
{
|
||||
"action": "remove",
|
||||
"when": "22e4dfacb61f62dfbb3eb41b31c7b69ba1059b80"
|
||||
},
|
||||
{
|
||||
"action": "change",
|
||||
"when": "e3a3ed8a981d9395c4859b6ef56cd02bc3148db2",
|
||||
"short": "[cleanup:ie] No `from` stdlib imports in extractors",
|
||||
"authors": ["pukkandan"]
|
||||
},
|
||||
{
|
||||
"action": "add",
|
||||
"when": "9590cc6b4768e190183d7d071a6c78170889116a",
|
||||
"short": "[priority] Security: [[CVE-2024-22423](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2024-22423)] [Prevent RCE when using `--exec` with `%q` on Windows](https://github.com/yt-dlp/yt-dlp/security/advisories/GHSA-hjq6-52gw-2g7p)\n - The shell escape function now properly escapes `%`, `\\` and `\\n`.\n - `utils.Popen` has been patched accordingly."
|
||||
},
|
||||
{
|
||||
"action": "change",
|
||||
"when": "41ba4a808b597a3afed78c89675a30deb6844450",
|
||||
"short": "[ie/tiktok] Extract via mobile API only if extractor-arg is passed (#9938)",
|
||||
"authors": ["bashonly"]
|
||||
},
|
||||
{
|
||||
"action": "remove",
|
||||
"when": "6e36d17f404556f0e3a43f441c477a71a91877d9"
|
||||
},
|
||||
{
|
||||
"action": "change",
|
||||
"when": "beaf832c7a9d57833f365ce18f6115b88071b296",
|
||||
"short": "[ie/soundcloud] Add `formats` extractor-arg (#10004)",
|
||||
"authors": ["bashonly", "Grub4K"]
|
||||
},
|
||||
{
|
||||
"action": "change",
|
||||
"when": "5c019f6328ad40d66561eac3c4de0b3cd070d0f6",
|
||||
"short": "[cleanup] Misc (#9765)",
|
||||
"authors": ["bashonly", "Grub4K", "seproDev"]
|
||||
},
|
||||
{
|
||||
"action": "change",
|
||||
"when": "e6a22834df1776ec4e486526f6df2bf53cb7e06f",
|
||||
"short": "[ie/orf:on] Add `prefer_segments_playlist` extractor-arg (#10314)",
|
||||
"authors": ["seproDev"]
|
||||
},
|
||||
{
|
||||
"action": "add",
|
||||
"when": "6aaf96a3d6e7d0d426e97e11a2fcf52fda00e733",
|
||||
"short": "[priority] Security: [[CVE-2024-38519](https://nvd.nist.gov/vuln/detail/CVE-2024-38519)] [Properly sanitize file-extension to prevent file system modification and RCE](https://github.com/yt-dlp/yt-dlp/security/advisories/GHSA-79w7-vh3h-8g4j)\n - Unsafe extensions are now blocked from being downloaded"
|
||||
},
|
||||
{
|
||||
"action": "add",
|
||||
"when": "6075a029dba70a89675ae1250e7cdfd91f0eba41",
|
||||
"short": "[priority] Security: [[ie/douyutv] Do not use dangerous javascript source/URL](https://github.com/yt-dlp/yt-dlp/security/advisories/GHSA-3v33-3wmw-3785)\n - A dependency on potentially malicious third-party JavaScript code has been removed from the Douyu extractors"
|
||||
},
|
||||
{
|
||||
"action": "add",
|
||||
"when": "fb8b7f226d251e521a89b23c415e249e5b788e5c",
|
||||
"short": "[priority] **The minimum *recommended* Python version has been raised to 3.9**\nSince Python 3.8 will reach end-of-life in October 2024, support for it will be dropped soon. [Read more](https://github.com/yt-dlp/yt-dlp/issues/10086)"
|
||||
},
|
||||
{
|
||||
"action": "change",
|
||||
"when": "b31b81d85f00601710d4fac590c3e4efb4133283",
|
||||
"short": "[ci] Rerun failed tests (#11143)",
|
||||
"authors": ["Grub4K"]
|
||||
},
|
||||
{
|
||||
"action": "add",
|
||||
"when": "a886cf3e900f4a2ec00af705f883539269545609",
|
||||
"short": "[priority] **py2exe is no longer supported**\nThis release's `yt-dlp_min.exe` will be the last, and it's actually a PyInstaller-bundled executable so that yt-dlp users updating their py2exe build with `-U` will be automatically migrated. [Read more](https://github.com/yt-dlp/yt-dlp/issues/10087)"
|
||||
},
|
||||
{
|
||||
"action": "add",
|
||||
"when": "a886cf3e900f4a2ec00af705f883539269545609",
|
||||
"short": "[priority] **Following this release, yt-dlp's Python dependencies *must* be installed using the `default` group**\nIf you're installing yt-dlp with pip/pipx or requiring yt-dlp in your own Python project, you'll need to specify `yt-dlp[default]` if you want to also install yt-dlp's optional dependencies (which were previously included by default). [Read more](https://github.com/yt-dlp/yt-dlp/pull/11255)"
|
||||
},
|
||||
{
|
||||
"action": "add",
|
||||
"when": "87884f15580910e4e0fe0e1db73508debc657471",
|
||||
"short": "[priority] **Beginning with this release, yt-dlp's Python dependencies *must* be installed using the `default` group**\nIf you're installing yt-dlp with pip/pipx or requiring yt-dlp in your own Python project, you'll need to specify `yt-dlp[default]` if you want to also install yt-dlp's optional dependencies (which were previously included by default). [Read more](https://github.com/yt-dlp/yt-dlp/pull/11255)"
|
||||
},
|
||||
{
|
||||
"action": "add",
|
||||
"when": "d784464399b600ba9516bbcec6286f11d68974dd",
|
||||
"short": "[priority] **The minimum *required* Python version has been raised to 3.9**\nPython 3.8 reached its end-of-life on 2024.10.07, and yt-dlp has now removed support for it. As an unfortunate side effect, the official `yt-dlp.exe` and `yt-dlp_x86.exe` binaries are no longer supported on Windows 7. [Read more](https://github.com/yt-dlp/yt-dlp/issues/10086)"
|
||||
}
|
||||
]
|
96
devscripts/changelog_override.schema.json
Normal file
96
devscripts/changelog_override.schema.json
Normal file
|
@ -0,0 +1,96 @@
|
|||
{
|
||||
"$schema": "http://json-schema.org/draft/2020-12/schema",
|
||||
"type": "array",
|
||||
"uniqueItems": true,
|
||||
"items": {
|
||||
"type": "object",
|
||||
"oneOf": [
|
||||
{
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"action": {
|
||||
"enum": [
|
||||
"add"
|
||||
]
|
||||
},
|
||||
"when": {
|
||||
"type": "string",
|
||||
"pattern": "^([0-9a-f]{40}|\\d{4}\\.\\d{2}\\.\\d{2})$"
|
||||
},
|
||||
"hash": {
|
||||
"type": "string",
|
||||
"pattern": "^[0-9a-f]{40}$"
|
||||
},
|
||||
"short": {
|
||||
"type": "string"
|
||||
},
|
||||
"authors": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"action",
|
||||
"short"
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"action": {
|
||||
"enum": [
|
||||
"remove"
|
||||
]
|
||||
},
|
||||
"when": {
|
||||
"type": "string",
|
||||
"pattern": "^([0-9a-f]{40}|\\d{4}\\.\\d{2}\\.\\d{2})$"
|
||||
},
|
||||
"hash": {
|
||||
"type": "string",
|
||||
"pattern": "^[0-9a-f]{40}$"
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"action",
|
||||
"hash"
|
||||
]
|
||||
},
|
||||
{
|
||||
"type": "object",
|
||||
"properties": {
|
||||
"action": {
|
||||
"enum": [
|
||||
"change"
|
||||
]
|
||||
},
|
||||
"when": {
|
||||
"type": "string",
|
||||
"pattern": "^([0-9a-f]{40}|\\d{4}\\.\\d{2}\\.\\d{2})$"
|
||||
},
|
||||
"hash": {
|
||||
"type": "string",
|
||||
"pattern": "^[0-9a-f]{40}$"
|
||||
},
|
||||
"short": {
|
||||
"type": "string"
|
||||
},
|
||||
"authors": {
|
||||
"type": "array",
|
||||
"items": {
|
||||
"type": "string"
|
||||
}
|
||||
}
|
||||
},
|
||||
"required": [
|
||||
"action",
|
||||
"hash",
|
||||
"short",
|
||||
"authors"
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
50
devscripts/cli_to_api.py
Executable file
50
devscripts/cli_to_api.py
Executable file
|
@ -0,0 +1,50 @@
|
|||
#!/usr/bin/env python3
|
||||
|
||||
# Allow direct execution
|
||||
import os
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
import yt_dlp
|
||||
import yt_dlp.options
|
||||
|
||||
create_parser = yt_dlp.options.create_parser
|
||||
|
||||
|
||||
def parse_patched_options(opts):
|
||||
patched_parser = create_parser()
|
||||
patched_parser.defaults.update({
|
||||
'ignoreerrors': False,
|
||||
'retries': 0,
|
||||
'fragment_retries': 0,
|
||||
'extract_flat': False,
|
||||
'concat_playlist': 'never',
|
||||
})
|
||||
yt_dlp.options.create_parser = lambda: patched_parser
|
||||
try:
|
||||
return yt_dlp.parse_options(opts)
|
||||
finally:
|
||||
yt_dlp.options.create_parser = create_parser
|
||||
|
||||
|
||||
default_opts = parse_patched_options([]).ydl_opts
|
||||
|
||||
|
||||
def cli_to_api(opts, cli_defaults=False):
|
||||
opts = (yt_dlp.parse_options if cli_defaults else parse_patched_options)(opts).ydl_opts
|
||||
|
||||
diff = {k: v for k, v in opts.items() if default_opts[k] != v}
|
||||
if 'postprocessors' in diff:
|
||||
diff['postprocessors'] = [pp for pp in diff['postprocessors']
|
||||
if pp not in default_opts['postprocessors']]
|
||||
return diff
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
from pprint import pprint
|
||||
|
||||
print('\nThe arguments passed translate to:\n')
|
||||
pprint(cli_to_api(sys.argv[1:]))
|
||||
print('\nCombining these with the CLI defaults gives:\n')
|
||||
pprint(cli_to_api(sys.argv[1:], True))
|
81
devscripts/install_deps.py
Executable file
81
devscripts/install_deps.py
Executable file
|
@ -0,0 +1,81 @@
|
|||
#!/usr/bin/env python3
|
||||
|
||||
# Allow execution from anywhere
|
||||
import os
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
import argparse
|
||||
import re
|
||||
import subprocess
|
||||
|
||||
from pathlib import Path
|
||||
|
||||
from devscripts.tomlparse import parse_toml
|
||||
from devscripts.utils import read_file
|
||||
|
||||
|
||||
def parse_args():
|
||||
parser = argparse.ArgumentParser(description='Install dependencies for yt-dlp')
|
||||
parser.add_argument(
|
||||
'input', nargs='?', metavar='TOMLFILE', default=Path(__file__).parent.parent / 'pyproject.toml',
|
||||
help='input file (default: %(default)s)')
|
||||
parser.add_argument(
|
||||
'-e', '--exclude', metavar='DEPENDENCY', action='append',
|
||||
help='exclude a dependency')
|
||||
parser.add_argument(
|
||||
'-i', '--include', metavar='GROUP', action='append',
|
||||
help='include an optional dependency group')
|
||||
parser.add_argument(
|
||||
'-o', '--only-optional', action='store_true',
|
||||
help='only install optional dependencies')
|
||||
parser.add_argument(
|
||||
'-p', '--print', action='store_true',
|
||||
help='only print requirements to stdout')
|
||||
parser.add_argument(
|
||||
'-u', '--user', action='store_true',
|
||||
help='install with pip as --user')
|
||||
return parser.parse_args()
|
||||
|
||||
|
||||
def main():
|
||||
args = parse_args()
|
||||
project_table = parse_toml(read_file(args.input))['project']
|
||||
recursive_pattern = re.compile(rf'{project_table["name"]}\[(?P<group_name>[\w-]+)\]')
|
||||
optional_groups = project_table['optional-dependencies']
|
||||
excludes = args.exclude or []
|
||||
|
||||
def yield_deps(group):
|
||||
for dep in group:
|
||||
if mobj := recursive_pattern.fullmatch(dep):
|
||||
yield from optional_groups.get(mobj.group('group_name'), [])
|
||||
else:
|
||||
yield dep
|
||||
|
||||
targets = []
|
||||
if not args.only_optional: # `-o` should exclude 'dependencies' and the 'default' group
|
||||
targets.extend(project_table['dependencies'])
|
||||
if 'default' not in excludes: # `--exclude default` should exclude entire 'default' group
|
||||
targets.extend(yield_deps(optional_groups['default']))
|
||||
|
||||
for include in filter(None, map(optional_groups.get, args.include or [])):
|
||||
targets.extend(yield_deps(include))
|
||||
|
||||
targets = [t for t in targets if re.match(r'[\w-]+', t).group(0).lower() not in excludes]
|
||||
|
||||
if args.print:
|
||||
for target in targets:
|
||||
print(target)
|
||||
return
|
||||
|
||||
pip_args = [sys.executable, '-m', 'pip', 'install', '-U']
|
||||
if args.user:
|
||||
pip_args.append('--user')
|
||||
pip_args.extend(targets)
|
||||
|
||||
return subprocess.call(pip_args)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
sys.exit(main())
|
|
@ -6,6 +6,7 @@
|
|||
age_restricted,
|
||||
bug_reports_message,
|
||||
classproperty,
|
||||
variadic,
|
||||
write_string,
|
||||
)
|
||||
|
||||
|
|
510
devscripts/make_changelog.py
Normal file
510
devscripts/make_changelog.py
Normal file
|
@ -0,0 +1,510 @@
|
|||
from __future__ import annotations
|
||||
|
||||
# Allow direct execution
|
||||
import os
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
import enum
|
||||
import itertools
|
||||
import json
|
||||
import logging
|
||||
import re
|
||||
from collections import defaultdict
|
||||
from dataclasses import dataclass
|
||||
from functools import lru_cache
|
||||
from pathlib import Path
|
||||
|
||||
from devscripts.utils import read_file, run_process, write_file
|
||||
|
||||
BASE_URL = 'https://github.com'
|
||||
LOCATION_PATH = Path(__file__).parent
|
||||
HASH_LENGTH = 7
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class CommitGroup(enum.Enum):
|
||||
PRIORITY = 'Important'
|
||||
CORE = 'Core'
|
||||
EXTRACTOR = 'Extractor'
|
||||
DOWNLOADER = 'Downloader'
|
||||
POSTPROCESSOR = 'Postprocessor'
|
||||
NETWORKING = 'Networking'
|
||||
MISC = 'Misc.'
|
||||
|
||||
@classmethod
|
||||
@lru_cache
|
||||
def subgroup_lookup(cls):
|
||||
return {
|
||||
name: group
|
||||
for group, names in {
|
||||
cls.MISC: {
|
||||
'build',
|
||||
'ci',
|
||||
'cleanup',
|
||||
'devscripts',
|
||||
'docs',
|
||||
'test',
|
||||
},
|
||||
cls.NETWORKING: {
|
||||
'rh',
|
||||
},
|
||||
}.items()
|
||||
for name in names
|
||||
}
|
||||
|
||||
@classmethod
|
||||
@lru_cache
|
||||
def group_lookup(cls):
|
||||
result = {
|
||||
'fd': cls.DOWNLOADER,
|
||||
'ie': cls.EXTRACTOR,
|
||||
'pp': cls.POSTPROCESSOR,
|
||||
'upstream': cls.CORE,
|
||||
}
|
||||
result.update({item.name.lower(): item for item in iter(cls)})
|
||||
return result
|
||||
|
||||
@classmethod
|
||||
def get(cls, value: str) -> tuple[CommitGroup | None, str | None]:
|
||||
group, _, subgroup = (group.strip().lower() for group in value.partition('/'))
|
||||
|
||||
result = cls.group_lookup().get(group)
|
||||
if not result:
|
||||
if subgroup:
|
||||
return None, value
|
||||
subgroup = group
|
||||
result = cls.subgroup_lookup().get(subgroup)
|
||||
|
||||
return result, subgroup or None
|
||||
|
||||
|
||||
@dataclass
|
||||
class Commit:
|
||||
hash: str | None
|
||||
short: str
|
||||
authors: list[str]
|
||||
|
||||
def __str__(self):
|
||||
result = f'{self.short!r}'
|
||||
|
||||
if self.hash:
|
||||
result += f' ({self.hash[:HASH_LENGTH]})'
|
||||
|
||||
if self.authors:
|
||||
authors = ', '.join(self.authors)
|
||||
result += f' by {authors}'
|
||||
|
||||
return result
|
||||
|
||||
|
||||
@dataclass
|
||||
class CommitInfo:
|
||||
details: str | None
|
||||
sub_details: tuple[str, ...]
|
||||
message: str
|
||||
issues: list[str]
|
||||
commit: Commit
|
||||
fixes: list[Commit]
|
||||
|
||||
def key(self):
|
||||
return ((self.details or '').lower(), self.sub_details, self.message)
|
||||
|
||||
|
||||
def unique(items):
|
||||
return sorted({item.strip().lower(): item for item in items if item}.values())
|
||||
|
||||
|
||||
class Changelog:
|
||||
MISC_RE = re.compile(r'(?:^|\b)(?:lint(?:ing)?|misc|format(?:ting)?|fixes)(?:\b|$)', re.IGNORECASE)
|
||||
ALWAYS_SHOWN = (CommitGroup.PRIORITY,)
|
||||
|
||||
def __init__(self, groups, repo, collapsible=False):
|
||||
self._groups = groups
|
||||
self._repo = repo
|
||||
self._collapsible = collapsible
|
||||
|
||||
def __str__(self):
|
||||
return '\n'.join(self._format_groups(self._groups)).replace('\t', ' ')
|
||||
|
||||
def _format_groups(self, groups):
|
||||
first = True
|
||||
for item in CommitGroup:
|
||||
if self._collapsible and item not in self.ALWAYS_SHOWN and first:
|
||||
first = False
|
||||
yield '\n<details><summary><h3>Changelog</h3></summary>\n'
|
||||
|
||||
group = groups[item]
|
||||
if group:
|
||||
yield self.format_module(item.value, group)
|
||||
|
||||
if self._collapsible:
|
||||
yield '\n</details>'
|
||||
|
||||
def format_module(self, name, group):
|
||||
result = f'\n#### {name} changes\n' if name else '\n'
|
||||
return result + '\n'.join(self._format_group(group))
|
||||
|
||||
def _format_group(self, group):
|
||||
sorted_group = sorted(group, key=CommitInfo.key)
|
||||
detail_groups = itertools.groupby(sorted_group, lambda item: (item.details or '').lower())
|
||||
for _, items in detail_groups:
|
||||
items = list(items)
|
||||
details = items[0].details
|
||||
|
||||
if details == 'cleanup':
|
||||
items = self._prepare_cleanup_misc_items(items)
|
||||
|
||||
prefix = '-'
|
||||
if details:
|
||||
if len(items) == 1:
|
||||
prefix = f'- **{details}**:'
|
||||
else:
|
||||
yield f'- **{details}**'
|
||||
prefix = '\t-'
|
||||
|
||||
sub_detail_groups = itertools.groupby(items, lambda item: tuple(map(str.lower, item.sub_details)))
|
||||
for sub_details, entries in sub_detail_groups:
|
||||
if not sub_details:
|
||||
for entry in entries:
|
||||
yield f'{prefix} {self.format_single_change(entry)}'
|
||||
continue
|
||||
|
||||
entries = list(entries)
|
||||
sub_prefix = f'{prefix} {", ".join(entries[0].sub_details)}'
|
||||
if len(entries) == 1:
|
||||
yield f'{sub_prefix}: {self.format_single_change(entries[0])}'
|
||||
continue
|
||||
|
||||
yield sub_prefix
|
||||
for entry in entries:
|
||||
yield f'\t{prefix} {self.format_single_change(entry)}'
|
||||
|
||||
def _prepare_cleanup_misc_items(self, items):
|
||||
cleanup_misc_items = defaultdict(list)
|
||||
sorted_items = []
|
||||
for item in items:
|
||||
if self.MISC_RE.search(item.message):
|
||||
cleanup_misc_items[tuple(item.commit.authors)].append(item)
|
||||
else:
|
||||
sorted_items.append(item)
|
||||
|
||||
for commit_infos in cleanup_misc_items.values():
|
||||
sorted_items.append(CommitInfo(
|
||||
'cleanup', ('Miscellaneous',), ', '.join(
|
||||
self._format_message_link(None, info.commit.hash)
|
||||
for info in sorted(commit_infos, key=lambda item: item.commit.hash or '')),
|
||||
[], Commit(None, '', commit_infos[0].commit.authors), []))
|
||||
|
||||
return sorted_items
|
||||
|
||||
def format_single_change(self, info: CommitInfo):
|
||||
message, sep, rest = info.message.partition('\n')
|
||||
if '[' not in message:
|
||||
# If the message doesn't already contain markdown links, try to add a link to the commit
|
||||
message = self._format_message_link(message, info.commit.hash)
|
||||
|
||||
if info.issues:
|
||||
message = f'{message} ({self._format_issues(info.issues)})'
|
||||
|
||||
if info.commit.authors:
|
||||
message = f'{message} by {self._format_authors(info.commit.authors)}'
|
||||
|
||||
if info.fixes:
|
||||
fix_message = ', '.join(f'{self._format_message_link(None, fix.hash)}' for fix in info.fixes)
|
||||
|
||||
authors = sorted({author for fix in info.fixes for author in fix.authors}, key=str.casefold)
|
||||
if authors != info.commit.authors:
|
||||
fix_message = f'{fix_message} by {self._format_authors(authors)}'
|
||||
|
||||
message = f'{message} (With fixes in {fix_message})'
|
||||
|
||||
return message if not sep else f'{message}{sep}{rest}'
|
||||
|
||||
def _format_message_link(self, message, commit_hash):
|
||||
assert message or commit_hash, 'Improperly defined commit message or override'
|
||||
message = message if message else commit_hash[:HASH_LENGTH]
|
||||
return f'[{message}]({self.repo_url}/commit/{commit_hash})' if commit_hash else message
|
||||
|
||||
def _format_issues(self, issues):
|
||||
return ', '.join(f'[#{issue}]({self.repo_url}/issues/{issue})' for issue in issues)
|
||||
|
||||
@staticmethod
|
||||
def _format_authors(authors):
|
||||
return ', '.join(f'[{author}]({BASE_URL}/{author})' for author in authors)
|
||||
|
||||
@property
|
||||
def repo_url(self):
|
||||
return f'{BASE_URL}/{self._repo}'
|
||||
|
||||
|
||||
class CommitRange:
|
||||
COMMAND = 'git'
|
||||
COMMIT_SEPARATOR = '-----'
|
||||
|
||||
AUTHOR_INDICATOR_RE = re.compile(r'Authored by:? ', re.IGNORECASE)
|
||||
MESSAGE_RE = re.compile(r'''
|
||||
(?:\[(?P<prefix>[^\]]+)\]\ )?
|
||||
(?:(?P<sub_details>`?[\w.-]+`?): )?
|
||||
(?P<message>.+?)
|
||||
(?:\ \((?P<issues>\#\d+(?:,\ \#\d+)*)\))?
|
||||
''', re.VERBOSE | re.DOTALL)
|
||||
EXTRACTOR_INDICATOR_RE = re.compile(r'(?:Fix|Add)\s+Extractors?', re.IGNORECASE)
|
||||
REVERT_RE = re.compile(r'(?:\[[^\]]+\]\s+)?(?i:Revert)\s+([\da-f]{40})')
|
||||
FIXES_RE = re.compile(r'(?i:Fix(?:es)?(?:\s+bugs?)?(?:\s+in|\s+for)?|Revert|Improve)\s+([\da-f]{40})')
|
||||
UPSTREAM_MERGE_RE = re.compile(r'Update to ytdl-commit-([\da-f]+)')
|
||||
|
||||
def __init__(self, start, end, default_author=None):
|
||||
self._start, self._end = start, end
|
||||
self._commits, self._fixes = self._get_commits_and_fixes(default_author)
|
||||
self._commits_added = []
|
||||
|
||||
def __iter__(self):
|
||||
return iter(itertools.chain(self._commits.values(), self._commits_added))
|
||||
|
||||
def __len__(self):
|
||||
return len(self._commits) + len(self._commits_added)
|
||||
|
||||
def __contains__(self, commit):
|
||||
if isinstance(commit, Commit):
|
||||
if not commit.hash:
|
||||
return False
|
||||
commit = commit.hash
|
||||
|
||||
return commit in self._commits
|
||||
|
||||
def _get_commits_and_fixes(self, default_author):
|
||||
result = run_process(
|
||||
self.COMMAND, 'log', f'--format=%H%n%s%n%b%n{self.COMMIT_SEPARATOR}',
|
||||
f'{self._start}..{self._end}' if self._start else self._end).stdout
|
||||
|
||||
commits, reverts = {}, {}
|
||||
fixes = defaultdict(list)
|
||||
lines = iter(result.splitlines(False))
|
||||
for i, commit_hash in enumerate(lines):
|
||||
short = next(lines)
|
||||
skip = short.startswith('Release ') or short == '[version] update'
|
||||
|
||||
authors = [default_author] if default_author else []
|
||||
for line in iter(lambda: next(lines), self.COMMIT_SEPARATOR):
|
||||
match = self.AUTHOR_INDICATOR_RE.match(line)
|
||||
if match:
|
||||
authors = sorted(map(str.strip, line[match.end():].split(',')), key=str.casefold)
|
||||
|
||||
commit = Commit(commit_hash, short, authors)
|
||||
if skip and (self._start or not i):
|
||||
logger.debug(f'Skipped commit: {commit}')
|
||||
continue
|
||||
elif skip:
|
||||
logger.debug(f'Reached Release commit, breaking: {commit}')
|
||||
break
|
||||
|
||||
revert_match = self.REVERT_RE.fullmatch(commit.short)
|
||||
if revert_match:
|
||||
reverts[revert_match.group(1)] = commit
|
||||
continue
|
||||
|
||||
fix_match = self.FIXES_RE.search(commit.short)
|
||||
if fix_match:
|
||||
commitish = fix_match.group(1)
|
||||
fixes[commitish].append(commit)
|
||||
|
||||
commits[commit.hash] = commit
|
||||
|
||||
for commitish, revert_commit in reverts.items():
|
||||
reverted = commits.pop(commitish, None)
|
||||
if reverted:
|
||||
logger.debug(f'{commitish} fully reverted {reverted}')
|
||||
else:
|
||||
commits[revert_commit.hash] = revert_commit
|
||||
|
||||
for commitish, fix_commits in fixes.items():
|
||||
if commitish in commits:
|
||||
hashes = ', '.join(commit.hash[:HASH_LENGTH] for commit in fix_commits)
|
||||
logger.info(f'Found fix(es) for {commitish[:HASH_LENGTH]}: {hashes}')
|
||||
for fix_commit in fix_commits:
|
||||
del commits[fix_commit.hash]
|
||||
else:
|
||||
logger.debug(f'Commit with fixes not in changes: {commitish[:HASH_LENGTH]}')
|
||||
|
||||
return commits, fixes
|
||||
|
||||
def apply_overrides(self, overrides):
|
||||
for override in overrides:
|
||||
when = override.get('when')
|
||||
if when and when not in self and when != self._start:
|
||||
logger.debug(f'Ignored {when!r} override')
|
||||
continue
|
||||
|
||||
override_hash = override.get('hash') or when
|
||||
if override['action'] == 'add':
|
||||
commit = Commit(override.get('hash'), override['short'], override.get('authors') or [])
|
||||
logger.info(f'ADD {commit}')
|
||||
self._commits_added.append(commit)
|
||||
|
||||
elif override['action'] == 'remove':
|
||||
if override_hash in self._commits:
|
||||
logger.info(f'REMOVE {self._commits[override_hash]}')
|
||||
del self._commits[override_hash]
|
||||
|
||||
elif override['action'] == 'change':
|
||||
if override_hash not in self._commits:
|
||||
continue
|
||||
commit = Commit(override_hash, override['short'], override.get('authors') or [])
|
||||
logger.info(f'CHANGE {self._commits[commit.hash]} -> {commit}')
|
||||
self._commits[commit.hash] = commit
|
||||
|
||||
self._commits = dict(reversed(self._commits.items()))
|
||||
|
||||
def groups(self):
|
||||
group_dict = defaultdict(list)
|
||||
for commit in self:
|
||||
upstream_re = self.UPSTREAM_MERGE_RE.search(commit.short)
|
||||
if upstream_re:
|
||||
commit.short = f'[upstream] Merged with youtube-dl {upstream_re.group(1)}'
|
||||
|
||||
match = self.MESSAGE_RE.fullmatch(commit.short)
|
||||
if not match:
|
||||
logger.error(f'Error parsing short commit message: {commit.short!r}')
|
||||
continue
|
||||
|
||||
prefix, sub_details_alt, message, issues = match.groups()
|
||||
issues = [issue.strip()[1:] for issue in issues.split(',')] if issues else []
|
||||
|
||||
if prefix:
|
||||
groups, details, sub_details = zip(*map(self.details_from_prefix, prefix.split(',')))
|
||||
group = next(iter(filter(None, groups)), None)
|
||||
details = ', '.join(unique(details))
|
||||
sub_details = list(itertools.chain.from_iterable(sub_details))
|
||||
else:
|
||||
group = CommitGroup.CORE
|
||||
details = None
|
||||
sub_details = []
|
||||
|
||||
if sub_details_alt:
|
||||
sub_details.append(sub_details_alt)
|
||||
sub_details = tuple(unique(sub_details))
|
||||
|
||||
if not group:
|
||||
if self.EXTRACTOR_INDICATOR_RE.search(commit.short):
|
||||
group = CommitGroup.EXTRACTOR
|
||||
logger.error(f'Assuming [ie] group for {commit.short!r}')
|
||||
else:
|
||||
group = CommitGroup.CORE
|
||||
|
||||
commit_info = CommitInfo(
|
||||
details, sub_details, message.strip(),
|
||||
issues, commit, self._fixes[commit.hash])
|
||||
|
||||
logger.debug(f'Resolved {commit.short!r} to {commit_info!r}')
|
||||
group_dict[group].append(commit_info)
|
||||
|
||||
return group_dict
|
||||
|
||||
@staticmethod
|
||||
def details_from_prefix(prefix):
|
||||
if not prefix:
|
||||
return CommitGroup.CORE, None, ()
|
||||
|
||||
prefix, *sub_details = prefix.split(':')
|
||||
|
||||
group, details = CommitGroup.get(prefix)
|
||||
if group is CommitGroup.PRIORITY and details:
|
||||
details = details.partition('/')[2].strip()
|
||||
|
||||
if details and '/' in details:
|
||||
logger.error(f'Prefix is overnested, using first part: {prefix}')
|
||||
details = details.partition('/')[0].strip()
|
||||
|
||||
if details == 'common':
|
||||
details = None
|
||||
elif group is CommitGroup.NETWORKING and details == 'rh':
|
||||
details = 'Request Handler'
|
||||
|
||||
return group, details, sub_details
|
||||
|
||||
|
||||
def get_new_contributors(contributors_path, commits):
|
||||
contributors = set()
|
||||
if contributors_path.exists():
|
||||
for line in read_file(contributors_path).splitlines():
|
||||
author, _, _ = line.strip().partition(' (')
|
||||
authors = author.split('/')
|
||||
contributors.update(map(str.casefold, authors))
|
||||
|
||||
new_contributors = set()
|
||||
for commit in commits:
|
||||
for author in commit.authors:
|
||||
author_folded = author.casefold()
|
||||
if author_folded not in contributors:
|
||||
contributors.add(author_folded)
|
||||
new_contributors.add(author)
|
||||
|
||||
return sorted(new_contributors, key=str.casefold)
|
||||
|
||||
|
||||
def create_changelog(args):
|
||||
logging.basicConfig(
|
||||
datefmt='%Y-%m-%d %H-%M-%S', format='{asctime} | {levelname:<8} | {message}',
|
||||
level=logging.WARNING - 10 * args.verbosity, style='{', stream=sys.stderr)
|
||||
|
||||
commits = CommitRange(None, args.commitish, args.default_author)
|
||||
|
||||
if not args.no_override:
|
||||
if args.override_path.exists():
|
||||
overrides = json.loads(read_file(args.override_path))
|
||||
commits.apply_overrides(overrides)
|
||||
else:
|
||||
logger.warning(f'File {args.override_path.as_posix()} does not exist')
|
||||
|
||||
logger.info(f'Loaded {len(commits)} commits')
|
||||
|
||||
new_contributors = get_new_contributors(args.contributors_path, commits)
|
||||
if new_contributors:
|
||||
if args.contributors:
|
||||
write_file(args.contributors_path, '\n'.join(new_contributors) + '\n', mode='a')
|
||||
logger.info(f'New contributors: {", ".join(new_contributors)}')
|
||||
|
||||
return Changelog(commits.groups(), args.repo, args.collapsible)
|
||||
|
||||
|
||||
def create_parser():
|
||||
import argparse
|
||||
|
||||
parser = argparse.ArgumentParser(
|
||||
description='Create a changelog markdown from a git commit range')
|
||||
parser.add_argument(
|
||||
'commitish', default='HEAD', nargs='?',
|
||||
help='The commitish to create the range from (default: %(default)s)')
|
||||
parser.add_argument(
|
||||
'-v', '--verbosity', action='count', default=0,
|
||||
help='increase verbosity (can be used twice)')
|
||||
parser.add_argument(
|
||||
'-c', '--contributors', action='store_true',
|
||||
help='update CONTRIBUTORS file (default: %(default)s)')
|
||||
parser.add_argument(
|
||||
'--contributors-path', type=Path, default=LOCATION_PATH.parent / 'CONTRIBUTORS',
|
||||
help='path to the CONTRIBUTORS file')
|
||||
parser.add_argument(
|
||||
'--no-override', action='store_true',
|
||||
help='skip override json in commit generation (default: %(default)s)')
|
||||
parser.add_argument(
|
||||
'--override-path', type=Path, default=LOCATION_PATH / 'changelog_override.json',
|
||||
help='path to the changelog_override.json file')
|
||||
parser.add_argument(
|
||||
'--default-author', default='pukkandan',
|
||||
help='the author to use without a author indicator (default: %(default)s)')
|
||||
parser.add_argument(
|
||||
'--repo', default='yt-dlp/yt-dlp',
|
||||
help='the github repository to use for the operations (default: %(default)s)')
|
||||
parser.add_argument(
|
||||
'--collapsible', action='store_true',
|
||||
help='make changelog collapsible (default: %(default)s)')
|
||||
|
||||
return parser
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
print(create_changelog(create_parser().parse_args()))
|
|
@ -9,12 +9,7 @@
|
|||
|
||||
import re
|
||||
|
||||
from devscripts.utils import (
|
||||
get_filename_args,
|
||||
read_file,
|
||||
read_version,
|
||||
write_file,
|
||||
)
|
||||
from devscripts.utils import get_filename_args, read_file, write_file
|
||||
|
||||
VERBOSE_TMPL = '''
|
||||
- type: checkboxes
|
||||
|
@ -24,6 +19,8 @@
|
|||
options:
|
||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||
required: true
|
||||
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
|
||||
required: false
|
||||
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
||||
required: true
|
||||
- type: textarea
|
||||
|
@ -33,23 +30,31 @@
|
|||
description: |
|
||||
It should start like this:
|
||||
placeholder: |
|
||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
||||
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||
[debug] yt-dlp version %(version)s [9d339c4] (win32_exe)
|
||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||
[debug] Checking exe version: ffmpeg -bsfs
|
||||
[debug] Checking exe version: ffprobe -bsfs
|
||||
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp-nightly-builds [1a176d874] (win_exe)
|
||||
[debug] Python 3.10.11 (CPython AMD64 64bit) - Windows-10-10.0.20348-SP0 (OpenSSL 1.1.1t 7 Feb 2023)
|
||||
[debug] exe versions: ffmpeg 7.0.2 (setts), ffprobe 7.0.2
|
||||
[debug] Optional libraries: Cryptodome-3.21.0, brotli-1.1.0, certifi-2024.08.30, curl_cffi-0.5.10, mutagen-1.47.0, requests-2.32.3, sqlite3-3.40.1, urllib3-2.2.3, websockets-13.1
|
||||
[debug] Proxy map: {}
|
||||
[debug] Request Handlers: urllib, requests, websockets, curl_cffi
|
||||
[debug] Loaded 1838 extractors
|
||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
||||
Latest version: %(version)s, Current version: %(version)s
|
||||
yt-dlp is up to date (%(version)s)
|
||||
Latest version: nightly@... from yt-dlp/yt-dlp-nightly-builds
|
||||
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
||||
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
||||
<more lines>
|
||||
render: shell
|
||||
validations:
|
||||
required: true
|
||||
- type: markdown
|
||||
attributes:
|
||||
value: |
|
||||
> [!CAUTION]
|
||||
> ### GitHub is experiencing a high volume of malicious spam comments.
|
||||
> ### If you receive any replies asking you download a file, do NOT follow the download links!
|
||||
>
|
||||
> Note that this issue may be temporarily locked as an anti-spam measure after it is opened.
|
||||
'''.strip()
|
||||
|
||||
NO_SKIP = '''
|
||||
|
@ -58,13 +63,13 @@
|
|||
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||
description: Fill all fields even if you think it is irrelevant for the issue
|
||||
options:
|
||||
- label: I understand that I will be **blocked** if I remove or skip any mandatory\\* field
|
||||
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\\* field
|
||||
required: true
|
||||
'''.strip()
|
||||
|
||||
|
||||
def main():
|
||||
fields = {'version': read_version(), 'no_skip': NO_SKIP}
|
||||
fields = {'no_skip': NO_SKIP}
|
||||
fields['verbose'] = VERBOSE_TMPL % fields
|
||||
fields['verbose_optional'] = re.sub(r'(\n\s+validations:)?\n\s+required: true', '', fields['verbose'])
|
||||
|
||||
|
|
|
@ -2,7 +2,6 @@
|
|||
|
||||
# Allow direct execution
|
||||
import os
|
||||
import shutil
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
@ -34,12 +33,12 @@ class {name}({bases}):
|
|||
|
||||
|
||||
def main():
|
||||
os.environ['YTDLP_NO_PLUGINS'] = 'true'
|
||||
os.environ['YTDLP_NO_LAZY_EXTRACTORS'] = 'true'
|
||||
|
||||
lazy_extractors_filename = get_filename_args(default_outfile='yt_dlp/extractor/lazy_extractors.py')
|
||||
if os.path.exists(lazy_extractors_filename):
|
||||
os.remove(lazy_extractors_filename)
|
||||
|
||||
_ALL_CLASSES = get_all_ies() # Must be before import
|
||||
|
||||
from yt_dlp.extractor.extractors import _ALL_CLASSES
|
||||
from yt_dlp.extractor.common import InfoExtractor, SearchInfoExtractor
|
||||
|
||||
DummyInfoExtractor = type('InfoExtractor', (InfoExtractor,), {'IE_NAME': NO_ATTR})
|
||||
|
@ -54,20 +53,6 @@ def main():
|
|||
write_file(lazy_extractors_filename, f'{module_src}\n')
|
||||
|
||||
|
||||
def get_all_ies():
|
||||
PLUGINS_DIRNAME = 'ytdlp_plugins'
|
||||
BLOCKED_DIRNAME = f'{PLUGINS_DIRNAME}_blocked'
|
||||
if os.path.exists(PLUGINS_DIRNAME):
|
||||
# os.rename cannot be used, e.g. in Docker. See https://github.com/yt-dlp/yt-dlp/pull/4958
|
||||
shutil.move(PLUGINS_DIRNAME, BLOCKED_DIRNAME)
|
||||
try:
|
||||
from yt_dlp.extractor.extractors import _ALL_CLASSES
|
||||
finally:
|
||||
if os.path.exists(BLOCKED_DIRNAME):
|
||||
shutil.move(BLOCKED_DIRNAME, PLUGINS_DIRNAME)
|
||||
return _ALL_CLASSES
|
||||
|
||||
|
||||
def extra_ie_code(ie, base=None):
|
||||
for var in STATIC_CLASS_PROPERTIES:
|
||||
val = getattr(ie, var)
|
||||
|
|
|
@ -45,32 +45,42 @@ def apply_patch(text, patch):
|
|||
delim = f'\n{" " * switch_col_width}'
|
||||
|
||||
PATCHES = (
|
||||
( # Standardize update message
|
||||
( # Standardize `--update` message
|
||||
r'(?m)^( -U, --update\s+).+(\n \s.+)*$',
|
||||
r'\1Update this program to the latest version',
|
||||
),
|
||||
( # Headings
|
||||
( # Headings
|
||||
r'(?m)^ (\w.+\n)( (?=\w))?',
|
||||
r'## \1'
|
||||
r'## \1',
|
||||
),
|
||||
( # Do not split URLs
|
||||
( # Fixup `--date` formatting
|
||||
rf'(?m)( --date DATE.+({delim}[^\[]+)*)\[.+({delim}.+)*$',
|
||||
(rf'\1[now|today|yesterday][-N[day|week|month|year]].{delim}'
|
||||
f'E.g. "--date today-2weeks" downloads only{delim}'
|
||||
'videos uploaded on the same day two weeks ago'),
|
||||
),
|
||||
( # Do not split URLs
|
||||
rf'({delim[:-1]})? (?P<label>\[\S+\] )?(?P<url>https?({delim})?:({delim})?/({delim})?/(({delim})?\S+)+)\s',
|
||||
lambda mobj: ''.join((delim, mobj.group('label') or '', re.sub(r'\s+', '', mobj.group('url')), '\n'))
|
||||
lambda mobj: ''.join((delim, mobj.group('label') or '', re.sub(r'\s+', '', mobj.group('url')), '\n')),
|
||||
),
|
||||
( # Do not split "words"
|
||||
( # Do not split "words"
|
||||
rf'(?m)({delim}\S+)+$',
|
||||
lambda mobj: ''.join((delim, mobj.group(0).replace(delim, '')))
|
||||
lambda mobj: ''.join((delim, mobj.group(0).replace(delim, ''))),
|
||||
),
|
||||
( # Allow overshooting last line
|
||||
( # Allow overshooting last line
|
||||
rf'(?m)^(?P<prev>.+)${delim}(?P<current>.+)$(?!{delim})',
|
||||
lambda mobj: (mobj.group().replace(delim, ' ')
|
||||
if len(mobj.group()) - len(delim) + 1 <= max_width + ALLOWED_OVERSHOOT
|
||||
else mobj.group())
|
||||
else mobj.group()),
|
||||
),
|
||||
( # Avoid newline when a space is available b/w switch and description
|
||||
( # Avoid newline when a space is available b/w switch and description
|
||||
DISABLE_PATCH, # This creates issues with prepare_manpage
|
||||
r'(?m)^(\s{4}-.{%d})(%s)' % (switch_col_width - 6, delim),
|
||||
r'\1 '
|
||||
r'\1 ',
|
||||
),
|
||||
( # Replace brackets with a Markdown link
|
||||
r'SponsorBlock API \((http.+)\)',
|
||||
r'[SponsorBlock API](\1)',
|
||||
),
|
||||
)
|
||||
|
||||
|
|
|
@ -24,7 +24,7 @@
|
|||
|
||||
# NAME
|
||||
|
||||
yt\-dlp \- A youtube-dl fork with additional features and patches
|
||||
yt\-dlp \- A feature\-rich command\-line audio/video downloader
|
||||
|
||||
# SYNOPSIS
|
||||
|
||||
|
@ -43,6 +43,27 @@ def filter_excluded_sections(readme):
|
|||
'', readme)
|
||||
|
||||
|
||||
def _convert_code_blocks(readme):
|
||||
current_code_block = None
|
||||
|
||||
for line in readme.splitlines(True):
|
||||
if current_code_block:
|
||||
if line == current_code_block:
|
||||
current_code_block = None
|
||||
yield '\n'
|
||||
else:
|
||||
yield f' {line}'
|
||||
elif line.startswith('```'):
|
||||
current_code_block = line.count('`') * '`' + '\n'
|
||||
yield '\n'
|
||||
else:
|
||||
yield line
|
||||
|
||||
|
||||
def convert_code_blocks(readme):
|
||||
return ''.join(_convert_code_blocks(readme))
|
||||
|
||||
|
||||
def move_sections(readme):
|
||||
MOVE_TAG_TEMPLATE = '<!-- MANPAGE: MOVE "%s" SECTION HERE -->'
|
||||
sections = re.findall(r'(?m)^%s$' % (
|
||||
|
@ -65,8 +86,10 @@ def move_sections(readme):
|
|||
|
||||
def filter_options(readme):
|
||||
section = re.search(r'(?sm)^# USAGE AND OPTIONS\n.+?(?=^# )', readme).group(0)
|
||||
section_new = section.replace('*', R'\*')
|
||||
|
||||
options = '# OPTIONS\n'
|
||||
for line in section.split('\n')[1:]:
|
||||
for line in section_new.split('\n')[1:]:
|
||||
mobj = re.fullmatch(r'''(?x)
|
||||
\s{4}(?P<opt>-(?:,\s|[^\s])+)
|
||||
(?:\s(?P<meta>(?:[^\s]|\s(?!\s))+))?
|
||||
|
@ -86,7 +109,7 @@ def filter_options(readme):
|
|||
return readme.replace(section, options, 1)
|
||||
|
||||
|
||||
TRANSFORM = compose_functions(filter_excluded_sections, move_sections, filter_options)
|
||||
TRANSFORM = compose_functions(filter_excluded_sections, convert_code_blocks, move_sections, filter_options)
|
||||
|
||||
|
||||
def main():
|
||||
|
|
|
@ -1,17 +0,0 @@
|
|||
@setlocal
|
||||
@echo off
|
||||
cd /d %~dp0..
|
||||
|
||||
if ["%~1"]==[""] (
|
||||
set "test_set="test""
|
||||
) else if ["%~1"]==["core"] (
|
||||
set "test_set="-m not download""
|
||||
) else if ["%~1"]==["download"] (
|
||||
set "test_set="-m "download""
|
||||
) else (
|
||||
echo.Invalid test type "%~1". Use "core" ^| "download"
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
set PYTHONWARNINGS=error
|
||||
pytest %test_set%
|
76
devscripts/run_tests.py
Executable file
76
devscripts/run_tests.py
Executable file
|
@ -0,0 +1,76 @@
|
|||
#!/usr/bin/env python3
|
||||
|
||||
import argparse
|
||||
import functools
|
||||
import os
|
||||
import re
|
||||
import shlex
|
||||
import subprocess
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
fix_test_name = functools.partial(re.compile(r'IE(_all|_\d+)?$').sub, r'\1')
|
||||
|
||||
|
||||
def parse_args():
|
||||
parser = argparse.ArgumentParser(description='Run selected yt-dlp tests')
|
||||
parser.add_argument(
|
||||
'test', help='an extractor test, test path, or one of "core" or "download"', nargs='*')
|
||||
parser.add_argument(
|
||||
'-k', help='run a test matching EXPRESSION. Same as "pytest -k"', metavar='EXPRESSION')
|
||||
parser.add_argument(
|
||||
'--pytest-args', help='arguments to passthrough to pytest')
|
||||
return parser.parse_args()
|
||||
|
||||
|
||||
def run_tests(*tests, pattern=None, ci=False):
|
||||
run_core = 'core' in tests or (not pattern and not tests)
|
||||
run_download = 'download' in tests
|
||||
|
||||
pytest_args = args.pytest_args or os.getenv('HATCH_TEST_ARGS', '')
|
||||
arguments = ['pytest', '-Werror', '--tb=short', *shlex.split(pytest_args)]
|
||||
if ci:
|
||||
arguments.append('--color=yes')
|
||||
if pattern:
|
||||
arguments.extend(['-k', pattern])
|
||||
if run_core:
|
||||
arguments.extend(['-m', 'not download'])
|
||||
elif run_download:
|
||||
arguments.extend(['-m', 'download'])
|
||||
else:
|
||||
arguments.extend(
|
||||
test if '/' in test
|
||||
else f'test/test_download.py::TestDownload::test_{fix_test_name(test)}'
|
||||
for test in tests)
|
||||
|
||||
print(f'Running {arguments}', flush=True)
|
||||
try:
|
||||
return subprocess.call(arguments)
|
||||
except FileNotFoundError:
|
||||
pass
|
||||
|
||||
arguments = [sys.executable, '-Werror', '-m', 'unittest']
|
||||
if pattern:
|
||||
arguments.extend(['-k', pattern])
|
||||
if run_core:
|
||||
print('"pytest" needs to be installed to run core tests', file=sys.stderr, flush=True)
|
||||
return 1
|
||||
elif run_download:
|
||||
arguments.append('test.test_download')
|
||||
else:
|
||||
arguments.extend(
|
||||
f'test.test_download.TestDownload.test_{test}' for test in tests)
|
||||
|
||||
print(f'Running {arguments}', flush=True)
|
||||
return subprocess.call(arguments)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
try:
|
||||
args = parse_args()
|
||||
|
||||
os.chdir(Path(__file__).parent.parent)
|
||||
sys.exit(run_tests(*args.test, pattern=args.k, ci=bool(os.getenv('CI'))))
|
||||
except KeyboardInterrupt:
|
||||
pass
|
|
@ -1,14 +0,0 @@
|
|||
#!/usr/bin/env sh
|
||||
|
||||
if [ -z "$1" ]; then
|
||||
test_set='test'
|
||||
elif [ "$1" = 'core' ]; then
|
||||
test_set="-m not download"
|
||||
elif [ "$1" = 'download' ]; then
|
||||
test_set="-m download"
|
||||
else
|
||||
echo 'Invalid test type "'"$1"'". Use "core" | "download"'
|
||||
exit 1
|
||||
fi
|
||||
|
||||
python3 -bb -Werror -m pytest "$test_set"
|
|
@ -30,7 +30,7 @@ def property_setter(name, value):
|
|||
opts = parse_options()
|
||||
transform = compose_functions(
|
||||
property_setter('VARIANT', opts.variant),
|
||||
property_setter('UPDATE_HINT', opts.update_message)
|
||||
property_setter('UPDATE_HINT', opts.update_message),
|
||||
)
|
||||
|
||||
write_file(VERSION_FILE, transform(read_file(VERSION_FILE)))
|
||||
|
|
189
devscripts/tomlparse.py
Executable file
189
devscripts/tomlparse.py
Executable file
|
@ -0,0 +1,189 @@
|
|||
#!/usr/bin/env python3
|
||||
|
||||
"""
|
||||
Simple parser for spec compliant toml files
|
||||
|
||||
A simple toml parser for files that comply with the spec.
|
||||
Should only be used to parse `pyproject.toml` for `install_deps.py`.
|
||||
|
||||
IMPORTANT: INVALID FILES OR MULTILINE STRINGS ARE NOT SUPPORTED!
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import datetime as dt
|
||||
import json
|
||||
import re
|
||||
|
||||
WS = r'(?:[\ \t]*)'
|
||||
STRING_RE = re.compile(r'"(?:\\.|[^\\"\n])*"|\'[^\'\n]*\'')
|
||||
SINGLE_KEY_RE = re.compile(rf'{STRING_RE.pattern}|[A-Za-z0-9_-]+')
|
||||
KEY_RE = re.compile(rf'{WS}(?:{SINGLE_KEY_RE.pattern}){WS}(?:\.{WS}(?:{SINGLE_KEY_RE.pattern}){WS})*')
|
||||
EQUALS_RE = re.compile(rf'={WS}')
|
||||
WS_RE = re.compile(WS)
|
||||
|
||||
_SUBTABLE = rf'(?P<subtable>^\[(?P<is_list>\[)?(?P<path>{KEY_RE.pattern})\]\]?)'
|
||||
EXPRESSION_RE = re.compile(rf'^(?:{_SUBTABLE}|{KEY_RE.pattern}=)', re.MULTILINE)
|
||||
|
||||
LIST_WS_RE = re.compile(rf'{WS}((#[^\n]*)?\n{WS})*')
|
||||
LEFTOVER_VALUE_RE = re.compile(r'[^,}\]\t\n#]+')
|
||||
|
||||
|
||||
def parse_key(value: str):
|
||||
for match in SINGLE_KEY_RE.finditer(value):
|
||||
if match[0][0] == '"':
|
||||
yield json.loads(match[0])
|
||||
elif match[0][0] == '\'':
|
||||
yield match[0][1:-1]
|
||||
else:
|
||||
yield match[0]
|
||||
|
||||
|
||||
def get_target(root: dict, paths: list[str], is_list=False):
|
||||
target = root
|
||||
|
||||
for index, key in enumerate(paths, 1):
|
||||
use_list = is_list and index == len(paths)
|
||||
result = target.get(key)
|
||||
if result is None:
|
||||
result = [] if use_list else {}
|
||||
target[key] = result
|
||||
|
||||
if isinstance(result, dict):
|
||||
target = result
|
||||
elif use_list:
|
||||
target = {}
|
||||
result.append(target)
|
||||
else:
|
||||
target = result[-1]
|
||||
|
||||
assert isinstance(target, dict)
|
||||
return target
|
||||
|
||||
|
||||
def parse_enclosed(data: str, index: int, end: str, ws_re: re.Pattern):
|
||||
index += 1
|
||||
|
||||
if match := ws_re.match(data, index):
|
||||
index = match.end()
|
||||
|
||||
while data[index] != end:
|
||||
index = yield True, index
|
||||
|
||||
if match := ws_re.match(data, index):
|
||||
index = match.end()
|
||||
|
||||
if data[index] == ',':
|
||||
index += 1
|
||||
|
||||
if match := ws_re.match(data, index):
|
||||
index = match.end()
|
||||
|
||||
assert data[index] == end
|
||||
yield False, index + 1
|
||||
|
||||
|
||||
def parse_value(data: str, index: int):
|
||||
if data[index] == '[':
|
||||
result = []
|
||||
|
||||
indices = parse_enclosed(data, index, ']', LIST_WS_RE)
|
||||
valid, index = next(indices)
|
||||
while valid:
|
||||
index, value = parse_value(data, index)
|
||||
result.append(value)
|
||||
valid, index = indices.send(index)
|
||||
|
||||
return index, result
|
||||
|
||||
if data[index] == '{':
|
||||
result = {}
|
||||
|
||||
indices = parse_enclosed(data, index, '}', WS_RE)
|
||||
valid, index = next(indices)
|
||||
while valid:
|
||||
valid, index = indices.send(parse_kv_pair(data, index, result))
|
||||
|
||||
return index, result
|
||||
|
||||
if match := STRING_RE.match(data, index):
|
||||
return match.end(), json.loads(match[0]) if match[0][0] == '"' else match[0][1:-1]
|
||||
|
||||
match = LEFTOVER_VALUE_RE.match(data, index)
|
||||
assert match
|
||||
value = match[0].strip()
|
||||
for func in [
|
||||
int,
|
||||
float,
|
||||
dt.time.fromisoformat,
|
||||
dt.date.fromisoformat,
|
||||
dt.datetime.fromisoformat,
|
||||
{'true': True, 'false': False}.get,
|
||||
]:
|
||||
try:
|
||||
value = func(value)
|
||||
break
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return match.end(), value
|
||||
|
||||
|
||||
def parse_kv_pair(data: str, index: int, target: dict):
|
||||
match = KEY_RE.match(data, index)
|
||||
if not match:
|
||||
return None
|
||||
|
||||
*keys, key = parse_key(match[0])
|
||||
|
||||
match = EQUALS_RE.match(data, match.end())
|
||||
assert match
|
||||
index = match.end()
|
||||
|
||||
index, value = parse_value(data, index)
|
||||
get_target(target, keys)[key] = value
|
||||
return index
|
||||
|
||||
|
||||
def parse_toml(data: str):
|
||||
root = {}
|
||||
target = root
|
||||
|
||||
index = 0
|
||||
while True:
|
||||
match = EXPRESSION_RE.search(data, index)
|
||||
if not match:
|
||||
break
|
||||
|
||||
if match.group('subtable'):
|
||||
index = match.end()
|
||||
path, is_list = match.group('path', 'is_list')
|
||||
target = get_target(root, list(parse_key(path)), bool(is_list))
|
||||
continue
|
||||
|
||||
index = parse_kv_pair(data, match.start(), target)
|
||||
assert index is not None
|
||||
|
||||
return root
|
||||
|
||||
|
||||
def main():
|
||||
import argparse
|
||||
from pathlib import Path
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument('infile', type=Path, help='The TOML file to read as input')
|
||||
args = parser.parse_args()
|
||||
|
||||
with args.infile.open('r', encoding='utf-8') as file:
|
||||
data = file.read()
|
||||
|
||||
def default(obj):
|
||||
if isinstance(obj, (dt.date, dt.time, dt.datetime)):
|
||||
return obj.isoformat()
|
||||
|
||||
print(json.dumps(parse_toml(data), default=default))
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
main()
|
|
@ -1,39 +0,0 @@
|
|||
#!/usr/bin/env python3
|
||||
|
||||
"""
|
||||
Usage: python3 ./devscripts/update-formulae.py <path-to-formulae-rb> <version>
|
||||
version can be either 0-aligned (yt-dlp version) or normalized (PyPi version)
|
||||
"""
|
||||
|
||||
# Allow direct execution
|
||||
import os
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
|
||||
import json
|
||||
import re
|
||||
import urllib.request
|
||||
|
||||
from devscripts.utils import read_file, write_file
|
||||
|
||||
filename, version = sys.argv[1:]
|
||||
|
||||
normalized_version = '.'.join(str(int(x)) for x in version.split('.'))
|
||||
|
||||
pypi_release = json.loads(urllib.request.urlopen(
|
||||
'https://pypi.org/pypi/yt-dlp/%s/json' % normalized_version
|
||||
).read().decode())
|
||||
|
||||
tarball_file = next(x for x in pypi_release['urls'] if x['filename'].endswith('.tar.gz'))
|
||||
|
||||
sha256sum = tarball_file['digests']['sha256']
|
||||
url = tarball_file['url']
|
||||
|
||||
formulae_text = read_file(filename)
|
||||
|
||||
formulae_text = re.sub(r'sha256 "[0-9a-f]*?"', 'sha256 "%s"' % sha256sum, formulae_text, count=1)
|
||||
formulae_text = re.sub(r'url "[^"]*?"', 'url "%s"' % url, formulae_text, count=1)
|
||||
|
||||
write_file(filename, formulae_text)
|
|
@ -7,50 +7,76 @@
|
|||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
|
||||
import argparse
|
||||
import contextlib
|
||||
import subprocess
|
||||
import datetime as dt
|
||||
import sys
|
||||
from datetime import datetime
|
||||
|
||||
from devscripts.utils import read_version, write_file
|
||||
from devscripts.utils import read_version, run_process, write_file
|
||||
|
||||
|
||||
def get_new_version(revision):
|
||||
version = datetime.utcnow().strftime('%Y.%m.%d')
|
||||
def get_new_version(version, revision):
|
||||
if not version:
|
||||
version = dt.datetime.now(dt.timezone.utc).strftime('%Y.%m.%d')
|
||||
|
||||
if revision:
|
||||
assert revision.isdigit(), 'Revision must be a number'
|
||||
assert revision.isdecimal(), 'Revision must be a number'
|
||||
else:
|
||||
old_version = read_version().split('.')
|
||||
if version.split('.') == old_version[:3]:
|
||||
revision = str(int((old_version + [0])[3]) + 1)
|
||||
revision = str(int(([*old_version, 0])[3]) + 1)
|
||||
|
||||
return f'{version}.{revision}' if revision else version
|
||||
|
||||
|
||||
def get_git_head():
|
||||
with contextlib.suppress(Exception):
|
||||
sp = subprocess.Popen(['git', 'rev-parse', '--short', 'HEAD'], stdout=subprocess.PIPE)
|
||||
return sp.communicate()[0].decode().strip() or None
|
||||
return run_process('git', 'rev-parse', 'HEAD').stdout.strip()
|
||||
|
||||
|
||||
VERSION = get_new_version((sys.argv + [''])[1])
|
||||
GIT_HEAD = get_git_head()
|
||||
|
||||
VERSION_FILE = f'''\
|
||||
VERSION_TEMPLATE = '''\
|
||||
# Autogenerated by devscripts/update-version.py
|
||||
|
||||
__version__ = {VERSION!r}
|
||||
__version__ = {version!r}
|
||||
|
||||
RELEASE_GIT_HEAD = {GIT_HEAD!r}
|
||||
RELEASE_GIT_HEAD = {git_head!r}
|
||||
|
||||
VARIANT = None
|
||||
|
||||
UPDATE_HINT = None
|
||||
|
||||
CHANNEL = {channel!r}
|
||||
|
||||
ORIGIN = {origin!r}
|
||||
|
||||
_pkg_version = {package_version!r}
|
||||
'''
|
||||
|
||||
write_file('yt_dlp/version.py', VERSION_FILE)
|
||||
github_output = os.getenv('GITHUB_OUTPUT')
|
||||
if github_output:
|
||||
write_file(github_output, f'ytdlp_version={VERSION}\n', 'a')
|
||||
print(f'\nVersion = {VERSION}, Git HEAD = {GIT_HEAD}')
|
||||
if __name__ == '__main__':
|
||||
parser = argparse.ArgumentParser(description='Update the version.py file')
|
||||
parser.add_argument(
|
||||
'-c', '--channel', default='stable',
|
||||
help='Select update channel (default: %(default)s)')
|
||||
parser.add_argument(
|
||||
'-r', '--origin', default='local',
|
||||
help='Select origin/repository (default: %(default)s)')
|
||||
parser.add_argument(
|
||||
'-s', '--suffix', default='',
|
||||
help='Add an alphanumeric suffix to the package version, e.g. "dev"')
|
||||
parser.add_argument(
|
||||
'-o', '--output', default='yt_dlp/version.py',
|
||||
help='The output file to write to (default: %(default)s)')
|
||||
parser.add_argument(
|
||||
'version', nargs='?', default=None,
|
||||
help='A version or revision to use instead of generating one')
|
||||
args = parser.parse_args()
|
||||
|
||||
git_head = get_git_head()
|
||||
version = (
|
||||
args.version if args.version and '.' in args.version
|
||||
else get_new_version(None, args.version))
|
||||
write_file(args.output, VERSION_TEMPLATE.format(
|
||||
version=version, git_head=git_head, channel=args.channel, origin=args.origin,
|
||||
package_version=f'{version}{args.suffix}'))
|
||||
|
||||
print(f'version={version} ({args.channel}), head={git_head}')
|
||||
|
|
26
devscripts/update_changelog.py
Executable file
26
devscripts/update_changelog.py
Executable file
|
@ -0,0 +1,26 @@
|
|||
#!/usr/bin/env python3
|
||||
|
||||
# Allow direct execution
|
||||
import os
|
||||
import sys
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
from pathlib import Path
|
||||
|
||||
from devscripts.make_changelog import create_changelog, create_parser
|
||||
from devscripts.utils import read_file, read_version, write_file
|
||||
|
||||
# Always run after devscripts/update-version.py, and run before `make doc|pypi-files|tar|all`
|
||||
|
||||
if __name__ == '__main__':
|
||||
parser = create_parser()
|
||||
parser.description = 'Update an existing changelog file with an entry for a new release'
|
||||
parser.add_argument(
|
||||
'--changelog-path', type=Path, default=Path(__file__).parent.parent / 'Changelog.md',
|
||||
help='path to the Changelog file')
|
||||
args = parser.parse_args()
|
||||
new_entry = create_changelog(args)
|
||||
|
||||
header, sep, changelog = read_file(args.changelog_path).partition('\n### ')
|
||||
write_file(args.changelog_path, f'{header}{sep}{read_version()}\n{new_entry}\n{sep}{changelog}')
|
|
@ -1,5 +1,6 @@
|
|||
import argparse
|
||||
import functools
|
||||
import subprocess
|
||||
|
||||
|
||||
def read_file(fname):
|
||||
|
@ -12,10 +13,11 @@ def write_file(fname, content, mode='w'):
|
|||
return f.write(content)
|
||||
|
||||
|
||||
# Get the version without importing the package
|
||||
def read_version(fname='yt_dlp/version.py'):
|
||||
exec(compile(read_file(fname), fname, 'exec'))
|
||||
return locals()['__version__']
|
||||
def read_version(fname='yt_dlp/version.py', varname='__version__'):
|
||||
"""Get the version without importing the package"""
|
||||
items = {}
|
||||
exec(compile(read_file(fname), fname, 'exec'), items)
|
||||
return items[varname]
|
||||
|
||||
|
||||
def get_filename_args(has_infile=False, default_outfile=None):
|
||||
|
@ -33,3 +35,13 @@ def get_filename_args(has_infile=False, default_outfile=None):
|
|||
|
||||
def compose_functions(*functions):
|
||||
return lambda x: functools.reduce(lambda y, f: f(y), functions, x)
|
||||
|
||||
|
||||
def run_process(*args, **kwargs):
|
||||
kwargs.setdefault('text', True)
|
||||
kwargs.setdefault('check', True)
|
||||
kwargs.setdefault('capture_output', True)
|
||||
if kwargs['text']:
|
||||
kwargs.setdefault('encoding', 'utf-8')
|
||||
kwargs.setdefault('errors', 'replace')
|
||||
return subprocess.run(args, **kwargs)
|
||||
|
|
|
@ -9,15 +9,15 @@
|
|||
|
||||
import yt_dlp
|
||||
|
||||
ZSH_COMPLETION_FILE = "completions/zsh/_yt-dlp"
|
||||
ZSH_COMPLETION_TEMPLATE = "devscripts/zsh-completion.in"
|
||||
ZSH_COMPLETION_FILE = 'completions/zsh/_yt-dlp'
|
||||
ZSH_COMPLETION_TEMPLATE = 'devscripts/zsh-completion.in'
|
||||
|
||||
|
||||
def build_completion(opt_parser):
|
||||
opts = [opt for group in opt_parser.option_groups
|
||||
for opt in group.option_list]
|
||||
opts_file = [opt for opt in opts if opt.metavar == "FILE"]
|
||||
opts_dir = [opt for opt in opts if opt.metavar == "DIR"]
|
||||
opts_file = [opt for opt in opts if opt.metavar == 'FILE']
|
||||
opts_dir = [opt for opt in opts if opt.metavar == 'DIR']
|
||||
|
||||
fileopts = []
|
||||
for opt in opts_file:
|
||||
|
@ -38,11 +38,11 @@ def build_completion(opt_parser):
|
|||
with open(ZSH_COMPLETION_TEMPLATE) as f:
|
||||
template = f.read()
|
||||
|
||||
template = template.replace("{{fileopts}}", "|".join(fileopts))
|
||||
template = template.replace("{{diropts}}", "|".join(diropts))
|
||||
template = template.replace("{{flags}}", " ".join(flags))
|
||||
template = template.replace('{{fileopts}}', '|'.join(fileopts))
|
||||
template = template.replace('{{diropts}}', '|'.join(diropts))
|
||||
template = template.replace('{{flags}}', ' '.join(flags))
|
||||
|
||||
with open(ZSH_COMPLETION_FILE, "w") as f:
|
||||
with open(ZSH_COMPLETION_FILE, 'w') as f:
|
||||
f.write(template)
|
||||
|
||||
|
||||
|
|
29
public.key
Normal file
29
public.key
Normal file
|
@ -0,0 +1,29 @@
|
|||
-----BEGIN PGP PUBLIC KEY BLOCK-----
|
||||
|
||||
mQINBGP78C4BEAD0rF9zjGPAt0thlt5C1ebzccAVX7Nb1v+eqQjk+WEZdTETVCg3
|
||||
WAM5ngArlHdm/fZqzUgO+pAYrB60GKeg7ffUDf+S0XFKEZdeRLYeAaqqKhSibVal
|
||||
DjvOBOztu3W607HLETQAqA7wTPuIt2WqmpL60NIcyr27LxqmgdN3mNvZ2iLO+bP0
|
||||
nKR/C+PgE9H4ytywDa12zMx6PmZCnVOOOu6XZEFmdUxxdQ9fFDqd9LcBKY2LDOcS
|
||||
Yo1saY0YWiZWHtzVoZu1kOzjnS5Fjq/yBHJLImDH7pNxHm7s/PnaurpmQFtDFruk
|
||||
t+2lhDnpKUmGr/I/3IHqH/X+9nPoS4uiqQ5HpblB8BK+4WfpaiEg75LnvuOPfZIP
|
||||
KYyXa/0A7QojMwgOrD88ozT+VCkKkkJ+ijXZ7gHNjmcBaUdKK7fDIEOYI63Lyc6Q
|
||||
WkGQTigFffSUXWHDCO9aXNhP3ejqFWgGMtCUsrbkcJkWuWY7q5ARy/05HbSM3K4D
|
||||
U9eqtnxmiV1WQ8nXuI9JgJQRvh5PTkny5LtxqzcmqvWO9TjHBbrs14BPEO9fcXxK
|
||||
L/CFBbzXDSvvAgArdqqlMoncQ/yicTlfL6qzJ8EKFiqW14QMTdAn6SuuZTodXCTi
|
||||
InwoT7WjjuFPKKdvfH1GP4bnqdzTnzLxCSDIEtfyfPsIX+9GI7Jkk/zZjQARAQAB
|
||||
tDdTaW1vbiBTYXdpY2tpICh5dC1kbHAgc2lnbmluZyBrZXkpIDxjb250YWN0QGdy
|
||||
dWI0ay54eXo+iQJOBBMBCgA4FiEErAy75oSNaoc0ZK9OV89lkztadYEFAmP78C4C
|
||||
GwMFCwkIBwIGFQoJCAsCBBYCAwECHgECF4AACgkQV89lkztadYEVqQ//cW7TxhXg
|
||||
7Xbh2EZQzXml0egn6j8QaV9KzGragMiShrlvTO2zXfLXqyizrFP4AspgjSn/4NrI
|
||||
8mluom+Yi+qr7DXT4BjQqIM9y3AjwZPdywe912Lxcw52NNoPZCm24I9T7ySc8lmR
|
||||
FQvZC0w4H/VTNj/2lgJ1dwMflpwvNRiWa5YzcFGlCUeDIPskLx9++AJE+xwU3LYm
|
||||
jQQsPBqpHHiTBEJzMLl+rfd9Fg4N+QNzpFkTDW3EPerLuvJniSBBwZthqxeAtw4M
|
||||
UiAXh6JvCc2hJkKCoygRfM281MeolvmsGNyQm+axlB0vyldiPP6BnaRgZlx+l6MU
|
||||
cPqgHblb7RW5j9lfr6OYL7SceBIHNv0CFrt1OnkGo/tVMwcs8LH3Ae4a7UJlIceL
|
||||
V54aRxSsZU7w4iX+PB79BWkEsQzwKrUuJVOeL4UDwWajp75OFaUqbS/slDDVXvK5
|
||||
OIeuth3mA/adjdvgjPxhRQjA3l69rRWIJDrqBSHldmRsnX6cvXTDy8wSXZgy51lP
|
||||
m4IVLHnCy9m4SaGGoAsfTZS0cC9FgjUIyTyrq9M67wOMpUxnuB0aRZgJE1DsI23E
|
||||
qdvcSNVlO+39xM/KPWUEh6b83wMn88QeW+DCVGWACQq5N3YdPnAJa50617fGbY6I
|
||||
gXIoRHXkDqe23PZ/jURYCv0sjVtjPoVC+bg=
|
||||
=bJkn
|
||||
-----END PGP PUBLIC KEY BLOCK-----
|
383
pyproject.toml
Normal file
383
pyproject.toml
Normal file
|
@ -0,0 +1,383 @@
|
|||
[build-system]
|
||||
requires = ["hatchling"]
|
||||
build-backend = "hatchling.build"
|
||||
|
||||
[project]
|
||||
name = "yt-dlp"
|
||||
maintainers = [
|
||||
{name = "pukkandan", email = "pukkandan.ytdlp@gmail.com"},
|
||||
{name = "Grub4K", email = "contact@grub4k.xyz"},
|
||||
{name = "bashonly", email = "bashonly@protonmail.com"},
|
||||
{name = "coletdjnz", email = "coletdjnz@protonmail.com"},
|
||||
{name = "sepro", email = "sepro@sepr0.com"},
|
||||
]
|
||||
description = "A feature-rich command-line audio/video downloader"
|
||||
readme = "README.md"
|
||||
requires-python = ">=3.9"
|
||||
keywords = [
|
||||
"youtube-dl",
|
||||
"video-downloader",
|
||||
"youtube-downloader",
|
||||
"sponsorblock",
|
||||
"youtube-dlc",
|
||||
"yt-dlp",
|
||||
]
|
||||
license = {file = "LICENSE"}
|
||||
classifiers = [
|
||||
"Topic :: Multimedia :: Video",
|
||||
"Development Status :: 5 - Production/Stable",
|
||||
"Environment :: Console",
|
||||
"Programming Language :: Python",
|
||||
"Programming Language :: Python :: 3 :: Only",
|
||||
"Programming Language :: Python :: 3.9",
|
||||
"Programming Language :: Python :: 3.10",
|
||||
"Programming Language :: Python :: 3.11",
|
||||
"Programming Language :: Python :: 3.12",
|
||||
"Programming Language :: Python :: 3.13",
|
||||
"Programming Language :: Python :: Implementation",
|
||||
"Programming Language :: Python :: Implementation :: CPython",
|
||||
"Programming Language :: Python :: Implementation :: PyPy",
|
||||
"License :: OSI Approved :: The Unlicense (Unlicense)",
|
||||
"Operating System :: OS Independent",
|
||||
]
|
||||
dynamic = ["version"]
|
||||
dependencies = []
|
||||
|
||||
[project.optional-dependencies]
|
||||
default = [
|
||||
"brotli; implementation_name=='cpython'",
|
||||
"brotlicffi; implementation_name!='cpython'",
|
||||
"certifi",
|
||||
"mutagen",
|
||||
"pycryptodomex",
|
||||
"requests>=2.32.2,<3",
|
||||
"urllib3>=1.26.17,<3",
|
||||
"websockets>=13.0",
|
||||
]
|
||||
curl-cffi = [
|
||||
"curl-cffi==0.5.10; os_name=='nt' and implementation_name=='cpython'",
|
||||
"curl-cffi>=0.5.10,!=0.6.*,<0.7.2; os_name!='nt' and implementation_name=='cpython'",
|
||||
]
|
||||
secretstorage = [
|
||||
"cffi",
|
||||
"secretstorage",
|
||||
]
|
||||
build = [
|
||||
"build",
|
||||
"hatchling",
|
||||
"pip",
|
||||
"setuptools>=71.0.2", # 71.0.0 broke pyinstaller
|
||||
"wheel",
|
||||
]
|
||||
dev = [
|
||||
"pre-commit",
|
||||
"yt-dlp[static-analysis]",
|
||||
"yt-dlp[test]",
|
||||
]
|
||||
static-analysis = [
|
||||
"autopep8~=2.0",
|
||||
"ruff~=0.7.0",
|
||||
]
|
||||
test = [
|
||||
"pytest~=8.1",
|
||||
"pytest-rerunfailures~=14.0",
|
||||
]
|
||||
pyinstaller = [
|
||||
"pyinstaller>=6.10.0", # Windows temp cleanup fixed in 6.10.0
|
||||
]
|
||||
|
||||
[project.urls]
|
||||
Documentation = "https://github.com/yt-dlp/yt-dlp#readme"
|
||||
Repository = "https://github.com/yt-dlp/yt-dlp"
|
||||
Tracker = "https://github.com/yt-dlp/yt-dlp/issues"
|
||||
Funding = "https://github.com/yt-dlp/yt-dlp/blob/master/Collaborators.md#collaborators"
|
||||
|
||||
[project.scripts]
|
||||
yt-dlp = "yt_dlp:main"
|
||||
|
||||
[project.entry-points.pyinstaller40]
|
||||
hook-dirs = "yt_dlp.__pyinstaller:get_hook_dirs"
|
||||
|
||||
[tool.hatch.build.targets.sdist]
|
||||
include = [
|
||||
"/yt_dlp",
|
||||
"/devscripts",
|
||||
"/test",
|
||||
"/.gitignore", # included by default, needed for auto-excludes
|
||||
"/Changelog.md",
|
||||
"/LICENSE", # included as license
|
||||
"/pyproject.toml", # included by default
|
||||
"/README.md", # included as readme
|
||||
"/setup.cfg",
|
||||
"/supportedsites.md",
|
||||
]
|
||||
artifacts = [
|
||||
"/yt_dlp/extractor/lazy_extractors.py",
|
||||
"/completions",
|
||||
"/AUTHORS", # included by default
|
||||
"/README.txt",
|
||||
"/yt-dlp.1",
|
||||
]
|
||||
|
||||
[tool.hatch.build.targets.wheel]
|
||||
packages = ["yt_dlp"]
|
||||
artifacts = ["/yt_dlp/extractor/lazy_extractors.py"]
|
||||
|
||||
[tool.hatch.build.targets.wheel.shared-data]
|
||||
"completions/bash/yt-dlp" = "share/bash-completion/completions/yt-dlp"
|
||||
"completions/zsh/_yt-dlp" = "share/zsh/site-functions/_yt-dlp"
|
||||
"completions/fish/yt-dlp.fish" = "share/fish/vendor_completions.d/yt-dlp.fish"
|
||||
"README.txt" = "share/doc/yt_dlp/README.txt"
|
||||
"yt-dlp.1" = "share/man/man1/yt-dlp.1"
|
||||
|
||||
[tool.hatch.version]
|
||||
path = "yt_dlp/version.py"
|
||||
pattern = "_pkg_version = '(?P<version>[^']+)'"
|
||||
|
||||
[tool.hatch.envs.default]
|
||||
features = ["curl-cffi", "default"]
|
||||
dependencies = ["pre-commit"]
|
||||
path = ".venv"
|
||||
installer = "uv"
|
||||
|
||||
[tool.hatch.envs.default.scripts]
|
||||
setup = "pre-commit install --config .pre-commit-hatch.yaml"
|
||||
yt-dlp = "python -Werror -Xdev -m yt_dlp {args}"
|
||||
|
||||
[tool.hatch.envs.hatch-static-analysis]
|
||||
detached = true
|
||||
features = ["static-analysis"]
|
||||
dependencies = [] # override hatch ruff version
|
||||
config-path = "pyproject.toml"
|
||||
|
||||
[tool.hatch.envs.hatch-static-analysis.scripts]
|
||||
format-check = "autopep8 --diff {args:.}"
|
||||
format-fix = "autopep8 --in-place {args:.}"
|
||||
lint-check = "ruff check {args:.}"
|
||||
lint-fix = "ruff check --fix {args:.}"
|
||||
|
||||
[tool.hatch.envs.hatch-test]
|
||||
features = ["test"]
|
||||
dependencies = [
|
||||
"pytest-randomly~=3.15",
|
||||
"pytest-xdist[psutil]~=3.5",
|
||||
]
|
||||
|
||||
[tool.hatch.envs.hatch-test.scripts]
|
||||
run = "python -m devscripts.run_tests {args}"
|
||||
run-cov = "echo Code coverage not implemented && exit 1"
|
||||
|
||||
[[tool.hatch.envs.hatch-test.matrix]]
|
||||
python = [
|
||||
"3.9",
|
||||
"3.10",
|
||||
"3.11",
|
||||
"3.12",
|
||||
"3.13",
|
||||
"pypy3.10",
|
||||
]
|
||||
|
||||
[tool.ruff]
|
||||
line-length = 120
|
||||
|
||||
[tool.ruff.lint]
|
||||
ignore = [
|
||||
"E402", # module-import-not-at-top-of-file
|
||||
"E501", # line-too-long
|
||||
"E731", # lambda-assignment
|
||||
"E741", # ambiguous-variable-name
|
||||
"UP036", # outdated-version-block
|
||||
"B006", # mutable-argument-default
|
||||
"B008", # function-call-in-default-argument
|
||||
"B011", # assert-false
|
||||
"B017", # assert-raises-exception
|
||||
"B023", # function-uses-loop-variable (false positives)
|
||||
"B028", # no-explicit-stacklevel
|
||||
"B904", # raise-without-from-inside-except
|
||||
"C401", # unnecessary-generator-set
|
||||
"C402", # unnecessary-generator-dict
|
||||
"PIE790", # unnecessary-placeholder
|
||||
"SIM102", # collapsible-if
|
||||
"SIM108", # if-else-block-instead-of-if-exp
|
||||
"SIM112", # uncapitalized-environment-variables
|
||||
"SIM113", # enumerate-for-loop
|
||||
"SIM114", # if-with-same-arms
|
||||
"SIM115", # open-file-with-context-handler
|
||||
"SIM117", # multiple-with-statements
|
||||
"SIM223", # expr-and-false
|
||||
"SIM300", # yoda-conditions
|
||||
"TD001", # invalid-todo-tag
|
||||
"TD002", # missing-todo-author
|
||||
"TD003", # missing-todo-link
|
||||
"PLE0604", # invalid-all-object (false positives)
|
||||
"PLE0643", # potential-index-error (false positives)
|
||||
"PLW0603", # global-statement
|
||||
"PLW1510", # subprocess-run-without-check
|
||||
"PLW2901", # redefined-loop-name
|
||||
"RUF001", # ambiguous-unicode-character-string
|
||||
"RUF012", # mutable-class-default
|
||||
"RUF100", # unused-noqa (flake8 has slightly different behavior)
|
||||
]
|
||||
select = [
|
||||
"E", # pycodestyle Error
|
||||
"W", # pycodestyle Warning
|
||||
"F", # Pyflakes
|
||||
"I", # isort
|
||||
"Q", # flake8-quotes
|
||||
"N803", # invalid-argument-name
|
||||
"N804", # invalid-first-argument-name-for-class-method
|
||||
"UP", # pyupgrade
|
||||
"B", # flake8-bugbear
|
||||
"A", # flake8-builtins
|
||||
"COM", # flake8-commas
|
||||
"C4", # flake8-comprehensions
|
||||
"FA", # flake8-future-annotations
|
||||
"ISC", # flake8-implicit-str-concat
|
||||
"ICN003", # banned-import-from
|
||||
"PIE", # flake8-pie
|
||||
"T20", # flake8-print
|
||||
"RSE", # flake8-raise
|
||||
"RET504", # unnecessary-assign
|
||||
"SIM", # flake8-simplify
|
||||
"TID251", # banned-api
|
||||
"TD", # flake8-todos
|
||||
"PLC", # Pylint Convention
|
||||
"PLE", # Pylint Error
|
||||
"PLW", # Pylint Warning
|
||||
"RUF", # Ruff-specific rules
|
||||
]
|
||||
|
||||
[tool.ruff.lint.per-file-ignores]
|
||||
"devscripts/lazy_load_template.py" = [
|
||||
"F401", # unused-import
|
||||
]
|
||||
"!yt_dlp/extractor/**.py" = [
|
||||
"I", # isort
|
||||
"ICN003", # banned-import-from
|
||||
"T20", # flake8-print
|
||||
"A002", # builtin-argument-shadowing
|
||||
"C408", # unnecessary-collection-call
|
||||
]
|
||||
"yt_dlp/jsinterp.py" = [
|
||||
"UP031", # printf-string-formatting
|
||||
]
|
||||
|
||||
[tool.ruff.lint.isort]
|
||||
known-first-party = [
|
||||
"bundle",
|
||||
"devscripts",
|
||||
"test",
|
||||
]
|
||||
relative-imports-order = "closest-to-furthest"
|
||||
|
||||
[tool.ruff.lint.flake8-quotes]
|
||||
docstring-quotes = "double"
|
||||
multiline-quotes = "single"
|
||||
inline-quotes = "single"
|
||||
avoid-escape = false
|
||||
|
||||
[tool.ruff.lint.pep8-naming]
|
||||
classmethod-decorators = [
|
||||
"yt_dlp.utils.classproperty",
|
||||
]
|
||||
|
||||
[tool.ruff.lint.flake8-import-conventions]
|
||||
banned-from = [
|
||||
"base64",
|
||||
"datetime",
|
||||
"functools",
|
||||
"glob",
|
||||
"hashlib",
|
||||
"itertools",
|
||||
"json",
|
||||
"math",
|
||||
"os",
|
||||
"pathlib",
|
||||
"random",
|
||||
"re",
|
||||
"string",
|
||||
"sys",
|
||||
"time",
|
||||
"urllib.parse",
|
||||
"uuid",
|
||||
"xml",
|
||||
]
|
||||
|
||||
[tool.ruff.lint.flake8-tidy-imports.banned-api]
|
||||
"yt_dlp.compat.compat_str".msg = "Use `str` instead."
|
||||
"yt_dlp.compat.compat_b64decode".msg = "Use `base64.b64decode` instead."
|
||||
"yt_dlp.compat.compat_urlparse".msg = "Use `urllib.parse` instead."
|
||||
"yt_dlp.compat.compat_parse_qs".msg = "Use `urllib.parse.parse_qs` instead."
|
||||
"yt_dlp.compat.compat_urllib_parse_unquote".msg = "Use `urllib.parse.unquote` instead."
|
||||
"yt_dlp.compat.compat_urllib_parse_urlencode".msg = "Use `urllib.parse.urlencode` instead."
|
||||
"yt_dlp.compat.compat_urllib_parse_urlparse".msg = "Use `urllib.parse.urlparse` instead."
|
||||
"yt_dlp.compat.compat_shlex_quote".msg = "Use `yt_dlp.utils.shell_quote` instead."
|
||||
"yt_dlp.utils.error_to_compat_str".msg = "Use `str` instead."
|
||||
|
||||
[tool.autopep8]
|
||||
max_line_length = 120
|
||||
recursive = true
|
||||
exit-code = true
|
||||
jobs = 0
|
||||
select = [
|
||||
"E101",
|
||||
"E112",
|
||||
"E113",
|
||||
"E115",
|
||||
"E116",
|
||||
"E117",
|
||||
"E121",
|
||||
"E122",
|
||||
"E123",
|
||||
"E124",
|
||||
"E125",
|
||||
"E126",
|
||||
"E127",
|
||||
"E128",
|
||||
"E129",
|
||||
"E131",
|
||||
"E201",
|
||||
"E202",
|
||||
"E203",
|
||||
"E211",
|
||||
"E221",
|
||||
"E222",
|
||||
"E223",
|
||||
"E224",
|
||||
"E225",
|
||||
"E226",
|
||||
"E227",
|
||||
"E228",
|
||||
"E231",
|
||||
"E241",
|
||||
"E242",
|
||||
"E251",
|
||||
"E252",
|
||||
"E261",
|
||||
"E262",
|
||||
"E265",
|
||||
"E266",
|
||||
"E271",
|
||||
"E272",
|
||||
"E273",
|
||||
"E274",
|
||||
"E275",
|
||||
"E301",
|
||||
"E302",
|
||||
"E303",
|
||||
"E304",
|
||||
"E305",
|
||||
"E306",
|
||||
"E502",
|
||||
"E701",
|
||||
"E702",
|
||||
"E704",
|
||||
"W391",
|
||||
"W504",
|
||||
]
|
||||
|
||||
[tool.pytest.ini_options]
|
||||
addopts = "-ra -v --strict-markers"
|
||||
markers = [
|
||||
"download",
|
||||
]
|
|
@ -1,6 +0,0 @@
|
|||
mutagen
|
||||
pycryptodomex
|
||||
websockets
|
||||
brotli; platform_python_implementation=='CPython'
|
||||
brotlicffi; platform_python_implementation!='CPython'
|
||||
certifi
|
16
setup.cfg
16
setup.cfg
|
@ -1,7 +1,3 @@
|
|||
[wheel]
|
||||
universal = true
|
||||
|
||||
|
||||
[flake8]
|
||||
exclude = build,venv,.tox,.git,.pytest_cache
|
||||
ignore = E402,E501,E731,E741,W503
|
||||
|
@ -18,20 +14,14 @@ remove-duplicate-keys = true
|
|||
remove-unused-variables = true
|
||||
|
||||
|
||||
[tool:pytest]
|
||||
addopts = -ra -v --strict-markers
|
||||
markers =
|
||||
download
|
||||
|
||||
|
||||
[tox:tox]
|
||||
skipsdist = true
|
||||
envlist = py{36,37,38,39,310},pypy{36,37,38,39}
|
||||
envlist = py{39,310,311,312,313},pypy310
|
||||
skip_missing_interpreters = true
|
||||
|
||||
[testenv] # tox
|
||||
deps =
|
||||
pytest
|
||||
pytest
|
||||
commands = pytest {posargs:"-m not download"}
|
||||
passenv = HOME # For test_compat_expanduser
|
||||
setenv =
|
||||
|
@ -39,7 +29,7 @@ setenv =
|
|||
|
||||
|
||||
[isort]
|
||||
py_version = 37
|
||||
py_version = 39
|
||||
multi_line_output = VERTICAL_HANGING_INDENT
|
||||
line_length = 80
|
||||
reverse_relative = true
|
||||
|
|
168
setup.py
168
setup.py
|
@ -1,168 +0,0 @@
|
|||
#!/usr/bin/env python3
|
||||
|
||||
import os.path
|
||||
import subprocess
|
||||
import sys
|
||||
import warnings
|
||||
|
||||
try:
|
||||
from setuptools import Command, find_packages, setup
|
||||
setuptools_available = True
|
||||
except ImportError:
|
||||
from distutils.core import Command, setup
|
||||
setuptools_available = False
|
||||
|
||||
from devscripts.utils import read_file, read_version
|
||||
|
||||
VERSION = read_version()
|
||||
|
||||
DESCRIPTION = 'A youtube-dl fork with additional features and patches'
|
||||
|
||||
LONG_DESCRIPTION = '\n\n'.join((
|
||||
'Official repository: <https://github.com/yt-dlp/yt-dlp>',
|
||||
'**PS**: Some links in this document will not work since this is a copy of the README.md from Github',
|
||||
read_file('README.md')))
|
||||
|
||||
REQUIREMENTS = read_file('requirements.txt').splitlines()
|
||||
|
||||
|
||||
def packages():
|
||||
if setuptools_available:
|
||||
return find_packages(exclude=('youtube_dl', 'youtube_dlc', 'test', 'ytdlp_plugins', 'devscripts'))
|
||||
|
||||
return [
|
||||
'yt_dlp', 'yt_dlp.extractor', 'yt_dlp.downloader', 'yt_dlp.postprocessor', 'yt_dlp.compat',
|
||||
]
|
||||
|
||||
|
||||
def py2exe_params():
|
||||
warnings.warn(
|
||||
'py2exe builds do not support pycryptodomex and needs VC++14 to run. '
|
||||
'It is recommended to run "pyinst.py" to build using pyinstaller instead')
|
||||
|
||||
return {
|
||||
'console': [{
|
||||
'script': './yt_dlp/__main__.py',
|
||||
'dest_base': 'yt-dlp',
|
||||
'icon_resources': [(1, 'devscripts/logo.ico')],
|
||||
}],
|
||||
'version_info': {
|
||||
'version': VERSION,
|
||||
'description': DESCRIPTION,
|
||||
'comments': LONG_DESCRIPTION.split('\n')[0],
|
||||
'product_name': 'yt-dlp',
|
||||
'product_version': VERSION,
|
||||
},
|
||||
'options': {
|
||||
'bundle_files': 0,
|
||||
'compressed': 1,
|
||||
'optimize': 2,
|
||||
'dist_dir': './dist',
|
||||
'excludes': ['Crypto', 'Cryptodome'], # py2exe cannot import Crypto
|
||||
'dll_excludes': ['w9xpopen.exe', 'crypt32.dll'],
|
||||
# Modules that are only imported dynamically must be added here
|
||||
'includes': ['yt_dlp.compat._legacy'],
|
||||
},
|
||||
'zipfile': None,
|
||||
}
|
||||
|
||||
|
||||
def build_params():
|
||||
files_spec = [
|
||||
('share/bash-completion/completions', ['completions/bash/yt-dlp']),
|
||||
('share/zsh/site-functions', ['completions/zsh/_yt-dlp']),
|
||||
('share/fish/vendor_completions.d', ['completions/fish/yt-dlp.fish']),
|
||||
('share/doc/yt_dlp', ['README.txt']),
|
||||
('share/man/man1', ['yt-dlp.1'])
|
||||
]
|
||||
data_files = []
|
||||
for dirname, files in files_spec:
|
||||
resfiles = []
|
||||
for fn in files:
|
||||
if not os.path.exists(fn):
|
||||
warnings.warn(f'Skipping file {fn} since it is not present. Try running " make pypi-files " first')
|
||||
else:
|
||||
resfiles.append(fn)
|
||||
data_files.append((dirname, resfiles))
|
||||
|
||||
params = {'data_files': data_files}
|
||||
|
||||
if setuptools_available:
|
||||
params['entry_points'] = {'console_scripts': ['yt-dlp = yt_dlp:main']}
|
||||
else:
|
||||
params['scripts'] = ['yt-dlp']
|
||||
return params
|
||||
|
||||
|
||||
class build_lazy_extractors(Command):
|
||||
description = 'Build the extractor lazy loading module'
|
||||
user_options = []
|
||||
|
||||
def initialize_options(self):
|
||||
pass
|
||||
|
||||
def finalize_options(self):
|
||||
pass
|
||||
|
||||
def run(self):
|
||||
if self.dry_run:
|
||||
print('Skipping build of lazy extractors in dry run mode')
|
||||
return
|
||||
subprocess.run([sys.executable, 'devscripts/make_lazy_extractors.py'])
|
||||
|
||||
|
||||
def main():
|
||||
if sys.argv[1:2] == ['py2exe']:
|
||||
params = py2exe_params()
|
||||
try:
|
||||
from py2exe import freeze
|
||||
except ImportError:
|
||||
import py2exe # noqa: F401
|
||||
warnings.warn('You are using an outdated version of py2exe. Support for this version will be removed in the future')
|
||||
params['console'][0].update(params.pop('version_info'))
|
||||
params['options'] = {'py2exe': params.pop('options')}
|
||||
else:
|
||||
return freeze(**params)
|
||||
else:
|
||||
params = build_params()
|
||||
|
||||
setup(
|
||||
name='yt-dlp',
|
||||
version=VERSION,
|
||||
maintainer='pukkandan',
|
||||
maintainer_email='pukkandan.ytdlp@gmail.com',
|
||||
description=DESCRIPTION,
|
||||
long_description=LONG_DESCRIPTION,
|
||||
long_description_content_type='text/markdown',
|
||||
url='https://github.com/yt-dlp/yt-dlp',
|
||||
packages=packages(),
|
||||
install_requires=REQUIREMENTS,
|
||||
python_requires='>=3.7',
|
||||
project_urls={
|
||||
'Documentation': 'https://github.com/yt-dlp/yt-dlp#readme',
|
||||
'Source': 'https://github.com/yt-dlp/yt-dlp',
|
||||
'Tracker': 'https://github.com/yt-dlp/yt-dlp/issues',
|
||||
'Funding': 'https://github.com/yt-dlp/yt-dlp/blob/master/Collaborators.md#collaborators',
|
||||
},
|
||||
classifiers=[
|
||||
'Topic :: Multimedia :: Video',
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Environment :: Console',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3.7',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: Implementation',
|
||||
'Programming Language :: Python :: Implementation :: CPython',
|
||||
'Programming Language :: Python :: Implementation :: PyPy',
|
||||
'License :: Public Domain',
|
||||
'Operating System :: OS Independent',
|
||||
],
|
||||
cmdclass={'build_lazy_extractors': build_lazy_extractors},
|
||||
**params
|
||||
)
|
||||
|
||||
|
||||
main()
|
1213
supportedsites.md
1213
supportedsites.md
File diff suppressed because it is too large
Load diff
64
test/conftest.py
Normal file
64
test/conftest.py
Normal file
|
@ -0,0 +1,64 @@
|
|||
import inspect
|
||||
|
||||
import pytest
|
||||
|
||||
from yt_dlp.networking import RequestHandler
|
||||
from yt_dlp.networking.common import _REQUEST_HANDLERS
|
||||
from yt_dlp.utils._utils import _YDLLogger as FakeLogger
|
||||
|
||||
|
||||
@pytest.fixture
|
||||
def handler(request):
|
||||
RH_KEY = getattr(request, 'param', None)
|
||||
if not RH_KEY:
|
||||
return
|
||||
if inspect.isclass(RH_KEY) and issubclass(RH_KEY, RequestHandler):
|
||||
handler = RH_KEY
|
||||
elif RH_KEY in _REQUEST_HANDLERS:
|
||||
handler = _REQUEST_HANDLERS[RH_KEY]
|
||||
else:
|
||||
pytest.skip(f'{RH_KEY} request handler is not available')
|
||||
|
||||
class HandlerWrapper(handler):
|
||||
RH_KEY = handler.RH_KEY
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
super().__init__(logger=FakeLogger, **kwargs)
|
||||
|
||||
return HandlerWrapper
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def skip_handler(request, handler):
|
||||
"""usage: pytest.mark.skip_handler('my_handler', 'reason')"""
|
||||
for marker in request.node.iter_markers('skip_handler'):
|
||||
if marker.args[0] == handler.RH_KEY:
|
||||
pytest.skip(marker.args[1] if len(marker.args) > 1 else '')
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def skip_handler_if(request, handler):
|
||||
"""usage: pytest.mark.skip_handler_if('my_handler', lambda request: True, 'reason')"""
|
||||
for marker in request.node.iter_markers('skip_handler_if'):
|
||||
if marker.args[0] == handler.RH_KEY and marker.args[1](request):
|
||||
pytest.skip(marker.args[2] if len(marker.args) > 2 else '')
|
||||
|
||||
|
||||
@pytest.fixture(autouse=True)
|
||||
def skip_handlers_if(request, handler):
|
||||
"""usage: pytest.mark.skip_handlers_if(lambda request, handler: True, 'reason')"""
|
||||
for marker in request.node.iter_markers('skip_handlers_if'):
|
||||
if handler and marker.args[0](request, handler):
|
||||
pytest.skip(marker.args[1] if len(marker.args) > 1 else '')
|
||||
|
||||
|
||||
def pytest_configure(config):
|
||||
config.addinivalue_line(
|
||||
'markers', 'skip_handler(handler): skip test for the given handler',
|
||||
)
|
||||
config.addinivalue_line(
|
||||
'markers', 'skip_handler_if(handler): skip test for the given handler if condition is true',
|
||||
)
|
||||
config.addinivalue_line(
|
||||
'markers', 'skip_handlers_if(handler): skip test for handlers when the condition is true',
|
||||
)
|
|
@ -10,14 +10,14 @@
|
|||
import yt_dlp.extractor
|
||||
from yt_dlp import YoutubeDL
|
||||
from yt_dlp.compat import compat_os_name
|
||||
from yt_dlp.utils import preferredencoding, write_string
|
||||
from yt_dlp.utils import preferredencoding, try_call, write_string, find_available_port
|
||||
|
||||
if 'pytest' in sys.modules:
|
||||
import pytest
|
||||
is_download_test = pytest.mark.download
|
||||
else:
|
||||
def is_download_test(testClass):
|
||||
return testClass
|
||||
def is_download_test(test_class):
|
||||
return test_class
|
||||
|
||||
|
||||
def get_params(override=None):
|
||||
|
@ -45,10 +45,10 @@ def try_rm(filename):
|
|||
|
||||
|
||||
def report_warning(message, *args, **kwargs):
|
||||
'''
|
||||
"""
|
||||
Print the message to stderr, it will be prefixed with 'WARNING:'
|
||||
If stderr is a tty file the 'WARNING:' will be colored
|
||||
'''
|
||||
"""
|
||||
if sys.stderr.isatty() and compat_os_name != 'nt':
|
||||
_msg_header = '\033[0;33mWARNING:\033[0m'
|
||||
else:
|
||||
|
@ -138,15 +138,14 @@ def expect_value(self, got, expected, field):
|
|||
elif isinstance(expected, list) and isinstance(got, list):
|
||||
self.assertEqual(
|
||||
len(expected), len(got),
|
||||
'Expect a list of length %d, but got a list of length %d for field %s' % (
|
||||
len(expected), len(got), field))
|
||||
f'Expect a list of length {len(expected)}, but got a list of length {len(got)} for field {field}')
|
||||
for index, (item_got, item_expected) in enumerate(zip(got, expected)):
|
||||
type_got = type(item_got)
|
||||
type_expected = type(item_expected)
|
||||
self.assertEqual(
|
||||
type_expected, type_got,
|
||||
'Type mismatch for list item at index %d for field %s, expected %r, got %r' % (
|
||||
index, field, type_expected, type_got))
|
||||
f'Type mismatch for list item at index {index} for field {field}, '
|
||||
f'expected {type_expected!r}, got {type_got!r}')
|
||||
expect_value(self, item_got, item_expected, field)
|
||||
else:
|
||||
if isinstance(expected, str) and expected.startswith('md5:'):
|
||||
|
@ -194,8 +193,8 @@ def sanitize_got_info_dict(got_dict):
|
|||
'formats', 'thumbnails', 'subtitles', 'automatic_captions', 'comments', 'entries',
|
||||
|
||||
# Auto-generated
|
||||
'autonumber', 'playlist', 'format_index', 'video_ext', 'audio_ext', 'duration_string', 'epoch',
|
||||
'fulltitle', 'extractor', 'extractor_key', 'filepath', 'infojson_filename', 'original_url', 'n_entries',
|
||||
'autonumber', 'playlist', 'format_index', 'video_ext', 'audio_ext', 'duration_string', 'epoch', 'n_entries',
|
||||
'fulltitle', 'extractor', 'extractor_key', 'filename', 'filepath', 'infojson_filename', 'original_url',
|
||||
|
||||
# Only live_status needs to be checked
|
||||
'is_live', 'was_live',
|
||||
|
@ -214,14 +213,23 @@ def sanitize(key, value):
|
|||
|
||||
test_info_dict = {
|
||||
key: sanitize(key, value) for key, value in got_dict.items()
|
||||
if value is not None and key not in IGNORED_FIELDS and not any(
|
||||
key.startswith(f'{prefix}_') for prefix in IGNORED_PREFIXES)
|
||||
if value is not None and key not in IGNORED_FIELDS and (
|
||||
not any(key.startswith(f'{prefix}_') for prefix in IGNORED_PREFIXES)
|
||||
or key == '_old_archive_ids')
|
||||
}
|
||||
|
||||
# display_id may be generated from id
|
||||
if test_info_dict.get('display_id') == test_info_dict.get('id'):
|
||||
test_info_dict.pop('display_id')
|
||||
|
||||
# Remove deprecated fields
|
||||
for old in YoutubeDL._deprecated_multivalue_fields:
|
||||
test_info_dict.pop(old, None)
|
||||
|
||||
# release_year may be generated from release_date
|
||||
if try_call(lambda: test_info_dict['release_year'] == int(test_info_dict['release_date'][:4])):
|
||||
test_info_dict.pop('release_year')
|
||||
|
||||
# Check url for flat entries
|
||||
if got_dict.get('_type', 'video') != 'video' and got_dict.get('url'):
|
||||
test_info_dict['url'] = got_dict['url']
|
||||
|
@ -237,11 +245,11 @@ def expect_info_dict(self, got_dict, expected_dict):
|
|||
if expected_dict.get('ext'):
|
||||
mandatory_fields.extend(('url', 'ext'))
|
||||
for key in mandatory_fields:
|
||||
self.assertTrue(got_dict.get(key), 'Missing mandatory field %s' % key)
|
||||
self.assertTrue(got_dict.get(key), f'Missing mandatory field {key}')
|
||||
# Check for mandatory fields that are automatically set by YoutubeDL
|
||||
if got_dict.get('_type', 'video') == 'video':
|
||||
for key in ['webpage_url', 'extractor', 'extractor_key']:
|
||||
self.assertTrue(got_dict.get(key), 'Missing field: %s' % key)
|
||||
self.assertTrue(got_dict.get(key), f'Missing field: {key}')
|
||||
|
||||
test_info_dict = sanitize_got_info_dict(got_dict)
|
||||
|
||||
|
@ -249,7 +257,7 @@ def expect_info_dict(self, got_dict, expected_dict):
|
|||
if missing_keys:
|
||||
def _repr(v):
|
||||
if isinstance(v, str):
|
||||
return "'%s'" % v.replace('\\', '\\\\').replace("'", "\\'").replace('\n', '\\n')
|
||||
return "'{}'".format(v.replace('\\', '\\\\').replace("'", "\\'").replace('\n', '\\n'))
|
||||
elif isinstance(v, type):
|
||||
return v.__name__
|
||||
else:
|
||||
|
@ -266,8 +274,7 @@ def _repr(v):
|
|||
write_string(info_dict_str.replace('\n', '\n '), out=sys.stderr)
|
||||
self.assertFalse(
|
||||
missing_keys,
|
||||
'Missing keys in test definition: %s' % (
|
||||
', '.join(sorted(missing_keys))))
|
||||
'Missing keys in test definition: {}'.format(', '.join(sorted(missing_keys))))
|
||||
|
||||
|
||||
def assertRegexpMatches(self, text, regexp, msg=None):
|
||||
|
@ -276,9 +283,9 @@ def assertRegexpMatches(self, text, regexp, msg=None):
|
|||
else:
|
||||
m = re.match(regexp, text)
|
||||
if not m:
|
||||
note = 'Regexp didn\'t match: %r not found' % (regexp)
|
||||
note = f'Regexp didn\'t match: {regexp!r} not found'
|
||||
if len(text) < 1000:
|
||||
note += ' in %r' % text
|
||||
note += f' in {text!r}'
|
||||
if msg is None:
|
||||
msg = note
|
||||
else:
|
||||
|
@ -301,7 +308,7 @@ def assertLessEqual(self, got, expected, msg=None):
|
|||
|
||||
|
||||
def assertEqual(self, got, expected, msg=None):
|
||||
if not (got == expected):
|
||||
if got != expected:
|
||||
if msg is None:
|
||||
msg = f'{got!r} not equal to {expected!r}'
|
||||
self.assertTrue(got == expected, msg)
|
||||
|
@ -324,3 +331,13 @@ def http_server_port(httpd):
|
|||
else:
|
||||
sock = httpd.socket
|
||||
return sock.getsockname()[1]
|
||||
|
||||
|
||||
def verify_address_availability(address):
|
||||
if find_available_port(address) is None:
|
||||
pytest.skip(f'Unable to bind to source address {address} (address may not exist)')
|
||||
|
||||
|
||||
def validate_and_send(rh, req):
|
||||
rh.validate(req)
|
||||
return rh.send(req)
|
||||
|
|
|
@ -69,6 +69,7 @@ def test_opengraph(self):
|
|||
<meta name="og:test1" content='foo > < bar'/>
|
||||
<meta name="og:test2" content="foo >//< bar"/>
|
||||
<meta property=og-test3 content='Ill-formatted opengraph'/>
|
||||
<meta property=og:test4 content=unquoted-value/>
|
||||
'''
|
||||
self.assertEqual(ie._og_search_title(html), 'Foo')
|
||||
self.assertEqual(ie._og_search_description(html), 'Some video\'s description ')
|
||||
|
@ -81,6 +82,7 @@ def test_opengraph(self):
|
|||
self.assertEqual(ie._og_search_property(('test0', 'test1'), html), 'foo > < bar')
|
||||
self.assertRaises(RegexNotFoundError, ie._og_search_property, 'test0', html, None, fatal=True)
|
||||
self.assertRaises(RegexNotFoundError, ie._og_search_property, ('test0', 'test00'), html, None, fatal=True)
|
||||
self.assertEqual(ie._og_search_property('test4', html), 'unquoted-value')
|
||||
|
||||
def test_html_search_meta(self):
|
||||
ie = self.ie
|
||||
|
@ -260,19 +262,19 @@ def test_search_json_ld_realworld(self):
|
|||
''',
|
||||
{
|
||||
'chapters': [
|
||||
{"title": "Explosie Turnhout", "start_time": 70, "end_time": 440},
|
||||
{"title": "Jaarwisseling", "start_time": 440, "end_time": 1179},
|
||||
{"title": "Natuurbranden Colorado", "start_time": 1179, "end_time": 1263},
|
||||
{"title": "Klimaatverandering", "start_time": 1263, "end_time": 1367},
|
||||
{"title": "Zacht weer", "start_time": 1367, "end_time": 1383},
|
||||
{"title": "Financiële balans", "start_time": 1383, "end_time": 1484},
|
||||
{"title": "Club Brugge", "start_time": 1484, "end_time": 1575},
|
||||
{"title": "Mentale gezondheid bij topsporters", "start_time": 1575, "end_time": 1728},
|
||||
{"title": "Olympische Winterspelen", "start_time": 1728, "end_time": 1873},
|
||||
{"title": "Sober oudjaar in Nederland", "start_time": 1873, "end_time": 2079.23}
|
||||
{'title': 'Explosie Turnhout', 'start_time': 70, 'end_time': 440},
|
||||
{'title': 'Jaarwisseling', 'start_time': 440, 'end_time': 1179},
|
||||
{'title': 'Natuurbranden Colorado', 'start_time': 1179, 'end_time': 1263},
|
||||
{'title': 'Klimaatverandering', 'start_time': 1263, 'end_time': 1367},
|
||||
{'title': 'Zacht weer', 'start_time': 1367, 'end_time': 1383},
|
||||
{'title': 'Financiële balans', 'start_time': 1383, 'end_time': 1484},
|
||||
{'title': 'Club Brugge', 'start_time': 1484, 'end_time': 1575},
|
||||
{'title': 'Mentale gezondheid bij topsporters', 'start_time': 1575, 'end_time': 1728},
|
||||
{'title': 'Olympische Winterspelen', 'start_time': 1728, 'end_time': 1873},
|
||||
{'title': 'Sober oudjaar in Nederland', 'start_time': 1873, 'end_time': 2079.23},
|
||||
],
|
||||
'title': 'Het journaal - Aflevering 365 (Seizoen 2021)'
|
||||
}, {}
|
||||
'title': 'Het journaal - Aflevering 365 (Seizoen 2021)',
|
||||
}, {},
|
||||
),
|
||||
(
|
||||
# test multiple thumbnails in a list
|
||||
|
@ -299,13 +301,13 @@ def test_search_json_ld_realworld(self):
|
|||
'thumbnails': [{'url': 'https://www.rainews.it/cropgd/640x360/dl/img/2021/12/30/1640886376927_GettyImages.jpg'}],
|
||||
},
|
||||
{},
|
||||
)
|
||||
),
|
||||
]
|
||||
for html, expected_dict, search_json_ld_kwargs in _TESTS:
|
||||
expect_dict(
|
||||
self,
|
||||
self.ie._search_json_ld(html, None, **search_json_ld_kwargs),
|
||||
expected_dict
|
||||
expected_dict,
|
||||
)
|
||||
|
||||
def test_download_json(self):
|
||||
|
@ -364,7 +366,7 @@ def test_parse_html5_media_entries(self):
|
|||
'height': 740,
|
||||
'tbr': 1500,
|
||||
}],
|
||||
'thumbnail': '//pics.r18.com/digital/amateur/mgmr105/mgmr105jp.jpg'
|
||||
'thumbnail': '//pics.r18.com/digital/amateur/mgmr105/mgmr105jp.jpg',
|
||||
})
|
||||
|
||||
# from https://www.csfd.cz/
|
||||
|
@ -417,9 +419,9 @@ def test_parse_html5_media_entries(self):
|
|||
'height': 1080,
|
||||
}],
|
||||
'subtitles': {
|
||||
'cs': [{'url': 'https://video.csfd.cz/files/subtitles/163/344/163344115_4c388b.srt'}]
|
||||
'cs': [{'url': 'https://video.csfd.cz/files/subtitles/163/344/163344115_4c388b.srt'}],
|
||||
},
|
||||
'thumbnail': 'https://img.csfd.cz/files/images/film/video/preview/163/344/163344118_748d20.png?h360'
|
||||
'thumbnail': 'https://img.csfd.cz/files/images/film/video/preview/163/344/163344118_748d20.png?h360',
|
||||
})
|
||||
|
||||
# from https://tamasha.com/v/Kkdjw
|
||||
|
@ -450,7 +452,7 @@ def test_parse_html5_media_entries(self):
|
|||
'ext': 'mp4',
|
||||
'format_id': '144p',
|
||||
'height': 144,
|
||||
}]
|
||||
}],
|
||||
})
|
||||
|
||||
# from https://www.directvnow.com
|
||||
|
@ -468,7 +470,7 @@ def test_parse_html5_media_entries(self):
|
|||
'formats': [{
|
||||
'ext': 'mp4',
|
||||
'url': 'https://cdn.directv.com/content/dam/dtv/prod/website_directvnow-international/videos/DTVN_hdr_HBO_v3.mp4',
|
||||
}]
|
||||
}],
|
||||
})
|
||||
|
||||
# from https://www.directvnow.com
|
||||
|
@ -486,7 +488,7 @@ def test_parse_html5_media_entries(self):
|
|||
'formats': [{
|
||||
'url': 'https://cdn.directv.com/content/dam/dtv/prod/website_directvnow-international/videos/DTVN_hdr_HBO_v3.mp4',
|
||||
'ext': 'mp4',
|
||||
}]
|
||||
}],
|
||||
})
|
||||
|
||||
# from https://www.klarna.com/uk/
|
||||
|
@ -545,8 +547,8 @@ def test_extract_jwplayer_data_realworld(self):
|
|||
'id': 'XEgvuql4',
|
||||
'formats': [{
|
||||
'url': 'rtmp://192.138.214.154/live/sjclive',
|
||||
'ext': 'flv'
|
||||
}]
|
||||
'ext': 'flv',
|
||||
}],
|
||||
})
|
||||
|
||||
# from https://www.pornoxo.com/videos/7564/striptease-from-sexy-secretary/
|
||||
|
@ -586,8 +588,8 @@ def test_extract_jwplayer_data_realworld(self):
|
|||
'thumbnail': 'https://t03.vipstreamservice.com/thumbs/pxo-full/2009-12/14/a4b2157147afe5efa93ce1978e0265289c193874e02597.flv-full-13.jpg',
|
||||
'formats': [{
|
||||
'url': 'https://cdn.pornoxo.com/key=MF+oEbaxqTKb50P-w9G3nA,end=1489689259,ip=104.199.146.27/ip=104.199.146.27/speed=6573765/buffer=3.0/2009-12/4b2157147afe5efa93ce1978e0265289c193874e02597.flv',
|
||||
'ext': 'flv'
|
||||
}]
|
||||
'ext': 'flv',
|
||||
}],
|
||||
})
|
||||
|
||||
# from http://www.indiedb.com/games/king-machine/videos
|
||||
|
@ -608,12 +610,12 @@ def test_extract_jwplayer_data_realworld(self):
|
|||
'formats': [{
|
||||
'url': 'http://cdn.dbolical.com/cache/videos/games/1/50/49678/encode_mp4/king-machine-trailer.mp4',
|
||||
'height': 360,
|
||||
'ext': 'mp4'
|
||||
'ext': 'mp4',
|
||||
}, {
|
||||
'url': 'http://cdn.dbolical.com/cache/videos/games/1/50/49678/encode720p_mp4/king-machine-trailer.mp4',
|
||||
'height': 720,
|
||||
'ext': 'mp4'
|
||||
}]
|
||||
'ext': 'mp4',
|
||||
}],
|
||||
})
|
||||
|
||||
def test_parse_m3u8_formats(self):
|
||||
|
@ -864,7 +866,7 @@ def test_parse_m3u8_formats(self):
|
|||
'height': 1080,
|
||||
'vcodec': 'avc1.64002a',
|
||||
}],
|
||||
{}
|
||||
{},
|
||||
),
|
||||
(
|
||||
'bipbop_16x9',
|
||||
|
@ -915,8 +917,6 @@ def test_parse_m3u8_formats(self):
|
|||
'acodec': 'mp4a.40.2',
|
||||
'video_ext': 'mp4',
|
||||
'audio_ext': 'none',
|
||||
'vbr': 263.851,
|
||||
'abr': 0,
|
||||
}, {
|
||||
'format_id': '577',
|
||||
'format_index': None,
|
||||
|
@ -934,8 +934,6 @@ def test_parse_m3u8_formats(self):
|
|||
'acodec': 'mp4a.40.2',
|
||||
'video_ext': 'mp4',
|
||||
'audio_ext': 'none',
|
||||
'vbr': 577.61,
|
||||
'abr': 0,
|
||||
}, {
|
||||
'format_id': '915',
|
||||
'format_index': None,
|
||||
|
@ -953,8 +951,6 @@ def test_parse_m3u8_formats(self):
|
|||
'acodec': 'mp4a.40.2',
|
||||
'video_ext': 'mp4',
|
||||
'audio_ext': 'none',
|
||||
'vbr': 915.905,
|
||||
'abr': 0,
|
||||
}, {
|
||||
'format_id': '1030',
|
||||
'format_index': None,
|
||||
|
@ -972,8 +968,6 @@ def test_parse_m3u8_formats(self):
|
|||
'acodec': 'mp4a.40.2',
|
||||
'video_ext': 'mp4',
|
||||
'audio_ext': 'none',
|
||||
'vbr': 1030.138,
|
||||
'abr': 0,
|
||||
}, {
|
||||
'format_id': '1924',
|
||||
'format_index': None,
|
||||
|
@ -991,52 +985,50 @@ def test_parse_m3u8_formats(self):
|
|||
'acodec': 'mp4a.40.2',
|
||||
'video_ext': 'mp4',
|
||||
'audio_ext': 'none',
|
||||
'vbr': 1924.009,
|
||||
'abr': 0,
|
||||
}],
|
||||
{
|
||||
'en': [{
|
||||
'url': 'https://devstreaming-cdn.apple.com/videos/streaming/examples/bipbop_16x9/subtitles/eng/prog_index.m3u8',
|
||||
'ext': 'vtt',
|
||||
'protocol': 'm3u8_native'
|
||||
'protocol': 'm3u8_native',
|
||||
}, {
|
||||
'url': 'https://devstreaming-cdn.apple.com/videos/streaming/examples/bipbop_16x9/subtitles/eng_forced/prog_index.m3u8',
|
||||
'ext': 'vtt',
|
||||
'protocol': 'm3u8_native'
|
||||
'protocol': 'm3u8_native',
|
||||
}],
|
||||
'fr': [{
|
||||
'url': 'https://devstreaming-cdn.apple.com/videos/streaming/examples/bipbop_16x9/subtitles/fra/prog_index.m3u8',
|
||||
'ext': 'vtt',
|
||||
'protocol': 'm3u8_native'
|
||||
'protocol': 'm3u8_native',
|
||||
}, {
|
||||
'url': 'https://devstreaming-cdn.apple.com/videos/streaming/examples/bipbop_16x9/subtitles/fra_forced/prog_index.m3u8',
|
||||
'ext': 'vtt',
|
||||
'protocol': 'm3u8_native'
|
||||
'protocol': 'm3u8_native',
|
||||
}],
|
||||
'es': [{
|
||||
'url': 'https://devstreaming-cdn.apple.com/videos/streaming/examples/bipbop_16x9/subtitles/spa/prog_index.m3u8',
|
||||
'ext': 'vtt',
|
||||
'protocol': 'm3u8_native'
|
||||
'protocol': 'm3u8_native',
|
||||
}, {
|
||||
'url': 'https://devstreaming-cdn.apple.com/videos/streaming/examples/bipbop_16x9/subtitles/spa_forced/prog_index.m3u8',
|
||||
'ext': 'vtt',
|
||||
'protocol': 'm3u8_native'
|
||||
'protocol': 'm3u8_native',
|
||||
}],
|
||||
'ja': [{
|
||||
'url': 'https://devstreaming-cdn.apple.com/videos/streaming/examples/bipbop_16x9/subtitles/jpn/prog_index.m3u8',
|
||||
'ext': 'vtt',
|
||||
'protocol': 'm3u8_native'
|
||||
'protocol': 'm3u8_native',
|
||||
}, {
|
||||
'url': 'https://devstreaming-cdn.apple.com/videos/streaming/examples/bipbop_16x9/subtitles/jpn_forced/prog_index.m3u8',
|
||||
'ext': 'vtt',
|
||||
'protocol': 'm3u8_native'
|
||||
'protocol': 'm3u8_native',
|
||||
}],
|
||||
}
|
||||
},
|
||||
),
|
||||
]
|
||||
|
||||
for m3u8_file, m3u8_url, expected_formats, expected_subs in _TEST_CASES:
|
||||
with open('./test/testdata/m3u8/%s.m3u8' % m3u8_file, encoding='utf-8') as f:
|
||||
with open(f'./test/testdata/m3u8/{m3u8_file}.m3u8', encoding='utf-8') as f:
|
||||
formats, subs = self.ie._parse_m3u8_formats_and_subtitles(
|
||||
f.read(), m3u8_url, ext='mp4')
|
||||
self.ie._sort_formats(formats)
|
||||
|
@ -1374,14 +1366,14 @@ def test_parse_mpd_formats(self):
|
|||
'url': 'https://sdn-global-streaming-cache-3qsdn.akamaized.net/stream/3144/files/17/07/672975/3144-kZT4LWMQw6Rh7Kpd.ism/manifest.mpd',
|
||||
'fragment_base_url': 'https://sdn-global-streaming-cache-3qsdn.akamaized.net/stream/3144/files/17/07/672975/3144-kZT4LWMQw6Rh7Kpd.ism/dash/',
|
||||
'protocol': 'http_dash_segments',
|
||||
}
|
||||
]
|
||||
},
|
||||
],
|
||||
},
|
||||
)
|
||||
),
|
||||
]
|
||||
|
||||
for mpd_file, mpd_url, mpd_base_url, expected_formats, expected_subtitles in _TEST_CASES:
|
||||
with open('./test/testdata/mpd/%s.mpd' % mpd_file, encoding='utf-8') as f:
|
||||
with open(f'./test/testdata/mpd/{mpd_file}.mpd', encoding='utf-8') as f:
|
||||
formats, subtitles = self.ie._parse_mpd_formats_and_subtitles(
|
||||
compat_etree_fromstring(f.read().encode()),
|
||||
mpd_base_url=mpd_base_url, mpd_url=mpd_url)
|
||||
|
@ -1404,6 +1396,7 @@ def test_parse_ism_formats(self):
|
|||
'vcodec': 'none',
|
||||
'acodec': 'AACL',
|
||||
'protocol': 'ism',
|
||||
'audio_channels': 2,
|
||||
'_download_params': {
|
||||
'stream_type': 'audio',
|
||||
'duration': 8880746666,
|
||||
|
@ -1415,11 +1408,8 @@ def test_parse_ism_formats(self):
|
|||
'sampling_rate': 48000,
|
||||
'channels': 2,
|
||||
'bits_per_sample': 16,
|
||||
'nal_unit_length_field': 4
|
||||
'nal_unit_length_field': 4,
|
||||
},
|
||||
'audio_ext': 'isma',
|
||||
'video_ext': 'none',
|
||||
'abr': 128,
|
||||
}, {
|
||||
'format_id': 'video-100',
|
||||
'url': 'https://sdn-global-streaming-cache-3qsdn.akamaized.net/stream/3144/files/17/07/672975/3144-kZT4LWMQw6Rh7Kpd.ism/Manifest',
|
||||
|
@ -1441,11 +1431,8 @@ def test_parse_ism_formats(self):
|
|||
'codec_private_data': '00000001674D401FDA0544EFFC2D002CBC40000003004000000C03C60CA80000000168EF32C8',
|
||||
'channels': 2,
|
||||
'bits_per_sample': 16,
|
||||
'nal_unit_length_field': 4
|
||||
'nal_unit_length_field': 4,
|
||||
},
|
||||
'video_ext': 'ismv',
|
||||
'audio_ext': 'none',
|
||||
'vbr': 100,
|
||||
}, {
|
||||
'format_id': 'video-326',
|
||||
'url': 'https://sdn-global-streaming-cache-3qsdn.akamaized.net/stream/3144/files/17/07/672975/3144-kZT4LWMQw6Rh7Kpd.ism/Manifest',
|
||||
|
@ -1467,11 +1454,8 @@ def test_parse_ism_formats(self):
|
|||
'codec_private_data': '00000001674D401FDA0241FE23FFC3BC83BA44000003000400000300C03C60CA800000000168EF32C8',
|
||||
'channels': 2,
|
||||
'bits_per_sample': 16,
|
||||
'nal_unit_length_field': 4
|
||||
'nal_unit_length_field': 4,
|
||||
},
|
||||
'video_ext': 'ismv',
|
||||
'audio_ext': 'none',
|
||||
'vbr': 326,
|
||||
}, {
|
||||
'format_id': 'video-698',
|
||||
'url': 'https://sdn-global-streaming-cache-3qsdn.akamaized.net/stream/3144/files/17/07/672975/3144-kZT4LWMQw6Rh7Kpd.ism/Manifest',
|
||||
|
@ -1493,11 +1477,8 @@ def test_parse_ism_formats(self):
|
|||
'codec_private_data': '00000001674D401FDA0350BFB97FF06AF06AD1000003000100000300300F1832A00000000168EF32C8',
|
||||
'channels': 2,
|
||||
'bits_per_sample': 16,
|
||||
'nal_unit_length_field': 4
|
||||
'nal_unit_length_field': 4,
|
||||
},
|
||||
'video_ext': 'ismv',
|
||||
'audio_ext': 'none',
|
||||
'vbr': 698,
|
||||
}, {
|
||||
'format_id': 'video-1493',
|
||||
'url': 'https://sdn-global-streaming-cache-3qsdn.akamaized.net/stream/3144/files/17/07/672975/3144-kZT4LWMQw6Rh7Kpd.ism/Manifest',
|
||||
|
@ -1519,11 +1500,8 @@ def test_parse_ism_formats(self):
|
|||
'codec_private_data': '00000001674D401FDA011C3DE6FFF0D890D871000003000100000300300F1832A00000000168EF32C8',
|
||||
'channels': 2,
|
||||
'bits_per_sample': 16,
|
||||
'nal_unit_length_field': 4
|
||||
'nal_unit_length_field': 4,
|
||||
},
|
||||
'video_ext': 'ismv',
|
||||
'audio_ext': 'none',
|
||||
'vbr': 1493,
|
||||
}, {
|
||||
'format_id': 'video-4482',
|
||||
'url': 'https://sdn-global-streaming-cache-3qsdn.akamaized.net/stream/3144/files/17/07/672975/3144-kZT4LWMQw6Rh7Kpd.ism/Manifest',
|
||||
|
@ -1545,11 +1523,8 @@ def test_parse_ism_formats(self):
|
|||
'codec_private_data': '00000001674D401FDA01A816F97FFC1ABC1AB440000003004000000C03C60CA80000000168EF32C8',
|
||||
'channels': 2,
|
||||
'bits_per_sample': 16,
|
||||
'nal_unit_length_field': 4
|
||||
'nal_unit_length_field': 4,
|
||||
},
|
||||
'video_ext': 'ismv',
|
||||
'audio_ext': 'none',
|
||||
'vbr': 4482,
|
||||
}],
|
||||
{
|
||||
'eng': [
|
||||
|
@ -1563,44 +1538,16 @@ def test_parse_ism_formats(self):
|
|||
'duration': 8880746666,
|
||||
'timescale': 10000000,
|
||||
'fourcc': 'TTML',
|
||||
'codec_private_data': ''
|
||||
}
|
||||
}
|
||||
]
|
||||
'codec_private_data': '',
|
||||
},
|
||||
},
|
||||
],
|
||||
},
|
||||
),
|
||||
(
|
||||
'ec-3_test',
|
||||
'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||
[{
|
||||
'format_id': 'audio_deu_1-224',
|
||||
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||
'manifest_url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||
'ext': 'isma',
|
||||
'tbr': 224,
|
||||
'asr': 48000,
|
||||
'vcodec': 'none',
|
||||
'acodec': 'EC-3',
|
||||
'protocol': 'ism',
|
||||
'_download_params':
|
||||
{
|
||||
'stream_type': 'audio',
|
||||
'duration': 370000000,
|
||||
'timescale': 10000000,
|
||||
'width': 0,
|
||||
'height': 0,
|
||||
'fourcc': 'EC-3',
|
||||
'language': 'deu',
|
||||
'codec_private_data': '00063F000000AF87FBA7022DFB42A4D405CD93843BDD0700200F00',
|
||||
'sampling_rate': 48000,
|
||||
'channels': 6,
|
||||
'bits_per_sample': 16,
|
||||
'nal_unit_length_field': 4
|
||||
},
|
||||
'audio_ext': 'isma',
|
||||
'video_ext': 'none',
|
||||
'abr': 224,
|
||||
}, {
|
||||
'format_id': 'audio_deu-127',
|
||||
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||
'manifest_url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||
|
@ -1610,8 +1557,9 @@ def test_parse_ism_formats(self):
|
|||
'vcodec': 'none',
|
||||
'acodec': 'AACL',
|
||||
'protocol': 'ism',
|
||||
'_download_params':
|
||||
{
|
||||
'language': 'deu',
|
||||
'audio_channels': 2,
|
||||
'_download_params': {
|
||||
'stream_type': 'audio',
|
||||
'duration': 370000000,
|
||||
'timescale': 10000000,
|
||||
|
@ -1623,11 +1571,34 @@ def test_parse_ism_formats(self):
|
|||
'sampling_rate': 48000,
|
||||
'channels': 2,
|
||||
'bits_per_sample': 16,
|
||||
'nal_unit_length_field': 4
|
||||
'nal_unit_length_field': 4,
|
||||
},
|
||||
}, {
|
||||
'format_id': 'audio_deu_1-224',
|
||||
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||
'manifest_url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||
'ext': 'isma',
|
||||
'tbr': 224,
|
||||
'asr': 48000,
|
||||
'vcodec': 'none',
|
||||
'acodec': 'EC-3',
|
||||
'protocol': 'ism',
|
||||
'language': 'deu',
|
||||
'audio_channels': 6,
|
||||
'_download_params': {
|
||||
'stream_type': 'audio',
|
||||
'duration': 370000000,
|
||||
'timescale': 10000000,
|
||||
'width': 0,
|
||||
'height': 0,
|
||||
'fourcc': 'EC-3',
|
||||
'language': 'deu',
|
||||
'codec_private_data': '00063F000000AF87FBA7022DFB42A4D405CD93843BDD0700200F00',
|
||||
'sampling_rate': 48000,
|
||||
'channels': 6,
|
||||
'bits_per_sample': 16,
|
||||
'nal_unit_length_field': 4,
|
||||
},
|
||||
'audio_ext': 'isma',
|
||||
'video_ext': 'none',
|
||||
'abr': 127,
|
||||
}, {
|
||||
'format_id': 'video_deu-23',
|
||||
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||
|
@ -1639,8 +1610,8 @@ def test_parse_ism_formats(self):
|
|||
'vcodec': 'AVC1',
|
||||
'acodec': 'none',
|
||||
'protocol': 'ism',
|
||||
'_download_params':
|
||||
{
|
||||
'language': 'deu',
|
||||
'_download_params': {
|
||||
'stream_type': 'video',
|
||||
'duration': 370000000,
|
||||
'timescale': 10000000,
|
||||
|
@ -1651,11 +1622,8 @@ def test_parse_ism_formats(self):
|
|||
'codec_private_data': '000000016742C00CDB06077E5C05A808080A00000300020000030009C0C02EE0177CC6300F142AE00000000168CA8DC8',
|
||||
'channels': 2,
|
||||
'bits_per_sample': 16,
|
||||
'nal_unit_length_field': 4
|
||||
'nal_unit_length_field': 4,
|
||||
},
|
||||
'video_ext': 'ismv',
|
||||
'audio_ext': 'none',
|
||||
'vbr': 23,
|
||||
}, {
|
||||
'format_id': 'video_deu-403',
|
||||
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||
|
@ -1667,8 +1635,8 @@ def test_parse_ism_formats(self):
|
|||
'vcodec': 'AVC1',
|
||||
'acodec': 'none',
|
||||
'protocol': 'ism',
|
||||
'_download_params':
|
||||
{
|
||||
'language': 'deu',
|
||||
'_download_params': {
|
||||
'stream_type': 'video',
|
||||
'duration': 370000000,
|
||||
'timescale': 10000000,
|
||||
|
@ -1679,11 +1647,8 @@ def test_parse_ism_formats(self):
|
|||
'codec_private_data': '00000001674D4014E98323B602D4040405000003000100000300320F1429380000000168EAECF2',
|
||||
'channels': 2,
|
||||
'bits_per_sample': 16,
|
||||
'nal_unit_length_field': 4
|
||||
'nal_unit_length_field': 4,
|
||||
},
|
||||
'video_ext': 'ismv',
|
||||
'audio_ext': 'none',
|
||||
'vbr': 403,
|
||||
}, {
|
||||
'format_id': 'video_deu-680',
|
||||
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||
|
@ -1695,8 +1660,8 @@ def test_parse_ism_formats(self):
|
|||
'vcodec': 'AVC1',
|
||||
'acodec': 'none',
|
||||
'protocol': 'ism',
|
||||
'_download_params':
|
||||
{
|
||||
'language': 'deu',
|
||||
'_download_params': {
|
||||
'stream_type': 'video',
|
||||
'duration': 370000000,
|
||||
'timescale': 10000000,
|
||||
|
@ -1707,11 +1672,8 @@ def test_parse_ism_formats(self):
|
|||
'codec_private_data': '00000001674D401EE981405FF2E02D4040405000000300100000030320F162D3800000000168EAECF2',
|
||||
'channels': 2,
|
||||
'bits_per_sample': 16,
|
||||
'nal_unit_length_field': 4
|
||||
'nal_unit_length_field': 4,
|
||||
},
|
||||
'video_ext': 'ismv',
|
||||
'audio_ext': 'none',
|
||||
'vbr': 680,
|
||||
}, {
|
||||
'format_id': 'video_deu-1253',
|
||||
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||
|
@ -1723,8 +1685,9 @@ def test_parse_ism_formats(self):
|
|||
'vcodec': 'AVC1',
|
||||
'acodec': 'none',
|
||||
'protocol': 'ism',
|
||||
'_download_params':
|
||||
{
|
||||
'vbr': 1253,
|
||||
'language': 'deu',
|
||||
'_download_params': {
|
||||
'stream_type': 'video',
|
||||
'duration': 370000000,
|
||||
'timescale': 10000000,
|
||||
|
@ -1735,11 +1698,8 @@ def test_parse_ism_formats(self):
|
|||
'codec_private_data': '00000001674D401EE981405FF2E02D4040405000000300100000030320F162D3800000000168EAECF2',
|
||||
'channels': 2,
|
||||
'bits_per_sample': 16,
|
||||
'nal_unit_length_field': 4
|
||||
'nal_unit_length_field': 4,
|
||||
},
|
||||
'video_ext': 'ismv',
|
||||
'audio_ext': 'none',
|
||||
'vbr': 1253,
|
||||
}, {
|
||||
'format_id': 'video_deu-2121',
|
||||
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||
|
@ -1751,8 +1711,8 @@ def test_parse_ism_formats(self):
|
|||
'vcodec': 'AVC1',
|
||||
'acodec': 'none',
|
||||
'protocol': 'ism',
|
||||
'_download_params':
|
||||
{
|
||||
'language': 'deu',
|
||||
'_download_params': {
|
||||
'stream_type': 'video',
|
||||
'duration': 370000000,
|
||||
'timescale': 10000000,
|
||||
|
@ -1763,11 +1723,8 @@ def test_parse_ism_formats(self):
|
|||
'codec_private_data': '00000001674D401EECA0601BD80B50101014000003000400000300C83C58B6580000000168E93B3C80',
|
||||
'channels': 2,
|
||||
'bits_per_sample': 16,
|
||||
'nal_unit_length_field': 4
|
||||
'nal_unit_length_field': 4,
|
||||
},
|
||||
'video_ext': 'ismv',
|
||||
'audio_ext': 'none',
|
||||
'vbr': 2121,
|
||||
}, {
|
||||
'format_id': 'video_deu-3275',
|
||||
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||
|
@ -1779,8 +1736,8 @@ def test_parse_ism_formats(self):
|
|||
'vcodec': 'AVC1',
|
||||
'acodec': 'none',
|
||||
'protocol': 'ism',
|
||||
'_download_params':
|
||||
{
|
||||
'language': 'deu',
|
||||
'_download_params': {
|
||||
'stream_type': 'video',
|
||||
'duration': 370000000,
|
||||
'timescale': 10000000,
|
||||
|
@ -1791,11 +1748,8 @@ def test_parse_ism_formats(self):
|
|||
'codec_private_data': '00000001674D4020ECA02802DD80B501010140000003004000000C83C60C65800000000168E93B3C80',
|
||||
'channels': 2,
|
||||
'bits_per_sample': 16,
|
||||
'nal_unit_length_field': 4
|
||||
'nal_unit_length_field': 4,
|
||||
},
|
||||
'video_ext': 'ismv',
|
||||
'audio_ext': 'none',
|
||||
'vbr': 3275,
|
||||
}, {
|
||||
'format_id': 'video_deu-5300',
|
||||
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||
|
@ -1807,8 +1761,8 @@ def test_parse_ism_formats(self):
|
|||
'vcodec': 'AVC1',
|
||||
'acodec': 'none',
|
||||
'protocol': 'ism',
|
||||
'_download_params':
|
||||
{
|
||||
'language': 'deu',
|
||||
'_download_params': {
|
||||
'stream_type': 'video',
|
||||
'duration': 370000000,
|
||||
'timescale': 10000000,
|
||||
|
@ -1819,11 +1773,8 @@ def test_parse_ism_formats(self):
|
|||
'codec_private_data': '00000001674D4028ECA03C0113F2E02D4040405000000300100000030320F18319600000000168E93B3C80',
|
||||
'channels': 2,
|
||||
'bits_per_sample': 16,
|
||||
'nal_unit_length_field': 4
|
||||
'nal_unit_length_field': 4,
|
||||
},
|
||||
'video_ext': 'ismv',
|
||||
'audio_ext': 'none',
|
||||
'vbr': 5300,
|
||||
}, {
|
||||
'format_id': 'video_deu-8079',
|
||||
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||
|
@ -1835,8 +1786,8 @@ def test_parse_ism_formats(self):
|
|||
'vcodec': 'AVC1',
|
||||
'acodec': 'none',
|
||||
'protocol': 'ism',
|
||||
'_download_params':
|
||||
{
|
||||
'language': 'deu',
|
||||
'_download_params': {
|
||||
'stream_type': 'video',
|
||||
'duration': 370000000,
|
||||
'timescale': 10000000,
|
||||
|
@ -1847,18 +1798,15 @@ def test_parse_ism_formats(self):
|
|||
'codec_private_data': '00000001674D4028ECA03C0113F2E02D4040405000000300100000030320F18319600000000168E93B3C80',
|
||||
'channels': 2,
|
||||
'bits_per_sample': 16,
|
||||
'nal_unit_length_field': 4
|
||||
'nal_unit_length_field': 4,
|
||||
},
|
||||
'video_ext': 'ismv',
|
||||
'audio_ext': 'none',
|
||||
'vbr': 8079,
|
||||
}],
|
||||
{},
|
||||
),
|
||||
]
|
||||
|
||||
for ism_file, ism_url, expected_formats, expected_subtitles in _TEST_CASES:
|
||||
with open('./test/testdata/ism/%s.Manifest' % ism_file, encoding='utf-8') as f:
|
||||
with open(f'./test/testdata/ism/{ism_file}.Manifest', encoding='utf-8') as f:
|
||||
formats, subtitles = self.ie._parse_ism_formats_and_subtitles(
|
||||
compat_etree_fromstring(f.read().encode()), ism_url=ism_url)
|
||||
self.ie._sort_formats(formats)
|
||||
|
@ -1879,12 +1827,12 @@ def test_parse_f4m_formats(self):
|
|||
'tbr': 2148,
|
||||
'width': 1280,
|
||||
'height': 720,
|
||||
}]
|
||||
}],
|
||||
),
|
||||
]
|
||||
|
||||
for f4m_file, f4m_url, expected_formats in _TEST_CASES:
|
||||
with open('./test/testdata/f4m/%s.f4m' % f4m_file, encoding='utf-8') as f:
|
||||
with open(f'./test/testdata/f4m/{f4m_file}.f4m', encoding='utf-8') as f:
|
||||
formats = self.ie._parse_f4m_formats(
|
||||
compat_etree_fromstring(f.read().encode()),
|
||||
f4m_url, None)
|
||||
|
@ -1925,13 +1873,13 @@ def test_parse_xspf(self):
|
|||
}, {
|
||||
'manifest_url': 'https://example.org/src/foo_xspf.xspf',
|
||||
'url': 'https://example.com/track3.mp3',
|
||||
}]
|
||||
}]
|
||||
}],
|
||||
}],
|
||||
),
|
||||
]
|
||||
|
||||
for xspf_file, xspf_url, expected_entries in _TEST_CASES:
|
||||
with open('./test/testdata/xspf/%s.xspf' % xspf_file, encoding='utf-8') as f:
|
||||
with open(f'./test/testdata/xspf/{xspf_file}.xspf', encoding='utf-8') as f:
|
||||
entries = self.ie._parse_xspf(
|
||||
compat_etree_fromstring(f.read().encode()),
|
||||
xspf_file, xspf_url=xspf_url, xspf_base_url=xspf_url)
|
||||
|
@ -1954,10 +1902,19 @@ def test_response_with_expected_status_returns_content(self):
|
|||
server_thread.start()
|
||||
|
||||
(content, urlh) = self.ie._download_webpage_handle(
|
||||
'http://127.0.0.1:%d/teapot' % port, None,
|
||||
f'http://127.0.0.1:{port}/teapot', None,
|
||||
expected_status=TEAPOT_RESPONSE_STATUS)
|
||||
self.assertEqual(content, TEAPOT_RESPONSE_BODY)
|
||||
|
||||
def test_search_nextjs_data(self):
|
||||
data = '<script id="__NEXT_DATA__" type="application/json">{"props":{}}</script>'
|
||||
self.assertEqual(self.ie._search_nextjs_data(data, None), {'props': {}})
|
||||
self.assertEqual(self.ie._search_nextjs_data('', None, fatal=False), {})
|
||||
self.assertEqual(self.ie._search_nextjs_data('', None, default=None), None)
|
||||
self.assertEqual(self.ie._search_nextjs_data('', None, default={}), {})
|
||||
with self.assertWarns(DeprecationWarning):
|
||||
self.assertEqual(self.ie._search_nextjs_data('', None, default='{}'), {})
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main()
|
||||
|
|
|
@ -4,15 +4,16 @@
|
|||
import os
|
||||
import sys
|
||||
import unittest
|
||||
from unittest.mock import patch
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
|
||||
import contextlib
|
||||
import copy
|
||||
import json
|
||||
import urllib.error
|
||||
|
||||
from test.helper import FakeYDL, assertRegexpMatches
|
||||
from test.helper import FakeYDL, assertRegexpMatches, try_rm
|
||||
from yt_dlp import YoutubeDL
|
||||
from yt_dlp.compat import compat_os_name
|
||||
from yt_dlp.extractor import YoutubeIE
|
||||
|
@ -25,6 +26,7 @@
|
|||
int_or_none,
|
||||
match_filter_func,
|
||||
)
|
||||
from yt_dlp.utils.traversal import traverse_obj
|
||||
|
||||
TEST_URL = 'http://localhost/sample.mp4'
|
||||
|
||||
|
@ -129,8 +131,8 @@ def test(inp, *expected, multi=False):
|
|||
'allow_multiple_audio_streams': multi,
|
||||
})
|
||||
ydl.process_ie_result(info_dict.copy())
|
||||
downloaded = map(lambda x: x['format_id'], ydl.downloaded_info_dicts)
|
||||
self.assertEqual(list(downloaded), list(expected))
|
||||
downloaded = [x['format_id'] for x in ydl.downloaded_info_dicts]
|
||||
self.assertEqual(downloaded, list(expected))
|
||||
|
||||
test('20/47', '47')
|
||||
test('20/71/worst', '35')
|
||||
|
@ -140,6 +142,8 @@ def test(inp, *expected, multi=False):
|
|||
test('example-with-dashes', 'example-with-dashes')
|
||||
test('all', '2', '47', '45', 'example-with-dashes', '35')
|
||||
test('mergeall', '2+47+45+example-with-dashes+35', multi=True)
|
||||
# See: https://github.com/yt-dlp/yt-dlp/pulls/8797
|
||||
test('7_a/worst', '35')
|
||||
|
||||
def test_format_selection_audio(self):
|
||||
formats = [
|
||||
|
@ -181,7 +185,7 @@ def test_format_selection_audio_exts(self):
|
|||
]
|
||||
|
||||
info_dict = _make_result(formats)
|
||||
ydl = YDL({'format': 'best'})
|
||||
ydl = YDL({'format': 'best', 'format_sort': ['abr', 'ext']})
|
||||
ydl.sort_formats(info_dict)
|
||||
ydl.process_ie_result(copy.deepcopy(info_dict))
|
||||
downloaded = ydl.downloaded_info_dicts[0]
|
||||
|
@ -193,7 +197,7 @@ def test_format_selection_audio_exts(self):
|
|||
downloaded = ydl.downloaded_info_dicts[0]
|
||||
self.assertEqual(downloaded['format_id'], 'mp3-64')
|
||||
|
||||
ydl = YDL({'prefer_free_formats': True})
|
||||
ydl = YDL({'prefer_free_formats': True, 'format_sort': ['abr', 'ext']})
|
||||
ydl.sort_formats(info_dict)
|
||||
ydl.process_ie_result(copy.deepcopy(info_dict))
|
||||
downloaded = ydl.downloaded_info_dicts[0]
|
||||
|
@ -232,6 +236,35 @@ def test_format_selection_video(self):
|
|||
downloaded = ydl.downloaded_info_dicts[0]
|
||||
self.assertEqual(downloaded['format_id'], 'vid-vcodec-dot')
|
||||
|
||||
def test_format_selection_by_vcodec_sort(self):
|
||||
formats = [
|
||||
{'format_id': 'av1-format', 'ext': 'mp4', 'vcodec': 'av1', 'acodec': 'none', 'url': TEST_URL},
|
||||
{'format_id': 'vp9-hdr-format', 'ext': 'mp4', 'vcodec': 'vp09.02.50.10.01.09.18.09.00', 'acodec': 'none', 'url': TEST_URL},
|
||||
{'format_id': 'vp9-sdr-format', 'ext': 'mp4', 'vcodec': 'vp09.00.50.08', 'acodec': 'none', 'url': TEST_URL},
|
||||
{'format_id': 'h265-format', 'ext': 'mp4', 'vcodec': 'h265', 'acodec': 'none', 'url': TEST_URL},
|
||||
]
|
||||
info_dict = _make_result(formats)
|
||||
|
||||
ydl = YDL({'format': 'bestvideo', 'format_sort': ['vcodec:vp9.2']})
|
||||
ydl.process_ie_result(info_dict.copy())
|
||||
downloaded = ydl.downloaded_info_dicts[0]
|
||||
self.assertEqual(downloaded['format_id'], 'vp9-hdr-format')
|
||||
|
||||
ydl = YDL({'format': 'bestvideo', 'format_sort': ['vcodec:vp9']})
|
||||
ydl.process_ie_result(info_dict.copy())
|
||||
downloaded = ydl.downloaded_info_dicts[0]
|
||||
self.assertEqual(downloaded['format_id'], 'vp9-sdr-format')
|
||||
|
||||
ydl = YDL({'format': 'bestvideo', 'format_sort': ['+vcodec:vp9.2']})
|
||||
ydl.process_ie_result(info_dict.copy())
|
||||
downloaded = ydl.downloaded_info_dicts[0]
|
||||
self.assertEqual(downloaded['format_id'], 'vp9-hdr-format')
|
||||
|
||||
ydl = YDL({'format': 'bestvideo', 'format_sort': ['+vcodec:vp9']})
|
||||
ydl.process_ie_result(info_dict.copy())
|
||||
downloaded = ydl.downloaded_info_dicts[0]
|
||||
self.assertEqual(downloaded['format_id'], 'vp9-sdr-format')
|
||||
|
||||
def test_format_selection_string_ops(self):
|
||||
formats = [
|
||||
{'format_id': 'abc-cba', 'ext': 'mp4', 'url': TEST_URL},
|
||||
|
@ -513,13 +546,37 @@ def test_format_filtering(self):
|
|||
self.assertEqual(downloaded_ids, ['D', 'C', 'B'])
|
||||
|
||||
ydl = YDL({'format': 'best[height<40]'})
|
||||
try:
|
||||
with contextlib.suppress(ExtractorError):
|
||||
ydl.process_ie_result(info_dict)
|
||||
except ExtractorError:
|
||||
pass
|
||||
self.assertEqual(ydl.downloaded_info_dicts, [])
|
||||
|
||||
def test_default_format_spec(self):
|
||||
@patch('yt_dlp.postprocessor.ffmpeg.FFmpegMergerPP.available', False)
|
||||
def test_default_format_spec_without_ffmpeg(self):
|
||||
ydl = YDL({})
|
||||
self.assertEqual(ydl._default_format_spec({}), 'best/bestvideo+bestaudio')
|
||||
|
||||
ydl = YDL({'simulate': True})
|
||||
self.assertEqual(ydl._default_format_spec({}), 'best/bestvideo+bestaudio')
|
||||
|
||||
ydl = YDL({})
|
||||
self.assertEqual(ydl._default_format_spec({'is_live': True}), 'best/bestvideo+bestaudio')
|
||||
|
||||
ydl = YDL({'simulate': True})
|
||||
self.assertEqual(ydl._default_format_spec({'is_live': True}), 'best/bestvideo+bestaudio')
|
||||
|
||||
ydl = YDL({'outtmpl': '-'})
|
||||
self.assertEqual(ydl._default_format_spec({}), 'best/bestvideo+bestaudio')
|
||||
|
||||
ydl = YDL({})
|
||||
self.assertEqual(ydl._default_format_spec({}), 'best/bestvideo+bestaudio')
|
||||
self.assertEqual(ydl._default_format_spec({'is_live': True}), 'best/bestvideo+bestaudio')
|
||||
|
||||
@patch('yt_dlp.postprocessor.ffmpeg.FFmpegMergerPP.available', True)
|
||||
@patch('yt_dlp.postprocessor.ffmpeg.FFmpegMergerPP.can_merge', lambda _: True)
|
||||
def test_default_format_spec_with_ffmpeg(self):
|
||||
ydl = YDL({})
|
||||
self.assertEqual(ydl._default_format_spec({}), 'bestvideo*+bestaudio/best')
|
||||
|
||||
ydl = YDL({'simulate': True})
|
||||
self.assertEqual(ydl._default_format_spec({}), 'bestvideo*+bestaudio/best')
|
||||
|
||||
|
@ -527,13 +584,13 @@ def test_default_format_spec(self):
|
|||
self.assertEqual(ydl._default_format_spec({'is_live': True}), 'best/bestvideo+bestaudio')
|
||||
|
||||
ydl = YDL({'simulate': True})
|
||||
self.assertEqual(ydl._default_format_spec({'is_live': True}), 'bestvideo*+bestaudio/best')
|
||||
self.assertEqual(ydl._default_format_spec({'is_live': True}), 'best/bestvideo+bestaudio')
|
||||
|
||||
ydl = YDL({'outtmpl': '-'})
|
||||
self.assertEqual(ydl._default_format_spec({}), 'best/bestvideo+bestaudio')
|
||||
|
||||
ydl = YDL({})
|
||||
self.assertEqual(ydl._default_format_spec({}, download=False), 'bestvideo*+bestaudio/best')
|
||||
self.assertEqual(ydl._default_format_spec({}), 'bestvideo*+bestaudio/best')
|
||||
self.assertEqual(ydl._default_format_spec({'is_live': True}), 'best/bestvideo+bestaudio')
|
||||
|
||||
|
||||
|
@ -650,8 +707,8 @@ def test_add_extra_info(self):
|
|||
'formats': [
|
||||
{'id': 'id 1', 'height': 1080, 'width': 1920},
|
||||
{'id': 'id 2', 'height': 720},
|
||||
{'id': 'id 3'}
|
||||
]
|
||||
{'id': 'id 3'},
|
||||
],
|
||||
}
|
||||
|
||||
def test_prepare_outtmpl_and_filename(self):
|
||||
|
@ -669,7 +726,7 @@ def test(tmpl, expected, *, info=None, **params):
|
|||
for (name, got), expect in zip((('outtmpl', out), ('filename', fname)), expected):
|
||||
if callable(expect):
|
||||
self.assertTrue(expect(got), f'Wrong {name} from {tmpl}')
|
||||
else:
|
||||
elif expect is not None:
|
||||
self.assertEqual(got, expect, f'Wrong {name} from {tmpl}')
|
||||
|
||||
# Side-effects
|
||||
|
@ -684,7 +741,8 @@ def test(tmpl, expected, *, info=None, **params):
|
|||
test('%(id)s.%(ext)s', '1234.mp4')
|
||||
test('%(duration_string)s', ('27:46:40', '27-46-40'))
|
||||
test('%(resolution)s', '1080p')
|
||||
test('%(playlist_index)s', '001')
|
||||
test('%(playlist_index|)s', '001')
|
||||
test('%(playlist_index&{}!)s', '1!')
|
||||
test('%(playlist_autonumber)s', '02')
|
||||
test('%(autonumber)s', '00001')
|
||||
test('%(autonumber+2)03d', '005', autonumber_start=3)
|
||||
|
@ -727,7 +785,7 @@ def expect_same_infodict(out):
|
|||
self.assertEqual(got_dict.get(info_field), expected, info_field)
|
||||
return True
|
||||
|
||||
test('%()j', (expect_same_infodict, str))
|
||||
test('%()j', (expect_same_infodict, None))
|
||||
|
||||
# NA placeholder
|
||||
NA_TEST_OUTTMPL = '%(uploader_date)s-%(width)d-%(x|def)s-%(id)s.%(ext)s'
|
||||
|
@ -755,20 +813,23 @@ def expect_same_infodict(out):
|
|||
test('%(ext)c', 'm')
|
||||
test('%(id)d %(id)r', "1234 '1234'")
|
||||
test('%(id)r %(height)r', "'1234' 1080")
|
||||
test('%(title5)a %(height)a', (R"'\xe1\xe9\xed \U0001d400' 1080", None))
|
||||
test('%(ext)s-%(ext|def)d', 'mp4-def')
|
||||
test('%(width|0)04d', '0000')
|
||||
test('a%(width|)d', 'a', outtmpl_na_placeholder='none')
|
||||
test('%(width|0)04d', '0')
|
||||
test('a%(width|b)d', 'ab', outtmpl_na_placeholder='none')
|
||||
|
||||
FORMATS = self.outtmpl_info['formats']
|
||||
sanitize = lambda x: x.replace(':', ':').replace('"', """).replace('\n', ' ')
|
||||
|
||||
# Custom type casting
|
||||
test('%(formats.:.id)l', 'id 1, id 2, id 3')
|
||||
test('%(formats.:.id)#l', ('id 1\nid 2\nid 3', 'id 1 id 2 id 3'))
|
||||
test('%(ext)l', 'mp4')
|
||||
test('%(formats.:.id) 18l', ' id 1, id 2, id 3')
|
||||
test('%(formats)j', (json.dumps(FORMATS), sanitize(json.dumps(FORMATS))))
|
||||
test('%(formats)#j', (json.dumps(FORMATS, indent=4), sanitize(json.dumps(FORMATS, indent=4))))
|
||||
test('%(formats)j', (json.dumps(FORMATS), None))
|
||||
test('%(formats)#j', (
|
||||
json.dumps(FORMATS, indent=4),
|
||||
json.dumps(FORMATS, indent=4).replace(':', ':').replace('"', '"').replace('\n', ' '),
|
||||
))
|
||||
test('%(title5).3B', 'á')
|
||||
test('%(title5)U', 'áéí 𝐀')
|
||||
test('%(title5)#U', 'a\u0301e\u0301i\u0301 𝐀')
|
||||
|
@ -780,9 +841,9 @@ def expect_same_infodict(out):
|
|||
test('%(title4)#S', 'foo_bar_test')
|
||||
test('%(title4).10S', ('foo "bar" ', 'foo "bar"' + ('#' if compat_os_name == 'nt' else ' ')))
|
||||
if compat_os_name == 'nt':
|
||||
test('%(title4)q', ('"foo \\"bar\\" test"', ""foo ⧹"bar⧹" test""))
|
||||
test('%(formats.:.id)#q', ('"id 1" "id 2" "id 3"', '"id 1" "id 2" "id 3"'))
|
||||
test('%(formats.0.id)#q', ('"id 1"', '"id 1"'))
|
||||
test('%(title4)q', ('"foo ""bar"" test"', None))
|
||||
test('%(formats.:.id)#q', ('"id 1" "id 2" "id 3"', None))
|
||||
test('%(formats.0.id)#q', ('"id 1"', None))
|
||||
else:
|
||||
test('%(title4)q', ('\'foo "bar" test\'', '\'foo "bar" test\''))
|
||||
test('%(formats.:.id)#q', "'id 1' 'id 2' 'id 3'")
|
||||
|
@ -793,8 +854,9 @@ def expect_same_infodict(out):
|
|||
test('%(title|%)s %(title|%%)s', '% %%')
|
||||
test('%(id+1-height+3)05d', '00158')
|
||||
test('%(width+100)05d', 'NA')
|
||||
test('%(formats.0) 15s', ('% 15s' % FORMATS[0], '% 15s' % sanitize(str(FORMATS[0]))))
|
||||
test('%(formats.0)r', (repr(FORMATS[0]), sanitize(repr(FORMATS[0]))))
|
||||
test('%(filesize*8)d', '8192')
|
||||
test('%(formats.0) 15s', ('% 15s' % FORMATS[0], None))
|
||||
test('%(formats.0)r', (repr(FORMATS[0]), None))
|
||||
test('%(height.0)03d', '001')
|
||||
test('%(-height.0)04d', '-001')
|
||||
test('%(formats.-1.id)s', FORMATS[-1]['id'])
|
||||
|
@ -806,7 +868,7 @@ def expect_same_infodict(out):
|
|||
out = json.dumps([{'id': f['id'], 'height.:2': str(f['height'])[:2]}
|
||||
if 'height' in f else {'id': f['id']}
|
||||
for f in FORMATS])
|
||||
test('%(formats.:.{id,height.:2})j', (out, sanitize(out)))
|
||||
test('%(formats.:.{id,height.:2})j', (out, None))
|
||||
test('%(formats.:.{id,height}.id)l', ', '.join(f['id'] for f in FORMATS))
|
||||
test('%(.{id,title})j', ('{"id": "1234"}', '{"id": "1234"}'))
|
||||
|
||||
|
@ -822,6 +884,11 @@ def expect_same_infodict(out):
|
|||
test('%(title&foo|baz)s.bar', 'baz.bar')
|
||||
test('%(x,id&foo|baz)s.bar', 'foo.bar')
|
||||
test('%(x,title&foo|baz)s.bar', 'baz.bar')
|
||||
test('%(id&a\nb|)s', ('a\nb', 'a b'))
|
||||
test('%(id&hi {:>10} {}|)s', 'hi 1234 1234')
|
||||
test(R'%(id&{0} {}|)s', 'NA')
|
||||
test(R'%(id&{0.1}|)s', 'NA')
|
||||
test('%(height&{:,d})S', '1,080')
|
||||
|
||||
# Laziness
|
||||
def gen():
|
||||
|
@ -831,8 +898,8 @@ def gen():
|
|||
|
||||
# Empty filename
|
||||
test('%(foo|)s-%(bar|)s.%(ext)s', '-.mp4')
|
||||
# test('%(foo|)s.%(ext)s', ('.mp4', '_.mp4')) # fixme
|
||||
# test('%(foo|)s', ('', '_')) # fixme
|
||||
# test('%(foo|)s.%(ext)s', ('.mp4', '_.mp4')) # FIXME: ?
|
||||
# test('%(foo|)s', ('', '_')) # FIXME: ?
|
||||
|
||||
# Environment variable expansion for prepare_filename
|
||||
os.environ['__yt_dlp_var'] = 'expanded'
|
||||
|
@ -849,7 +916,7 @@ def gen():
|
|||
test('Hello %(title1)s', 'Hello $PATH')
|
||||
test('Hello %(title2)s', 'Hello %PATH%')
|
||||
test('%(title3)s', ('foo/bar\\test', 'foo⧸bar⧹test'))
|
||||
test('folder/%(title3)s', ('folder/foo/bar\\test', 'folder%sfoo⧸bar⧹test' % os.path.sep))
|
||||
test('folder/%(title3)s', ('folder/foo/bar\\test', f'folder{os.path.sep}foo⧸bar⧹test'))
|
||||
|
||||
def test_format_note(self):
|
||||
ydl = YoutubeDL()
|
||||
|
@ -867,36 +934,36 @@ def test_postprocessors(self):
|
|||
|
||||
class SimplePP(PostProcessor):
|
||||
def run(self, info):
|
||||
with open(audiofile, 'wt') as f:
|
||||
with open(audiofile, 'w') as f:
|
||||
f.write('EXAMPLE')
|
||||
return [info['filepath']], info
|
||||
|
||||
def run_pp(params, PP):
|
||||
with open(filename, 'wt') as f:
|
||||
def run_pp(params, pp):
|
||||
with open(filename, 'w') as f:
|
||||
f.write('EXAMPLE')
|
||||
ydl = YoutubeDL(params)
|
||||
ydl.add_post_processor(PP())
|
||||
ydl.add_post_processor(pp())
|
||||
ydl.post_process(filename, {'filepath': filename})
|
||||
|
||||
run_pp({'keepvideo': True}, SimplePP)
|
||||
self.assertTrue(os.path.exists(filename), '%s doesn\'t exist' % filename)
|
||||
self.assertTrue(os.path.exists(audiofile), '%s doesn\'t exist' % audiofile)
|
||||
self.assertTrue(os.path.exists(filename), f'{filename} doesn\'t exist')
|
||||
self.assertTrue(os.path.exists(audiofile), f'{audiofile} doesn\'t exist')
|
||||
os.unlink(filename)
|
||||
os.unlink(audiofile)
|
||||
|
||||
run_pp({'keepvideo': False}, SimplePP)
|
||||
self.assertFalse(os.path.exists(filename), '%s exists' % filename)
|
||||
self.assertTrue(os.path.exists(audiofile), '%s doesn\'t exist' % audiofile)
|
||||
self.assertFalse(os.path.exists(filename), f'{filename} exists')
|
||||
self.assertTrue(os.path.exists(audiofile), f'{audiofile} doesn\'t exist')
|
||||
os.unlink(audiofile)
|
||||
|
||||
class ModifierPP(PostProcessor):
|
||||
def run(self, info):
|
||||
with open(info['filepath'], 'wt') as f:
|
||||
with open(info['filepath'], 'w') as f:
|
||||
f.write('MODIFIED')
|
||||
return [], info
|
||||
|
||||
run_pp({'keepvideo': False}, ModifierPP)
|
||||
self.assertTrue(os.path.exists(filename), '%s doesn\'t exist' % filename)
|
||||
self.assertTrue(os.path.exists(filename), f'{filename} doesn\'t exist')
|
||||
os.unlink(filename)
|
||||
|
||||
def test_match_filter(self):
|
||||
|
@ -908,7 +975,7 @@ def test_match_filter(self):
|
|||
'duration': 30,
|
||||
'filesize': 10 * 1024,
|
||||
'playlist_id': '42',
|
||||
'uploader': "變態妍字幕版 太妍 тест",
|
||||
'uploader': '變態妍字幕版 太妍 тест',
|
||||
'creator': "тест ' 123 ' тест--",
|
||||
'webpage_url': 'http://example.com/watch?v=shenanigans',
|
||||
}
|
||||
|
@ -921,7 +988,7 @@ def test_match_filter(self):
|
|||
'description': 'foo',
|
||||
'filesize': 5 * 1024,
|
||||
'playlist_id': '43',
|
||||
'uploader': "тест 123",
|
||||
'uploader': 'тест 123',
|
||||
'webpage_url': 'http://example.com/watch?v=SHENANIGANS',
|
||||
}
|
||||
videos = [first, second]
|
||||
|
@ -929,7 +996,7 @@ def test_match_filter(self):
|
|||
def get_videos(filter_=None):
|
||||
ydl = YDL({'match_filter': filter_, 'simulate': True})
|
||||
for v in videos:
|
||||
ydl.process_ie_result(v, download=True)
|
||||
ydl.process_ie_result(v.copy(), download=True)
|
||||
return [v['id'] for v in ydl.downloaded_info_dicts]
|
||||
|
||||
res = get_videos()
|
||||
|
@ -1093,11 +1160,6 @@ def test_selection(params, expected_ids, evaluate_all=False):
|
|||
test_selection({'playlist_items': '-15::2'}, INDICES[1::2], True)
|
||||
test_selection({'playlist_items': '-15::15'}, [], True)
|
||||
|
||||
def test_urlopen_no_file_protocol(self):
|
||||
# see https://github.com/ytdl-org/youtube-dl/issues/8227
|
||||
ydl = YDL()
|
||||
self.assertRaises(urllib.error.URLError, ydl.urlopen, 'file:///etc/passwd')
|
||||
|
||||
def test_do_not_override_ie_key_in_url_transparent(self):
|
||||
ydl = YDL()
|
||||
|
||||
|
@ -1173,7 +1235,7 @@ def _real_extract(self, url):
|
|||
})
|
||||
return {
|
||||
'id': video_id,
|
||||
'title': 'Video %s' % video_id,
|
||||
'title': f'Video {video_id}',
|
||||
'formats': formats,
|
||||
}
|
||||
|
||||
|
@ -1187,8 +1249,8 @@ def _entries(self):
|
|||
'_type': 'url_transparent',
|
||||
'ie_key': VideoIE.ie_key(),
|
||||
'id': video_id,
|
||||
'url': 'video:%s' % video_id,
|
||||
'title': 'Video Transparent %s' % video_id,
|
||||
'url': f'video:{video_id}',
|
||||
'title': f'Video Transparent {video_id}',
|
||||
}
|
||||
|
||||
def _real_extract(self, url):
|
||||
|
@ -1211,6 +1273,129 @@ def _real_extract(self, url):
|
|||
self.assertEqual(downloaded['extractor'], 'Video')
|
||||
self.assertEqual(downloaded['extractor_key'], 'Video')
|
||||
|
||||
def test_header_cookies(self):
|
||||
from http.cookiejar import Cookie
|
||||
|
||||
ydl = FakeYDL()
|
||||
ydl.report_warning = lambda *_, **__: None
|
||||
|
||||
def cookie(name, value, version=None, domain='', path='', secure=False, expires=None):
|
||||
return Cookie(
|
||||
version or 0, name, value, None, False,
|
||||
domain, bool(domain), bool(domain), path, bool(path),
|
||||
secure, expires, False, None, None, rest={})
|
||||
|
||||
_test_url = 'https://yt.dlp/test'
|
||||
|
||||
def test(encoded_cookies, cookies, *, headers=False, round_trip=None, error_re=None):
|
||||
def _test():
|
||||
ydl.cookiejar.clear()
|
||||
ydl._load_cookies(encoded_cookies, autoscope=headers)
|
||||
if headers:
|
||||
ydl._apply_header_cookies(_test_url)
|
||||
data = {'url': _test_url}
|
||||
ydl._calc_headers(data)
|
||||
self.assertCountEqual(
|
||||
map(vars, ydl.cookiejar), map(vars, cookies),
|
||||
'Extracted cookiejar.Cookie is not the same')
|
||||
if not headers:
|
||||
self.assertEqual(
|
||||
data.get('cookies'), round_trip or encoded_cookies,
|
||||
'Cookie is not the same as round trip')
|
||||
ydl.__dict__['_YoutubeDL__header_cookies'] = []
|
||||
|
||||
with self.subTest(msg=encoded_cookies):
|
||||
if not error_re:
|
||||
_test()
|
||||
return
|
||||
with self.assertRaisesRegex(Exception, error_re):
|
||||
_test()
|
||||
|
||||
test('test=value; Domain=.yt.dlp', [cookie('test', 'value', domain='.yt.dlp')])
|
||||
test('test=value', [cookie('test', 'value')], error_re=r'Unscoped cookies are not allowed')
|
||||
test('cookie1=value1; Domain=.yt.dlp; Path=/test; cookie2=value2; Domain=.yt.dlp; Path=/', [
|
||||
cookie('cookie1', 'value1', domain='.yt.dlp', path='/test'),
|
||||
cookie('cookie2', 'value2', domain='.yt.dlp', path='/')])
|
||||
test('test=value; Domain=.yt.dlp; Path=/test; Secure; Expires=9999999999', [
|
||||
cookie('test', 'value', domain='.yt.dlp', path='/test', secure=True, expires=9999999999)])
|
||||
test('test="value; "; path=/test; domain=.yt.dlp', [
|
||||
cookie('test', 'value; ', domain='.yt.dlp', path='/test')],
|
||||
round_trip='test="value\\073 "; Domain=.yt.dlp; Path=/test')
|
||||
test('name=; Domain=.yt.dlp', [cookie('name', '', domain='.yt.dlp')],
|
||||
round_trip='name=""; Domain=.yt.dlp')
|
||||
|
||||
test('test=value', [cookie('test', 'value', domain='.yt.dlp')], headers=True)
|
||||
test('cookie1=value; Domain=.yt.dlp; cookie2=value', [], headers=True, error_re=r'Invalid syntax')
|
||||
ydl.deprecated_feature = ydl.report_error
|
||||
test('test=value', [], headers=True, error_re=r'Passing cookies as a header is a potential security risk')
|
||||
|
||||
def test_infojson_cookies(self):
|
||||
TEST_FILE = 'test_infojson_cookies.info.json'
|
||||
TEST_URL = 'https://example.com/example.mp4'
|
||||
COOKIES = 'a=b; Domain=.example.com; c=d; Domain=.example.com'
|
||||
COOKIE_HEADER = {'Cookie': 'a=b; c=d'}
|
||||
|
||||
ydl = FakeYDL()
|
||||
ydl.process_info = lambda x: ydl._write_info_json('test', x, TEST_FILE)
|
||||
|
||||
def make_info(info_header_cookies=False, fmts_header_cookies=False, cookies_field=False):
|
||||
fmt = {'url': TEST_URL}
|
||||
if fmts_header_cookies:
|
||||
fmt['http_headers'] = COOKIE_HEADER
|
||||
if cookies_field:
|
||||
fmt['cookies'] = COOKIES
|
||||
return _make_result([fmt], http_headers=COOKIE_HEADER if info_header_cookies else None)
|
||||
|
||||
def test(initial_info, note):
|
||||
result = {}
|
||||
result['processed'] = ydl.process_ie_result(initial_info)
|
||||
self.assertTrue(ydl.cookiejar.get_cookies_for_url(TEST_URL),
|
||||
msg=f'No cookies set in cookiejar after initial process when {note}')
|
||||
ydl.cookiejar.clear()
|
||||
with open(TEST_FILE) as infojson:
|
||||
result['loaded'] = ydl.sanitize_info(json.load(infojson), True)
|
||||
result['final'] = ydl.process_ie_result(result['loaded'].copy(), download=False)
|
||||
self.assertTrue(ydl.cookiejar.get_cookies_for_url(TEST_URL),
|
||||
msg=f'No cookies set in cookiejar after final process when {note}')
|
||||
ydl.cookiejar.clear()
|
||||
for key in ('processed', 'loaded', 'final'):
|
||||
info = result[key]
|
||||
self.assertIsNone(
|
||||
traverse_obj(info, ((None, ('formats', 0)), 'http_headers', 'Cookie'), casesense=False, get_all=False),
|
||||
msg=f'Cookie header not removed in {key} result when {note}')
|
||||
self.assertEqual(
|
||||
traverse_obj(info, ((None, ('formats', 0)), 'cookies'), get_all=False), COOKIES,
|
||||
msg=f'No cookies field found in {key} result when {note}')
|
||||
|
||||
test({'url': TEST_URL, 'http_headers': COOKIE_HEADER, 'id': '1', 'title': 'x'}, 'no formats field')
|
||||
test(make_info(info_header_cookies=True), 'info_dict header cokies')
|
||||
test(make_info(fmts_header_cookies=True), 'format header cookies')
|
||||
test(make_info(info_header_cookies=True, fmts_header_cookies=True), 'info_dict and format header cookies')
|
||||
test(make_info(info_header_cookies=True, fmts_header_cookies=True, cookies_field=True), 'all cookies fields')
|
||||
test(make_info(cookies_field=True), 'cookies format field')
|
||||
test({'url': TEST_URL, 'cookies': COOKIES, 'id': '1', 'title': 'x'}, 'info_dict cookies field only')
|
||||
|
||||
try_rm(TEST_FILE)
|
||||
|
||||
def test_add_headers_cookie(self):
|
||||
def check_for_cookie_header(result):
|
||||
return traverse_obj(result, ((None, ('formats', 0)), 'http_headers', 'Cookie'), casesense=False, get_all=False)
|
||||
|
||||
ydl = FakeYDL({'http_headers': {'Cookie': 'a=b'}})
|
||||
ydl._apply_header_cookies(_make_result([])['webpage_url']) # Scope to input webpage URL: .example.com
|
||||
|
||||
fmt = {'url': 'https://example.com/video.mp4'}
|
||||
result = ydl.process_ie_result(_make_result([fmt]), download=False)
|
||||
self.assertIsNone(check_for_cookie_header(result), msg='http_headers cookies in result info_dict')
|
||||
self.assertEqual(result.get('cookies'), 'a=b; Domain=.example.com', msg='No cookies were set in cookies field')
|
||||
self.assertIn('a=b', ydl.cookiejar.get_cookie_header(fmt['url']), msg='No cookies were set in cookiejar')
|
||||
|
||||
fmt = {'url': 'https://wrong.com/video.mp4'}
|
||||
result = ydl.process_ie_result(_make_result([fmt]), download=False)
|
||||
self.assertIsNone(check_for_cookie_header(result), msg='http_headers cookies for wrong domain')
|
||||
self.assertFalse(result.get('cookies'), msg='Cookies set in cookies field for wrong domain')
|
||||
self.assertFalse(ydl.cookiejar.get_cookie_header(fmt['url']), msg='Cookies set in cookiejar for wrong domain')
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main()
|
||||
|
|
|
@ -11,16 +11,16 @@
|
|||
import re
|
||||
import tempfile
|
||||
|
||||
from yt_dlp.utils import YoutubeDLCookieJar
|
||||
from yt_dlp.cookies import YoutubeDLCookieJar
|
||||
|
||||
|
||||
class TestYoutubeDLCookieJar(unittest.TestCase):
|
||||
def test_keep_session_cookies(self):
|
||||
cookiejar = YoutubeDLCookieJar('./test/testdata/cookies/session_cookies.txt')
|
||||
cookiejar.load(ignore_discard=True, ignore_expires=True)
|
||||
cookiejar.load()
|
||||
tf = tempfile.NamedTemporaryFile(delete=False)
|
||||
try:
|
||||
cookiejar.save(filename=tf.name, ignore_discard=True, ignore_expires=True)
|
||||
cookiejar.save(filename=tf.name)
|
||||
temp = tf.read().decode()
|
||||
self.assertTrue(re.search(
|
||||
r'www\.foobar\.foobar\s+FALSE\s+/\s+TRUE\s+0\s+YoutubeDLExpiresEmpty\s+YoutubeDLExpiresEmptyValue', temp))
|
||||
|
@ -32,7 +32,7 @@ def test_keep_session_cookies(self):
|
|||
|
||||
def test_strip_httponly_prefix(self):
|
||||
cookiejar = YoutubeDLCookieJar('./test/testdata/cookies/httponly_cookies.txt')
|
||||
cookiejar.load(ignore_discard=True, ignore_expires=True)
|
||||
cookiejar.load()
|
||||
|
||||
def assert_cookie_has_value(key):
|
||||
self.assertEqual(cookiejar._cookies['www.foobar.foobar']['/'][key].value, key + '_VALUE')
|
||||
|
@ -42,11 +42,25 @@ def assert_cookie_has_value(key):
|
|||
|
||||
def test_malformed_cookies(self):
|
||||
cookiejar = YoutubeDLCookieJar('./test/testdata/cookies/malformed_cookies.txt')
|
||||
cookiejar.load(ignore_discard=True, ignore_expires=True)
|
||||
cookiejar.load()
|
||||
# Cookies should be empty since all malformed cookie file entries
|
||||
# will be ignored
|
||||
self.assertFalse(cookiejar._cookies)
|
||||
|
||||
def test_get_cookie_header(self):
|
||||
cookiejar = YoutubeDLCookieJar('./test/testdata/cookies/httponly_cookies.txt')
|
||||
cookiejar.load()
|
||||
header = cookiejar.get_cookie_header('https://www.foobar.foobar')
|
||||
self.assertIn('HTTPONLY_COOKIE', header)
|
||||
|
||||
def test_get_cookies_for_url(self):
|
||||
cookiejar = YoutubeDLCookieJar('./test/testdata/cookies/session_cookies.txt')
|
||||
cookiejar.load()
|
||||
cookies = cookiejar.get_cookies_for_url('https://www.foobar.foobar/')
|
||||
self.assertEqual(len(cookies), 2)
|
||||
cookies = cookiejar.get_cookies_for_url('https://foobar.foobar/')
|
||||
self.assertFalse(cookies)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main()
|
||||
|
|
|
@ -26,7 +26,7 @@
|
|||
key_expansion,
|
||||
pad_block,
|
||||
)
|
||||
from yt_dlp.dependencies import Cryptodome_AES
|
||||
from yt_dlp.dependencies import Cryptodome
|
||||
from yt_dlp.utils import bytes_to_intlist, intlist_to_bytes
|
||||
|
||||
# the encrypted data can be generate with 'devscripts/generate_aes_testdata.py'
|
||||
|
@ -48,7 +48,7 @@ def test_cbc_decrypt(self):
|
|||
data = b'\x97\x92+\xe5\x0b\xc3\x18\x91ky9m&\xb3\xb5@\xe6\x27\xc2\x96.\xc8u\x88\xab9-[\x9e|\xf1\xcd'
|
||||
decrypted = intlist_to_bytes(aes_cbc_decrypt(bytes_to_intlist(data), self.key, self.iv))
|
||||
self.assertEqual(decrypted.rstrip(b'\x08'), self.secret_msg)
|
||||
if Cryptodome_AES:
|
||||
if Cryptodome.AES:
|
||||
decrypted = aes_cbc_decrypt_bytes(data, intlist_to_bytes(self.key), intlist_to_bytes(self.iv))
|
||||
self.assertEqual(decrypted.rstrip(b'\x08'), self.secret_msg)
|
||||
|
||||
|
@ -78,7 +78,7 @@ def test_gcm_decrypt(self):
|
|||
decrypted = intlist_to_bytes(aes_gcm_decrypt_and_verify(
|
||||
bytes_to_intlist(data), self.key, bytes_to_intlist(authentication_tag), self.iv[:12]))
|
||||
self.assertEqual(decrypted.rstrip(b'\x08'), self.secret_msg)
|
||||
if Cryptodome_AES:
|
||||
if Cryptodome.AES:
|
||||
decrypted = aes_gcm_decrypt_and_verify_bytes(
|
||||
data, intlist_to_bytes(self.key), authentication_tag, intlist_to_bytes(self.iv[:12]))
|
||||
self.assertEqual(decrypted.rstrip(b'\x08'), self.secret_msg)
|
||||
|
@ -87,7 +87,7 @@ def test_decrypt_text(self):
|
|||
password = intlist_to_bytes(self.key).decode()
|
||||
encrypted = base64.b64encode(
|
||||
intlist_to_bytes(self.iv[:8])
|
||||
+ b'\x17\x15\x93\xab\x8d\x80V\xcdV\xe0\t\xcdo\xc2\xa5\xd8ksM\r\xe27N\xae'
|
||||
+ b'\x17\x15\x93\xab\x8d\x80V\xcdV\xe0\t\xcdo\xc2\xa5\xd8ksM\r\xe27N\xae',
|
||||
).decode()
|
||||
decrypted = (aes_decrypt_text(encrypted, password, 16))
|
||||
self.assertEqual(decrypted, self.secret_msg)
|
||||
|
@ -95,7 +95,7 @@ def test_decrypt_text(self):
|
|||
password = intlist_to_bytes(self.key).decode()
|
||||
encrypted = base64.b64encode(
|
||||
intlist_to_bytes(self.iv[:8])
|
||||
+ b'\x0b\xe6\xa4\xd9z\x0e\xb8\xb9\xd0\xd4i_\x85\x1d\x99\x98_\xe5\x80\xe7.\xbf\xa5\x83'
|
||||
+ b'\x0b\xe6\xa4\xd9z\x0e\xb8\xb9\xd0\xd4i_\x85\x1d\x99\x98_\xe5\x80\xe7.\xbf\xa5\x83',
|
||||
).decode()
|
||||
decrypted = (aes_decrypt_text(encrypted, password, 32))
|
||||
self.assertEqual(decrypted, self.secret_msg)
|
||||
|
@ -132,16 +132,16 @@ def test_pad_block(self):
|
|||
block = [0x21, 0xA0, 0x43, 0xFF]
|
||||
|
||||
self.assertEqual(pad_block(block, 'pkcs7'),
|
||||
block + [0x0C, 0x0C, 0x0C, 0x0C, 0x0C, 0x0C, 0x0C, 0x0C, 0x0C, 0x0C, 0x0C, 0x0C])
|
||||
[*block, 0x0C, 0x0C, 0x0C, 0x0C, 0x0C, 0x0C, 0x0C, 0x0C, 0x0C, 0x0C, 0x0C, 0x0C])
|
||||
|
||||
self.assertEqual(pad_block(block, 'iso7816'),
|
||||
block + [0x80, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00])
|
||||
[*block, 0x80, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00])
|
||||
|
||||
self.assertEqual(pad_block(block, 'whitespace'),
|
||||
block + [0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20])
|
||||
[*block, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20, 0x20])
|
||||
|
||||
self.assertEqual(pad_block(block, 'zero'),
|
||||
block + [0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00])
|
||||
[*block, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00])
|
||||
|
||||
block = list(range(16))
|
||||
for mode in ('pkcs7', 'iso7816', 'whitespace', 'zero'):
|
||||
|
|
|
@ -10,6 +10,7 @@
|
|||
|
||||
from test.helper import is_download_test, try_rm
|
||||
from yt_dlp import YoutubeDL
|
||||
from yt_dlp.utils import DownloadError
|
||||
|
||||
|
||||
def _download_restricted(url, filename, age):
|
||||
|
@ -25,10 +26,14 @@ def _download_restricted(url, filename, age):
|
|||
ydl.add_default_info_extractors()
|
||||
json_filename = os.path.splitext(filename)[0] + '.info.json'
|
||||
try_rm(json_filename)
|
||||
ydl.download([url])
|
||||
res = os.path.exists(json_filename)
|
||||
try_rm(json_filename)
|
||||
return res
|
||||
try:
|
||||
ydl.download([url])
|
||||
except DownloadError:
|
||||
pass
|
||||
else:
|
||||
return os.path.exists(json_filename)
|
||||
finally:
|
||||
try_rm(json_filename)
|
||||
|
||||
|
||||
@is_download_test
|
||||
|
@ -38,12 +43,12 @@ def _assert_restricted(self, url, filename, age, old_age=None):
|
|||
self.assertFalse(_download_restricted(url, filename, age))
|
||||
|
||||
def test_youtube(self):
|
||||
self._assert_restricted('07FYdnEawAQ', '07FYdnEawAQ.mp4', 10)
|
||||
self._assert_restricted('HtVdAasjOgU', 'HtVdAasjOgU.mp4', 10)
|
||||
|
||||
def test_youporn(self):
|
||||
self._assert_restricted(
|
||||
'http://www.youporn.com/watch/505835/sex-ed-is-it-safe-to-masturbate-daily/',
|
||||
'505835.mp4', 2, old_age=25)
|
||||
'https://www.youporn.com/watch/16715086/sex-ed-in-detention-18-asmr/',
|
||||
'16715086.mp4', 2, old_age=25)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
|
|
|
@ -9,27 +9,30 @@
|
|||
|
||||
|
||||
import struct
|
||||
import urllib.parse
|
||||
|
||||
from yt_dlp import compat
|
||||
from yt_dlp.compat import urllib # isort: split
|
||||
from yt_dlp.compat import (
|
||||
compat_etree_fromstring,
|
||||
compat_expanduser,
|
||||
compat_urllib_parse_unquote,
|
||||
compat_urllib_parse_urlencode,
|
||||
compat_urllib_parse_unquote, # noqa: TID251
|
||||
compat_urllib_parse_urlencode, # noqa: TID251
|
||||
)
|
||||
from yt_dlp.compat.urllib.request import getproxies
|
||||
|
||||
|
||||
class TestCompat(unittest.TestCase):
|
||||
def test_compat_passthrough(self):
|
||||
with self.assertWarns(DeprecationWarning):
|
||||
compat.compat_basestring
|
||||
_ = compat.compat_basestring
|
||||
|
||||
with self.assertWarns(DeprecationWarning):
|
||||
compat.WINDOWS_VT_MODE
|
||||
_ = compat.WINDOWS_VT_MODE
|
||||
|
||||
# TODO: Test submodule
|
||||
# compat.asyncio.events # Must not raise error
|
||||
self.assertEqual(urllib.request.getproxies, getproxies)
|
||||
|
||||
with self.assertWarns(DeprecationWarning):
|
||||
_ = compat.compat_pycrypto_AES # Must not raise error
|
||||
|
||||
def test_compat_expanduser(self):
|
||||
old_home = os.environ.get('HOME')
|
||||
|
|
227
test/test_config.py
Normal file
227
test/test_config.py
Normal file
|
@ -0,0 +1,227 @@
|
|||
#!/usr/bin/env python3
|
||||
|
||||
# Allow direct execution
|
||||
import os
|
||||
import sys
|
||||
import unittest
|
||||
import unittest.mock
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
import contextlib
|
||||
import itertools
|
||||
from pathlib import Path
|
||||
|
||||
from yt_dlp.compat import compat_expanduser
|
||||
from yt_dlp.options import create_parser, parseOpts
|
||||
from yt_dlp.utils import Config, get_executable_path
|
||||
|
||||
ENVIRON_DEFAULTS = {
|
||||
'HOME': None,
|
||||
'XDG_CONFIG_HOME': '/_xdg_config_home/',
|
||||
'USERPROFILE': 'C:/Users/testing/',
|
||||
'APPDATA': 'C:/Users/testing/AppData/Roaming/',
|
||||
'HOMEDRIVE': 'C:/',
|
||||
'HOMEPATH': 'Users/testing/',
|
||||
}
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def set_environ(**kwargs):
|
||||
saved_environ = os.environ.copy()
|
||||
|
||||
for name, value in {**ENVIRON_DEFAULTS, **kwargs}.items():
|
||||
if value is None:
|
||||
os.environ.pop(name, None)
|
||||
else:
|
||||
os.environ[name] = value
|
||||
|
||||
yield
|
||||
|
||||
os.environ.clear()
|
||||
os.environ.update(saved_environ)
|
||||
|
||||
|
||||
def _generate_expected_groups():
|
||||
xdg_config_home = os.getenv('XDG_CONFIG_HOME') or compat_expanduser('~/.config')
|
||||
appdata_dir = os.getenv('appdata')
|
||||
home_dir = compat_expanduser('~')
|
||||
return {
|
||||
'Portable': [
|
||||
Path(get_executable_path(), 'yt-dlp.conf'),
|
||||
],
|
||||
'Home': [
|
||||
Path('yt-dlp.conf'),
|
||||
],
|
||||
'User': [
|
||||
Path(xdg_config_home, 'yt-dlp.conf'),
|
||||
Path(xdg_config_home, 'yt-dlp', 'config'),
|
||||
Path(xdg_config_home, 'yt-dlp', 'config.txt'),
|
||||
*((
|
||||
Path(appdata_dir, 'yt-dlp.conf'),
|
||||
Path(appdata_dir, 'yt-dlp', 'config'),
|
||||
Path(appdata_dir, 'yt-dlp', 'config.txt'),
|
||||
) if appdata_dir else ()),
|
||||
Path(home_dir, 'yt-dlp.conf'),
|
||||
Path(home_dir, 'yt-dlp.conf.txt'),
|
||||
Path(home_dir, '.yt-dlp', 'config'),
|
||||
Path(home_dir, '.yt-dlp', 'config.txt'),
|
||||
],
|
||||
'System': [
|
||||
Path('/etc/yt-dlp.conf'),
|
||||
Path('/etc/yt-dlp/config'),
|
||||
Path('/etc/yt-dlp/config.txt'),
|
||||
],
|
||||
}
|
||||
|
||||
|
||||
class TestConfig(unittest.TestCase):
|
||||
maxDiff = None
|
||||
|
||||
@set_environ()
|
||||
def test_config__ENVIRON_DEFAULTS_sanity(self):
|
||||
expected = make_expected()
|
||||
self.assertCountEqual(
|
||||
set(expected), expected,
|
||||
'ENVIRON_DEFAULTS produces non unique names')
|
||||
|
||||
def test_config_all_environ_values(self):
|
||||
for name, value in ENVIRON_DEFAULTS.items():
|
||||
for new_value in (None, '', '.', value or '/some/dir'):
|
||||
with set_environ(**{name: new_value}):
|
||||
self._simple_grouping_test()
|
||||
|
||||
def test_config_default_expected_locations(self):
|
||||
files, _ = self._simple_config_test()
|
||||
self.assertEqual(
|
||||
files, make_expected(),
|
||||
'Not all expected locations have been checked')
|
||||
|
||||
def test_config_default_grouping(self):
|
||||
self._simple_grouping_test()
|
||||
|
||||
def _simple_grouping_test(self):
|
||||
expected_groups = make_expected_groups()
|
||||
for name, group in expected_groups.items():
|
||||
for index, existing_path in enumerate(group):
|
||||
result, opts = self._simple_config_test(existing_path)
|
||||
expected = expected_from_expected_groups(expected_groups, existing_path)
|
||||
self.assertEqual(
|
||||
result, expected,
|
||||
f'The checked locations do not match the expected ({name}, {index})')
|
||||
self.assertEqual(
|
||||
opts.outtmpl['default'], '1',
|
||||
f'The used result value was incorrect ({name}, {index})')
|
||||
|
||||
def _simple_config_test(self, *stop_paths):
|
||||
encountered = 0
|
||||
paths = []
|
||||
|
||||
def read_file(filename, default=[]):
|
||||
nonlocal encountered
|
||||
path = Path(filename)
|
||||
paths.append(path)
|
||||
if path in stop_paths:
|
||||
encountered += 1
|
||||
return ['-o', f'{encountered}']
|
||||
|
||||
with ConfigMock(read_file):
|
||||
_, opts, _ = parseOpts([], False)
|
||||
|
||||
return paths, opts
|
||||
|
||||
@set_environ()
|
||||
def test_config_early_exit_commandline(self):
|
||||
self._early_exit_test(0, '--ignore-config')
|
||||
|
||||
@set_environ()
|
||||
def test_config_early_exit_files(self):
|
||||
for index, _ in enumerate(make_expected(), 1):
|
||||
self._early_exit_test(index)
|
||||
|
||||
def _early_exit_test(self, allowed_reads, *args):
|
||||
reads = 0
|
||||
|
||||
def read_file(filename, default=[]):
|
||||
nonlocal reads
|
||||
reads += 1
|
||||
|
||||
if reads > allowed_reads:
|
||||
self.fail('The remaining config was not ignored')
|
||||
elif reads == allowed_reads:
|
||||
return ['--ignore-config']
|
||||
|
||||
with ConfigMock(read_file):
|
||||
parseOpts(args, False)
|
||||
|
||||
@set_environ()
|
||||
def test_config_override_commandline(self):
|
||||
self._override_test(0, '-o', 'pass')
|
||||
|
||||
@set_environ()
|
||||
def test_config_override_files(self):
|
||||
for index, _ in enumerate(make_expected(), 1):
|
||||
self._override_test(index)
|
||||
|
||||
def _override_test(self, start_index, *args):
|
||||
index = 0
|
||||
|
||||
def read_file(filename, default=[]):
|
||||
nonlocal index
|
||||
index += 1
|
||||
|
||||
if index > start_index:
|
||||
return ['-o', 'fail']
|
||||
elif index == start_index:
|
||||
return ['-o', 'pass']
|
||||
|
||||
with ConfigMock(read_file):
|
||||
_, opts, _ = parseOpts(args, False)
|
||||
|
||||
self.assertEqual(
|
||||
opts.outtmpl['default'], 'pass',
|
||||
'The earlier group did not override the later ones')
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def ConfigMock(read_file=None):
|
||||
with unittest.mock.patch('yt_dlp.options.Config') as mock:
|
||||
mock.return_value = Config(create_parser())
|
||||
if read_file is not None:
|
||||
mock.read_file = read_file
|
||||
|
||||
yield mock
|
||||
|
||||
|
||||
def make_expected(*filepaths):
|
||||
return expected_from_expected_groups(_generate_expected_groups(), *filepaths)
|
||||
|
||||
|
||||
def make_expected_groups(*filepaths):
|
||||
return _filter_expected_groups(_generate_expected_groups(), filepaths)
|
||||
|
||||
|
||||
def expected_from_expected_groups(expected_groups, *filepaths):
|
||||
return list(itertools.chain.from_iterable(
|
||||
_filter_expected_groups(expected_groups, filepaths).values()))
|
||||
|
||||
|
||||
def _filter_expected_groups(expected, filepaths):
|
||||
if not filepaths:
|
||||
return expected
|
||||
|
||||
result = {}
|
||||
for group, paths in expected.items():
|
||||
new_paths = []
|
||||
for path in paths:
|
||||
new_paths.append(path)
|
||||
if path in filepaths:
|
||||
break
|
||||
|
||||
result[group] = new_paths
|
||||
|
||||
return result
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main()
|
|
@ -1,5 +1,5 @@
|
|||
import datetime as dt
|
||||
import unittest
|
||||
from datetime import datetime, timezone
|
||||
|
||||
from yt_dlp import cookies
|
||||
from yt_dlp.cookies import (
|
||||
|
@ -49,32 +49,39 @@ def test_get_desktop_environment(self):
|
|||
""" based on https://chromium.googlesource.com/chromium/src/+/refs/heads/main/base/nix/xdg_util_unittest.cc """
|
||||
test_cases = [
|
||||
({}, _LinuxDesktopEnvironment.OTHER),
|
||||
({'DESKTOP_SESSION': 'my_custom_de'}, _LinuxDesktopEnvironment.OTHER),
|
||||
({'XDG_CURRENT_DESKTOP': 'my_custom_de'}, _LinuxDesktopEnvironment.OTHER),
|
||||
|
||||
({'DESKTOP_SESSION': 'gnome'}, _LinuxDesktopEnvironment.GNOME),
|
||||
({'DESKTOP_SESSION': 'mate'}, _LinuxDesktopEnvironment.GNOME),
|
||||
({'DESKTOP_SESSION': 'kde4'}, _LinuxDesktopEnvironment.KDE),
|
||||
({'DESKTOP_SESSION': 'kde'}, _LinuxDesktopEnvironment.KDE),
|
||||
({'DESKTOP_SESSION': 'kde4'}, _LinuxDesktopEnvironment.KDE4),
|
||||
({'DESKTOP_SESSION': 'kde'}, _LinuxDesktopEnvironment.KDE3),
|
||||
({'DESKTOP_SESSION': 'xfce'}, _LinuxDesktopEnvironment.XFCE),
|
||||
|
||||
({'GNOME_DESKTOP_SESSION_ID': 1}, _LinuxDesktopEnvironment.GNOME),
|
||||
({'KDE_FULL_SESSION': 1}, _LinuxDesktopEnvironment.KDE),
|
||||
({'KDE_FULL_SESSION': 1}, _LinuxDesktopEnvironment.KDE3),
|
||||
({'KDE_FULL_SESSION': 1, 'DESKTOP_SESSION': 'kde4'}, _LinuxDesktopEnvironment.KDE4),
|
||||
|
||||
({'XDG_CURRENT_DESKTOP': 'X-Cinnamon'}, _LinuxDesktopEnvironment.CINNAMON),
|
||||
({'XDG_CURRENT_DESKTOP': 'Deepin'}, _LinuxDesktopEnvironment.DEEPIN),
|
||||
({'XDG_CURRENT_DESKTOP': 'GNOME'}, _LinuxDesktopEnvironment.GNOME),
|
||||
({'XDG_CURRENT_DESKTOP': 'GNOME:GNOME-Classic'}, _LinuxDesktopEnvironment.GNOME),
|
||||
({'XDG_CURRENT_DESKTOP': 'GNOME : GNOME-Classic'}, _LinuxDesktopEnvironment.GNOME),
|
||||
({'XDG_CURRENT_DESKTOP': 'ubuntu:GNOME'}, _LinuxDesktopEnvironment.GNOME),
|
||||
|
||||
({'XDG_CURRENT_DESKTOP': 'Unity', 'DESKTOP_SESSION': 'gnome-fallback'}, _LinuxDesktopEnvironment.GNOME),
|
||||
({'XDG_CURRENT_DESKTOP': 'KDE', 'KDE_SESSION_VERSION': '5'}, _LinuxDesktopEnvironment.KDE),
|
||||
({'XDG_CURRENT_DESKTOP': 'KDE'}, _LinuxDesktopEnvironment.KDE),
|
||||
({'XDG_CURRENT_DESKTOP': 'KDE', 'KDE_SESSION_VERSION': '5'}, _LinuxDesktopEnvironment.KDE5),
|
||||
({'XDG_CURRENT_DESKTOP': 'KDE', 'KDE_SESSION_VERSION': '6'}, _LinuxDesktopEnvironment.KDE6),
|
||||
({'XDG_CURRENT_DESKTOP': 'KDE'}, _LinuxDesktopEnvironment.KDE4),
|
||||
({'XDG_CURRENT_DESKTOP': 'Pantheon'}, _LinuxDesktopEnvironment.PANTHEON),
|
||||
({'XDG_CURRENT_DESKTOP': 'UKUI'}, _LinuxDesktopEnvironment.UKUI),
|
||||
({'XDG_CURRENT_DESKTOP': 'Unity'}, _LinuxDesktopEnvironment.UNITY),
|
||||
({'XDG_CURRENT_DESKTOP': 'Unity:Unity7'}, _LinuxDesktopEnvironment.UNITY),
|
||||
({'XDG_CURRENT_DESKTOP': 'Unity:Unity8'}, _LinuxDesktopEnvironment.UNITY),
|
||||
]
|
||||
|
||||
for env, expected_desktop_environment in test_cases:
|
||||
self.assertEqual(_get_linux_desktop_environment(env), expected_desktop_environment)
|
||||
self.assertEqual(_get_linux_desktop_environment(env, Logger()), expected_desktop_environment)
|
||||
|
||||
def test_chrome_cookie_decryptor_linux_derive_key(self):
|
||||
key = LinuxChromeCookieDecryptor.derive_key(b'abc')
|
||||
|
@ -100,7 +107,7 @@ def test_chrome_cookie_decryptor_linux_v11(self):
|
|||
|
||||
def test_chrome_cookie_decryptor_windows_v10(self):
|
||||
with MonkeyPatch(cookies, {
|
||||
'_get_windows_v10_key': lambda *args, **kwargs: b'Y\xef\xad\xad\xeerp\xf0Y\xe6\x9b\x12\xc2<z\x16]\n\xbb\xb8\xcb\xd7\x9bA\xc3\x14e\x99{\xd6\xf4&'
|
||||
'_get_windows_v10_key': lambda *args, **kwargs: b'Y\xef\xad\xad\xeerp\xf0Y\xe6\x9b\x12\xc2<z\x16]\n\xbb\xb8\xcb\xd7\x9bA\xc3\x14e\x99{\xd6\xf4&',
|
||||
}):
|
||||
encrypted_value = b'v10T\xb8\xf3\xb8\x01\xa7TtcV\xfc\x88\xb8\xb8\xef\x05\xb5\xfd\x18\xc90\x009\xab\xb1\x893\x85)\x87\xe1\xa9-\xa3\xad='
|
||||
value = '32101439'
|
||||
|
@ -115,24 +122,24 @@ def test_chrome_cookie_decryptor_mac_v10(self):
|
|||
self.assertEqual(decryptor.decrypt(encrypted_value), value)
|
||||
|
||||
def test_safari_cookie_parsing(self):
|
||||
cookies = \
|
||||
b'cook\x00\x00\x00\x01\x00\x00\x00i\x00\x00\x01\x00\x01\x00\x00\x00\x10\x00\x00\x00\x00\x00\x00\x00Y' \
|
||||
b'\x00\x00\x00\x00\x00\x00\x00 \x00\x00\x00\x00\x00\x00\x008\x00\x00\x00B\x00\x00\x00F\x00\x00\x00H' \
|
||||
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x80\x03\xa5>\xc3A\x00\x00\x80\xc3\x07:\xc3A' \
|
||||
b'localhost\x00foo\x00/\x00test%20%3Bcookie\x00\x00\x00\x054\x07\x17 \x05\x00\x00\x00Kbplist00\xd1\x01' \
|
||||
b'\x02_\x10\x18NSHTTPCookieAcceptPolicy\x10\x02\x08\x0b&\x00\x00\x00\x00\x00\x00\x01\x01\x00\x00\x00' \
|
||||
b'\x00\x00\x00\x00\x03\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00('
|
||||
cookies = (
|
||||
b'cook\x00\x00\x00\x01\x00\x00\x00i\x00\x00\x01\x00\x01\x00\x00\x00\x10\x00\x00\x00\x00\x00\x00\x00Y'
|
||||
b'\x00\x00\x00\x00\x00\x00\x00 \x00\x00\x00\x00\x00\x00\x008\x00\x00\x00B\x00\x00\x00F\x00\x00\x00H'
|
||||
b'\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x80\x03\xa5>\xc3A\x00\x00\x80\xc3\x07:\xc3A'
|
||||
b'localhost\x00foo\x00/\x00test%20%3Bcookie\x00\x00\x00\x054\x07\x17 \x05\x00\x00\x00Kbplist00\xd1\x01'
|
||||
b'\x02_\x10\x18NSHTTPCookieAcceptPolicy\x10\x02\x08\x0b&\x00\x00\x00\x00\x00\x00\x01\x01\x00\x00\x00'
|
||||
b'\x00\x00\x00\x00\x03\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00(')
|
||||
|
||||
jar = parse_safari_cookies(cookies)
|
||||
self.assertEqual(len(jar), 1)
|
||||
cookie = list(jar)[0]
|
||||
cookie = next(iter(jar))
|
||||
self.assertEqual(cookie.domain, 'localhost')
|
||||
self.assertEqual(cookie.port, None)
|
||||
self.assertEqual(cookie.path, '/')
|
||||
self.assertEqual(cookie.name, 'foo')
|
||||
self.assertEqual(cookie.value, 'test%20%3Bcookie')
|
||||
self.assertFalse(cookie.secure)
|
||||
expected_expiration = datetime(2021, 6, 18, 21, 39, 19, tzinfo=timezone.utc)
|
||||
expected_expiration = dt.datetime(2021, 6, 18, 21, 39, 19, tzinfo=dt.timezone.utc)
|
||||
self.assertEqual(cookie.expires, int(expected_expiration.timestamp()))
|
||||
|
||||
def test_pbkdf2_sha1(self):
|
||||
|
@ -158,7 +165,7 @@ def _run_tests(self, *cases):
|
|||
attributes = {
|
||||
key: value
|
||||
for key, value in dict(morsel).items()
|
||||
if value != ""
|
||||
if value != ''
|
||||
}
|
||||
self.assertEqual(attributes, expected_attributes, message)
|
||||
|
||||
|
@ -168,133 +175,133 @@ def test_parsing(self):
|
|||
self._run_tests(
|
||||
# Copied from https://github.com/python/cpython/blob/v3.10.7/Lib/test/test_http_cookies.py
|
||||
(
|
||||
"Test basic cookie",
|
||||
"chips=ahoy; vienna=finger",
|
||||
{"chips": "ahoy", "vienna": "finger"},
|
||||
'Test basic cookie',
|
||||
'chips=ahoy; vienna=finger',
|
||||
{'chips': 'ahoy', 'vienna': 'finger'},
|
||||
),
|
||||
(
|
||||
"Test quoted cookie",
|
||||
'Test quoted cookie',
|
||||
'keebler="E=mc2; L=\\"Loves\\"; fudge=\\012;"',
|
||||
{"keebler": 'E=mc2; L="Loves"; fudge=\012;'},
|
||||
{'keebler': 'E=mc2; L="Loves"; fudge=\012;'},
|
||||
),
|
||||
(
|
||||
"Allow '=' in an unquoted value",
|
||||
"keebler=E=mc2",
|
||||
{"keebler": "E=mc2"},
|
||||
'keebler=E=mc2',
|
||||
{'keebler': 'E=mc2'},
|
||||
),
|
||||
(
|
||||
"Allow cookies with ':' in their name",
|
||||
"key:term=value:term",
|
||||
{"key:term": "value:term"},
|
||||
'key:term=value:term',
|
||||
{'key:term': 'value:term'},
|
||||
),
|
||||
(
|
||||
"Allow '[' and ']' in cookie values",
|
||||
"a=b; c=[; d=r; f=h",
|
||||
{"a": "b", "c": "[", "d": "r", "f": "h"},
|
||||
'a=b; c=[; d=r; f=h',
|
||||
{'a': 'b', 'c': '[', 'd': 'r', 'f': 'h'},
|
||||
),
|
||||
(
|
||||
"Test basic cookie attributes",
|
||||
'Test basic cookie attributes',
|
||||
'Customer="WILE_E_COYOTE"; Version=1; Path=/acme',
|
||||
{"Customer": ("WILE_E_COYOTE", {"version": "1", "path": "/acme"})},
|
||||
{'Customer': ('WILE_E_COYOTE', {'version': '1', 'path': '/acme'})},
|
||||
),
|
||||
(
|
||||
"Test flag only cookie attributes",
|
||||
'Test flag only cookie attributes',
|
||||
'Customer="WILE_E_COYOTE"; HttpOnly; Secure',
|
||||
{"Customer": ("WILE_E_COYOTE", {"httponly": True, "secure": True})},
|
||||
{'Customer': ('WILE_E_COYOTE', {'httponly': True, 'secure': True})},
|
||||
),
|
||||
(
|
||||
"Test flag only attribute with values",
|
||||
"eggs=scrambled; httponly=foo; secure=bar; Path=/bacon",
|
||||
{"eggs": ("scrambled", {"httponly": "foo", "secure": "bar", "path": "/bacon"})},
|
||||
'Test flag only attribute with values',
|
||||
'eggs=scrambled; httponly=foo; secure=bar; Path=/bacon',
|
||||
{'eggs': ('scrambled', {'httponly': 'foo', 'secure': 'bar', 'path': '/bacon'})},
|
||||
),
|
||||
(
|
||||
"Test special case for 'expires' attribute, 4 digit year",
|
||||
'Customer="W"; expires=Wed, 01 Jan 2010 00:00:00 GMT',
|
||||
{"Customer": ("W", {"expires": "Wed, 01 Jan 2010 00:00:00 GMT"})},
|
||||
{'Customer': ('W', {'expires': 'Wed, 01 Jan 2010 00:00:00 GMT'})},
|
||||
),
|
||||
(
|
||||
"Test special case for 'expires' attribute, 2 digit year",
|
||||
'Customer="W"; expires=Wed, 01 Jan 98 00:00:00 GMT',
|
||||
{"Customer": ("W", {"expires": "Wed, 01 Jan 98 00:00:00 GMT"})},
|
||||
{'Customer': ('W', {'expires': 'Wed, 01 Jan 98 00:00:00 GMT'})},
|
||||
),
|
||||
(
|
||||
"Test extra spaces in keys and values",
|
||||
"eggs = scrambled ; secure ; path = bar ; foo=foo ",
|
||||
{"eggs": ("scrambled", {"secure": True, "path": "bar"}), "foo": "foo"},
|
||||
'Test extra spaces in keys and values',
|
||||
'eggs = scrambled ; secure ; path = bar ; foo=foo ',
|
||||
{'eggs': ('scrambled', {'secure': True, 'path': 'bar'}), 'foo': 'foo'},
|
||||
),
|
||||
(
|
||||
"Test quoted attributes",
|
||||
'Test quoted attributes',
|
||||
'Customer="WILE_E_COYOTE"; Version="1"; Path="/acme"',
|
||||
{"Customer": ("WILE_E_COYOTE", {"version": "1", "path": "/acme"})}
|
||||
{'Customer': ('WILE_E_COYOTE', {'version': '1', 'path': '/acme'})},
|
||||
),
|
||||
# Our own tests that CPython passes
|
||||
(
|
||||
"Allow ';' in quoted value",
|
||||
'chips="a;hoy"; vienna=finger',
|
||||
{"chips": "a;hoy", "vienna": "finger"},
|
||||
{'chips': 'a;hoy', 'vienna': 'finger'},
|
||||
),
|
||||
(
|
||||
"Keep only the last set value",
|
||||
"a=c; a=b",
|
||||
{"a": "b"},
|
||||
'Keep only the last set value',
|
||||
'a=c; a=b',
|
||||
{'a': 'b'},
|
||||
),
|
||||
)
|
||||
|
||||
def test_lenient_parsing(self):
|
||||
self._run_tests(
|
||||
(
|
||||
"Ignore and try to skip invalid cookies",
|
||||
'Ignore and try to skip invalid cookies',
|
||||
'chips={"ahoy;": 1}; vienna="finger;"',
|
||||
{"vienna": "finger;"},
|
||||
{'vienna': 'finger;'},
|
||||
),
|
||||
(
|
||||
"Ignore cookies without a name",
|
||||
"a=b; unnamed; c=d",
|
||||
{"a": "b", "c": "d"},
|
||||
'Ignore cookies without a name',
|
||||
'a=b; unnamed; c=d',
|
||||
{'a': 'b', 'c': 'd'},
|
||||
),
|
||||
(
|
||||
"Ignore '\"' cookie without name",
|
||||
'a=b; "; c=d',
|
||||
{"a": "b", "c": "d"},
|
||||
{'a': 'b', 'c': 'd'},
|
||||
),
|
||||
(
|
||||
"Skip all space separated values",
|
||||
"x a=b c=d x; e=f",
|
||||
{"a": "b", "c": "d", "e": "f"},
|
||||
'Skip all space separated values',
|
||||
'x a=b c=d x; e=f',
|
||||
{'a': 'b', 'c': 'd', 'e': 'f'},
|
||||
),
|
||||
(
|
||||
"Skip all space separated values",
|
||||
'Skip all space separated values',
|
||||
'x a=b; data={"complex": "json", "with": "key=value"}; x c=d x',
|
||||
{"a": "b", "c": "d"},
|
||||
{'a': 'b', 'c': 'd'},
|
||||
),
|
||||
(
|
||||
"Expect quote mending",
|
||||
'Expect quote mending',
|
||||
'a=b; invalid="; c=d',
|
||||
{"a": "b", "c": "d"},
|
||||
{'a': 'b', 'c': 'd'},
|
||||
),
|
||||
(
|
||||
"Reset morsel after invalid to not capture attributes",
|
||||
"a=b; invalid; Version=1; c=d",
|
||||
{"a": "b", "c": "d"},
|
||||
'Reset morsel after invalid to not capture attributes',
|
||||
'a=b; invalid; Version=1; c=d',
|
||||
{'a': 'b', 'c': 'd'},
|
||||
),
|
||||
(
|
||||
"Reset morsel after invalid to not capture attributes",
|
||||
"a=b; $invalid; $Version=1; c=d",
|
||||
{"a": "b", "c": "d"},
|
||||
'Reset morsel after invalid to not capture attributes',
|
||||
'a=b; $invalid; $Version=1; c=d',
|
||||
{'a': 'b', 'c': 'd'},
|
||||
),
|
||||
(
|
||||
"Continue after non-flag attribute without value",
|
||||
"a=b; path; Version=1; c=d",
|
||||
{"a": "b", "c": "d"},
|
||||
'Continue after non-flag attribute without value',
|
||||
'a=b; path; Version=1; c=d',
|
||||
{'a': 'b', 'c': 'd'},
|
||||
),
|
||||
(
|
||||
"Allow cookie attributes with `$` prefix",
|
||||
'Allow cookie attributes with `$` prefix',
|
||||
'Customer="WILE_E_COYOTE"; $Version=1; $Secure; $Path=/acme',
|
||||
{"Customer": ("WILE_E_COYOTE", {"version": "1", "secure": True, "path": "/acme"})},
|
||||
{'Customer': ('WILE_E_COYOTE', {'version': '1', 'secure': True, 'path': '/acme'})},
|
||||
),
|
||||
(
|
||||
"Invalid Morsel keys should not result in an error",
|
||||
"Key=Value; [Invalid]=Value; Another=Value",
|
||||
{"Key": "Value", "Another": "Value"},
|
||||
'Invalid Morsel keys should not result in an error',
|
||||
'Key=Value; [Invalid]=Value; Another=Value',
|
||||
{'Key': 'Value', 'Another': 'Value'},
|
||||
),
|
||||
)
|
||||
|
|
|
@ -10,10 +10,7 @@
|
|||
|
||||
import collections
|
||||
import hashlib
|
||||
import http.client
|
||||
import json
|
||||
import socket
|
||||
import urllib.error
|
||||
|
||||
from test.helper import (
|
||||
assertGreaterEqual,
|
||||
|
@ -23,16 +20,17 @@
|
|||
gettestcases,
|
||||
getwebpagetestcases,
|
||||
is_download_test,
|
||||
report_warning,
|
||||
try_rm,
|
||||
)
|
||||
|
||||
import yt_dlp.YoutubeDL # isort: split
|
||||
from yt_dlp.extractor import get_info_extractor
|
||||
from yt_dlp.networking.exceptions import HTTPError, TransportError
|
||||
from yt_dlp.utils import (
|
||||
DownloadError,
|
||||
ExtractorError,
|
||||
UnavailableVideoError,
|
||||
YoutubeDLError,
|
||||
format_bytes,
|
||||
join_nonempty,
|
||||
)
|
||||
|
@ -95,13 +93,15 @@ def test_template(self):
|
|||
'playlist', [] if is_playlist else [test_case])
|
||||
|
||||
def print_skipping(reason):
|
||||
print('Skipping %s: %s' % (test_case['name'], reason))
|
||||
print('Skipping {}: {}'.format(test_case['name'], reason))
|
||||
self.skipTest(reason)
|
||||
|
||||
if not ie.working():
|
||||
print_skipping('IE marked as not _WORKING')
|
||||
|
||||
for tc in test_cases:
|
||||
if tc.get('expected_exception'):
|
||||
continue
|
||||
info_dict = tc.get('info_dict', {})
|
||||
params = tc.get('params', {})
|
||||
if not info_dict.get('id'):
|
||||
|
@ -116,7 +116,7 @@ def print_skipping(reason):
|
|||
|
||||
for other_ie in other_ies:
|
||||
if not other_ie.working():
|
||||
print_skipping('test depends on %sIE, marked as not WORKING' % other_ie.ie_key())
|
||||
print_skipping(f'test depends on {other_ie.ie_key()}IE, marked as not WORKING')
|
||||
|
||||
params = get_params(test_case.get('params', {}))
|
||||
params['outtmpl'] = tname + '_' + params['outtmpl']
|
||||
|
@ -141,6 +141,14 @@ def get_tc_filename(tc):
|
|||
|
||||
res_dict = None
|
||||
|
||||
def match_exception(err):
|
||||
expected_exception = test_case.get('expected_exception')
|
||||
if not expected_exception:
|
||||
return False
|
||||
if err.__class__.__name__ == expected_exception:
|
||||
return True
|
||||
return any(exc.__class__.__name__ == expected_exception for exc in err.exc_info)
|
||||
|
||||
def try_rm_tcs_files(tcs=None):
|
||||
if tcs is None:
|
||||
tcs = test_cases
|
||||
|
@ -162,18 +170,22 @@ def try_rm_tcs_files(tcs=None):
|
|||
force_generic_extractor=params.get('force_generic_extractor', False))
|
||||
except (DownloadError, ExtractorError) as err:
|
||||
# Check if the exception is not a network related one
|
||||
if (err.exc_info[0] not in (urllib.error.URLError, socket.timeout, UnavailableVideoError, http.client.BadStatusLine)
|
||||
or (err.exc_info[0] == urllib.error.HTTPError and err.exc_info[1].code == 503)):
|
||||
if not isinstance(err.exc_info[1], (TransportError, UnavailableVideoError)) or (isinstance(err.exc_info[1], HTTPError) and err.exc_info[1].status == 503):
|
||||
if match_exception(err):
|
||||
return
|
||||
err.msg = f'{getattr(err, "msg", err)} ({tname})'
|
||||
raise
|
||||
|
||||
if try_num == RETRIES:
|
||||
report_warning('%s failed due to network errors, skipping...' % tname)
|
||||
return
|
||||
raise
|
||||
|
||||
print(f'Retrying: {try_num} failed tries\n\n##########\n\n')
|
||||
|
||||
try_num += 1
|
||||
except YoutubeDLError as err:
|
||||
if match_exception(err):
|
||||
return
|
||||
raise
|
||||
else:
|
||||
break
|
||||
|
||||
|
@ -227,9 +239,8 @@ def try_rm_tcs_files(tcs=None):
|
|||
got_fsize = os.path.getsize(tc_filename)
|
||||
assertGreaterEqual(
|
||||
self, got_fsize, expected_minsize,
|
||||
'Expected %s to be at least %s, but it\'s only %s ' %
|
||||
(tc_filename, format_bytes(expected_minsize),
|
||||
format_bytes(got_fsize)))
|
||||
f'Expected {tc_filename} to be at least {format_bytes(expected_minsize)}, '
|
||||
f'but it\'s only {format_bytes(got_fsize)} ')
|
||||
if 'md5' in tc:
|
||||
md5_for_file = _file_md5(tc_filename)
|
||||
self.assertEqual(tc['md5'], md5_for_file)
|
||||
|
@ -238,7 +249,7 @@ def try_rm_tcs_files(tcs=None):
|
|||
info_json_fn = os.path.splitext(tc_filename)[0] + '.info.json'
|
||||
self.assertTrue(
|
||||
os.path.exists(info_json_fn),
|
||||
'Missing info file %s' % info_json_fn)
|
||||
f'Missing info file {info_json_fn}')
|
||||
with open(info_json_fn, encoding='utf-8') as infof:
|
||||
info_dict = json.load(infof)
|
||||
expect_info_dict(self, info_dict, tc.get('info_dict', {}))
|
||||
|
@ -249,7 +260,7 @@ def try_rm_tcs_files(tcs=None):
|
|||
# extractor returns full results even with extract_flat
|
||||
res_tcs = [{'info_dict': e} for e in res_dict['entries']]
|
||||
try_rm_tcs_files(res_tcs)
|
||||
|
||||
ydl.close()
|
||||
return test_template
|
||||
|
||||
|
||||
|
|
139
test/test_downloader_external.py
Normal file
139
test/test_downloader_external.py
Normal file
|
@ -0,0 +1,139 @@
|
|||
#!/usr/bin/env python3
|
||||
|
||||
# Allow direct execution
|
||||
import os
|
||||
import sys
|
||||
import unittest
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
import http.cookiejar
|
||||
|
||||
from test.helper import FakeYDL
|
||||
from yt_dlp.downloader.external import (
|
||||
Aria2cFD,
|
||||
AxelFD,
|
||||
CurlFD,
|
||||
FFmpegFD,
|
||||
HttpieFD,
|
||||
WgetFD,
|
||||
)
|
||||
|
||||
TEST_COOKIE = {
|
||||
'version': 0,
|
||||
'name': 'test',
|
||||
'value': 'ytdlp',
|
||||
'port': None,
|
||||
'port_specified': False,
|
||||
'domain': '.example.com',
|
||||
'domain_specified': True,
|
||||
'domain_initial_dot': False,
|
||||
'path': '/',
|
||||
'path_specified': True,
|
||||
'secure': False,
|
||||
'expires': None,
|
||||
'discard': False,
|
||||
'comment': None,
|
||||
'comment_url': None,
|
||||
'rest': {},
|
||||
}
|
||||
|
||||
TEST_INFO = {'url': 'http://www.example.com/'}
|
||||
|
||||
|
||||
class TestHttpieFD(unittest.TestCase):
|
||||
def test_make_cmd(self):
|
||||
with FakeYDL() as ydl:
|
||||
downloader = HttpieFD(ydl, {})
|
||||
self.assertEqual(
|
||||
downloader._make_cmd('test', TEST_INFO),
|
||||
['http', '--download', '--output', 'test', 'http://www.example.com/'])
|
||||
|
||||
# Test cookie header is added
|
||||
ydl.cookiejar.set_cookie(http.cookiejar.Cookie(**TEST_COOKIE))
|
||||
self.assertEqual(
|
||||
downloader._make_cmd('test', TEST_INFO),
|
||||
['http', '--download', '--output', 'test', 'http://www.example.com/', 'Cookie:test=ytdlp'])
|
||||
|
||||
|
||||
class TestAxelFD(unittest.TestCase):
|
||||
def test_make_cmd(self):
|
||||
with FakeYDL() as ydl:
|
||||
downloader = AxelFD(ydl, {})
|
||||
self.assertEqual(
|
||||
downloader._make_cmd('test', TEST_INFO),
|
||||
['axel', '-o', 'test', '--', 'http://www.example.com/'])
|
||||
|
||||
# Test cookie header is added
|
||||
ydl.cookiejar.set_cookie(http.cookiejar.Cookie(**TEST_COOKIE))
|
||||
self.assertEqual(
|
||||
downloader._make_cmd('test', TEST_INFO),
|
||||
['axel', '-o', 'test', '-H', 'Cookie: test=ytdlp', '--max-redirect=0', '--', 'http://www.example.com/'])
|
||||
|
||||
|
||||
class TestWgetFD(unittest.TestCase):
|
||||
def test_make_cmd(self):
|
||||
with FakeYDL() as ydl:
|
||||
downloader = WgetFD(ydl, {})
|
||||
self.assertNotIn('--load-cookies', downloader._make_cmd('test', TEST_INFO))
|
||||
# Test cookiejar tempfile arg is added
|
||||
ydl.cookiejar.set_cookie(http.cookiejar.Cookie(**TEST_COOKIE))
|
||||
self.assertIn('--load-cookies', downloader._make_cmd('test', TEST_INFO))
|
||||
|
||||
|
||||
class TestCurlFD(unittest.TestCase):
|
||||
def test_make_cmd(self):
|
||||
with FakeYDL() as ydl:
|
||||
downloader = CurlFD(ydl, {})
|
||||
self.assertNotIn('--cookie', downloader._make_cmd('test', TEST_INFO))
|
||||
# Test cookie header is added
|
||||
ydl.cookiejar.set_cookie(http.cookiejar.Cookie(**TEST_COOKIE))
|
||||
self.assertIn('--cookie', downloader._make_cmd('test', TEST_INFO))
|
||||
self.assertIn('test=ytdlp', downloader._make_cmd('test', TEST_INFO))
|
||||
|
||||
|
||||
class TestAria2cFD(unittest.TestCase):
|
||||
def test_make_cmd(self):
|
||||
with FakeYDL() as ydl:
|
||||
downloader = Aria2cFD(ydl, {})
|
||||
downloader._make_cmd('test', TEST_INFO)
|
||||
self.assertFalse(hasattr(downloader, '_cookies_tempfile'))
|
||||
|
||||
# Test cookiejar tempfile arg is added
|
||||
ydl.cookiejar.set_cookie(http.cookiejar.Cookie(**TEST_COOKIE))
|
||||
cmd = downloader._make_cmd('test', TEST_INFO)
|
||||
self.assertIn(f'--load-cookies={downloader._cookies_tempfile}', cmd)
|
||||
|
||||
|
||||
@unittest.skipUnless(FFmpegFD.available(), 'ffmpeg not found')
|
||||
class TestFFmpegFD(unittest.TestCase):
|
||||
_args = []
|
||||
|
||||
def _test_cmd(self, args):
|
||||
self._args = args
|
||||
|
||||
def test_make_cmd(self):
|
||||
with FakeYDL() as ydl:
|
||||
downloader = FFmpegFD(ydl, {})
|
||||
downloader._debug_cmd = self._test_cmd
|
||||
|
||||
downloader._call_downloader('test', {**TEST_INFO, 'ext': 'mp4'})
|
||||
self.assertEqual(self._args, [
|
||||
'ffmpeg', '-y', '-hide_banner', '-i', 'http://www.example.com/',
|
||||
'-c', 'copy', '-f', 'mp4', 'file:test'])
|
||||
|
||||
# Test cookies arg is added
|
||||
ydl.cookiejar.set_cookie(http.cookiejar.Cookie(**TEST_COOKIE))
|
||||
downloader._call_downloader('test', {**TEST_INFO, 'ext': 'mp4'})
|
||||
self.assertEqual(self._args, [
|
||||
'ffmpeg', '-y', '-hide_banner', '-cookies', 'test=ytdlp; path=/; domain=.example.com;\r\n',
|
||||
'-i', 'http://www.example.com/', '-c', 'copy', '-f', 'mp4', 'file:test'])
|
||||
|
||||
# Test with non-url input (ffmpeg reads from stdin '-' for websockets)
|
||||
downloader._call_downloader('test', {'url': 'x', 'ext': 'mp4'})
|
||||
self.assertEqual(self._args, [
|
||||
'ffmpeg', '-y', '-hide_banner', '-i', 'x', '-c', 'copy', '-f', 'mp4', 'file:test'])
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main()
|
|
@ -16,6 +16,7 @@
|
|||
from yt_dlp import YoutubeDL
|
||||
from yt_dlp.downloader.http import HttpFD
|
||||
from yt_dlp.utils import encodeFilename
|
||||
from yt_dlp.utils._utils import _YDLLogger as FakeLogger
|
||||
|
||||
TEST_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
@ -37,9 +38,9 @@ def send_content_range(self, total=None):
|
|||
end = int(mobj.group(2))
|
||||
valid_range = start is not None and end is not None
|
||||
if valid_range:
|
||||
content_range = 'bytes %d-%d' % (start, end)
|
||||
content_range = f'bytes {start}-{end}'
|
||||
if total:
|
||||
content_range += '/%d' % total
|
||||
content_range += f'/{total}'
|
||||
self.send_header('Content-Range', content_range)
|
||||
return (end - start + 1) if valid_range else total
|
||||
|
||||
|
@ -67,17 +68,6 @@ def do_GET(self):
|
|||
assert False
|
||||
|
||||
|
||||
class FakeLogger:
|
||||
def debug(self, msg):
|
||||
pass
|
||||
|
||||
def warning(self, msg):
|
||||
pass
|
||||
|
||||
def error(self, msg):
|
||||
pass
|
||||
|
||||
|
||||
class TestHttpFD(unittest.TestCase):
|
||||
def setUp(self):
|
||||
self.httpd = http.server.HTTPServer(
|
||||
|
@ -94,7 +84,7 @@ def download(self, params, ep):
|
|||
filename = 'testfile.mp4'
|
||||
try_rm(encodeFilename(filename))
|
||||
self.assertTrue(downloader.real_download(filename, {
|
||||
'url': 'http://127.0.0.1:%d/%s' % (self.port, ep),
|
||||
'url': f'http://127.0.0.1:{self.port}/{ep}',
|
||||
}), ep)
|
||||
self.assertEqual(os.path.getsize(encodeFilename(filename)), TEST_SIZE, ep)
|
||||
try_rm(encodeFilename(filename))
|
||||
|
|
|
@ -45,6 +45,9 @@ def test_lazy_extractors(self):
|
|||
self.assertTrue(os.path.exists(LAZY_EXTRACTORS))
|
||||
|
||||
_, stderr = self.run_yt_dlp(opts=('-s', 'test:'))
|
||||
# `MIN_RECOMMENDED` emits a deprecated feature warning for deprecated Python versions
|
||||
if stderr and stderr.startswith('Deprecated Feature: Support for Python'):
|
||||
stderr = ''
|
||||
self.assertFalse(stderr)
|
||||
|
||||
subprocess.check_call([sys.executable, 'test/test_all_urls.py'], cwd=rootDir, stdout=subprocess.DEVNULL)
|
||||
|
|
|
@ -1,192 +0,0 @@
|
|||
#!/usr/bin/env python3
|
||||
|
||||
# Allow direct execution
|
||||
import os
|
||||
import sys
|
||||
import unittest
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
|
||||
import http.server
|
||||
import ssl
|
||||
import threading
|
||||
import urllib.request
|
||||
|
||||
from test.helper import http_server_port
|
||||
from yt_dlp import YoutubeDL
|
||||
|
||||
TEST_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
||||
class HTTPTestRequestHandler(http.server.BaseHTTPRequestHandler):
|
||||
def log_message(self, format, *args):
|
||||
pass
|
||||
|
||||
def do_GET(self):
|
||||
if self.path == '/video.html':
|
||||
self.send_response(200)
|
||||
self.send_header('Content-Type', 'text/html; charset=utf-8')
|
||||
self.end_headers()
|
||||
self.wfile.write(b'<html><video src="/vid.mp4" /></html>')
|
||||
elif self.path == '/vid.mp4':
|
||||
self.send_response(200)
|
||||
self.send_header('Content-Type', 'video/mp4')
|
||||
self.end_headers()
|
||||
self.wfile.write(b'\x00\x00\x00\x00\x20\x66\x74[video]')
|
||||
elif self.path == '/%E4%B8%AD%E6%96%87.html':
|
||||
self.send_response(200)
|
||||
self.send_header('Content-Type', 'text/html; charset=utf-8')
|
||||
self.end_headers()
|
||||
self.wfile.write(b'<html><video src="/vid.mp4" /></html>')
|
||||
else:
|
||||
assert False
|
||||
|
||||
|
||||
class FakeLogger:
|
||||
def debug(self, msg):
|
||||
pass
|
||||
|
||||
def warning(self, msg):
|
||||
pass
|
||||
|
||||
def error(self, msg):
|
||||
pass
|
||||
|
||||
|
||||
class TestHTTP(unittest.TestCase):
|
||||
def setUp(self):
|
||||
self.httpd = http.server.HTTPServer(
|
||||
('127.0.0.1', 0), HTTPTestRequestHandler)
|
||||
self.port = http_server_port(self.httpd)
|
||||
self.server_thread = threading.Thread(target=self.httpd.serve_forever)
|
||||
self.server_thread.daemon = True
|
||||
self.server_thread.start()
|
||||
|
||||
|
||||
class TestHTTPS(unittest.TestCase):
|
||||
def setUp(self):
|
||||
certfn = os.path.join(TEST_DIR, 'testcert.pem')
|
||||
self.httpd = http.server.HTTPServer(
|
||||
('127.0.0.1', 0), HTTPTestRequestHandler)
|
||||
sslctx = ssl.SSLContext(ssl.PROTOCOL_TLS_SERVER)
|
||||
sslctx.load_cert_chain(certfn, None)
|
||||
self.httpd.socket = sslctx.wrap_socket(self.httpd.socket, server_side=True)
|
||||
self.port = http_server_port(self.httpd)
|
||||
self.server_thread = threading.Thread(target=self.httpd.serve_forever)
|
||||
self.server_thread.daemon = True
|
||||
self.server_thread.start()
|
||||
|
||||
def test_nocheckcertificate(self):
|
||||
ydl = YoutubeDL({'logger': FakeLogger()})
|
||||
self.assertRaises(
|
||||
Exception,
|
||||
ydl.extract_info, 'https://127.0.0.1:%d/video.html' % self.port)
|
||||
|
||||
ydl = YoutubeDL({'logger': FakeLogger(), 'nocheckcertificate': True})
|
||||
r = ydl.extract_info('https://127.0.0.1:%d/video.html' % self.port)
|
||||
self.assertEqual(r['url'], 'https://127.0.0.1:%d/vid.mp4' % self.port)
|
||||
|
||||
|
||||
class TestClientCert(unittest.TestCase):
|
||||
def setUp(self):
|
||||
certfn = os.path.join(TEST_DIR, 'testcert.pem')
|
||||
self.certdir = os.path.join(TEST_DIR, 'testdata', 'certificate')
|
||||
cacertfn = os.path.join(self.certdir, 'ca.crt')
|
||||
self.httpd = http.server.HTTPServer(('127.0.0.1', 0), HTTPTestRequestHandler)
|
||||
sslctx = ssl.SSLContext(ssl.PROTOCOL_TLS_SERVER)
|
||||
sslctx.verify_mode = ssl.CERT_REQUIRED
|
||||
sslctx.load_verify_locations(cafile=cacertfn)
|
||||
sslctx.load_cert_chain(certfn, None)
|
||||
self.httpd.socket = sslctx.wrap_socket(self.httpd.socket, server_side=True)
|
||||
self.port = http_server_port(self.httpd)
|
||||
self.server_thread = threading.Thread(target=self.httpd.serve_forever)
|
||||
self.server_thread.daemon = True
|
||||
self.server_thread.start()
|
||||
|
||||
def _run_test(self, **params):
|
||||
ydl = YoutubeDL({
|
||||
'logger': FakeLogger(),
|
||||
# Disable client-side validation of unacceptable self-signed testcert.pem
|
||||
# The test is of a check on the server side, so unaffected
|
||||
'nocheckcertificate': True,
|
||||
**params,
|
||||
})
|
||||
r = ydl.extract_info('https://127.0.0.1:%d/video.html' % self.port)
|
||||
self.assertEqual(r['url'], 'https://127.0.0.1:%d/vid.mp4' % self.port)
|
||||
|
||||
def test_certificate_combined_nopass(self):
|
||||
self._run_test(client_certificate=os.path.join(self.certdir, 'clientwithkey.crt'))
|
||||
|
||||
def test_certificate_nocombined_nopass(self):
|
||||
self._run_test(client_certificate=os.path.join(self.certdir, 'client.crt'),
|
||||
client_certificate_key=os.path.join(self.certdir, 'client.key'))
|
||||
|
||||
def test_certificate_combined_pass(self):
|
||||
self._run_test(client_certificate=os.path.join(self.certdir, 'clientwithencryptedkey.crt'),
|
||||
client_certificate_password='foobar')
|
||||
|
||||
def test_certificate_nocombined_pass(self):
|
||||
self._run_test(client_certificate=os.path.join(self.certdir, 'client.crt'),
|
||||
client_certificate_key=os.path.join(self.certdir, 'clientencrypted.key'),
|
||||
client_certificate_password='foobar')
|
||||
|
||||
|
||||
def _build_proxy_handler(name):
|
||||
class HTTPTestRequestHandler(http.server.BaseHTTPRequestHandler):
|
||||
proxy_name = name
|
||||
|
||||
def log_message(self, format, *args):
|
||||
pass
|
||||
|
||||
def do_GET(self):
|
||||
self.send_response(200)
|
||||
self.send_header('Content-Type', 'text/plain; charset=utf-8')
|
||||
self.end_headers()
|
||||
self.wfile.write(f'{self.proxy_name}: {self.path}'.encode())
|
||||
return HTTPTestRequestHandler
|
||||
|
||||
|
||||
class TestProxy(unittest.TestCase):
|
||||
def setUp(self):
|
||||
self.proxy = http.server.HTTPServer(
|
||||
('127.0.0.1', 0), _build_proxy_handler('normal'))
|
||||
self.port = http_server_port(self.proxy)
|
||||
self.proxy_thread = threading.Thread(target=self.proxy.serve_forever)
|
||||
self.proxy_thread.daemon = True
|
||||
self.proxy_thread.start()
|
||||
|
||||
self.geo_proxy = http.server.HTTPServer(
|
||||
('127.0.0.1', 0), _build_proxy_handler('geo'))
|
||||
self.geo_port = http_server_port(self.geo_proxy)
|
||||
self.geo_proxy_thread = threading.Thread(target=self.geo_proxy.serve_forever)
|
||||
self.geo_proxy_thread.daemon = True
|
||||
self.geo_proxy_thread.start()
|
||||
|
||||
def test_proxy(self):
|
||||
geo_proxy = f'127.0.0.1:{self.geo_port}'
|
||||
ydl = YoutubeDL({
|
||||
'proxy': f'127.0.0.1:{self.port}',
|
||||
'geo_verification_proxy': geo_proxy,
|
||||
})
|
||||
url = 'http://foo.com/bar'
|
||||
response = ydl.urlopen(url).read().decode()
|
||||
self.assertEqual(response, f'normal: {url}')
|
||||
|
||||
req = urllib.request.Request(url)
|
||||
req.add_header('Ytdl-request-proxy', geo_proxy)
|
||||
response = ydl.urlopen(req).read().decode()
|
||||
self.assertEqual(response, f'geo: {url}')
|
||||
|
||||
def test_proxy_with_idn(self):
|
||||
ydl = YoutubeDL({
|
||||
'proxy': f'127.0.0.1:{self.port}',
|
||||
})
|
||||
url = 'http://中文.tw/'
|
||||
response = ydl.urlopen(url).read().decode()
|
||||
# b'xn--fiq228c' is '中文'.encode('idna')
|
||||
self.assertEqual(response, 'normal: http://xn--fiq228c.tw/')
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main()
|
380
test/test_http_proxy.py
Normal file
380
test/test_http_proxy.py
Normal file
|
@ -0,0 +1,380 @@
|
|||
import abc
|
||||
import base64
|
||||
import contextlib
|
||||
import functools
|
||||
import json
|
||||
import os
|
||||
import random
|
||||
import ssl
|
||||
import threading
|
||||
from http.server import BaseHTTPRequestHandler
|
||||
from socketserver import ThreadingTCPServer
|
||||
|
||||
import pytest
|
||||
|
||||
from test.helper import http_server_port, verify_address_availability
|
||||
from test.test_networking import TEST_DIR
|
||||
from test.test_socks import IPv6ThreadingTCPServer
|
||||
from yt_dlp.dependencies import urllib3
|
||||
from yt_dlp.networking import Request
|
||||
from yt_dlp.networking.exceptions import HTTPError, ProxyError, SSLError
|
||||
|
||||
|
||||
class HTTPProxyAuthMixin:
|
||||
|
||||
def proxy_auth_error(self):
|
||||
self.send_response(407)
|
||||
self.send_header('Proxy-Authenticate', 'Basic realm="test http proxy"')
|
||||
self.end_headers()
|
||||
return False
|
||||
|
||||
def do_proxy_auth(self, username, password):
|
||||
if username is None and password is None:
|
||||
return True
|
||||
|
||||
proxy_auth_header = self.headers.get('Proxy-Authorization', None)
|
||||
if proxy_auth_header is None:
|
||||
return self.proxy_auth_error()
|
||||
|
||||
if not proxy_auth_header.startswith('Basic '):
|
||||
return self.proxy_auth_error()
|
||||
|
||||
auth = proxy_auth_header[6:]
|
||||
|
||||
try:
|
||||
auth_username, auth_password = base64.b64decode(auth).decode().split(':', 1)
|
||||
except Exception:
|
||||
return self.proxy_auth_error()
|
||||
|
||||
if auth_username != (username or '') or auth_password != (password or ''):
|
||||
return self.proxy_auth_error()
|
||||
return True
|
||||
|
||||
|
||||
class HTTPProxyHandler(BaseHTTPRequestHandler, HTTPProxyAuthMixin):
|
||||
def __init__(self, *args, proxy_info=None, username=None, password=None, request_handler=None, **kwargs):
|
||||
self.username = username
|
||||
self.password = password
|
||||
self.proxy_info = proxy_info
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
def do_GET(self):
|
||||
if not self.do_proxy_auth(self.username, self.password):
|
||||
self.server.close_request(self.request)
|
||||
return
|
||||
if self.path.endswith('/proxy_info'):
|
||||
payload = json.dumps(self.proxy_info or {
|
||||
'client_address': self.client_address,
|
||||
'connect': False,
|
||||
'connect_host': None,
|
||||
'connect_port': None,
|
||||
'headers': dict(self.headers),
|
||||
'path': self.path,
|
||||
'proxy': ':'.join(str(y) for y in self.connection.getsockname()),
|
||||
})
|
||||
self.send_response(200)
|
||||
self.send_header('Content-Type', 'application/json; charset=utf-8')
|
||||
self.send_header('Content-Length', str(len(payload)))
|
||||
self.end_headers()
|
||||
self.wfile.write(payload.encode())
|
||||
else:
|
||||
self.send_response(404)
|
||||
self.end_headers()
|
||||
|
||||
self.server.close_request(self.request)
|
||||
|
||||
|
||||
if urllib3:
|
||||
import urllib3.util.ssltransport
|
||||
|
||||
class SSLTransport(urllib3.util.ssltransport.SSLTransport):
|
||||
"""
|
||||
Modified version of urllib3 SSLTransport to support server side SSL
|
||||
|
||||
This allows us to chain multiple TLS connections.
|
||||
"""
|
||||
|
||||
def __init__(self, socket, ssl_context, server_hostname=None, suppress_ragged_eofs=True, server_side=False):
|
||||
self.incoming = ssl.MemoryBIO()
|
||||
self.outgoing = ssl.MemoryBIO()
|
||||
|
||||
self.suppress_ragged_eofs = suppress_ragged_eofs
|
||||
self.socket = socket
|
||||
|
||||
self.sslobj = ssl_context.wrap_bio(
|
||||
self.incoming,
|
||||
self.outgoing,
|
||||
server_hostname=server_hostname,
|
||||
server_side=server_side,
|
||||
)
|
||||
self._ssl_io_loop(self.sslobj.do_handshake)
|
||||
|
||||
@property
|
||||
def _io_refs(self):
|
||||
return self.socket._io_refs
|
||||
|
||||
@_io_refs.setter
|
||||
def _io_refs(self, value):
|
||||
self.socket._io_refs = value
|
||||
|
||||
def shutdown(self, *args, **kwargs):
|
||||
self.socket.shutdown(*args, **kwargs)
|
||||
else:
|
||||
SSLTransport = None
|
||||
|
||||
|
||||
class HTTPSProxyHandler(HTTPProxyHandler):
|
||||
def __init__(self, request, *args, **kwargs):
|
||||
certfn = os.path.join(TEST_DIR, 'testcert.pem')
|
||||
sslctx = ssl.SSLContext(ssl.PROTOCOL_TLS_SERVER)
|
||||
sslctx.load_cert_chain(certfn, None)
|
||||
if isinstance(request, ssl.SSLSocket):
|
||||
request = SSLTransport(request, ssl_context=sslctx, server_side=True)
|
||||
else:
|
||||
request = sslctx.wrap_socket(request, server_side=True)
|
||||
super().__init__(request, *args, **kwargs)
|
||||
|
||||
|
||||
class HTTPConnectProxyHandler(BaseHTTPRequestHandler, HTTPProxyAuthMixin):
|
||||
protocol_version = 'HTTP/1.1'
|
||||
default_request_version = 'HTTP/1.1'
|
||||
|
||||
def __init__(self, *args, username=None, password=None, request_handler=None, **kwargs):
|
||||
self.username = username
|
||||
self.password = password
|
||||
self.request_handler = request_handler
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
def do_CONNECT(self):
|
||||
if not self.do_proxy_auth(self.username, self.password):
|
||||
self.server.close_request(self.request)
|
||||
return
|
||||
self.send_response(200)
|
||||
self.end_headers()
|
||||
proxy_info = {
|
||||
'client_address': self.client_address,
|
||||
'connect': True,
|
||||
'connect_host': self.path.split(':')[0],
|
||||
'connect_port': int(self.path.split(':')[1]),
|
||||
'headers': dict(self.headers),
|
||||
'path': self.path,
|
||||
'proxy': ':'.join(str(y) for y in self.connection.getsockname()),
|
||||
}
|
||||
self.request_handler(self.request, self.client_address, self.server, proxy_info=proxy_info)
|
||||
self.server.close_request(self.request)
|
||||
|
||||
|
||||
class HTTPSConnectProxyHandler(HTTPConnectProxyHandler):
|
||||
def __init__(self, request, *args, **kwargs):
|
||||
certfn = os.path.join(TEST_DIR, 'testcert.pem')
|
||||
sslctx = ssl.SSLContext(ssl.PROTOCOL_TLS_SERVER)
|
||||
sslctx.load_cert_chain(certfn, None)
|
||||
request = sslctx.wrap_socket(request, server_side=True)
|
||||
self._original_request = request
|
||||
super().__init__(request, *args, **kwargs)
|
||||
|
||||
def do_CONNECT(self):
|
||||
super().do_CONNECT()
|
||||
self.server.close_request(self._original_request)
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def proxy_server(proxy_server_class, request_handler, bind_ip=None, **proxy_server_kwargs):
|
||||
server = server_thread = None
|
||||
try:
|
||||
bind_address = bind_ip or '127.0.0.1'
|
||||
server_type = ThreadingTCPServer if '.' in bind_address else IPv6ThreadingTCPServer
|
||||
server = server_type(
|
||||
(bind_address, 0), functools.partial(proxy_server_class, request_handler=request_handler, **proxy_server_kwargs))
|
||||
server_port = http_server_port(server)
|
||||
server_thread = threading.Thread(target=server.serve_forever)
|
||||
server_thread.daemon = True
|
||||
server_thread.start()
|
||||
if '.' not in bind_address:
|
||||
yield f'[{bind_address}]:{server_port}'
|
||||
else:
|
||||
yield f'{bind_address}:{server_port}'
|
||||
finally:
|
||||
server.shutdown()
|
||||
server.server_close()
|
||||
server_thread.join(2.0)
|
||||
|
||||
|
||||
class HTTPProxyTestContext(abc.ABC):
|
||||
REQUEST_HANDLER_CLASS = None
|
||||
REQUEST_PROTO = None
|
||||
|
||||
def http_server(self, server_class, *args, **kwargs):
|
||||
return proxy_server(server_class, self.REQUEST_HANDLER_CLASS, *args, **kwargs)
|
||||
|
||||
@abc.abstractmethod
|
||||
def proxy_info_request(self, handler, target_domain=None, target_port=None, **req_kwargs) -> dict:
|
||||
"""return a dict of proxy_info"""
|
||||
|
||||
|
||||
class HTTPProxyHTTPTestContext(HTTPProxyTestContext):
|
||||
# Standard HTTP Proxy for http requests
|
||||
REQUEST_HANDLER_CLASS = HTTPProxyHandler
|
||||
REQUEST_PROTO = 'http'
|
||||
|
||||
def proxy_info_request(self, handler, target_domain=None, target_port=None, **req_kwargs):
|
||||
request = Request(f'http://{target_domain or "127.0.0.1"}:{target_port or "40000"}/proxy_info', **req_kwargs)
|
||||
handler.validate(request)
|
||||
return json.loads(handler.send(request).read().decode())
|
||||
|
||||
|
||||
class HTTPProxyHTTPSTestContext(HTTPProxyTestContext):
|
||||
# HTTP Connect proxy, for https requests
|
||||
REQUEST_HANDLER_CLASS = HTTPSProxyHandler
|
||||
REQUEST_PROTO = 'https'
|
||||
|
||||
def proxy_info_request(self, handler, target_domain=None, target_port=None, **req_kwargs):
|
||||
request = Request(f'https://{target_domain or "127.0.0.1"}:{target_port or "40000"}/proxy_info', **req_kwargs)
|
||||
handler.validate(request)
|
||||
return json.loads(handler.send(request).read().decode())
|
||||
|
||||
|
||||
CTX_MAP = {
|
||||
'http': HTTPProxyHTTPTestContext,
|
||||
'https': HTTPProxyHTTPSTestContext,
|
||||
}
|
||||
|
||||
|
||||
@pytest.fixture(scope='module')
|
||||
def ctx(request):
|
||||
return CTX_MAP[request.param]()
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
'handler', ['Urllib', 'Requests', 'CurlCFFI'], indirect=True)
|
||||
@pytest.mark.parametrize('ctx', ['http'], indirect=True) # pure http proxy can only support http
|
||||
class TestHTTPProxy:
|
||||
def test_http_no_auth(self, handler, ctx):
|
||||
with ctx.http_server(HTTPProxyHandler) as server_address:
|
||||
with handler(proxies={ctx.REQUEST_PROTO: f'http://{server_address}'}) as rh:
|
||||
proxy_info = ctx.proxy_info_request(rh)
|
||||
assert proxy_info['proxy'] == server_address
|
||||
assert proxy_info['connect'] is False
|
||||
assert 'Proxy-Authorization' not in proxy_info['headers']
|
||||
|
||||
def test_http_auth(self, handler, ctx):
|
||||
with ctx.http_server(HTTPProxyHandler, username='test', password='test') as server_address:
|
||||
with handler(proxies={ctx.REQUEST_PROTO: f'http://test:test@{server_address}'}) as rh:
|
||||
proxy_info = ctx.proxy_info_request(rh)
|
||||
assert proxy_info['proxy'] == server_address
|
||||
assert 'Proxy-Authorization' in proxy_info['headers']
|
||||
|
||||
def test_http_bad_auth(self, handler, ctx):
|
||||
with ctx.http_server(HTTPProxyHandler, username='test', password='test') as server_address:
|
||||
with handler(proxies={ctx.REQUEST_PROTO: f'http://test:bad@{server_address}'}) as rh:
|
||||
with pytest.raises(HTTPError) as exc_info:
|
||||
ctx.proxy_info_request(rh)
|
||||
assert exc_info.value.response.status == 407
|
||||
exc_info.value.response.close()
|
||||
|
||||
def test_http_source_address(self, handler, ctx):
|
||||
with ctx.http_server(HTTPProxyHandler) as server_address:
|
||||
source_address = f'127.0.0.{random.randint(5, 255)}'
|
||||
verify_address_availability(source_address)
|
||||
with handler(proxies={ctx.REQUEST_PROTO: f'http://{server_address}'},
|
||||
source_address=source_address) as rh:
|
||||
proxy_info = ctx.proxy_info_request(rh)
|
||||
assert proxy_info['proxy'] == server_address
|
||||
assert proxy_info['client_address'][0] == source_address
|
||||
|
||||
@pytest.mark.skip_handler('Urllib', 'urllib does not support https proxies')
|
||||
def test_https(self, handler, ctx):
|
||||
with ctx.http_server(HTTPSProxyHandler) as server_address:
|
||||
with handler(verify=False, proxies={ctx.REQUEST_PROTO: f'https://{server_address}'}) as rh:
|
||||
proxy_info = ctx.proxy_info_request(rh)
|
||||
assert proxy_info['proxy'] == server_address
|
||||
assert proxy_info['connect'] is False
|
||||
assert 'Proxy-Authorization' not in proxy_info['headers']
|
||||
|
||||
@pytest.mark.skip_handler('Urllib', 'urllib does not support https proxies')
|
||||
def test_https_verify_failed(self, handler, ctx):
|
||||
with ctx.http_server(HTTPSProxyHandler) as server_address:
|
||||
with handler(verify=True, proxies={ctx.REQUEST_PROTO: f'https://{server_address}'}) as rh:
|
||||
# Accept SSLError as may not be feasible to tell if it is proxy or request error.
|
||||
# note: if request proto also does ssl verification, this may also be the error of the request.
|
||||
# Until we can support passing custom cacerts to handlers, we cannot properly test this for all cases.
|
||||
with pytest.raises((ProxyError, SSLError)):
|
||||
ctx.proxy_info_request(rh)
|
||||
|
||||
def test_http_with_idn(self, handler, ctx):
|
||||
with ctx.http_server(HTTPProxyHandler) as server_address:
|
||||
with handler(proxies={ctx.REQUEST_PROTO: f'http://{server_address}'}) as rh:
|
||||
proxy_info = ctx.proxy_info_request(rh, target_domain='中文.tw')
|
||||
assert proxy_info['proxy'] == server_address
|
||||
assert proxy_info['path'].startswith('http://xn--fiq228c.tw')
|
||||
assert proxy_info['headers']['Host'].split(':', 1)[0] == 'xn--fiq228c.tw'
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
'handler,ctx', [
|
||||
('Requests', 'https'),
|
||||
('CurlCFFI', 'https'),
|
||||
], indirect=True)
|
||||
class TestHTTPConnectProxy:
|
||||
def test_http_connect_no_auth(self, handler, ctx):
|
||||
with ctx.http_server(HTTPConnectProxyHandler) as server_address:
|
||||
with handler(verify=False, proxies={ctx.REQUEST_PROTO: f'http://{server_address}'}) as rh:
|
||||
proxy_info = ctx.proxy_info_request(rh)
|
||||
assert proxy_info['proxy'] == server_address
|
||||
assert proxy_info['connect'] is True
|
||||
assert 'Proxy-Authorization' not in proxy_info['headers']
|
||||
|
||||
def test_http_connect_auth(self, handler, ctx):
|
||||
with ctx.http_server(HTTPConnectProxyHandler, username='test', password='test') as server_address:
|
||||
with handler(verify=False, proxies={ctx.REQUEST_PROTO: f'http://test:test@{server_address}'}) as rh:
|
||||
proxy_info = ctx.proxy_info_request(rh)
|
||||
assert proxy_info['proxy'] == server_address
|
||||
assert 'Proxy-Authorization' in proxy_info['headers']
|
||||
|
||||
@pytest.mark.skip_handler(
|
||||
'Requests',
|
||||
'bug in urllib3 causes unclosed socket: https://github.com/urllib3/urllib3/issues/3374',
|
||||
)
|
||||
def test_http_connect_bad_auth(self, handler, ctx):
|
||||
with ctx.http_server(HTTPConnectProxyHandler, username='test', password='test') as server_address:
|
||||
with handler(verify=False, proxies={ctx.REQUEST_PROTO: f'http://test:bad@{server_address}'}) as rh:
|
||||
with pytest.raises(ProxyError):
|
||||
ctx.proxy_info_request(rh)
|
||||
|
||||
def test_http_connect_source_address(self, handler, ctx):
|
||||
with ctx.http_server(HTTPConnectProxyHandler) as server_address:
|
||||
source_address = f'127.0.0.{random.randint(5, 255)}'
|
||||
verify_address_availability(source_address)
|
||||
with handler(proxies={ctx.REQUEST_PROTO: f'http://{server_address}'},
|
||||
source_address=source_address,
|
||||
verify=False) as rh:
|
||||
proxy_info = ctx.proxy_info_request(rh)
|
||||
assert proxy_info['proxy'] == server_address
|
||||
assert proxy_info['client_address'][0] == source_address
|
||||
|
||||
@pytest.mark.skipif(urllib3 is None, reason='requires urllib3 to test')
|
||||
def test_https_connect_proxy(self, handler, ctx):
|
||||
with ctx.http_server(HTTPSConnectProxyHandler) as server_address:
|
||||
with handler(verify=False, proxies={ctx.REQUEST_PROTO: f'https://{server_address}'}) as rh:
|
||||
proxy_info = ctx.proxy_info_request(rh)
|
||||
assert proxy_info['proxy'] == server_address
|
||||
assert proxy_info['connect'] is True
|
||||
assert 'Proxy-Authorization' not in proxy_info['headers']
|
||||
|
||||
@pytest.mark.skipif(urllib3 is None, reason='requires urllib3 to test')
|
||||
def test_https_connect_verify_failed(self, handler, ctx):
|
||||
with ctx.http_server(HTTPSConnectProxyHandler) as server_address:
|
||||
with handler(verify=True, proxies={ctx.REQUEST_PROTO: f'https://{server_address}'}) as rh:
|
||||
# Accept SSLError as may not be feasible to tell if it is proxy or request error.
|
||||
# note: if request proto also does ssl verification, this may also be the error of the request.
|
||||
# Until we can support passing custom cacerts to handlers, we cannot properly test this for all cases.
|
||||
with pytest.raises((ProxyError, SSLError)):
|
||||
ctx.proxy_info_request(rh)
|
||||
|
||||
@pytest.mark.skipif(urllib3 is None, reason='requires urllib3 to test')
|
||||
def test_https_connect_proxy_auth(self, handler, ctx):
|
||||
with ctx.http_server(HTTPSConnectProxyHandler, username='test', password='test') as server_address:
|
||||
with handler(verify=False, proxies={ctx.REQUEST_PROTO: f'https://test:test@{server_address}'}) as rh:
|
||||
proxy_info = ctx.proxy_info_request(rh)
|
||||
assert proxy_info['proxy'] == server_address
|
||||
assert 'Proxy-Authorization' in proxy_info['headers']
|
|
@ -29,11 +29,11 @@ def error(self, msg):
|
|||
@is_download_test
|
||||
class TestIqiyiSDKInterpreter(unittest.TestCase):
|
||||
def test_iqiyi_sdk_interpreter(self):
|
||||
'''
|
||||
"""
|
||||
Test the functionality of IqiyiSDKInterpreter by trying to log in
|
||||
|
||||
If `sign` is incorrect, /validate call throws an HTTP 556 error
|
||||
'''
|
||||
"""
|
||||
logger = WarningLogger()
|
||||
ie = IqiyiIE(FakeYDL({'logger': logger}))
|
||||
ie._perform_login('foo', 'bar')
|
||||
|
|
|
@ -8,410 +8,428 @@
|
|||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
import math
|
||||
import re
|
||||
|
||||
from yt_dlp.jsinterp import JS_Undefined, JSInterpreter
|
||||
|
||||
|
||||
class NaN:
|
||||
pass
|
||||
|
||||
|
||||
class TestJSInterpreter(unittest.TestCase):
|
||||
def _test(self, jsi_or_code, expected, func='f', args=()):
|
||||
if isinstance(jsi_or_code, str):
|
||||
jsi_or_code = JSInterpreter(jsi_or_code)
|
||||
got = jsi_or_code.call_function(func, *args)
|
||||
if expected is NaN:
|
||||
self.assertTrue(math.isnan(got), f'{got} is not NaN')
|
||||
else:
|
||||
self.assertEqual(got, expected)
|
||||
|
||||
def test_basic(self):
|
||||
jsi = JSInterpreter('function x(){;}')
|
||||
self.assertEqual(jsi.call_function('x'), None)
|
||||
jsi = JSInterpreter('function f(){;}')
|
||||
self.assertEqual(repr(jsi.extract_function('f')), 'F<f>')
|
||||
self._test(jsi, None)
|
||||
|
||||
jsi = JSInterpreter('function x3(){return 42;}')
|
||||
self.assertEqual(jsi.call_function('x3'), 42)
|
||||
self._test('function f(){return 42;}', 42)
|
||||
self._test('function f(){42}', None)
|
||||
self._test('var f = function(){return 42;}', 42)
|
||||
|
||||
jsi = JSInterpreter('function x3(){42}')
|
||||
self.assertEqual(jsi.call_function('x3'), None)
|
||||
def test_add(self):
|
||||
self._test('function f(){return 42 + 7;}', 49)
|
||||
self._test('function f(){return 42 + undefined;}', NaN)
|
||||
self._test('function f(){return 42 + null;}', 42)
|
||||
|
||||
jsi = JSInterpreter('var x5 = function(){return 42;}')
|
||||
self.assertEqual(jsi.call_function('x5'), 42)
|
||||
def test_sub(self):
|
||||
self._test('function f(){return 42 - 7;}', 35)
|
||||
self._test('function f(){return 42 - undefined;}', NaN)
|
||||
self._test('function f(){return 42 - null;}', 42)
|
||||
|
||||
def test_mul(self):
|
||||
self._test('function f(){return 42 * 7;}', 294)
|
||||
self._test('function f(){return 42 * undefined;}', NaN)
|
||||
self._test('function f(){return 42 * null;}', 0)
|
||||
|
||||
def test_div(self):
|
||||
jsi = JSInterpreter('function f(a, b){return a / b;}')
|
||||
self._test(jsi, NaN, args=(0, 0))
|
||||
self._test(jsi, NaN, args=(JS_Undefined, 1))
|
||||
self._test(jsi, float('inf'), args=(2, 0))
|
||||
self._test(jsi, 0, args=(0, 3))
|
||||
|
||||
def test_mod(self):
|
||||
self._test('function f(){return 42 % 7;}', 0)
|
||||
self._test('function f(){return 42 % 0;}', NaN)
|
||||
self._test('function f(){return 42 % undefined;}', NaN)
|
||||
|
||||
def test_exp(self):
|
||||
self._test('function f(){return 42 ** 2;}', 1764)
|
||||
self._test('function f(){return 42 ** undefined;}', NaN)
|
||||
self._test('function f(){return 42 ** null;}', 1)
|
||||
self._test('function f(){return undefined ** 42;}', NaN)
|
||||
|
||||
def test_calc(self):
|
||||
jsi = JSInterpreter('function x4(a){return 2*a+1;}')
|
||||
self.assertEqual(jsi.call_function('x4', 3), 7)
|
||||
self._test('function f(a){return 2*a+1;}', 7, args=[3])
|
||||
|
||||
def test_empty_return(self):
|
||||
jsi = JSInterpreter('function f(){return; y()}')
|
||||
self.assertEqual(jsi.call_function('f'), None)
|
||||
self._test('function f(){return; y()}', None)
|
||||
|
||||
def test_morespace(self):
|
||||
jsi = JSInterpreter('function x (a) { return 2 * a + 1 ; }')
|
||||
self.assertEqual(jsi.call_function('x', 3), 7)
|
||||
|
||||
jsi = JSInterpreter('function f () { x = 2 ; return x; }')
|
||||
self.assertEqual(jsi.call_function('f'), 2)
|
||||
self._test('function f (a) { return 2 * a + 1 ; }', 7, args=[3])
|
||||
self._test('function f () { x = 2 ; return x; }', 2)
|
||||
|
||||
def test_strange_chars(self):
|
||||
jsi = JSInterpreter('function $_xY1 ($_axY1) { var $_axY2 = $_axY1 + 1; return $_axY2; }')
|
||||
self.assertEqual(jsi.call_function('$_xY1', 20), 21)
|
||||
self._test('function $_xY1 ($_axY1) { var $_axY2 = $_axY1 + 1; return $_axY2; }',
|
||||
21, args=[20], func='$_xY1')
|
||||
|
||||
def test_operators(self):
|
||||
jsi = JSInterpreter('function f(){return 1 << 5;}')
|
||||
self.assertEqual(jsi.call_function('f'), 32)
|
||||
|
||||
jsi = JSInterpreter('function f(){return 2 ** 5}')
|
||||
self.assertEqual(jsi.call_function('f'), 32)
|
||||
|
||||
jsi = JSInterpreter('function f(){return 19 & 21;}')
|
||||
self.assertEqual(jsi.call_function('f'), 17)
|
||||
|
||||
jsi = JSInterpreter('function f(){return 11 >> 2;}')
|
||||
self.assertEqual(jsi.call_function('f'), 2)
|
||||
|
||||
jsi = JSInterpreter('function f(){return []? 2+3: 4;}')
|
||||
self.assertEqual(jsi.call_function('f'), 5)
|
||||
|
||||
jsi = JSInterpreter('function f(){return 1 == 2}')
|
||||
self.assertEqual(jsi.call_function('f'), False)
|
||||
|
||||
jsi = JSInterpreter('function f(){return 0 && 1 || 2;}')
|
||||
self.assertEqual(jsi.call_function('f'), 2)
|
||||
|
||||
jsi = JSInterpreter('function f(){return 0 ?? 42;}')
|
||||
self.assertEqual(jsi.call_function('f'), 0)
|
||||
|
||||
jsi = JSInterpreter('function f(){return "life, the universe and everything" < 42;}')
|
||||
self.assertFalse(jsi.call_function('f'))
|
||||
self._test('function f(){return 1 << 5;}', 32)
|
||||
self._test('function f(){return 2 ** 5}', 32)
|
||||
self._test('function f(){return 19 & 21;}', 17)
|
||||
self._test('function f(){return 11 >> 2;}', 2)
|
||||
self._test('function f(){return []? 2+3: 4;}', 5)
|
||||
self._test('function f(){return 1 == 2}', False)
|
||||
self._test('function f(){return 0 && 1 || 2;}', 2)
|
||||
self._test('function f(){return 0 ?? 42;}', 0)
|
||||
self._test('function f(){return "life, the universe and everything" < 42;}', False)
|
||||
self._test('function f(){return 0 - 7 * - 6;}', 42)
|
||||
|
||||
def test_array_access(self):
|
||||
jsi = JSInterpreter('function f(){var x = [1,2,3]; x[0] = 4; x[0] = 5; x[2.0] = 7; return x;}')
|
||||
self.assertEqual(jsi.call_function('f'), [5, 2, 7])
|
||||
self._test('function f(){var x = [1,2,3]; x[0] = 4; x[0] = 5; x[2.0] = 7; return x;}', [5, 2, 7])
|
||||
|
||||
def test_parens(self):
|
||||
jsi = JSInterpreter('function f(){return (1) + (2) * ((( (( (((((3)))))) )) ));}')
|
||||
self.assertEqual(jsi.call_function('f'), 7)
|
||||
|
||||
jsi = JSInterpreter('function f(){return (1 + 2) * 3;}')
|
||||
self.assertEqual(jsi.call_function('f'), 9)
|
||||
self._test('function f(){return (1) + (2) * ((( (( (((((3)))))) )) ));}', 7)
|
||||
self._test('function f(){return (1 + 2) * 3;}', 9)
|
||||
|
||||
def test_quotes(self):
|
||||
jsi = JSInterpreter(R'function f(){return "a\"\\("}')
|
||||
self.assertEqual(jsi.call_function('f'), R'a"\(')
|
||||
self._test(R'function f(){return "a\"\\("}', R'a"\(')
|
||||
|
||||
def test_assignments(self):
|
||||
jsi = JSInterpreter('function f(){var x = 20; x = 30 + 1; return x;}')
|
||||
self.assertEqual(jsi.call_function('f'), 31)
|
||||
|
||||
jsi = JSInterpreter('function f(){var x = 20; x += 30 + 1; return x;}')
|
||||
self.assertEqual(jsi.call_function('f'), 51)
|
||||
|
||||
jsi = JSInterpreter('function f(){var x = 20; x -= 30 + 1; return x;}')
|
||||
self.assertEqual(jsi.call_function('f'), -11)
|
||||
self._test('function f(){var x = 20; x = 30 + 1; return x;}', 31)
|
||||
self._test('function f(){var x = 20; x += 30 + 1; return x;}', 51)
|
||||
self._test('function f(){var x = 20; x -= 30 + 1; return x;}', -11)
|
||||
|
||||
@unittest.skip('Not implemented')
|
||||
def test_comments(self):
|
||||
'Skipping: Not yet fully implemented'
|
||||
return
|
||||
jsi = JSInterpreter('''
|
||||
function x() {
|
||||
var x = /* 1 + */ 2;
|
||||
var y = /* 30
|
||||
* 40 */ 50;
|
||||
return x + y;
|
||||
}
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), 52)
|
||||
self._test('''
|
||||
function f() {
|
||||
var x = /* 1 + */ 2;
|
||||
var y = /* 30
|
||||
* 40 */ 50;
|
||||
return x + y;
|
||||
}
|
||||
''', 52)
|
||||
|
||||
jsi = JSInterpreter('''
|
||||
function f() {
|
||||
var x = "/*";
|
||||
var y = 1 /* comment */ + 2;
|
||||
return y;
|
||||
}
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('f'), 3)
|
||||
self._test('''
|
||||
function f() {
|
||||
var x = "/*";
|
||||
var y = 1 /* comment */ + 2;
|
||||
return y;
|
||||
}
|
||||
''', 3)
|
||||
|
||||
def test_precedence(self):
|
||||
jsi = JSInterpreter('''
|
||||
function x() {
|
||||
var a = [10, 20, 30, 40, 50];
|
||||
var b = 6;
|
||||
a[0]=a[b%a.length];
|
||||
return a;
|
||||
}''')
|
||||
self.assertEqual(jsi.call_function('x'), [20, 20, 30, 40, 50])
|
||||
self._test('''
|
||||
function f() {
|
||||
var a = [10, 20, 30, 40, 50];
|
||||
var b = 6;
|
||||
a[0]=a[b%a.length];
|
||||
return a;
|
||||
}
|
||||
''', [20, 20, 30, 40, 50])
|
||||
|
||||
def test_builtins(self):
|
||||
jsi = JSInterpreter('''
|
||||
function x() { return NaN }
|
||||
''')
|
||||
self.assertTrue(math.isnan(jsi.call_function('x')))
|
||||
self._test('function f() { return NaN }', NaN)
|
||||
|
||||
jsi = JSInterpreter('''
|
||||
function x() { return new Date('Wednesday 31 December 1969 18:01:26 MDT') - 0; }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), 86000)
|
||||
jsi = JSInterpreter('''
|
||||
function x(dt) { return new Date(dt) - 0; }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x', 'Wednesday 31 December 1969 18:01:26 MDT'), 86000)
|
||||
def test_date(self):
|
||||
self._test('function f() { return new Date("Wednesday 31 December 1969 18:01:26 MDT") - 0; }', 86000)
|
||||
|
||||
jsi = JSInterpreter('function f(dt) { return new Date(dt) - 0; }')
|
||||
self._test(jsi, 86000, args=['Wednesday 31 December 1969 18:01:26 MDT'])
|
||||
self._test(jsi, 86000, args=['12/31/1969 18:01:26 MDT']) # m/d/y
|
||||
self._test(jsi, 0, args=['1 January 1970 00:00:00 UTC'])
|
||||
|
||||
def test_call(self):
|
||||
jsi = JSInterpreter('''
|
||||
function x() { return 2; }
|
||||
function y(a) { return x() + (a?a:0); }
|
||||
function z() { return y(3); }
|
||||
function x() { return 2; }
|
||||
function y(a) { return x() + (a?a:0); }
|
||||
function z() { return y(3); }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('z'), 5)
|
||||
self.assertEqual(jsi.call_function('y'), 2)
|
||||
self._test(jsi, 5, func='z')
|
||||
self._test(jsi, 2, func='y')
|
||||
|
||||
def test_if(self):
|
||||
self._test('''
|
||||
function f() {
|
||||
let a = 9;
|
||||
if (0==0) {a++}
|
||||
return a
|
||||
}
|
||||
''', 10)
|
||||
|
||||
self._test('''
|
||||
function f() {
|
||||
if (0==0) {return 10}
|
||||
}
|
||||
''', 10)
|
||||
|
||||
self._test('''
|
||||
function f() {
|
||||
if (0!=0) {return 1}
|
||||
else {return 10}
|
||||
}
|
||||
''', 10)
|
||||
|
||||
""" # Unsupported
|
||||
self._test('''
|
||||
function f() {
|
||||
if (0!=0) {return 1}
|
||||
else if (1==0) {return 2}
|
||||
else {return 10}
|
||||
}
|
||||
''', 10)
|
||||
"""
|
||||
|
||||
def test_for_loop(self):
|
||||
jsi = JSInterpreter('''
|
||||
function x() { a=0; for (i=0; i-10; i++) {a++} return a }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), 10)
|
||||
self._test('function f() { a=0; for (i=0; i-10; i++) {a++} return a }', 10)
|
||||
|
||||
def test_switch(self):
|
||||
jsi = JSInterpreter('''
|
||||
function x(f) { switch(f){
|
||||
case 1:f+=1;
|
||||
case 2:f+=2;
|
||||
case 3:f+=3;break;
|
||||
case 4:f+=4;
|
||||
default:f=0;
|
||||
} return f }
|
||||
function f(x) { switch(x){
|
||||
case 1:x+=1;
|
||||
case 2:x+=2;
|
||||
case 3:x+=3;break;
|
||||
case 4:x+=4;
|
||||
default:x=0;
|
||||
} return x }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x', 1), 7)
|
||||
self.assertEqual(jsi.call_function('x', 3), 6)
|
||||
self.assertEqual(jsi.call_function('x', 5), 0)
|
||||
self._test(jsi, 7, args=[1])
|
||||
self._test(jsi, 6, args=[3])
|
||||
self._test(jsi, 0, args=[5])
|
||||
|
||||
def test_switch_default(self):
|
||||
jsi = JSInterpreter('''
|
||||
function x(f) { switch(f){
|
||||
case 2: f+=2;
|
||||
default: f-=1;
|
||||
case 5:
|
||||
case 6: f+=6;
|
||||
case 0: break;
|
||||
case 1: f+=1;
|
||||
} return f }
|
||||
function f(x) { switch(x){
|
||||
case 2: x+=2;
|
||||
default: x-=1;
|
||||
case 5:
|
||||
case 6: x+=6;
|
||||
case 0: break;
|
||||
case 1: x+=1;
|
||||
} return x }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x', 1), 2)
|
||||
self.assertEqual(jsi.call_function('x', 5), 11)
|
||||
self.assertEqual(jsi.call_function('x', 9), 14)
|
||||
self._test(jsi, 2, args=[1])
|
||||
self._test(jsi, 11, args=[5])
|
||||
self._test(jsi, 14, args=[9])
|
||||
|
||||
def test_try(self):
|
||||
jsi = JSInterpreter('''
|
||||
function x() { try{return 10} catch(e){return 5} }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), 10)
|
||||
self._test('function f() { try{return 10} catch(e){return 5} }', 10)
|
||||
|
||||
def test_catch(self):
|
||||
jsi = JSInterpreter('''
|
||||
function x() { try{throw 10} catch(e){return 5} }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), 5)
|
||||
self._test('function f() { try{throw 10} catch(e){return 5} }', 5)
|
||||
|
||||
def test_finally(self):
|
||||
jsi = JSInterpreter('''
|
||||
function x() { try{throw 10} finally {return 42} }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), 42)
|
||||
jsi = JSInterpreter('''
|
||||
function x() { try{throw 10} catch(e){return 5} finally {return 42} }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), 42)
|
||||
self._test('function f() { try{throw 10} finally {return 42} }', 42)
|
||||
self._test('function f() { try{throw 10} catch(e){return 5} finally {return 42} }', 42)
|
||||
|
||||
def test_nested_try(self):
|
||||
jsi = JSInterpreter('''
|
||||
function x() {try {
|
||||
try{throw 10} finally {throw 42}
|
||||
} catch(e){return 5} }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), 5)
|
||||
self._test('''
|
||||
function f() {try {
|
||||
try{throw 10} finally {throw 42}
|
||||
} catch(e){return 5} }
|
||||
''', 5)
|
||||
|
||||
def test_for_loop_continue(self):
|
||||
jsi = JSInterpreter('''
|
||||
function x() { a=0; for (i=0; i-10; i++) { continue; a++ } return a }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), 0)
|
||||
self._test('function f() { a=0; for (i=0; i-10; i++) { continue; a++ } return a }', 0)
|
||||
|
||||
def test_for_loop_break(self):
|
||||
jsi = JSInterpreter('''
|
||||
function x() { a=0; for (i=0; i-10; i++) { break; a++ } return a }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), 0)
|
||||
self._test('function f() { a=0; for (i=0; i-10; i++) { break; a++ } return a }', 0)
|
||||
|
||||
def test_for_loop_try(self):
|
||||
jsi = JSInterpreter('''
|
||||
function x() {
|
||||
for (i=0; i-10; i++) { try { if (i == 5) throw i} catch {return 10} finally {break} };
|
||||
return 42 }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), 42)
|
||||
self._test('''
|
||||
function f() {
|
||||
for (i=0; i-10; i++) { try { if (i == 5) throw i} catch {return 10} finally {break} };
|
||||
return 42 }
|
||||
''', 42)
|
||||
|
||||
def test_literal_list(self):
|
||||
jsi = JSInterpreter('''
|
||||
function x() { return [1, 2, "asdf", [5, 6, 7]][3] }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), [5, 6, 7])
|
||||
self._test('function f() { return [1, 2, "asdf", [5, 6, 7]][3] }', [5, 6, 7])
|
||||
|
||||
def test_comma(self):
|
||||
jsi = JSInterpreter('''
|
||||
function x() { a=5; a -= 1, a+=3; return a }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), 7)
|
||||
|
||||
jsi = JSInterpreter('''
|
||||
function x() { a=5; return (a -= 1, a+=3, a); }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), 7)
|
||||
|
||||
jsi = JSInterpreter('''
|
||||
function x() { return (l=[0,1,2,3], function(a, b){return a+b})((l[1], l[2]), l[3]) }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), 5)
|
||||
self._test('function f() { a=5; a -= 1, a+=3; return a }', 7)
|
||||
self._test('function f() { a=5; return (a -= 1, a+=3, a); }', 7)
|
||||
self._test('function f() { return (l=[0,1,2,3], function(a, b){return a+b})((l[1], l[2]), l[3]) }', 5)
|
||||
|
||||
def test_void(self):
|
||||
jsi = JSInterpreter('''
|
||||
function x() { return void 42; }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), None)
|
||||
self._test('function f() { return void 42; }', None)
|
||||
|
||||
def test_return_function(self):
|
||||
jsi = JSInterpreter('''
|
||||
function x() { return [1, function(){return 1}][1] }
|
||||
function f() { return [1, function(){return 1}][1] }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x')([]), 1)
|
||||
self.assertEqual(jsi.call_function('f')([]), 1)
|
||||
|
||||
def test_null(self):
|
||||
jsi = JSInterpreter('''
|
||||
function x() { return null; }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), None)
|
||||
|
||||
jsi = JSInterpreter('''
|
||||
function x() { return [null > 0, null < 0, null == 0, null === 0]; }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), [False, False, False, False])
|
||||
|
||||
jsi = JSInterpreter('''
|
||||
function x() { return [null >= 0, null <= 0]; }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), [True, True])
|
||||
self._test('function f() { return null; }', None)
|
||||
self._test('function f() { return [null > 0, null < 0, null == 0, null === 0]; }',
|
||||
[False, False, False, False])
|
||||
self._test('function f() { return [null >= 0, null <= 0]; }', [True, True])
|
||||
|
||||
def test_undefined(self):
|
||||
jsi = JSInterpreter('''
|
||||
function x() { return undefined === undefined; }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), True)
|
||||
self._test('function f() { return undefined === undefined; }', True)
|
||||
self._test('function f() { return undefined; }', JS_Undefined)
|
||||
self._test('function f() {return undefined ?? 42; }', 42)
|
||||
self._test('function f() { let v; return v; }', JS_Undefined)
|
||||
self._test('function f() { let v; return v**0; }', 1)
|
||||
self._test('function f() { let v; return [v>42, v<=42, v&&42, 42&&v]; }',
|
||||
[False, False, JS_Undefined, JS_Undefined])
|
||||
|
||||
self._test('''
|
||||
function f() { return [
|
||||
undefined === undefined,
|
||||
undefined == undefined,
|
||||
undefined == null,
|
||||
undefined < undefined,
|
||||
undefined > undefined,
|
||||
undefined === 0,
|
||||
undefined == 0,
|
||||
undefined < 0,
|
||||
undefined > 0,
|
||||
undefined >= 0,
|
||||
undefined <= 0,
|
||||
undefined > null,
|
||||
undefined < null,
|
||||
undefined === null
|
||||
]; }
|
||||
''', list(map(bool, (1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0))))
|
||||
|
||||
jsi = JSInterpreter('''
|
||||
function x() { return undefined; }
|
||||
function f() { let v; return [42+v, v+42, v**42, 42**v, 0**v]; }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), JS_Undefined)
|
||||
|
||||
jsi = JSInterpreter('''
|
||||
function x() { let v; return v; }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), JS_Undefined)
|
||||
|
||||
jsi = JSInterpreter('''
|
||||
function x() { return [undefined === undefined, undefined == undefined, undefined < undefined, undefined > undefined]; }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), [True, True, False, False])
|
||||
|
||||
jsi = JSInterpreter('''
|
||||
function x() { return [undefined === 0, undefined == 0, undefined < 0, undefined > 0]; }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), [False, False, False, False])
|
||||
|
||||
jsi = JSInterpreter('''
|
||||
function x() { return [undefined >= 0, undefined <= 0]; }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), [False, False])
|
||||
|
||||
jsi = JSInterpreter('''
|
||||
function x() { return [undefined > null, undefined < null, undefined == null, undefined === null]; }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), [False, False, True, False])
|
||||
|
||||
jsi = JSInterpreter('''
|
||||
function x() { return [undefined === null, undefined == null, undefined < null, undefined > null]; }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), [False, True, False, False])
|
||||
|
||||
jsi = JSInterpreter('''
|
||||
function x() { let v; return [42+v, v+42, v**42, 42**v, 0**v]; }
|
||||
''')
|
||||
for y in jsi.call_function('x'):
|
||||
for y in jsi.call_function('f'):
|
||||
self.assertTrue(math.isnan(y))
|
||||
|
||||
jsi = JSInterpreter('''
|
||||
function x() { let v; return v**0; }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), 1)
|
||||
|
||||
jsi = JSInterpreter('''
|
||||
function x() { let v; return [v>42, v<=42, v&&42, 42&&v]; }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), [False, False, JS_Undefined, JS_Undefined])
|
||||
|
||||
jsi = JSInterpreter('function x(){return undefined ?? 42; }')
|
||||
self.assertEqual(jsi.call_function('x'), 42)
|
||||
|
||||
def test_object(self):
|
||||
jsi = JSInterpreter('''
|
||||
function x() { return {}; }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), {})
|
||||
|
||||
jsi = JSInterpreter('''
|
||||
function x() { let a = {m1: 42, m2: 0 }; return [a["m1"], a.m2]; }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), [42, 0])
|
||||
|
||||
jsi = JSInterpreter('''
|
||||
function x() { let a; return a?.qq; }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), JS_Undefined)
|
||||
|
||||
jsi = JSInterpreter('''
|
||||
function x() { let a = {m1: 42, m2: 0 }; return a?.qq; }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), JS_Undefined)
|
||||
self._test('function f() { return {}; }', {})
|
||||
self._test('function f() { let a = {m1: 42, m2: 0 }; return [a["m1"], a.m2]; }', [42, 0])
|
||||
self._test('function f() { let a; return a?.qq; }', JS_Undefined)
|
||||
self._test('function f() { let a = {m1: 42, m2: 0 }; return a?.qq; }', JS_Undefined)
|
||||
|
||||
def test_regex(self):
|
||||
jsi = JSInterpreter('''
|
||||
function x() { let a=/,,[/,913,/](,)}/; }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x'), None)
|
||||
self._test('function f() { let a=/,,[/,913,/](,)}/; }', None)
|
||||
self._test('function f() { let a=/,,[/,913,/](,)}/; return a; }', R'/,,[/,913,/](,)}/0')
|
||||
|
||||
jsi = JSInterpreter('''
|
||||
function x() { let a=/,,[/,913,/](,)}/; return a; }
|
||||
''')
|
||||
self.assertIsInstance(jsi.call_function('x'), re.Pattern)
|
||||
R''' # We are not compiling regex
|
||||
jsi = JSInterpreter('function f() { let a=/,,[/,913,/](,)}/; return a; }')
|
||||
self.assertIsInstance(jsi.call_function('f'), re.Pattern)
|
||||
|
||||
jsi = JSInterpreter('''
|
||||
function x() { let a=/,,[/,913,/](,)}/i; return a; }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x').flags & re.I, re.I)
|
||||
jsi = JSInterpreter('function f() { let a=/,,[/,913,/](,)}/i; return a; }')
|
||||
self.assertEqual(jsi.call_function('f').flags & re.I, re.I)
|
||||
|
||||
jsi = JSInterpreter(R'''
|
||||
function x() { let a=/,][}",],()}(\[)/; return a; }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x').pattern, r',][}",],()}(\[)')
|
||||
jsi = JSInterpreter(R'function f() { let a=/,][}",],()}(\[)/; return a; }')
|
||||
self.assertEqual(jsi.call_function('f').pattern, r',][}",],()}(\[)')
|
||||
|
||||
jsi = JSInterpreter(R'''
|
||||
function x() { let a=[/[)\\]/]; return a[0]; }
|
||||
''')
|
||||
self.assertEqual(jsi.call_function('x').pattern, r'[)\\]')
|
||||
jsi = JSInterpreter(R'function f() { let a=[/[)\\]/]; return a[0]; }')
|
||||
self.assertEqual(jsi.call_function('f').pattern, r'[)\\]')
|
||||
'''
|
||||
|
||||
@unittest.skip('Not implemented')
|
||||
def test_replace(self):
|
||||
self._test('function f() { let a="data-name".replace("data-", ""); return a }',
|
||||
'name')
|
||||
self._test('function f() { let a="data-name".replace(new RegExp("^.+-"), ""); return a; }',
|
||||
'name')
|
||||
self._test('function f() { let a="data-name".replace(/^.+-/, ""); return a; }',
|
||||
'name')
|
||||
self._test('function f() { let a="data-name".replace(/a/g, "o"); return a; }',
|
||||
'doto-nome')
|
||||
self._test('function f() { let a="data-name".replaceAll("a", "o"); return a; }',
|
||||
'doto-nome')
|
||||
|
||||
def test_char_code_at(self):
|
||||
jsi = JSInterpreter('function x(i){return "test".charCodeAt(i)}')
|
||||
self.assertEqual(jsi.call_function('x', 0), 116)
|
||||
self.assertEqual(jsi.call_function('x', 1), 101)
|
||||
self.assertEqual(jsi.call_function('x', 2), 115)
|
||||
self.assertEqual(jsi.call_function('x', 3), 116)
|
||||
self.assertEqual(jsi.call_function('x', 4), None)
|
||||
self.assertEqual(jsi.call_function('x', 'not_a_number'), 116)
|
||||
jsi = JSInterpreter('function f(i){return "test".charCodeAt(i)}')
|
||||
self._test(jsi, 116, args=[0])
|
||||
self._test(jsi, 101, args=[1])
|
||||
self._test(jsi, 115, args=[2])
|
||||
self._test(jsi, 116, args=[3])
|
||||
self._test(jsi, None, args=[4])
|
||||
self._test(jsi, 116, args=['not_a_number'])
|
||||
|
||||
def test_bitwise_operators_overflow(self):
|
||||
jsi = JSInterpreter('function x(){return -524999584 << 5}')
|
||||
self.assertEqual(jsi.call_function('x'), 379882496)
|
||||
self._test('function f(){return -524999584 << 5}', 379882496)
|
||||
self._test('function f(){return 1236566549 << 5}', 915423904)
|
||||
|
||||
jsi = JSInterpreter('function x(){return 1236566549 << 5}')
|
||||
self.assertEqual(jsi.call_function('x'), 915423904)
|
||||
def test_bitwise_operators_typecast(self):
|
||||
self._test('function f(){return null << 5}', 0)
|
||||
self._test('function f(){return undefined >> 5}', 0)
|
||||
self._test('function f(){return 42 << NaN}', 42)
|
||||
|
||||
def test_negative(self):
|
||||
self._test('function f(){return 2 * -2.0 ;}', -4)
|
||||
self._test('function f(){return 2 - - -2 ;}', 0)
|
||||
self._test('function f(){return 2 - - - -2 ;}', 4)
|
||||
self._test('function f(){return 2 - + + - -2;}', 0)
|
||||
self._test('function f(){return 2 + - + - -2;}', 0)
|
||||
|
||||
@unittest.skip('Not implemented')
|
||||
def test_packed(self):
|
||||
jsi = JSInterpreter('''function f(p,a,c,k,e,d){while(c--)if(k[c])p=p.replace(new RegExp('\\b'+c.toString(a)+'\\b','g'),k[c]);return p}''')
|
||||
self.assertEqual(jsi.call_function('f', '''h 7=g("1j");7.7h({7g:[{33:"w://7f-7e-7d-7c.v.7b/7a/79/78/77/76.74?t=73&s=2s&e=72&f=2t&71=70.0.0.1&6z=6y&6x=6w"}],6v:"w://32.v.u/6u.31",16:"r%",15:"r%",6t:"6s",6r:"",6q:"l",6p:"l",6o:"6n",6m:\'6l\',6k:"6j",9:[{33:"/2u?b=6i&n=50&6h=w://32.v.u/6g.31",6f:"6e"}],1y:{6d:1,6c:\'#6b\',6a:\'#69\',68:"67",66:30,65:r,},"64":{63:"%62 2m%m%61%5z%5y%5x.u%5w%5v%5u.2y%22 2k%m%1o%22 5t%m%1o%22 5s%m%1o%22 2j%m%5r%22 16%m%5q%22 15%m%5p%22 5o%2z%5n%5m%2z",5l:"w://v.u/d/1k/5k.2y",5j:[]},\'5i\':{"5h":"5g"},5f:"5e",5d:"w://v.u",5c:{},5b:l,1x:[0.25,0.50,0.75,1,1.25,1.5,2]});h 1m,1n,5a;h 59=0,58=0;h 7=g("1j");h 2x=0,57=0,56=0;$.55({54:{\'53-52\':\'2i-51\'}});7.j(\'4z\',6(x){c(5>0&&x.1l>=5&&1n!=1){1n=1;$(\'q.4y\').4x(\'4w\')}});7.j(\'13\',6(x){2x=x.1l});7.j(\'2g\',6(x){2w(x)});7.j(\'4v\',6(){$(\'q.2v\').4u()});6 2w(x){$(\'q.2v\').4t();c(1m)19;1m=1;17=0;c(4s.4r===l){17=1}$.4q(\'/2u?b=4p&2l=1k&4o=2t-4n-4m-2s-4l&4k=&4j=&4i=&17=\'+17,6(2r){$(\'#4h\').4g(2r)});$(\'.3-8-4f-4e:4d("4c")\').2h(6(e){2q();g().4b(0);g().4a(l)});6 2q(){h $14=$("<q />").2p({1l:"49",16:"r%",15:"r%",48:0,2n:0,2o:47,46:"45(10%, 10%, 10%, 0.4)","44-43":"42"});$("<41 />").2p({16:"60%",15:"60%",2o:40,"3z-2n":"3y"}).3x({\'2m\':\'/?b=3w&2l=1k\',\'2k\':\'0\',\'2j\':\'2i\'}).2f($14);$14.2h(6(){$(3v).3u();g().2g()});$14.2f($(\'#1j\'))}g().13(0);}6 3t(){h 9=7.1b(2e);2d.2c(9);c(9.n>1){1r(i=0;i<9.n;i++){c(9[i].1a==2e){2d.2c(\'!!=\'+i);7.1p(i)}}}}7.j(\'3s\',6(){g().1h("/2a/3r.29","3q 10 28",6(){g().13(g().27()+10)},"2b");$("q[26=2b]").23().21(\'.3-20-1z\');g().1h("/2a/3p.29","3o 10 28",6(){h 12=g().27()-10;c(12<0)12=0;g().13(12)},"24");$("q[26=24]").23().21(\'.3-20-1z\');});6 1i(){}7.j(\'3n\',6(){1i()});7.j(\'3m\',6(){1i()});7.j("k",6(y){h 9=7.1b();c(9.n<2)19;$(\'.3-8-3l-3k\').3j(6(){$(\'#3-8-a-k\').1e(\'3-8-a-z\');$(\'.3-a-k\').p(\'o-1f\',\'11\')});7.1h("/3i/3h.3g","3f 3e",6(){$(\'.3-1w\').3d(\'3-8-1v\');$(\'.3-8-1y, .3-8-1x\').p(\'o-1g\',\'11\');c($(\'.3-1w\').3c(\'3-8-1v\')){$(\'.3-a-k\').p(\'o-1g\',\'l\');$(\'.3-a-k\').p(\'o-1f\',\'l\');$(\'.3-8-a\').1e(\'3-8-a-z\');$(\'.3-8-a:1u\').3b(\'3-8-a-z\')}3a{$(\'.3-a-k\').p(\'o-1g\',\'11\');$(\'.3-a-k\').p(\'o-1f\',\'11\');$(\'.3-8-a:1u\').1e(\'3-8-a-z\')}},"39");7.j("38",6(y){1d.37(\'1c\',y.9[y.36].1a)});c(1d.1t(\'1c\')){35("1s(1d.1t(\'1c\'));",34)}});h 18;6 1s(1q){h 9=7.1b();c(9.n>1){1r(i=0;i<9.n;i++){c(9[i].1a==1q){c(i==18){19}18=i;7.1p(i)}}}}',36,270,'|||jw|||function|player|settings|tracks|submenu||if||||jwplayer|var||on|audioTracks|true|3D|length|aria|attr|div|100|||sx|filemoon|https||event|active||false|tt|seek|dd|height|width|adb|current_audio|return|name|getAudioTracks|default_audio|localStorage|removeClass|expanded|checked|addButton|callMeMaybe|vplayer|0fxcyc2ajhp1|position|vvplay|vvad|220|setCurrentAudioTrack|audio_name|for|audio_set|getItem|last|open|controls|playbackRates|captions|rewind|icon|insertAfter||detach|ff00||button|getPosition|sec|png|player8|ff11|log|console|track_name|appendTo|play|click|no|scrolling|frameborder|file_code|src|top|zIndex|css|showCCform|data|1662367683|383371|dl|video_ad|doPlay|prevt|mp4|3E||jpg|thumbs|file|300|setTimeout|currentTrack|setItem|audioTrackChanged|dualSound|else|addClass|hasClass|toggleClass|Track|Audio|svg|dualy|images|mousedown|buttons|topbar|playAttemptFailed|beforePlay|Rewind|fr|Forward|ff|ready|set_audio_track|remove|this|upload_srt|prop|50px|margin|1000001|iframe|center|align|text|rgba|background|1000000|left|absolute|pause|setCurrentCaptions|Upload|contains|item|content|html|fviews|referer|prem|embed|3e57249ef633e0d03bf76ceb8d8a4b65|216|83|hash|view|get|TokenZir|window|hide|show|complete|slow|fadeIn|video_ad_fadein|time||cache|Cache|Content|headers|ajaxSetup|v2done|tott|vastdone2|vastdone1|vvbefore|playbackRateControls|cast|aboutlink|FileMoon|abouttext|UHD|1870|qualityLabels|sites|GNOME_POWER|link|2Fiframe|3C|allowfullscreen|22360|22640|22no|marginheight|marginwidth|2FGNOME_POWER|2F0fxcyc2ajhp1|2Fe|2Ffilemoon|2F|3A||22https|3Ciframe|code|sharing|fontOpacity|backgroundOpacity|Tahoma|fontFamily|303030|backgroundColor|FFFFFF|color|userFontScale|thumbnails|kind|0fxcyc2ajhp10000|url|get_slides|start|startparam|none|preload|html5|primary|hlshtml|androidhls|duration|uniform|stretching|0fxcyc2ajhp1_xt|image|2048|sp|6871|asn|127|srv|43200|_g3XlBcu2lmD9oDexD2NLWSmah2Nu3XcDrl93m9PwXY|m3u8||master|0fxcyc2ajhp1_x|00076|01|hls2|to|s01|delivery|storage|moon|sources|setup'''.split('|')))
|
||||
|
||||
def test_join(self):
|
||||
test_input = list('test')
|
||||
tests = [
|
||||
'function f(a, b){return a.join(b)}',
|
||||
'function f(a, b){return Array.prototype.join.call(a, b)}',
|
||||
'function f(a, b){return Array.prototype.join.apply(a, [b])}',
|
||||
]
|
||||
for test in tests:
|
||||
jsi = JSInterpreter(test)
|
||||
self._test(jsi, 'test', args=[test_input, ''])
|
||||
self._test(jsi, 't-e-s-t', args=[test_input, '-'])
|
||||
self._test(jsi, '', args=[[], '-'])
|
||||
|
||||
def test_split(self):
|
||||
test_result = list('test')
|
||||
tests = [
|
||||
'function f(a, b){return a.split(b)}',
|
||||
'function f(a, b){return String.prototype.split.call(a, b)}',
|
||||
'function f(a, b){return String.prototype.split.apply(a, [b])}',
|
||||
]
|
||||
for test in tests:
|
||||
jsi = JSInterpreter(test)
|
||||
self._test(jsi, test_result, args=['test', ''])
|
||||
self._test(jsi, test_result, args=['t-e-s-t', '-'])
|
||||
self._test(jsi, [''], args=['', '-'])
|
||||
self._test(jsi, [], args=['', ''])
|
||||
|
||||
def test_slice(self):
|
||||
self._test('function f(){return [0, 1, 2, 3, 4, 5, 6, 7, 8].slice()}', [0, 1, 2, 3, 4, 5, 6, 7, 8])
|
||||
self._test('function f(){return [0, 1, 2, 3, 4, 5, 6, 7, 8].slice(0)}', [0, 1, 2, 3, 4, 5, 6, 7, 8])
|
||||
self._test('function f(){return [0, 1, 2, 3, 4, 5, 6, 7, 8].slice(5)}', [5, 6, 7, 8])
|
||||
self._test('function f(){return [0, 1, 2, 3, 4, 5, 6, 7, 8].slice(99)}', [])
|
||||
self._test('function f(){return [0, 1, 2, 3, 4, 5, 6, 7, 8].slice(-2)}', [7, 8])
|
||||
self._test('function f(){return [0, 1, 2, 3, 4, 5, 6, 7, 8].slice(-99)}', [0, 1, 2, 3, 4, 5, 6, 7, 8])
|
||||
self._test('function f(){return [0, 1, 2, 3, 4, 5, 6, 7, 8].slice(0, 0)}', [])
|
||||
self._test('function f(){return [0, 1, 2, 3, 4, 5, 6, 7, 8].slice(1, 0)}', [])
|
||||
self._test('function f(){return [0, 1, 2, 3, 4, 5, 6, 7, 8].slice(0, 1)}', [0])
|
||||
self._test('function f(){return [0, 1, 2, 3, 4, 5, 6, 7, 8].slice(3, 6)}', [3, 4, 5])
|
||||
self._test('function f(){return [0, 1, 2, 3, 4, 5, 6, 7, 8].slice(1, -1)}', [1, 2, 3, 4, 5, 6, 7])
|
||||
self._test('function f(){return [0, 1, 2, 3, 4, 5, 6, 7, 8].slice(-1, 1)}', [])
|
||||
self._test('function f(){return [0, 1, 2, 3, 4, 5, 6, 7, 8].slice(-3, -1)}', [6, 7])
|
||||
self._test('function f(){return "012345678".slice()}', '012345678')
|
||||
self._test('function f(){return "012345678".slice(0)}', '012345678')
|
||||
self._test('function f(){return "012345678".slice(5)}', '5678')
|
||||
self._test('function f(){return "012345678".slice(99)}', '')
|
||||
self._test('function f(){return "012345678".slice(-2)}', '78')
|
||||
self._test('function f(){return "012345678".slice(-99)}', '012345678')
|
||||
self._test('function f(){return "012345678".slice(0, 0)}', '')
|
||||
self._test('function f(){return "012345678".slice(1, 0)}', '')
|
||||
self._test('function f(){return "012345678".slice(0, 1)}', '0')
|
||||
self._test('function f(){return "012345678".slice(3, 6)}', '345')
|
||||
self._test('function f(){return "012345678".slice(1, -1)}', '1234567')
|
||||
self._test('function f(){return "012345678".slice(-1, 1)}', '')
|
||||
self._test('function f(){return "012345678".slice(-3, -1)}', '67')
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
|
|
|
@ -21,7 +21,7 @@ def test_netrc_present(self):
|
|||
continue
|
||||
self.assertTrue(
|
||||
ie._NETRC_MACHINE,
|
||||
'Extractor %s supports login, but is missing a _NETRC_MACHINE property' % ie.IE_NAME)
|
||||
f'Extractor {ie.IE_NAME} supports login, but is missing a _NETRC_MACHINE property')
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
|
|
2080
test/test_networking.py
Normal file
2080
test/test_networking.py
Normal file
File diff suppressed because it is too large
Load diff
208
test/test_networking_utils.py
Normal file
208
test/test_networking_utils.py
Normal file
|
@ -0,0 +1,208 @@
|
|||
#!/usr/bin/env python3
|
||||
|
||||
# Allow direct execution
|
||||
import os
|
||||
import sys
|
||||
|
||||
import pytest
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
import io
|
||||
import random
|
||||
import ssl
|
||||
|
||||
from yt_dlp.cookies import YoutubeDLCookieJar
|
||||
from yt_dlp.dependencies import certifi
|
||||
from yt_dlp.networking import Response
|
||||
from yt_dlp.networking._helper import (
|
||||
InstanceStoreMixin,
|
||||
add_accept_encoding_header,
|
||||
get_redirect_method,
|
||||
make_socks_proxy_opts,
|
||||
select_proxy,
|
||||
ssl_load_certs,
|
||||
)
|
||||
from yt_dlp.networking.exceptions import (
|
||||
HTTPError,
|
||||
IncompleteRead,
|
||||
)
|
||||
from yt_dlp.socks import ProxyType
|
||||
from yt_dlp.utils.networking import HTTPHeaderDict
|
||||
|
||||
TEST_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||
|
||||
|
||||
class TestNetworkingUtils:
|
||||
|
||||
def test_select_proxy(self):
|
||||
proxies = {
|
||||
'all': 'socks5://example.com',
|
||||
'http': 'http://example.com:1080',
|
||||
'no': 'bypass.example.com,yt-dl.org',
|
||||
}
|
||||
|
||||
assert select_proxy('https://example.com', proxies) == proxies['all']
|
||||
assert select_proxy('http://example.com', proxies) == proxies['http']
|
||||
assert select_proxy('http://bypass.example.com', proxies) is None
|
||||
assert select_proxy('https://yt-dl.org', proxies) is None
|
||||
|
||||
@pytest.mark.parametrize('socks_proxy,expected', [
|
||||
('socks5h://example.com', {
|
||||
'proxytype': ProxyType.SOCKS5,
|
||||
'addr': 'example.com',
|
||||
'port': 1080,
|
||||
'rdns': True,
|
||||
'username': None,
|
||||
'password': None,
|
||||
}),
|
||||
('socks5://user:@example.com:5555', {
|
||||
'proxytype': ProxyType.SOCKS5,
|
||||
'addr': 'example.com',
|
||||
'port': 5555,
|
||||
'rdns': False,
|
||||
'username': 'user',
|
||||
'password': '',
|
||||
}),
|
||||
('socks4://u%40ser:pa%20ss@127.0.0.1:1080', {
|
||||
'proxytype': ProxyType.SOCKS4,
|
||||
'addr': '127.0.0.1',
|
||||
'port': 1080,
|
||||
'rdns': False,
|
||||
'username': 'u@ser',
|
||||
'password': 'pa ss',
|
||||
}),
|
||||
('socks4a://:pa%20ss@127.0.0.1', {
|
||||
'proxytype': ProxyType.SOCKS4A,
|
||||
'addr': '127.0.0.1',
|
||||
'port': 1080,
|
||||
'rdns': True,
|
||||
'username': '',
|
||||
'password': 'pa ss',
|
||||
}),
|
||||
])
|
||||
def test_make_socks_proxy_opts(self, socks_proxy, expected):
|
||||
assert make_socks_proxy_opts(socks_proxy) == expected
|
||||
|
||||
def test_make_socks_proxy_unknown(self):
|
||||
with pytest.raises(ValueError, match='Unknown SOCKS proxy version: socks'):
|
||||
make_socks_proxy_opts('socks://127.0.0.1')
|
||||
|
||||
@pytest.mark.skipif(not certifi, reason='certifi is not installed')
|
||||
def test_load_certifi(self):
|
||||
context_certifi = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
|
||||
context_certifi.load_verify_locations(cafile=certifi.where())
|
||||
context = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
|
||||
ssl_load_certs(context, use_certifi=True)
|
||||
assert context.get_ca_certs() == context_certifi.get_ca_certs()
|
||||
|
||||
context_default = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
|
||||
context_default.load_default_certs()
|
||||
context = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
|
||||
ssl_load_certs(context, use_certifi=False)
|
||||
assert context.get_ca_certs() == context_default.get_ca_certs()
|
||||
|
||||
if context_default.get_ca_certs() == context_certifi.get_ca_certs():
|
||||
pytest.skip('System uses certifi as default. The test is not valid')
|
||||
|
||||
@pytest.mark.parametrize('method,status,expected', [
|
||||
('GET', 303, 'GET'),
|
||||
('HEAD', 303, 'HEAD'),
|
||||
('PUT', 303, 'GET'),
|
||||
('POST', 301, 'GET'),
|
||||
('HEAD', 301, 'HEAD'),
|
||||
('POST', 302, 'GET'),
|
||||
('HEAD', 302, 'HEAD'),
|
||||
('PUT', 302, 'PUT'),
|
||||
('POST', 308, 'POST'),
|
||||
('POST', 307, 'POST'),
|
||||
('HEAD', 308, 'HEAD'),
|
||||
('HEAD', 307, 'HEAD'),
|
||||
])
|
||||
def test_get_redirect_method(self, method, status, expected):
|
||||
assert get_redirect_method(method, status) == expected
|
||||
|
||||
@pytest.mark.parametrize('headers,supported_encodings,expected', [
|
||||
({'Accept-Encoding': 'br'}, ['gzip', 'br'], {'Accept-Encoding': 'br'}),
|
||||
({}, ['gzip', 'br'], {'Accept-Encoding': 'gzip, br'}),
|
||||
({'Content-type': 'application/json'}, [], {'Content-type': 'application/json', 'Accept-Encoding': 'identity'}),
|
||||
])
|
||||
def test_add_accept_encoding_header(self, headers, supported_encodings, expected):
|
||||
headers = HTTPHeaderDict(headers)
|
||||
add_accept_encoding_header(headers, supported_encodings)
|
||||
assert headers == HTTPHeaderDict(expected)
|
||||
|
||||
|
||||
class TestInstanceStoreMixin:
|
||||
|
||||
class FakeInstanceStoreMixin(InstanceStoreMixin):
|
||||
def _create_instance(self, **kwargs):
|
||||
return random.randint(0, 1000000)
|
||||
|
||||
def _close_instance(self, instance):
|
||||
pass
|
||||
|
||||
def test_mixin(self):
|
||||
mixin = self.FakeInstanceStoreMixin()
|
||||
assert mixin._get_instance(d={'a': 1, 'b': 2, 'c': {'d', 4}}) == mixin._get_instance(d={'a': 1, 'b': 2, 'c': {'d', 4}})
|
||||
|
||||
assert mixin._get_instance(d={'a': 1, 'b': 2, 'c': {'e', 4}}) != mixin._get_instance(d={'a': 1, 'b': 2, 'c': {'d', 4}})
|
||||
|
||||
assert mixin._get_instance(d={'a': 1, 'b': 2, 'c': {'d', 4}} != mixin._get_instance(d={'a': 1, 'b': 2, 'g': {'d', 4}}))
|
||||
|
||||
assert mixin._get_instance(d={'a': 1}, e=[1, 2, 3]) == mixin._get_instance(d={'a': 1}, e=[1, 2, 3])
|
||||
|
||||
assert mixin._get_instance(d={'a': 1}, e=[1, 2, 3]) != mixin._get_instance(d={'a': 1}, e=[1, 2, 3, 4])
|
||||
|
||||
cookiejar = YoutubeDLCookieJar()
|
||||
assert mixin._get_instance(b=[1, 2], c=cookiejar) == mixin._get_instance(b=[1, 2], c=cookiejar)
|
||||
|
||||
assert mixin._get_instance(b=[1, 2], c=cookiejar) != mixin._get_instance(b=[1, 2], c=YoutubeDLCookieJar())
|
||||
|
||||
# Different order
|
||||
assert mixin._get_instance(c=cookiejar, b=[1, 2]) == mixin._get_instance(b=[1, 2], c=cookiejar)
|
||||
|
||||
m = mixin._get_instance(t=1234)
|
||||
assert mixin._get_instance(t=1234) == m
|
||||
mixin._clear_instances()
|
||||
assert mixin._get_instance(t=1234) != m
|
||||
|
||||
|
||||
class TestNetworkingExceptions:
|
||||
|
||||
@staticmethod
|
||||
def create_response(status):
|
||||
return Response(fp=io.BytesIO(b'test'), url='http://example.com', headers={'tesT': 'test'}, status=status)
|
||||
|
||||
def test_http_error(self):
|
||||
|
||||
response = self.create_response(403)
|
||||
error = HTTPError(response)
|
||||
|
||||
assert error.status == 403
|
||||
assert str(error) == error.msg == 'HTTP Error 403: Forbidden'
|
||||
assert error.reason == response.reason
|
||||
assert error.response is response
|
||||
|
||||
data = error.response.read()
|
||||
assert data == b'test'
|
||||
assert repr(error) == '<HTTPError 403: Forbidden>'
|
||||
|
||||
def test_redirect_http_error(self):
|
||||
response = self.create_response(301)
|
||||
error = HTTPError(response, redirect_loop=True)
|
||||
assert str(error) == error.msg == 'HTTP Error 301: Moved Permanently (redirect loop detected)'
|
||||
assert error.reason == 'Moved Permanently'
|
||||
|
||||
def test_incomplete_read_error(self):
|
||||
error = IncompleteRead(4, 3, cause='test')
|
||||
assert isinstance(error, IncompleteRead)
|
||||
assert repr(error) == '<IncompleteRead: 4 bytes read, 3 more expected>'
|
||||
assert str(error) == error.msg == '4 bytes read, 3 more expected'
|
||||
assert error.partial == 4
|
||||
assert error.expected == 3
|
||||
assert error.cause == 'test'
|
||||
|
||||
error = IncompleteRead(3)
|
||||
assert repr(error) == '<IncompleteRead: 3 bytes read>'
|
||||
assert str(error) == '3 bytes read'
|
|
@ -27,7 +27,7 @@ def test_default_overwrites(self):
|
|||
[
|
||||
sys.executable, 'yt_dlp/__main__.py',
|
||||
'-o', 'test.webm',
|
||||
'https://www.youtube.com/watch?v=jNQXAC9IVRw'
|
||||
'https://www.youtube.com/watch?v=jNQXAC9IVRw',
|
||||
], cwd=root_dir, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
|
||||
sout, serr = outp.communicate()
|
||||
self.assertTrue(b'has already been downloaded' in sout)
|
||||
|
@ -39,7 +39,7 @@ def test_yes_overwrites(self):
|
|||
[
|
||||
sys.executable, 'yt_dlp/__main__.py', '--yes-overwrites',
|
||||
'-o', 'test.webm',
|
||||
'https://www.youtube.com/watch?v=jNQXAC9IVRw'
|
||||
'https://www.youtube.com/watch?v=jNQXAC9IVRw',
|
||||
], cwd=root_dir, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
|
||||
sout, serr = outp.communicate()
|
||||
self.assertTrue(b'has already been downloaded' not in sout)
|
||||
|
|
92
test/test_plugins.py
Normal file
92
test/test_plugins.py
Normal file
|
@ -0,0 +1,92 @@
|
|||
import importlib
|
||||
import os
|
||||
import shutil
|
||||
import sys
|
||||
import unittest
|
||||
from pathlib import Path
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
TEST_DATA_DIR = Path(os.path.dirname(os.path.abspath(__file__)), 'testdata')
|
||||
sys.path.append(str(TEST_DATA_DIR))
|
||||
importlib.invalidate_caches()
|
||||
|
||||
from yt_dlp.utils import Config
|
||||
from yt_dlp.plugins import PACKAGE_NAME, directories, load_plugins
|
||||
|
||||
|
||||
class TestPlugins(unittest.TestCase):
|
||||
|
||||
TEST_PLUGIN_DIR = TEST_DATA_DIR / PACKAGE_NAME
|
||||
|
||||
def test_directories_containing_plugins(self):
|
||||
self.assertIn(self.TEST_PLUGIN_DIR, map(Path, directories()))
|
||||
|
||||
def test_extractor_classes(self):
|
||||
for module_name in tuple(sys.modules):
|
||||
if module_name.startswith(f'{PACKAGE_NAME}.extractor'):
|
||||
del sys.modules[module_name]
|
||||
plugins_ie = load_plugins('extractor', 'IE')
|
||||
|
||||
self.assertIn(f'{PACKAGE_NAME}.extractor.normal', sys.modules.keys())
|
||||
self.assertIn('NormalPluginIE', plugins_ie.keys())
|
||||
|
||||
# don't load modules with underscore prefix
|
||||
self.assertFalse(
|
||||
f'{PACKAGE_NAME}.extractor._ignore' in sys.modules,
|
||||
'loaded module beginning with underscore')
|
||||
self.assertNotIn('IgnorePluginIE', plugins_ie.keys())
|
||||
|
||||
# Don't load extractors with underscore prefix
|
||||
self.assertNotIn('_IgnoreUnderscorePluginIE', plugins_ie.keys())
|
||||
|
||||
# Don't load extractors not specified in __all__ (if supplied)
|
||||
self.assertNotIn('IgnoreNotInAllPluginIE', plugins_ie.keys())
|
||||
self.assertIn('InAllPluginIE', plugins_ie.keys())
|
||||
|
||||
def test_postprocessor_classes(self):
|
||||
plugins_pp = load_plugins('postprocessor', 'PP')
|
||||
self.assertIn('NormalPluginPP', plugins_pp.keys())
|
||||
|
||||
def test_importing_zipped_module(self):
|
||||
zip_path = TEST_DATA_DIR / 'zipped_plugins.zip'
|
||||
shutil.make_archive(str(zip_path)[:-4], 'zip', str(zip_path)[:-4])
|
||||
sys.path.append(str(zip_path)) # add zip to search paths
|
||||
importlib.invalidate_caches() # reset the import caches
|
||||
|
||||
try:
|
||||
for plugin_type in ('extractor', 'postprocessor'):
|
||||
package = importlib.import_module(f'{PACKAGE_NAME}.{plugin_type}')
|
||||
self.assertIn(zip_path / PACKAGE_NAME / plugin_type, map(Path, package.__path__))
|
||||
|
||||
plugins_ie = load_plugins('extractor', 'IE')
|
||||
self.assertIn('ZippedPluginIE', plugins_ie.keys())
|
||||
|
||||
plugins_pp = load_plugins('postprocessor', 'PP')
|
||||
self.assertIn('ZippedPluginPP', plugins_pp.keys())
|
||||
|
||||
finally:
|
||||
sys.path.remove(str(zip_path))
|
||||
os.remove(zip_path)
|
||||
importlib.invalidate_caches() # reset the import caches
|
||||
|
||||
def test_plugin_dirs(self):
|
||||
# Internal plugin dirs hack for CLI --plugin-dirs
|
||||
# To be replaced with proper system later
|
||||
custom_plugin_dir = TEST_DATA_DIR / 'plugin_packages'
|
||||
Config._plugin_dirs = [str(custom_plugin_dir)]
|
||||
importlib.invalidate_caches() # reset the import caches
|
||||
|
||||
try:
|
||||
package = importlib.import_module(f'{PACKAGE_NAME}.extractor')
|
||||
self.assertIn(custom_plugin_dir / 'testpackage' / PACKAGE_NAME / 'extractor', map(Path, package.__path__))
|
||||
|
||||
plugins_ie = load_plugins('extractor', 'IE')
|
||||
self.assertIn('PackagePluginIE', plugins_ie.keys())
|
||||
|
||||
finally:
|
||||
Config._plugin_dirs = []
|
||||
importlib.invalidate_caches() # reset the import caches
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main()
|
|
@ -59,7 +59,7 @@ def hook_two(self, filename):
|
|||
|
||||
def hook_three(self, filename):
|
||||
self.files.append(filename)
|
||||
raise Exception('Test exception for \'%s\'' % filename)
|
||||
raise Exception(f'Test exception for \'{filename}\'')
|
||||
|
||||
def tearDown(self):
|
||||
for f in self.files:
|
||||
|
|
|
@ -9,7 +9,7 @@
|
|||
|
||||
|
||||
from yt_dlp import YoutubeDL
|
||||
from yt_dlp.compat import compat_shlex_quote
|
||||
from yt_dlp.utils import shell_quote
|
||||
from yt_dlp.postprocessor import (
|
||||
ExecPP,
|
||||
FFmpegThumbnailsConvertorPP,
|
||||
|
@ -65,7 +65,7 @@ class TestExec(unittest.TestCase):
|
|||
def test_parse_cmd(self):
|
||||
pp = ExecPP(YoutubeDL(), '')
|
||||
info = {'filepath': 'file name'}
|
||||
cmd = 'echo %s' % compat_shlex_quote(info['filepath'])
|
||||
cmd = 'echo {}'.format(shell_quote(info['filepath']))
|
||||
|
||||
self.assertEqual(pp.parse_cmd('echo', info), cmd)
|
||||
self.assertEqual(pp.parse_cmd('echo {}', info), cmd)
|
||||
|
@ -125,7 +125,8 @@ def test_remove_marked_arrange_sponsors_CanGetThroughUnaltered(self):
|
|||
self._remove_marked_arrange_sponsors_test_impl(chapters, chapters, [])
|
||||
|
||||
def test_remove_marked_arrange_sponsors_ChapterWithSponsors(self):
|
||||
chapters = self._chapters([70], ['c']) + [
|
||||
chapters = [
|
||||
*self._chapters([70], ['c']),
|
||||
self._sponsor_chapter(10, 20, 'sponsor'),
|
||||
self._sponsor_chapter(30, 40, 'preview'),
|
||||
self._sponsor_chapter(50, 60, 'filler')]
|
||||
|
@ -136,7 +137,8 @@ def test_remove_marked_arrange_sponsors_ChapterWithSponsors(self):
|
|||
self._remove_marked_arrange_sponsors_test_impl(chapters, expected, [])
|
||||
|
||||
def test_remove_marked_arrange_sponsors_SponsorBlockChapters(self):
|
||||
chapters = self._chapters([70], ['c']) + [
|
||||
chapters = [
|
||||
*self._chapters([70], ['c']),
|
||||
self._sponsor_chapter(10, 20, 'chapter', title='sb c1'),
|
||||
self._sponsor_chapter(15, 16, 'chapter', title='sb c2'),
|
||||
self._sponsor_chapter(30, 40, 'preview'),
|
||||
|
@ -149,10 +151,14 @@ def test_remove_marked_arrange_sponsors_SponsorBlockChapters(self):
|
|||
self._remove_marked_arrange_sponsors_test_impl(chapters, expected, [])
|
||||
|
||||
def test_remove_marked_arrange_sponsors_UniqueNamesForOverlappingSponsors(self):
|
||||
chapters = self._chapters([120], ['c']) + [
|
||||
self._sponsor_chapter(10, 45, 'sponsor'), self._sponsor_chapter(20, 40, 'selfpromo'),
|
||||
self._sponsor_chapter(50, 70, 'sponsor'), self._sponsor_chapter(60, 85, 'selfpromo'),
|
||||
self._sponsor_chapter(90, 120, 'selfpromo'), self._sponsor_chapter(100, 110, 'sponsor')]
|
||||
chapters = [
|
||||
*self._chapters([120], ['c']),
|
||||
self._sponsor_chapter(10, 45, 'sponsor'),
|
||||
self._sponsor_chapter(20, 40, 'selfpromo'),
|
||||
self._sponsor_chapter(50, 70, 'sponsor'),
|
||||
self._sponsor_chapter(60, 85, 'selfpromo'),
|
||||
self._sponsor_chapter(90, 120, 'selfpromo'),
|
||||
self._sponsor_chapter(100, 110, 'sponsor')]
|
||||
expected = self._chapters(
|
||||
[10, 20, 40, 45, 50, 60, 70, 85, 90, 100, 110, 120],
|
||||
['c', '[SponsorBlock]: Sponsor', '[SponsorBlock]: Sponsor, Unpaid/Self Promotion',
|
||||
|
@ -172,7 +178,8 @@ def test_remove_marked_arrange_sponsors_ChapterWithCuts(self):
|
|||
chapters, self._chapters([40], ['c']), cuts)
|
||||
|
||||
def test_remove_marked_arrange_sponsors_ChapterWithSponsorsAndCuts(self):
|
||||
chapters = self._chapters([70], ['c']) + [
|
||||
chapters = [
|
||||
*self._chapters([70], ['c']),
|
||||
self._sponsor_chapter(10, 20, 'sponsor'),
|
||||
self._sponsor_chapter(30, 40, 'selfpromo', remove=True),
|
||||
self._sponsor_chapter(50, 60, 'interaction')]
|
||||
|
@ -185,24 +192,29 @@ def test_remove_marked_arrange_sponsors_ChapterWithSponsorsAndCuts(self):
|
|||
def test_remove_marked_arrange_sponsors_ChapterWithSponsorCutInTheMiddle(self):
|
||||
cuts = [self._sponsor_chapter(20, 30, 'selfpromo', remove=True),
|
||||
self._chapter(40, 50, remove=True)]
|
||||
chapters = self._chapters([70], ['c']) + [self._sponsor_chapter(10, 60, 'sponsor')] + cuts
|
||||
chapters = [
|
||||
*self._chapters([70], ['c']),
|
||||
self._sponsor_chapter(10, 60, 'sponsor'),
|
||||
*cuts]
|
||||
expected = self._chapters(
|
||||
[10, 40, 50], ['c', '[SponsorBlock]: Sponsor', 'c'])
|
||||
self._remove_marked_arrange_sponsors_test_impl(chapters, expected, cuts)
|
||||
|
||||
def test_remove_marked_arrange_sponsors_ChapterWithCutHidingSponsor(self):
|
||||
cuts = [self._sponsor_chapter(20, 50, 'selfpromo', remove=True)]
|
||||
chapters = self._chapters([60], ['c']) + [
|
||||
chapters = [
|
||||
*self._chapters([60], ['c']),
|
||||
self._sponsor_chapter(10, 20, 'intro'),
|
||||
self._sponsor_chapter(30, 40, 'sponsor'),
|
||||
self._sponsor_chapter(50, 60, 'outro'),
|
||||
] + cuts
|
||||
*cuts]
|
||||
expected = self._chapters(
|
||||
[10, 20, 30], ['c', '[SponsorBlock]: Intermission/Intro Animation', '[SponsorBlock]: Endcards/Credits'])
|
||||
self._remove_marked_arrange_sponsors_test_impl(chapters, expected, cuts)
|
||||
|
||||
def test_remove_marked_arrange_sponsors_ChapterWithAdjacentSponsors(self):
|
||||
chapters = self._chapters([70], ['c']) + [
|
||||
chapters = [
|
||||
*self._chapters([70], ['c']),
|
||||
self._sponsor_chapter(10, 20, 'sponsor'),
|
||||
self._sponsor_chapter(20, 30, 'selfpromo'),
|
||||
self._sponsor_chapter(30, 40, 'interaction')]
|
||||
|
@ -213,7 +225,8 @@ def test_remove_marked_arrange_sponsors_ChapterWithAdjacentSponsors(self):
|
|||
self._remove_marked_arrange_sponsors_test_impl(chapters, expected, [])
|
||||
|
||||
def test_remove_marked_arrange_sponsors_ChapterWithAdjacentCuts(self):
|
||||
chapters = self._chapters([70], ['c']) + [
|
||||
chapters = [
|
||||
*self._chapters([70], ['c']),
|
||||
self._sponsor_chapter(10, 20, 'sponsor'),
|
||||
self._sponsor_chapter(20, 30, 'interaction', remove=True),
|
||||
self._chapter(30, 40, remove=True),
|
||||
|
@ -226,7 +239,8 @@ def test_remove_marked_arrange_sponsors_ChapterWithAdjacentCuts(self):
|
|||
chapters, expected, [self._chapter(20, 50, remove=True)])
|
||||
|
||||
def test_remove_marked_arrange_sponsors_ChapterWithOverlappingSponsors(self):
|
||||
chapters = self._chapters([70], ['c']) + [
|
||||
chapters = [
|
||||
*self._chapters([70], ['c']),
|
||||
self._sponsor_chapter(10, 30, 'sponsor'),
|
||||
self._sponsor_chapter(20, 50, 'selfpromo'),
|
||||
self._sponsor_chapter(40, 60, 'interaction')]
|
||||
|
@ -238,7 +252,8 @@ def test_remove_marked_arrange_sponsors_ChapterWithOverlappingSponsors(self):
|
|||
self._remove_marked_arrange_sponsors_test_impl(chapters, expected, [])
|
||||
|
||||
def test_remove_marked_arrange_sponsors_ChapterWithOverlappingCuts(self):
|
||||
chapters = self._chapters([70], ['c']) + [
|
||||
chapters = [
|
||||
*self._chapters([70], ['c']),
|
||||
self._sponsor_chapter(10, 30, 'sponsor', remove=True),
|
||||
self._sponsor_chapter(20, 50, 'selfpromo', remove=True),
|
||||
self._sponsor_chapter(40, 60, 'interaction', remove=True)]
|
||||
|
@ -246,7 +261,8 @@ def test_remove_marked_arrange_sponsors_ChapterWithOverlappingCuts(self):
|
|||
chapters, self._chapters([20], ['c']), [self._chapter(10, 60, remove=True)])
|
||||
|
||||
def test_remove_marked_arrange_sponsors_ChapterWithRunsOfOverlappingSponsors(self):
|
||||
chapters = self._chapters([170], ['c']) + [
|
||||
chapters = [
|
||||
*self._chapters([170], ['c']),
|
||||
self._sponsor_chapter(0, 30, 'intro'),
|
||||
self._sponsor_chapter(20, 50, 'sponsor'),
|
||||
self._sponsor_chapter(40, 60, 'selfpromo'),
|
||||
|
@ -267,7 +283,8 @@ def test_remove_marked_arrange_sponsors_ChapterWithRunsOfOverlappingSponsors(sel
|
|||
self._remove_marked_arrange_sponsors_test_impl(chapters, expected, [])
|
||||
|
||||
def test_remove_marked_arrange_sponsors_ChapterWithRunsOfOverlappingCuts(self):
|
||||
chapters = self._chapters([170], ['c']) + [
|
||||
chapters = [
|
||||
*self._chapters([170], ['c']),
|
||||
self._chapter(0, 30, remove=True),
|
||||
self._sponsor_chapter(20, 50, 'sponsor', remove=True),
|
||||
self._chapter(40, 60, remove=True),
|
||||
|
@ -284,7 +301,8 @@ def test_remove_marked_arrange_sponsors_ChapterWithRunsOfOverlappingCuts(self):
|
|||
chapters, self._chapters([20], ['c']), expected_cuts)
|
||||
|
||||
def test_remove_marked_arrange_sponsors_OverlappingSponsorsDifferentTitlesAfterCut(self):
|
||||
chapters = self._chapters([60], ['c']) + [
|
||||
chapters = [
|
||||
*self._chapters([60], ['c']),
|
||||
self._sponsor_chapter(10, 60, 'sponsor'),
|
||||
self._sponsor_chapter(10, 40, 'intro'),
|
||||
self._sponsor_chapter(30, 50, 'interaction'),
|
||||
|
@ -297,7 +315,8 @@ def test_remove_marked_arrange_sponsors_OverlappingSponsorsDifferentTitlesAfterC
|
|||
chapters, expected, [self._chapter(30, 50, remove=True)])
|
||||
|
||||
def test_remove_marked_arrange_sponsors_SponsorsNoLongerOverlapAfterCut(self):
|
||||
chapters = self._chapters([70], ['c']) + [
|
||||
chapters = [
|
||||
*self._chapters([70], ['c']),
|
||||
self._sponsor_chapter(10, 30, 'sponsor'),
|
||||
self._sponsor_chapter(20, 50, 'interaction'),
|
||||
self._sponsor_chapter(30, 50, 'selfpromo', remove=True),
|
||||
|
@ -310,7 +329,8 @@ def test_remove_marked_arrange_sponsors_SponsorsNoLongerOverlapAfterCut(self):
|
|||
chapters, expected, [self._chapter(30, 50, remove=True)])
|
||||
|
||||
def test_remove_marked_arrange_sponsors_SponsorsStillOverlapAfterCut(self):
|
||||
chapters = self._chapters([70], ['c']) + [
|
||||
chapters = [
|
||||
*self._chapters([70], ['c']),
|
||||
self._sponsor_chapter(10, 60, 'sponsor'),
|
||||
self._sponsor_chapter(20, 60, 'interaction'),
|
||||
self._sponsor_chapter(30, 50, 'selfpromo', remove=True)]
|
||||
|
@ -321,7 +341,8 @@ def test_remove_marked_arrange_sponsors_SponsorsStillOverlapAfterCut(self):
|
|||
chapters, expected, [self._chapter(30, 50, remove=True)])
|
||||
|
||||
def test_remove_marked_arrange_sponsors_ChapterWithRunsOfOverlappingSponsorsAndCuts(self):
|
||||
chapters = self._chapters([200], ['c']) + [
|
||||
chapters = [
|
||||
*self._chapters([200], ['c']),
|
||||
self._sponsor_chapter(10, 40, 'sponsor'),
|
||||
self._sponsor_chapter(10, 30, 'intro'),
|
||||
self._chapter(20, 30, remove=True),
|
||||
|
@ -347,8 +368,9 @@ def test_remove_marked_arrange_sponsors_ChapterWithRunsOfOverlappingSponsorsAndC
|
|||
self._remove_marked_arrange_sponsors_test_impl(chapters, expected, expected_cuts)
|
||||
|
||||
def test_remove_marked_arrange_sponsors_SponsorOverlapsMultipleChapters(self):
|
||||
chapters = (self._chapters([20, 40, 60, 80, 100], ['c1', 'c2', 'c3', 'c4', 'c5'])
|
||||
+ [self._sponsor_chapter(10, 90, 'sponsor')])
|
||||
chapters = [
|
||||
*self._chapters([20, 40, 60, 80, 100], ['c1', 'c2', 'c3', 'c4', 'c5']),
|
||||
self._sponsor_chapter(10, 90, 'sponsor')]
|
||||
expected = self._chapters([10, 90, 100], ['c1', '[SponsorBlock]: Sponsor', 'c5'])
|
||||
self._remove_marked_arrange_sponsors_test_impl(chapters, expected, [])
|
||||
|
||||
|
@ -359,9 +381,10 @@ def test_remove_marked_arrange_sponsors_CutOverlapsMultipleChapters(self):
|
|||
self._remove_marked_arrange_sponsors_test_impl(chapters, expected, cuts)
|
||||
|
||||
def test_remove_marked_arrange_sponsors_SponsorsWithinSomeChaptersAndOverlappingOthers(self):
|
||||
chapters = (self._chapters([10, 40, 60, 80], ['c1', 'c2', 'c3', 'c4'])
|
||||
+ [self._sponsor_chapter(20, 30, 'sponsor'),
|
||||
self._sponsor_chapter(50, 70, 'selfpromo')])
|
||||
chapters = [
|
||||
*self._chapters([10, 40, 60, 80], ['c1', 'c2', 'c3', 'c4']),
|
||||
self._sponsor_chapter(20, 30, 'sponsor'),
|
||||
self._sponsor_chapter(50, 70, 'selfpromo')]
|
||||
expected = self._chapters([10, 20, 30, 40, 50, 70, 80],
|
||||
['c1', 'c2', '[SponsorBlock]: Sponsor', 'c2', 'c3',
|
||||
'[SponsorBlock]: Unpaid/Self Promotion', 'c4'])
|
||||
|
@ -374,8 +397,9 @@ def test_remove_marked_arrange_sponsors_CutsWithinSomeChaptersAndOverlappingOthe
|
|||
self._remove_marked_arrange_sponsors_test_impl(chapters, expected, cuts)
|
||||
|
||||
def test_remove_marked_arrange_sponsors_ChaptersAfterLastSponsor(self):
|
||||
chapters = (self._chapters([20, 40, 50, 60], ['c1', 'c2', 'c3', 'c4'])
|
||||
+ [self._sponsor_chapter(10, 30, 'music_offtopic')])
|
||||
chapters = [
|
||||
*self._chapters([20, 40, 50, 60], ['c1', 'c2', 'c3', 'c4']),
|
||||
self._sponsor_chapter(10, 30, 'music_offtopic')]
|
||||
expected = self._chapters(
|
||||
[10, 30, 40, 50, 60],
|
||||
['c1', '[SponsorBlock]: Non-Music Section', 'c2', 'c3', 'c4'])
|
||||
|
@ -388,8 +412,9 @@ def test_remove_marked_arrange_sponsors_ChaptersAfterLastCut(self):
|
|||
self._remove_marked_arrange_sponsors_test_impl(chapters, expected, cuts)
|
||||
|
||||
def test_remove_marked_arrange_sponsors_SponsorStartsAtChapterStart(self):
|
||||
chapters = (self._chapters([10, 20, 40], ['c1', 'c2', 'c3'])
|
||||
+ [self._sponsor_chapter(20, 30, 'sponsor')])
|
||||
chapters = [
|
||||
*self._chapters([10, 20, 40], ['c1', 'c2', 'c3']),
|
||||
self._sponsor_chapter(20, 30, 'sponsor')]
|
||||
expected = self._chapters([10, 20, 30, 40], ['c1', 'c2', '[SponsorBlock]: Sponsor', 'c3'])
|
||||
self._remove_marked_arrange_sponsors_test_impl(chapters, expected, [])
|
||||
|
||||
|
@ -400,8 +425,9 @@ def test_remove_marked_arrange_sponsors_CutStartsAtChapterStart(self):
|
|||
self._remove_marked_arrange_sponsors_test_impl(chapters, expected, cuts)
|
||||
|
||||
def test_remove_marked_arrange_sponsors_SponsorEndsAtChapterEnd(self):
|
||||
chapters = (self._chapters([10, 30, 40], ['c1', 'c2', 'c3'])
|
||||
+ [self._sponsor_chapter(20, 30, 'sponsor')])
|
||||
chapters = [
|
||||
*self._chapters([10, 30, 40], ['c1', 'c2', 'c3']),
|
||||
self._sponsor_chapter(20, 30, 'sponsor')]
|
||||
expected = self._chapters([10, 20, 30, 40], ['c1', 'c2', '[SponsorBlock]: Sponsor', 'c3'])
|
||||
self._remove_marked_arrange_sponsors_test_impl(chapters, expected, [])
|
||||
|
||||
|
@ -412,8 +438,9 @@ def test_remove_marked_arrange_sponsors_CutEndsAtChapterEnd(self):
|
|||
self._remove_marked_arrange_sponsors_test_impl(chapters, expected, cuts)
|
||||
|
||||
def test_remove_marked_arrange_sponsors_SponsorCoincidesWithChapters(self):
|
||||
chapters = (self._chapters([10, 20, 30, 40], ['c1', 'c2', 'c3', 'c4'])
|
||||
+ [self._sponsor_chapter(10, 30, 'sponsor')])
|
||||
chapters = [
|
||||
*self._chapters([10, 20, 30, 40], ['c1', 'c2', 'c3', 'c4']),
|
||||
self._sponsor_chapter(10, 30, 'sponsor')]
|
||||
expected = self._chapters([10, 30, 40], ['c1', '[SponsorBlock]: Sponsor', 'c4'])
|
||||
self._remove_marked_arrange_sponsors_test_impl(chapters, expected, [])
|
||||
|
||||
|
@ -424,8 +451,9 @@ def test_remove_marked_arrange_sponsors_CutCoincidesWithChapters(self):
|
|||
self._remove_marked_arrange_sponsors_test_impl(chapters, expected, cuts)
|
||||
|
||||
def test_remove_marked_arrange_sponsors_SponsorsAtVideoBoundaries(self):
|
||||
chapters = (self._chapters([20, 40, 60], ['c1', 'c2', 'c3'])
|
||||
+ [self._sponsor_chapter(0, 10, 'intro'), self._sponsor_chapter(50, 60, 'outro')])
|
||||
chapters = [
|
||||
*self._chapters([20, 40, 60], ['c1', 'c2', 'c3']),
|
||||
self._sponsor_chapter(0, 10, 'intro'), self._sponsor_chapter(50, 60, 'outro')]
|
||||
expected = self._chapters(
|
||||
[10, 20, 40, 50, 60], ['[SponsorBlock]: Intermission/Intro Animation', 'c1', 'c2', 'c3', '[SponsorBlock]: Endcards/Credits'])
|
||||
self._remove_marked_arrange_sponsors_test_impl(chapters, expected, [])
|
||||
|
@ -437,8 +465,10 @@ def test_remove_marked_arrange_sponsors_CutsAtVideoBoundaries(self):
|
|||
self._remove_marked_arrange_sponsors_test_impl(chapters, expected, cuts)
|
||||
|
||||
def test_remove_marked_arrange_sponsors_SponsorsOverlapChaptersAtVideoBoundaries(self):
|
||||
chapters = (self._chapters([10, 40, 50], ['c1', 'c2', 'c3'])
|
||||
+ [self._sponsor_chapter(0, 20, 'intro'), self._sponsor_chapter(30, 50, 'outro')])
|
||||
chapters = [
|
||||
*self._chapters([10, 40, 50], ['c1', 'c2', 'c3']),
|
||||
self._sponsor_chapter(0, 20, 'intro'),
|
||||
self._sponsor_chapter(30, 50, 'outro')]
|
||||
expected = self._chapters(
|
||||
[20, 30, 50], ['[SponsorBlock]: Intermission/Intro Animation', 'c2', '[SponsorBlock]: Endcards/Credits'])
|
||||
self._remove_marked_arrange_sponsors_test_impl(chapters, expected, [])
|
||||
|
@ -450,8 +480,10 @@ def test_remove_marked_arrange_sponsors_CutsOverlapChaptersAtVideoBoundaries(sel
|
|||
self._remove_marked_arrange_sponsors_test_impl(chapters, expected, cuts)
|
||||
|
||||
def test_remove_marked_arrange_sponsors_EverythingSponsored(self):
|
||||
chapters = (self._chapters([10, 20, 30, 40], ['c1', 'c2', 'c3', 'c4'])
|
||||
+ [self._sponsor_chapter(0, 20, 'intro'), self._sponsor_chapter(20, 40, 'outro')])
|
||||
chapters = [
|
||||
*self._chapters([10, 20, 30, 40], ['c1', 'c2', 'c3', 'c4']),
|
||||
self._sponsor_chapter(0, 20, 'intro'),
|
||||
self._sponsor_chapter(20, 40, 'outro')]
|
||||
expected = self._chapters([20, 40], ['[SponsorBlock]: Intermission/Intro Animation', '[SponsorBlock]: Endcards/Credits'])
|
||||
self._remove_marked_arrange_sponsors_test_impl(chapters, expected, [])
|
||||
|
||||
|
@ -491,38 +523,39 @@ def test_remove_marked_arrange_sponsors_TinyChapterAtTheStartPrependedToTheNext(
|
|||
chapters, self._chapters([2.5], ['c2']), cuts)
|
||||
|
||||
def test_remove_marked_arrange_sponsors_TinyChaptersResultingFromSponsorOverlapAreIgnored(self):
|
||||
chapters = self._chapters([1, 3, 4], ['c1', 'c2', 'c3']) + [
|
||||
chapters = [
|
||||
*self._chapters([1, 3, 4], ['c1', 'c2', 'c3']),
|
||||
self._sponsor_chapter(1.5, 2.5, 'sponsor')]
|
||||
self._remove_marked_arrange_sponsors_test_impl(
|
||||
chapters, self._chapters([1.5, 2.5, 4], ['c1', '[SponsorBlock]: Sponsor', 'c3']), [])
|
||||
|
||||
def test_remove_marked_arrange_sponsors_TinySponsorsOverlapsAreIgnored(self):
|
||||
chapters = self._chapters([2, 3, 5], ['c1', 'c2', 'c3']) + [
|
||||
chapters = [
|
||||
*self._chapters([2, 3, 5], ['c1', 'c2', 'c3']),
|
||||
self._sponsor_chapter(1, 3, 'sponsor'),
|
||||
self._sponsor_chapter(2.5, 4, 'selfpromo')
|
||||
]
|
||||
self._sponsor_chapter(2.5, 4, 'selfpromo')]
|
||||
self._remove_marked_arrange_sponsors_test_impl(
|
||||
chapters, self._chapters([1, 3, 4, 5], [
|
||||
'c1', '[SponsorBlock]: Sponsor', '[SponsorBlock]: Unpaid/Self Promotion', 'c3']), [])
|
||||
|
||||
def test_remove_marked_arrange_sponsors_TinySponsorsPrependedToTheNextSponsor(self):
|
||||
chapters = self._chapters([4], ['c']) + [
|
||||
chapters = [
|
||||
*self._chapters([4], ['c']),
|
||||
self._sponsor_chapter(1.5, 2, 'sponsor'),
|
||||
self._sponsor_chapter(2, 4, 'selfpromo')
|
||||
]
|
||||
self._sponsor_chapter(2, 4, 'selfpromo')]
|
||||
self._remove_marked_arrange_sponsors_test_impl(
|
||||
chapters, self._chapters([1.5, 4], ['c', '[SponsorBlock]: Unpaid/Self Promotion']), [])
|
||||
|
||||
def test_remove_marked_arrange_sponsors_SmallestSponsorInTheOverlapGetsNamed(self):
|
||||
self._pp._sponsorblock_chapter_title = '[SponsorBlock]: %(name)s'
|
||||
chapters = self._chapters([10], ['c']) + [
|
||||
chapters = [
|
||||
*self._chapters([10], ['c']),
|
||||
self._sponsor_chapter(2, 8, 'sponsor'),
|
||||
self._sponsor_chapter(4, 6, 'selfpromo')
|
||||
]
|
||||
self._sponsor_chapter(4, 6, 'selfpromo')]
|
||||
self._remove_marked_arrange_sponsors_test_impl(
|
||||
chapters, self._chapters([2, 4, 6, 8, 10], [
|
||||
'c', '[SponsorBlock]: Sponsor', '[SponsorBlock]: Unpaid/Self Promotion',
|
||||
'[SponsorBlock]: Sponsor', 'c'
|
||||
'[SponsorBlock]: Sponsor', 'c',
|
||||
]), [])
|
||||
|
||||
def test_make_concat_opts_CommonCase(self):
|
||||
|
|
|
@ -1,113 +1,471 @@
|
|||
#!/usr/bin/env python3
|
||||
|
||||
# Allow direct execution
|
||||
import os
|
||||
import sys
|
||||
import threading
|
||||
import unittest
|
||||
|
||||
import pytest
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
|
||||
import abc
|
||||
import contextlib
|
||||
import enum
|
||||
import functools
|
||||
import http.server
|
||||
import json
|
||||
import random
|
||||
import subprocess
|
||||
import urllib.request
|
||||
import socket
|
||||
import struct
|
||||
import time
|
||||
from socketserver import (
|
||||
BaseRequestHandler,
|
||||
StreamRequestHandler,
|
||||
ThreadingTCPServer,
|
||||
)
|
||||
|
||||
from test.helper import FakeYDL, get_params, is_download_test
|
||||
from test.helper import http_server_port, verify_address_availability
|
||||
from yt_dlp.networking import Request
|
||||
from yt_dlp.networking.exceptions import ProxyError, TransportError
|
||||
from yt_dlp.socks import (
|
||||
SOCKS4_REPLY_VERSION,
|
||||
SOCKS4_VERSION,
|
||||
SOCKS5_USER_AUTH_SUCCESS,
|
||||
SOCKS5_USER_AUTH_VERSION,
|
||||
SOCKS5_VERSION,
|
||||
Socks5AddressType,
|
||||
Socks5Auth,
|
||||
)
|
||||
|
||||
SOCKS5_USER_AUTH_FAILURE = 0x1
|
||||
|
||||
|
||||
@is_download_test
|
||||
class TestMultipleSocks(unittest.TestCase):
|
||||
@staticmethod
|
||||
def _check_params(attrs):
|
||||
params = get_params()
|
||||
for attr in attrs:
|
||||
if attr not in params:
|
||||
print('Missing %s. Skipping.' % attr)
|
||||
class Socks4CD(enum.IntEnum):
|
||||
REQUEST_GRANTED = 90
|
||||
REQUEST_REJECTED_OR_FAILED = 91
|
||||
REQUEST_REJECTED_CANNOT_CONNECT_TO_IDENTD = 92
|
||||
REQUEST_REJECTED_DIFFERENT_USERID = 93
|
||||
|
||||
|
||||
class Socks5Reply(enum.IntEnum):
|
||||
SUCCEEDED = 0x0
|
||||
GENERAL_FAILURE = 0x1
|
||||
CONNECTION_NOT_ALLOWED = 0x2
|
||||
NETWORK_UNREACHABLE = 0x3
|
||||
HOST_UNREACHABLE = 0x4
|
||||
CONNECTION_REFUSED = 0x5
|
||||
TTL_EXPIRED = 0x6
|
||||
COMMAND_NOT_SUPPORTED = 0x7
|
||||
ADDRESS_TYPE_NOT_SUPPORTED = 0x8
|
||||
|
||||
|
||||
class SocksTestRequestHandler(BaseRequestHandler):
|
||||
|
||||
def __init__(self, *args, socks_info=None, **kwargs):
|
||||
self.socks_info = socks_info
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
|
||||
class SocksProxyHandler(BaseRequestHandler):
|
||||
def __init__(self, request_handler_class, socks_server_kwargs, *args, **kwargs):
|
||||
self.socks_kwargs = socks_server_kwargs or {}
|
||||
self.request_handler_class = request_handler_class
|
||||
super().__init__(*args, **kwargs)
|
||||
|
||||
|
||||
class Socks5ProxyHandler(StreamRequestHandler, SocksProxyHandler):
|
||||
|
||||
# SOCKS5 protocol https://tools.ietf.org/html/rfc1928
|
||||
# SOCKS5 username/password authentication https://tools.ietf.org/html/rfc1929
|
||||
|
||||
def handle(self):
|
||||
sleep = self.socks_kwargs.get('sleep')
|
||||
if sleep:
|
||||
time.sleep(sleep)
|
||||
version, nmethods = self.connection.recv(2)
|
||||
assert version == SOCKS5_VERSION
|
||||
methods = list(self.connection.recv(nmethods))
|
||||
|
||||
auth = self.socks_kwargs.get('auth')
|
||||
|
||||
if auth is not None and Socks5Auth.AUTH_USER_PASS not in methods:
|
||||
self.connection.sendall(struct.pack('!BB', SOCKS5_VERSION, Socks5Auth.AUTH_NO_ACCEPTABLE))
|
||||
self.server.close_request(self.request)
|
||||
return
|
||||
|
||||
elif Socks5Auth.AUTH_USER_PASS in methods:
|
||||
self.connection.sendall(struct.pack('!BB', SOCKS5_VERSION, Socks5Auth.AUTH_USER_PASS))
|
||||
|
||||
_, user_len = struct.unpack('!BB', self.connection.recv(2))
|
||||
username = self.connection.recv(user_len).decode()
|
||||
pass_len = ord(self.connection.recv(1))
|
||||
password = self.connection.recv(pass_len).decode()
|
||||
|
||||
if username == auth[0] and password == auth[1]:
|
||||
self.connection.sendall(struct.pack('!BB', SOCKS5_USER_AUTH_VERSION, SOCKS5_USER_AUTH_SUCCESS))
|
||||
else:
|
||||
self.connection.sendall(struct.pack('!BB', SOCKS5_USER_AUTH_VERSION, SOCKS5_USER_AUTH_FAILURE))
|
||||
self.server.close_request(self.request)
|
||||
return
|
||||
return params
|
||||
|
||||
def test_proxy_http(self):
|
||||
params = self._check_params(['primary_proxy', 'primary_server_ip'])
|
||||
if params is None:
|
||||
return
|
||||
ydl = FakeYDL({
|
||||
'proxy': params['primary_proxy']
|
||||
})
|
||||
self.assertEqual(
|
||||
ydl.urlopen('http://yt-dl.org/ip').read().decode(),
|
||||
params['primary_server_ip'])
|
||||
|
||||
def test_proxy_https(self):
|
||||
params = self._check_params(['primary_proxy', 'primary_server_ip'])
|
||||
if params is None:
|
||||
return
|
||||
ydl = FakeYDL({
|
||||
'proxy': params['primary_proxy']
|
||||
})
|
||||
self.assertEqual(
|
||||
ydl.urlopen('https://yt-dl.org/ip').read().decode(),
|
||||
params['primary_server_ip'])
|
||||
|
||||
def test_secondary_proxy_http(self):
|
||||
params = self._check_params(['secondary_proxy', 'secondary_server_ip'])
|
||||
if params is None:
|
||||
return
|
||||
ydl = FakeYDL()
|
||||
req = urllib.request.Request('http://yt-dl.org/ip')
|
||||
req.add_header('Ytdl-request-proxy', params['secondary_proxy'])
|
||||
self.assertEqual(
|
||||
ydl.urlopen(req).read().decode(),
|
||||
params['secondary_server_ip'])
|
||||
|
||||
def test_secondary_proxy_https(self):
|
||||
params = self._check_params(['secondary_proxy', 'secondary_server_ip'])
|
||||
if params is None:
|
||||
return
|
||||
ydl = FakeYDL()
|
||||
req = urllib.request.Request('https://yt-dl.org/ip')
|
||||
req.add_header('Ytdl-request-proxy', params['secondary_proxy'])
|
||||
self.assertEqual(
|
||||
ydl.urlopen(req).read().decode(),
|
||||
params['secondary_server_ip'])
|
||||
|
||||
|
||||
@is_download_test
|
||||
class TestSocks(unittest.TestCase):
|
||||
_SKIP_SOCKS_TEST = True
|
||||
|
||||
def setUp(self):
|
||||
if self._SKIP_SOCKS_TEST:
|
||||
elif Socks5Auth.AUTH_NONE in methods:
|
||||
self.connection.sendall(struct.pack('!BB', SOCKS5_VERSION, Socks5Auth.AUTH_NONE))
|
||||
else:
|
||||
self.connection.sendall(struct.pack('!BB', SOCKS5_VERSION, Socks5Auth.AUTH_NO_ACCEPTABLE))
|
||||
self.server.close_request(self.request)
|
||||
return
|
||||
|
||||
self.port = random.randint(20000, 30000)
|
||||
self.server_process = subprocess.Popen([
|
||||
'srelay', '-f', '-i', '127.0.0.1:%d' % self.port],
|
||||
stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
|
||||
version, command, _, address_type = struct.unpack('!BBBB', self.connection.recv(4))
|
||||
socks_info = {
|
||||
'version': version,
|
||||
'auth_methods': methods,
|
||||
'command': command,
|
||||
'client_address': self.client_address,
|
||||
'ipv4_address': None,
|
||||
'domain_address': None,
|
||||
'ipv6_address': None,
|
||||
}
|
||||
if address_type == Socks5AddressType.ATYP_IPV4:
|
||||
socks_info['ipv4_address'] = socket.inet_ntoa(self.connection.recv(4))
|
||||
elif address_type == Socks5AddressType.ATYP_DOMAINNAME:
|
||||
socks_info['domain_address'] = self.connection.recv(ord(self.connection.recv(1))).decode()
|
||||
elif address_type == Socks5AddressType.ATYP_IPV6:
|
||||
socks_info['ipv6_address'] = socket.inet_ntop(socket.AF_INET6, self.connection.recv(16))
|
||||
else:
|
||||
self.server.close_request(self.request)
|
||||
|
||||
def tearDown(self):
|
||||
if self._SKIP_SOCKS_TEST:
|
||||
socks_info['port'] = struct.unpack('!H', self.connection.recv(2))[0]
|
||||
|
||||
# dummy response, the returned IP is just a placeholder
|
||||
self.connection.sendall(struct.pack(
|
||||
'!BBBBIH', SOCKS5_VERSION, self.socks_kwargs.get('reply', Socks5Reply.SUCCEEDED), 0x0, 0x1, 0x7f000001, 40000))
|
||||
|
||||
self.request_handler_class(self.request, self.client_address, self.server, socks_info=socks_info)
|
||||
|
||||
|
||||
class Socks4ProxyHandler(StreamRequestHandler, SocksProxyHandler):
|
||||
|
||||
# SOCKS4 protocol http://www.openssh.com/txt/socks4.protocol
|
||||
# SOCKS4A protocol http://www.openssh.com/txt/socks4a.protocol
|
||||
|
||||
def _read_until_null(self):
|
||||
return b''.join(iter(functools.partial(self.connection.recv, 1), b'\x00'))
|
||||
|
||||
def handle(self):
|
||||
sleep = self.socks_kwargs.get('sleep')
|
||||
if sleep:
|
||||
time.sleep(sleep)
|
||||
socks_info = {
|
||||
'version': SOCKS4_VERSION,
|
||||
'command': None,
|
||||
'client_address': self.client_address,
|
||||
'ipv4_address': None,
|
||||
'port': None,
|
||||
'domain_address': None,
|
||||
}
|
||||
version, command, dest_port, dest_ip = struct.unpack('!BBHI', self.connection.recv(8))
|
||||
socks_info['port'] = dest_port
|
||||
socks_info['command'] = command
|
||||
if version != SOCKS4_VERSION:
|
||||
self.server.close_request(self.request)
|
||||
return
|
||||
use_remote_dns = False
|
||||
if 0x0 < dest_ip <= 0xFF:
|
||||
use_remote_dns = True
|
||||
else:
|
||||
socks_info['ipv4_address'] = socket.inet_ntoa(struct.pack('!I', dest_ip))
|
||||
|
||||
user_id = self._read_until_null().decode()
|
||||
if user_id != (self.socks_kwargs.get('user_id') or ''):
|
||||
self.connection.sendall(struct.pack(
|
||||
'!BBHI', SOCKS4_REPLY_VERSION, Socks4CD.REQUEST_REJECTED_DIFFERENT_USERID, 0x00, 0x00000000))
|
||||
self.server.close_request(self.request)
|
||||
return
|
||||
|
||||
self.server_process.terminate()
|
||||
self.server_process.communicate()
|
||||
if use_remote_dns:
|
||||
socks_info['domain_address'] = self._read_until_null().decode()
|
||||
|
||||
def _get_ip(self, protocol):
|
||||
if self._SKIP_SOCKS_TEST:
|
||||
return '127.0.0.1'
|
||||
# dummy response, the returned IP is just a placeholder
|
||||
self.connection.sendall(
|
||||
struct.pack(
|
||||
'!BBHI', SOCKS4_REPLY_VERSION,
|
||||
self.socks_kwargs.get('cd_reply', Socks4CD.REQUEST_GRANTED), 40000, 0x7f000001))
|
||||
|
||||
ydl = FakeYDL({
|
||||
'proxy': '%s://127.0.0.1:%d' % (protocol, self.port),
|
||||
})
|
||||
return ydl.urlopen('http://yt-dl.org/ip').read().decode()
|
||||
self.request_handler_class(self.request, self.client_address, self.server, socks_info=socks_info)
|
||||
|
||||
def test_socks4(self):
|
||||
self.assertTrue(isinstance(self._get_ip('socks4'), str))
|
||||
|
||||
def test_socks4a(self):
|
||||
self.assertTrue(isinstance(self._get_ip('socks4a'), str))
|
||||
class IPv6ThreadingTCPServer(ThreadingTCPServer):
|
||||
address_family = socket.AF_INET6
|
||||
|
||||
def test_socks5(self):
|
||||
self.assertTrue(isinstance(self._get_ip('socks5'), str))
|
||||
|
||||
class SocksHTTPTestRequestHandler(http.server.BaseHTTPRequestHandler, SocksTestRequestHandler):
|
||||
def do_GET(self):
|
||||
if self.path == '/socks_info':
|
||||
payload = json.dumps(self.socks_info.copy())
|
||||
self.send_response(200)
|
||||
self.send_header('Content-Type', 'application/json; charset=utf-8')
|
||||
self.send_header('Content-Length', str(len(payload)))
|
||||
self.end_headers()
|
||||
self.wfile.write(payload.encode())
|
||||
|
||||
|
||||
class SocksWebSocketTestRequestHandler(SocksTestRequestHandler):
|
||||
def handle(self):
|
||||
import websockets.sync.server
|
||||
protocol = websockets.ServerProtocol()
|
||||
connection = websockets.sync.server.ServerConnection(socket=self.request, protocol=protocol, close_timeout=0)
|
||||
connection.handshake()
|
||||
connection.send(json.dumps(self.socks_info))
|
||||
connection.close()
|
||||
|
||||
|
||||
@contextlib.contextmanager
|
||||
def socks_server(socks_server_class, request_handler, bind_ip=None, **socks_server_kwargs):
|
||||
server = server_thread = None
|
||||
try:
|
||||
bind_address = bind_ip or '127.0.0.1'
|
||||
server_type = ThreadingTCPServer if '.' in bind_address else IPv6ThreadingTCPServer
|
||||
server = server_type(
|
||||
(bind_address, 0), functools.partial(socks_server_class, request_handler, socks_server_kwargs))
|
||||
server_port = http_server_port(server)
|
||||
server_thread = threading.Thread(target=server.serve_forever)
|
||||
server_thread.daemon = True
|
||||
server_thread.start()
|
||||
if '.' not in bind_address:
|
||||
yield f'[{bind_address}]:{server_port}'
|
||||
else:
|
||||
yield f'{bind_address}:{server_port}'
|
||||
finally:
|
||||
server.shutdown()
|
||||
server.server_close()
|
||||
server_thread.join(2.0)
|
||||
|
||||
|
||||
class SocksProxyTestContext(abc.ABC):
|
||||
REQUEST_HANDLER_CLASS = None
|
||||
|
||||
def socks_server(self, server_class, *args, **kwargs):
|
||||
return socks_server(server_class, self.REQUEST_HANDLER_CLASS, *args, **kwargs)
|
||||
|
||||
@abc.abstractmethod
|
||||
def socks_info_request(self, handler, target_domain=None, target_port=None, **req_kwargs) -> dict:
|
||||
"""return a dict of socks_info"""
|
||||
|
||||
|
||||
class HTTPSocksTestProxyContext(SocksProxyTestContext):
|
||||
REQUEST_HANDLER_CLASS = SocksHTTPTestRequestHandler
|
||||
|
||||
def socks_info_request(self, handler, target_domain=None, target_port=None, **req_kwargs):
|
||||
request = Request(f'http://{target_domain or "127.0.0.1"}:{target_port or "40000"}/socks_info', **req_kwargs)
|
||||
handler.validate(request)
|
||||
return json.loads(handler.send(request).read().decode())
|
||||
|
||||
|
||||
class WebSocketSocksTestProxyContext(SocksProxyTestContext):
|
||||
REQUEST_HANDLER_CLASS = SocksWebSocketTestRequestHandler
|
||||
|
||||
def socks_info_request(self, handler, target_domain=None, target_port=None, **req_kwargs):
|
||||
request = Request(f'ws://{target_domain or "127.0.0.1"}:{target_port or "40000"}', **req_kwargs)
|
||||
handler.validate(request)
|
||||
ws = handler.send(request)
|
||||
ws.send('socks_info')
|
||||
socks_info = ws.recv()
|
||||
ws.close()
|
||||
return json.loads(socks_info)
|
||||
|
||||
|
||||
CTX_MAP = {
|
||||
'http': HTTPSocksTestProxyContext,
|
||||
'ws': WebSocketSocksTestProxyContext,
|
||||
}
|
||||
|
||||
|
||||
@pytest.fixture(scope='module')
|
||||
def ctx(request):
|
||||
return CTX_MAP[request.param]()
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
'handler,ctx', [
|
||||
('Urllib', 'http'),
|
||||
('Requests', 'http'),
|
||||
('Websockets', 'ws'),
|
||||
('CurlCFFI', 'http'),
|
||||
], indirect=True)
|
||||
class TestSocks4Proxy:
|
||||
def test_socks4_no_auth(self, handler, ctx):
|
||||
with handler() as rh:
|
||||
with ctx.socks_server(Socks4ProxyHandler) as server_address:
|
||||
response = ctx.socks_info_request(
|
||||
rh, proxies={'all': f'socks4://{server_address}'})
|
||||
assert response['version'] == 4
|
||||
|
||||
def test_socks4_auth(self, handler, ctx):
|
||||
with handler() as rh:
|
||||
with ctx.socks_server(Socks4ProxyHandler, user_id='user') as server_address:
|
||||
with pytest.raises(ProxyError):
|
||||
ctx.socks_info_request(rh, proxies={'all': f'socks4://{server_address}'})
|
||||
response = ctx.socks_info_request(
|
||||
rh, proxies={'all': f'socks4://user:@{server_address}'})
|
||||
assert response['version'] == 4
|
||||
|
||||
def test_socks4a_ipv4_target(self, handler, ctx):
|
||||
with ctx.socks_server(Socks4ProxyHandler) as server_address:
|
||||
with handler(proxies={'all': f'socks4a://{server_address}'}) as rh:
|
||||
response = ctx.socks_info_request(rh, target_domain='127.0.0.1')
|
||||
assert response['version'] == 4
|
||||
assert (response['ipv4_address'] == '127.0.0.1') != (response['domain_address'] == '127.0.0.1')
|
||||
|
||||
def test_socks4a_domain_target(self, handler, ctx):
|
||||
with ctx.socks_server(Socks4ProxyHandler) as server_address:
|
||||
with handler(proxies={'all': f'socks4a://{server_address}'}) as rh:
|
||||
response = ctx.socks_info_request(rh, target_domain='localhost')
|
||||
assert response['version'] == 4
|
||||
assert response['ipv4_address'] is None
|
||||
assert response['domain_address'] == 'localhost'
|
||||
|
||||
def test_ipv4_client_source_address(self, handler, ctx):
|
||||
with ctx.socks_server(Socks4ProxyHandler) as server_address:
|
||||
source_address = f'127.0.0.{random.randint(5, 255)}'
|
||||
verify_address_availability(source_address)
|
||||
with handler(proxies={'all': f'socks4://{server_address}'},
|
||||
source_address=source_address) as rh:
|
||||
response = ctx.socks_info_request(rh)
|
||||
assert response['client_address'][0] == source_address
|
||||
assert response['version'] == 4
|
||||
|
||||
@pytest.mark.parametrize('reply_code', [
|
||||
Socks4CD.REQUEST_REJECTED_OR_FAILED,
|
||||
Socks4CD.REQUEST_REJECTED_CANNOT_CONNECT_TO_IDENTD,
|
||||
Socks4CD.REQUEST_REJECTED_DIFFERENT_USERID,
|
||||
])
|
||||
def test_socks4_errors(self, handler, ctx, reply_code):
|
||||
with ctx.socks_server(Socks4ProxyHandler, cd_reply=reply_code) as server_address:
|
||||
with handler(proxies={'all': f'socks4://{server_address}'}) as rh:
|
||||
with pytest.raises(ProxyError):
|
||||
ctx.socks_info_request(rh)
|
||||
|
||||
def test_ipv6_socks4_proxy(self, handler, ctx):
|
||||
with ctx.socks_server(Socks4ProxyHandler, bind_ip='::1') as server_address:
|
||||
with handler(proxies={'all': f'socks4://{server_address}'}) as rh:
|
||||
response = ctx.socks_info_request(rh, target_domain='127.0.0.1')
|
||||
assert response['client_address'][0] == '::1'
|
||||
assert response['ipv4_address'] == '127.0.0.1'
|
||||
assert response['version'] == 4
|
||||
|
||||
def test_timeout(self, handler, ctx):
|
||||
with ctx.socks_server(Socks4ProxyHandler, sleep=2) as server_address:
|
||||
with handler(proxies={'all': f'socks4://{server_address}'}, timeout=0.5) as rh:
|
||||
with pytest.raises(TransportError):
|
||||
ctx.socks_info_request(rh)
|
||||
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
'handler,ctx', [
|
||||
('Urllib', 'http'),
|
||||
('Requests', 'http'),
|
||||
('Websockets', 'ws'),
|
||||
('CurlCFFI', 'http'),
|
||||
], indirect=True)
|
||||
class TestSocks5Proxy:
|
||||
|
||||
def test_socks5_no_auth(self, handler, ctx):
|
||||
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
||||
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
|
||||
response = ctx.socks_info_request(rh)
|
||||
assert response['auth_methods'] == [0x0]
|
||||
assert response['version'] == 5
|
||||
|
||||
def test_socks5_user_pass(self, handler, ctx):
|
||||
with ctx.socks_server(Socks5ProxyHandler, auth=('test', 'testpass')) as server_address:
|
||||
with handler() as rh:
|
||||
with pytest.raises(ProxyError):
|
||||
ctx.socks_info_request(rh, proxies={'all': f'socks5://{server_address}'})
|
||||
|
||||
response = ctx.socks_info_request(
|
||||
rh, proxies={'all': f'socks5://test:testpass@{server_address}'})
|
||||
|
||||
assert response['auth_methods'] == [Socks5Auth.AUTH_NONE, Socks5Auth.AUTH_USER_PASS]
|
||||
assert response['version'] == 5
|
||||
|
||||
def test_socks5_ipv4_target(self, handler, ctx):
|
||||
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
||||
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
|
||||
response = ctx.socks_info_request(rh, target_domain='127.0.0.1')
|
||||
assert response['ipv4_address'] == '127.0.0.1'
|
||||
assert response['version'] == 5
|
||||
|
||||
def test_socks5_domain_target(self, handler, ctx):
|
||||
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
||||
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
|
||||
response = ctx.socks_info_request(rh, target_domain='localhost')
|
||||
assert (response['ipv4_address'] == '127.0.0.1') != (response['ipv6_address'] == '::1')
|
||||
assert response['version'] == 5
|
||||
|
||||
def test_socks5h_domain_target(self, handler, ctx):
|
||||
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
||||
with handler(proxies={'all': f'socks5h://{server_address}'}) as rh:
|
||||
response = ctx.socks_info_request(rh, target_domain='localhost')
|
||||
assert response['ipv4_address'] is None
|
||||
assert response['domain_address'] == 'localhost'
|
||||
assert response['version'] == 5
|
||||
|
||||
def test_socks5h_ip_target(self, handler, ctx):
|
||||
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
||||
with handler(proxies={'all': f'socks5h://{server_address}'}) as rh:
|
||||
response = ctx.socks_info_request(rh, target_domain='127.0.0.1')
|
||||
assert response['ipv4_address'] == '127.0.0.1'
|
||||
assert response['domain_address'] is None
|
||||
assert response['version'] == 5
|
||||
|
||||
def test_socks5_ipv6_destination(self, handler, ctx):
|
||||
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
||||
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
|
||||
response = ctx.socks_info_request(rh, target_domain='[::1]')
|
||||
assert response['ipv6_address'] == '::1'
|
||||
assert response['version'] == 5
|
||||
|
||||
def test_ipv6_socks5_proxy(self, handler, ctx):
|
||||
with ctx.socks_server(Socks5ProxyHandler, bind_ip='::1') as server_address:
|
||||
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
|
||||
response = ctx.socks_info_request(rh, target_domain='127.0.0.1')
|
||||
assert response['client_address'][0] == '::1'
|
||||
assert response['ipv4_address'] == '127.0.0.1'
|
||||
assert response['version'] == 5
|
||||
|
||||
# XXX: is there any feasible way of testing IPv6 source addresses?
|
||||
# Same would go for non-proxy source_address test...
|
||||
def test_ipv4_client_source_address(self, handler, ctx):
|
||||
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
||||
source_address = f'127.0.0.{random.randint(5, 255)}'
|
||||
verify_address_availability(source_address)
|
||||
with handler(proxies={'all': f'socks5://{server_address}'}, source_address=source_address) as rh:
|
||||
response = ctx.socks_info_request(rh)
|
||||
assert response['client_address'][0] == source_address
|
||||
assert response['version'] == 5
|
||||
|
||||
@pytest.mark.parametrize('reply_code', [
|
||||
Socks5Reply.GENERAL_FAILURE,
|
||||
Socks5Reply.CONNECTION_NOT_ALLOWED,
|
||||
Socks5Reply.NETWORK_UNREACHABLE,
|
||||
Socks5Reply.HOST_UNREACHABLE,
|
||||
Socks5Reply.CONNECTION_REFUSED,
|
||||
Socks5Reply.TTL_EXPIRED,
|
||||
Socks5Reply.COMMAND_NOT_SUPPORTED,
|
||||
Socks5Reply.ADDRESS_TYPE_NOT_SUPPORTED,
|
||||
])
|
||||
def test_socks5_errors(self, handler, ctx, reply_code):
|
||||
with ctx.socks_server(Socks5ProxyHandler, reply=reply_code) as server_address:
|
||||
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
|
||||
with pytest.raises(ProxyError):
|
||||
ctx.socks_info_request(rh)
|
||||
|
||||
def test_timeout(self, handler, ctx):
|
||||
with ctx.socks_server(Socks5ProxyHandler, sleep=2) as server_address:
|
||||
with handler(proxies={'all': f'socks5://{server_address}'}, timeout=1) as rh:
|
||||
with pytest.raises(TransportError):
|
||||
ctx.socks_info_request(rh)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
|
|
|
@ -40,12 +40,11 @@ def setUp(self):
|
|||
self.ie = self.IE()
|
||||
self.DL.add_info_extractor(self.ie)
|
||||
if not self.IE.working():
|
||||
print('Skipping: %s marked as not _WORKING' % self.IE.ie_key())
|
||||
print(f'Skipping: {self.IE.ie_key()} marked as not _WORKING')
|
||||
self.skipTest('IE marked as not _WORKING')
|
||||
|
||||
def getInfoDict(self):
|
||||
info_dict = self.DL.extract_info(self.url, download=False)
|
||||
return info_dict
|
||||
return self.DL.extract_info(self.url, download=False)
|
||||
|
||||
def getSubtitles(self):
|
||||
info_dict = self.getInfoDict()
|
||||
|
@ -87,7 +86,7 @@ def test_youtube_allsubtitles(self):
|
|||
self.assertEqual(md5(subtitles['en']), 'ae1bd34126571a77aabd4d276b28044d')
|
||||
self.assertEqual(md5(subtitles['it']), '0e0b667ba68411d88fd1c5f4f4eab2f9')
|
||||
for lang in ['fr', 'de']:
|
||||
self.assertTrue(subtitles.get(lang) is not None, 'Subtitles for \'%s\' not extracted' % lang)
|
||||
self.assertTrue(subtitles.get(lang) is not None, f'Subtitles for \'{lang}\' not extracted')
|
||||
|
||||
def _test_subtitles_format(self, fmt, md5_hash, lang='en'):
|
||||
self.DL.params['writesubtitles'] = True
|
||||
|
@ -157,7 +156,7 @@ def test_allsubtitles(self):
|
|||
self.assertEqual(md5(subtitles['en']), '976553874490cba125086bbfea3ff76f')
|
||||
self.assertEqual(md5(subtitles['fr']), '594564ec7d588942e384e920e5341792')
|
||||
for lang in ['es', 'fr', 'de']:
|
||||
self.assertTrue(subtitles.get(lang) is not None, 'Subtitles for \'%s\' not extracted' % lang)
|
||||
self.assertTrue(subtitles.get(lang) is not None, f'Subtitles for \'{lang}\' not extracted')
|
||||
|
||||
def test_nosubtitles(self):
|
||||
self.DL.expect_warning('video doesn\'t have subtitles')
|
||||
|
@ -182,7 +181,7 @@ def test_allsubtitles(self):
|
|||
self.assertEqual(md5(subtitles['en']), '4262c1665ff928a2dada178f62cb8d14')
|
||||
self.assertEqual(md5(subtitles['fr']), '66a63f7f42c97a50f8c0e90bc7797bb5')
|
||||
for lang in ['es', 'fr', 'de']:
|
||||
self.assertTrue(subtitles.get(lang) is not None, 'Subtitles for \'%s\' not extracted' % lang)
|
||||
self.assertTrue(subtitles.get(lang) is not None, f'Subtitles for \'{lang}\' not extracted')
|
||||
|
||||
|
||||
@is_download_test
|
||||
|
|
519
test/test_traversal.py
Normal file
519
test/test_traversal.py
Normal file
|
@ -0,0 +1,519 @@
|
|||
import http.cookies
|
||||
import re
|
||||
import xml.etree.ElementTree
|
||||
|
||||
import pytest
|
||||
|
||||
from yt_dlp.utils import (
|
||||
ExtractorError,
|
||||
determine_ext,
|
||||
dict_get,
|
||||
int_or_none,
|
||||
str_or_none,
|
||||
)
|
||||
from yt_dlp.utils.traversal import (
|
||||
traverse_obj,
|
||||
require,
|
||||
subs_list_to_dict,
|
||||
)
|
||||
|
||||
_TEST_DATA = {
|
||||
100: 100,
|
||||
1.2: 1.2,
|
||||
'str': 'str',
|
||||
'None': None,
|
||||
'...': ...,
|
||||
'urls': [
|
||||
{'index': 0, 'url': 'https://www.example.com/0'},
|
||||
{'index': 1, 'url': 'https://www.example.com/1'},
|
||||
],
|
||||
'data': (
|
||||
{'index': 2},
|
||||
{'index': 3},
|
||||
),
|
||||
'dict': {},
|
||||
}
|
||||
|
||||
|
||||
class TestTraversal:
|
||||
def test_traversal_base(self):
|
||||
assert traverse_obj(_TEST_DATA, ('str',)) == 'str', \
|
||||
'allow tuple path'
|
||||
assert traverse_obj(_TEST_DATA, ['str']) == 'str', \
|
||||
'allow list path'
|
||||
assert traverse_obj(_TEST_DATA, (value for value in ('str',))) == 'str', \
|
||||
'allow iterable path'
|
||||
assert traverse_obj(_TEST_DATA, 'str') == 'str', \
|
||||
'single items should be treated as a path'
|
||||
assert traverse_obj(_TEST_DATA, 100) == 100, \
|
||||
'allow int path'
|
||||
assert traverse_obj(_TEST_DATA, 1.2) == 1.2, \
|
||||
'allow float path'
|
||||
assert traverse_obj(_TEST_DATA, None) == _TEST_DATA, \
|
||||
'`None` should not perform any modification'
|
||||
|
||||
def test_traversal_ellipsis(self):
|
||||
assert traverse_obj(_TEST_DATA, ...) == [x for x in _TEST_DATA.values() if x not in (None, {})], \
|
||||
'`...` should give all non discarded values'
|
||||
assert traverse_obj(_TEST_DATA, ('urls', 0, ...)) == list(_TEST_DATA['urls'][0].values()), \
|
||||
'`...` selection for dicts should select all values'
|
||||
assert traverse_obj(_TEST_DATA, (..., ..., 'url')) == ['https://www.example.com/0', 'https://www.example.com/1'], \
|
||||
'nested `...` queries should work'
|
||||
assert traverse_obj(_TEST_DATA, (..., ..., 'index')) == list(range(4)), \
|
||||
'`...` query result should be flattened'
|
||||
assert traverse_obj(iter(range(4)), ...) == list(range(4)), \
|
||||
'`...` should accept iterables'
|
||||
|
||||
def test_traversal_function(self):
|
||||
filter_func = lambda x, y: x == 'urls' and isinstance(y, list)
|
||||
assert traverse_obj(_TEST_DATA, filter_func) == [_TEST_DATA['urls']], \
|
||||
'function as query key should perform a filter based on (key, value)'
|
||||
assert traverse_obj(_TEST_DATA, lambda _, x: isinstance(x[0], str)) == ['str'], \
|
||||
'exceptions in the query function should be catched'
|
||||
assert traverse_obj(iter(range(4)), lambda _, x: x % 2 == 0) == [0, 2], \
|
||||
'function key should accept iterables'
|
||||
# Wrong function signature should raise (debug mode)
|
||||
with pytest.raises(Exception):
|
||||
traverse_obj(_TEST_DATA, lambda a: ...)
|
||||
with pytest.raises(Exception):
|
||||
traverse_obj(_TEST_DATA, lambda a, b, c: ...)
|
||||
|
||||
def test_traversal_set(self):
|
||||
# transformation/type, like `expected_type`
|
||||
assert traverse_obj(_TEST_DATA, (..., {str.upper})) == ['STR'], \
|
||||
'Function in set should be a transformation'
|
||||
assert traverse_obj(_TEST_DATA, (..., {str})) == ['str'], \
|
||||
'Type in set should be a type filter'
|
||||
assert traverse_obj(_TEST_DATA, (..., {str, int})) == [100, 'str'], \
|
||||
'Multiple types in set should be a type filter'
|
||||
assert traverse_obj(_TEST_DATA, {dict}) == _TEST_DATA, \
|
||||
'A single set should be wrapped into a path'
|
||||
assert traverse_obj(_TEST_DATA, (..., {str.upper})) == ['STR'], \
|
||||
'Transformation function should not raise'
|
||||
expected = [x for x in map(str_or_none, _TEST_DATA.values()) if x is not None]
|
||||
assert traverse_obj(_TEST_DATA, (..., {str_or_none})) == expected, \
|
||||
'Function in set should be a transformation'
|
||||
assert traverse_obj(_TEST_DATA, ('fail', {lambda _: 'const'})) == 'const', \
|
||||
'Function in set should always be called'
|
||||
# Sets with length < 1 or > 1 not including only types should raise
|
||||
with pytest.raises(Exception):
|
||||
traverse_obj(_TEST_DATA, set())
|
||||
with pytest.raises(Exception):
|
||||
traverse_obj(_TEST_DATA, {str.upper, str})
|
||||
|
||||
def test_traversal_slice(self):
|
||||
_SLICE_DATA = [0, 1, 2, 3, 4]
|
||||
|
||||
assert traverse_obj(_TEST_DATA, ('dict', slice(1))) is None, \
|
||||
'slice on a dictionary should not throw'
|
||||
assert traverse_obj(_SLICE_DATA, slice(1)) == _SLICE_DATA[:1], \
|
||||
'slice key should apply slice to sequence'
|
||||
assert traverse_obj(_SLICE_DATA, slice(1, 2)) == _SLICE_DATA[1:2], \
|
||||
'slice key should apply slice to sequence'
|
||||
assert traverse_obj(_SLICE_DATA, slice(1, 4, 2)) == _SLICE_DATA[1:4:2], \
|
||||
'slice key should apply slice to sequence'
|
||||
|
||||
def test_traversal_alternatives(self):
|
||||
assert traverse_obj(_TEST_DATA, 'fail', 'str') == 'str', \
|
||||
'multiple `paths` should be treated as alternative paths'
|
||||
assert traverse_obj(_TEST_DATA, 'str', 100) == 'str', \
|
||||
'alternatives should exit early'
|
||||
assert traverse_obj(_TEST_DATA, 'fail', 'fail') is None, \
|
||||
'alternatives should return `default` if exhausted'
|
||||
assert traverse_obj(_TEST_DATA, (..., 'fail'), 100) == 100, \
|
||||
'alternatives should track their own branching return'
|
||||
assert traverse_obj(_TEST_DATA, ('dict', ...), ('data', ...)) == list(_TEST_DATA['data']), \
|
||||
'alternatives on empty objects should search further'
|
||||
|
||||
def test_traversal_branching_nesting(self):
|
||||
assert traverse_obj(_TEST_DATA, ('urls', (3, 0), 'url')) == ['https://www.example.com/0'], \
|
||||
'tuple as key should be treated as branches'
|
||||
assert traverse_obj(_TEST_DATA, ('urls', [3, 0], 'url')) == ['https://www.example.com/0'], \
|
||||
'list as key should be treated as branches'
|
||||
assert traverse_obj(_TEST_DATA, ('urls', ((1, 'fail'), (0, 'url')))) == ['https://www.example.com/0'], \
|
||||
'double nesting in path should be treated as paths'
|
||||
assert traverse_obj(['0', [1, 2]], [(0, 1), 0]) == [1], \
|
||||
'do not fail early on branching'
|
||||
expected = ['https://www.example.com/0', 'https://www.example.com/1']
|
||||
assert traverse_obj(_TEST_DATA, ('urls', ((0, ('fail', 'url')), (1, 'url')))) == expected, \
|
||||
'tripple nesting in path should be treated as branches'
|
||||
assert traverse_obj(_TEST_DATA, ('urls', ('fail', (..., 'url')))) == expected, \
|
||||
'ellipsis as branch path start gets flattened'
|
||||
|
||||
def test_traversal_dict(self):
|
||||
assert traverse_obj(_TEST_DATA, {0: 100, 1: 1.2}) == {0: 100, 1: 1.2}, \
|
||||
'dict key should result in a dict with the same keys'
|
||||
expected = {0: 'https://www.example.com/0'}
|
||||
assert traverse_obj(_TEST_DATA, {0: ('urls', 0, 'url')}) == expected, \
|
||||
'dict key should allow paths'
|
||||
expected = {0: ['https://www.example.com/0']}
|
||||
assert traverse_obj(_TEST_DATA, {0: ('urls', (3, 0), 'url')}) == expected, \
|
||||
'tuple in dict path should be treated as branches'
|
||||
assert traverse_obj(_TEST_DATA, {0: ('urls', ((1, 'fail'), (0, 'url')))}) == expected, \
|
||||
'double nesting in dict path should be treated as paths'
|
||||
expected = {0: ['https://www.example.com/1', 'https://www.example.com/0']}
|
||||
assert traverse_obj(_TEST_DATA, {0: ('urls', ((1, ('fail', 'url')), (0, 'url')))}) == expected, \
|
||||
'tripple nesting in dict path should be treated as branches'
|
||||
assert traverse_obj(_TEST_DATA, {0: 'fail'}) == {}, \
|
||||
'remove `None` values when top level dict key fails'
|
||||
assert traverse_obj(_TEST_DATA, {0: 'fail'}, default=...) == {0: ...}, \
|
||||
'use `default` if key fails and `default`'
|
||||
assert traverse_obj(_TEST_DATA, {0: 'dict'}) == {}, \
|
||||
'remove empty values when dict key'
|
||||
assert traverse_obj(_TEST_DATA, {0: 'dict'}, default=...) == {0: ...}, \
|
||||
'use `default` when dict key and `default`'
|
||||
assert traverse_obj(_TEST_DATA, {0: {0: 'fail'}}) == {}, \
|
||||
'remove empty values when nested dict key fails'
|
||||
assert traverse_obj(None, {0: 'fail'}) == {}, \
|
||||
'default to dict if pruned'
|
||||
assert traverse_obj(None, {0: 'fail'}, default=...) == {0: ...}, \
|
||||
'default to dict if pruned and default is given'
|
||||
assert traverse_obj(_TEST_DATA, {0: {0: 'fail'}}, default=...) == {0: {0: ...}}, \
|
||||
'use nested `default` when nested dict key fails and `default`'
|
||||
assert traverse_obj(_TEST_DATA, {0: ('dict', ...)}) == {}, \
|
||||
'remove key if branch in dict key not successful'
|
||||
|
||||
def test_traversal_default(self):
|
||||
_DEFAULT_DATA = {'None': None, 'int': 0, 'list': []}
|
||||
|
||||
assert traverse_obj(_DEFAULT_DATA, 'fail') is None, \
|
||||
'default value should be `None`'
|
||||
assert traverse_obj(_DEFAULT_DATA, 'fail', 'fail', default=...) == ..., \
|
||||
'chained fails should result in default'
|
||||
assert traverse_obj(_DEFAULT_DATA, 'None', 'int') == 0, \
|
||||
'should not short cirquit on `None`'
|
||||
assert traverse_obj(_DEFAULT_DATA, 'fail', default=1) == 1, \
|
||||
'invalid dict key should result in `default`'
|
||||
assert traverse_obj(_DEFAULT_DATA, 'None', default=1) == 1, \
|
||||
'`None` is a deliberate sentinel and should become `default`'
|
||||
assert traverse_obj(_DEFAULT_DATA, ('list', 10)) is None, \
|
||||
'`IndexError` should result in `default`'
|
||||
assert traverse_obj(_DEFAULT_DATA, (..., 'fail'), default=1) == 1, \
|
||||
'if branched but not successful return `default` if defined, not `[]`'
|
||||
assert traverse_obj(_DEFAULT_DATA, (..., 'fail'), default=None) is None, \
|
||||
'if branched but not successful return `default` even if `default` is `None`'
|
||||
assert traverse_obj(_DEFAULT_DATA, (..., 'fail')) == [], \
|
||||
'if branched but not successful return `[]`, not `default`'
|
||||
assert traverse_obj(_DEFAULT_DATA, ('list', ...)) == [], \
|
||||
'if branched but object is empty return `[]`, not `default`'
|
||||
assert traverse_obj(None, ...) == [], \
|
||||
'if branched but object is `None` return `[]`, not `default`'
|
||||
assert traverse_obj({0: None}, (0, ...)) == [], \
|
||||
'if branched but state is `None` return `[]`, not `default`'
|
||||
|
||||
@pytest.mark.parametrize('path', [
|
||||
('fail', ...),
|
||||
(..., 'fail'),
|
||||
100 * ('fail',) + (...,),
|
||||
(...,) + 100 * ('fail',),
|
||||
])
|
||||
def test_traversal_branching(self, path):
|
||||
assert traverse_obj({}, path) == [], \
|
||||
'if branched but state is `None`, return `[]` (not `default`)'
|
||||
assert traverse_obj({}, 'fail', path) == [], \
|
||||
'if branching in last alternative and previous did not match, return `[]` (not `default`)'
|
||||
assert traverse_obj({0: 'x'}, 0, path) == 'x', \
|
||||
'if branching in last alternative and previous did match, return single value'
|
||||
assert traverse_obj({0: 'x'}, path, 0) == 'x', \
|
||||
'if branching in first alternative and non-branching path does match, return single value'
|
||||
assert traverse_obj({}, path, 'fail') is None, \
|
||||
'if branching in first alternative and non-branching path does not match, return `default`'
|
||||
|
||||
def test_traversal_expected_type(self):
|
||||
_EXPECTED_TYPE_DATA = {'str': 'str', 'int': 0}
|
||||
|
||||
assert traverse_obj(_EXPECTED_TYPE_DATA, 'str', expected_type=str) == 'str', \
|
||||
'accept matching `expected_type` type'
|
||||
assert traverse_obj(_EXPECTED_TYPE_DATA, 'str', expected_type=int) is None, \
|
||||
'reject non matching `expected_type` type'
|
||||
assert traverse_obj(_EXPECTED_TYPE_DATA, 'int', expected_type=lambda x: str(x)) == '0', \
|
||||
'transform type using type function'
|
||||
assert traverse_obj(_EXPECTED_TYPE_DATA, 'str', expected_type=lambda _: 1 / 0) is None, \
|
||||
'wrap expected_type fuction in try_call'
|
||||
assert traverse_obj(_EXPECTED_TYPE_DATA, ..., expected_type=str) == ['str'], \
|
||||
'eliminate items that expected_type fails on'
|
||||
assert traverse_obj(_TEST_DATA, {0: 100, 1: 1.2}, expected_type=int) == {0: 100}, \
|
||||
'type as expected_type should filter dict values'
|
||||
assert traverse_obj(_TEST_DATA, {0: 100, 1: 1.2, 2: 'None'}, expected_type=str_or_none) == {0: '100', 1: '1.2'}, \
|
||||
'function as expected_type should transform dict values'
|
||||
assert traverse_obj(_TEST_DATA, ({0: 1.2}, 0, {int_or_none}), expected_type=int) == 1, \
|
||||
'expected_type should not filter non final dict values'
|
||||
assert traverse_obj(_TEST_DATA, {0: {0: 100, 1: 'str'}}, expected_type=int) == {0: {0: 100}}, \
|
||||
'expected_type should transform deep dict values'
|
||||
assert traverse_obj(_TEST_DATA, [({0: '...'}, {0: '...'})], expected_type=type(...)) == [{0: ...}, {0: ...}], \
|
||||
'expected_type should transform branched dict values'
|
||||
assert traverse_obj({1: {3: 4}}, [(1, 2), 3], expected_type=int) == [4], \
|
||||
'expected_type regression for type matching in tuple branching'
|
||||
assert traverse_obj(_TEST_DATA, ['data', ...], expected_type=int) == [], \
|
||||
'expected_type regression for type matching in dict result'
|
||||
|
||||
def test_traversal_get_all(self):
|
||||
_GET_ALL_DATA = {'key': [0, 1, 2]}
|
||||
|
||||
assert traverse_obj(_GET_ALL_DATA, ('key', ...), get_all=False) == 0, \
|
||||
'if not `get_all`, return only first matching value'
|
||||
assert traverse_obj(_GET_ALL_DATA, ..., get_all=False) == [0, 1, 2], \
|
||||
'do not overflatten if not `get_all`'
|
||||
|
||||
def test_traversal_casesense(self):
|
||||
_CASESENSE_DATA = {
|
||||
'KeY': 'value0',
|
||||
0: {
|
||||
'KeY': 'value1',
|
||||
0: {'KeY': 'value2'},
|
||||
},
|
||||
}
|
||||
|
||||
assert traverse_obj(_CASESENSE_DATA, 'key') is None, \
|
||||
'dict keys should be case sensitive unless `casesense`'
|
||||
assert traverse_obj(_CASESENSE_DATA, 'keY', casesense=False) == 'value0', \
|
||||
'allow non matching key case if `casesense`'
|
||||
assert traverse_obj(_CASESENSE_DATA, [0, ('keY',)], casesense=False) == ['value1'], \
|
||||
'allow non matching key case in branch if `casesense`'
|
||||
assert traverse_obj(_CASESENSE_DATA, [0, ([0, 'keY'],)], casesense=False) == ['value2'], \
|
||||
'allow non matching key case in branch path if `casesense`'
|
||||
|
||||
def test_traversal_traverse_string(self):
|
||||
_TRAVERSE_STRING_DATA = {'str': 'str', 1.2: 1.2}
|
||||
|
||||
assert traverse_obj(_TRAVERSE_STRING_DATA, ('str', 0)) is None, \
|
||||
'do not traverse into string if not `traverse_string`'
|
||||
assert traverse_obj(_TRAVERSE_STRING_DATA, ('str', 0), traverse_string=True) == 's', \
|
||||
'traverse into string if `traverse_string`'
|
||||
assert traverse_obj(_TRAVERSE_STRING_DATA, (1.2, 1), traverse_string=True) == '.', \
|
||||
'traverse into converted data if `traverse_string`'
|
||||
assert traverse_obj(_TRAVERSE_STRING_DATA, ('str', ...), traverse_string=True) == 'str', \
|
||||
'`...` should result in string (same value) if `traverse_string`'
|
||||
assert traverse_obj(_TRAVERSE_STRING_DATA, ('str', slice(0, None, 2)), traverse_string=True) == 'sr', \
|
||||
'`slice` should result in string if `traverse_string`'
|
||||
assert traverse_obj(_TRAVERSE_STRING_DATA, ('str', lambda i, v: i or v == 's'), traverse_string=True) == 'str', \
|
||||
'function should result in string if `traverse_string`'
|
||||
assert traverse_obj(_TRAVERSE_STRING_DATA, ('str', (0, 2)), traverse_string=True) == ['s', 'r'], \
|
||||
'branching should result in list if `traverse_string`'
|
||||
assert traverse_obj({}, (0, ...), traverse_string=True) == [], \
|
||||
'branching should result in list if `traverse_string`'
|
||||
assert traverse_obj({}, (0, lambda x, y: True), traverse_string=True) == [], \
|
||||
'branching should result in list if `traverse_string`'
|
||||
assert traverse_obj({}, (0, slice(1)), traverse_string=True) == [], \
|
||||
'branching should result in list if `traverse_string`'
|
||||
|
||||
def test_traversal_re(self):
|
||||
mobj = re.fullmatch(r'0(12)(?P<group>3)(4)?', '0123')
|
||||
assert traverse_obj(mobj, ...) == [x for x in mobj.groups() if x is not None], \
|
||||
'`...` on a `re.Match` should give its `groups()`'
|
||||
assert traverse_obj(mobj, lambda k, _: k in (0, 2)) == ['0123', '3'], \
|
||||
'function on a `re.Match` should give groupno, value starting at 0'
|
||||
assert traverse_obj(mobj, 'group') == '3', \
|
||||
'str key on a `re.Match` should give group with that name'
|
||||
assert traverse_obj(mobj, 2) == '3', \
|
||||
'int key on a `re.Match` should give group with that name'
|
||||
assert traverse_obj(mobj, 'gRoUp', casesense=False) == '3', \
|
||||
'str key on a `re.Match` should respect casesense'
|
||||
assert traverse_obj(mobj, 'fail') is None, \
|
||||
'failing str key on a `re.Match` should return `default`'
|
||||
assert traverse_obj(mobj, 'gRoUpS', casesense=False) is None, \
|
||||
'failing str key on a `re.Match` should return `default`'
|
||||
assert traverse_obj(mobj, 8) is None, \
|
||||
'failing int key on a `re.Match` should return `default`'
|
||||
assert traverse_obj(mobj, lambda k, _: k in (0, 'group')) == ['0123', '3'], \
|
||||
'function on a `re.Match` should give group name as well'
|
||||
|
||||
def test_traversal_xml_etree(self):
|
||||
etree = xml.etree.ElementTree.fromstring('''<?xml version="1.0"?>
|
||||
<data>
|
||||
<country name="Liechtenstein">
|
||||
<rank>1</rank>
|
||||
<year>2008</year>
|
||||
<gdppc>141100</gdppc>
|
||||
<neighbor name="Austria" direction="E"/>
|
||||
<neighbor name="Switzerland" direction="W"/>
|
||||
</country>
|
||||
<country name="Singapore">
|
||||
<rank>4</rank>
|
||||
<year>2011</year>
|
||||
<gdppc>59900</gdppc>
|
||||
<neighbor name="Malaysia" direction="N"/>
|
||||
</country>
|
||||
<country name="Panama">
|
||||
<rank>68</rank>
|
||||
<year>2011</year>
|
||||
<gdppc>13600</gdppc>
|
||||
<neighbor name="Costa Rica" direction="W"/>
|
||||
<neighbor name="Colombia" direction="E"/>
|
||||
</country>
|
||||
</data>''')
|
||||
assert traverse_obj(etree, '') == etree, \
|
||||
'empty str key should return the element itself'
|
||||
assert traverse_obj(etree, 'country') == list(etree), \
|
||||
'str key should lead all children with that tag name'
|
||||
assert traverse_obj(etree, ...) == list(etree), \
|
||||
'`...` as key should return all children'
|
||||
assert traverse_obj(etree, lambda _, x: x[0].text == '4') == [etree[1]], \
|
||||
'function as key should get element as value'
|
||||
assert traverse_obj(etree, lambda i, _: i == 1) == [etree[1]], \
|
||||
'function as key should get index as key'
|
||||
assert traverse_obj(etree, 0) == etree[0], \
|
||||
'int key should return the nth child'
|
||||
expected = ['Austria', 'Switzerland', 'Malaysia', 'Costa Rica', 'Colombia']
|
||||
assert traverse_obj(etree, './/neighbor/@name') == expected, \
|
||||
'`@<attribute>` at end of path should give that attribute'
|
||||
assert traverse_obj(etree, '//neighbor/@fail') == [None, None, None, None, None], \
|
||||
'`@<nonexistant>` at end of path should give `None`'
|
||||
assert traverse_obj(etree, ('//neighbor/@', 2)) == {'name': 'Malaysia', 'direction': 'N'}, \
|
||||
'`@` should give the full attribute dict'
|
||||
assert traverse_obj(etree, '//year/text()') == ['2008', '2011', '2011'], \
|
||||
'`text()` at end of path should give the inner text'
|
||||
assert traverse_obj(etree, '//*[@direction]/@direction') == ['E', 'W', 'N', 'W', 'E'], \
|
||||
'full Python xpath features should be supported'
|
||||
assert traverse_obj(etree, (0, '@name')) == 'Liechtenstein', \
|
||||
'special transformations should act on current element'
|
||||
assert traverse_obj(etree, ('country', 0, ..., 'text()', {int_or_none})) == [1, 2008, 141100], \
|
||||
'special transformations should act on current element'
|
||||
|
||||
def test_traversal_unbranching(self):
|
||||
assert traverse_obj(_TEST_DATA, [(100, 1.2), all]) == [100, 1.2], \
|
||||
'`all` should give all results as list'
|
||||
assert traverse_obj(_TEST_DATA, [(100, 1.2), any]) == 100, \
|
||||
'`any` should give the first result'
|
||||
assert traverse_obj(_TEST_DATA, [100, all]) == [100], \
|
||||
'`all` should give list if non branching'
|
||||
assert traverse_obj(_TEST_DATA, [100, any]) == 100, \
|
||||
'`any` should give single item if non branching'
|
||||
assert traverse_obj(_TEST_DATA, [('dict', 'None', 100), all]) == [100], \
|
||||
'`all` should filter `None` and empty dict'
|
||||
assert traverse_obj(_TEST_DATA, [('dict', 'None', 100), any]) == 100, \
|
||||
'`any` should filter `None` and empty dict'
|
||||
assert traverse_obj(_TEST_DATA, [{
|
||||
'all': [('dict', 'None', 100, 1.2), all],
|
||||
'any': [('dict', 'None', 100, 1.2), any],
|
||||
}]) == {'all': [100, 1.2], 'any': 100}, \
|
||||
'`all`/`any` should apply to each dict path separately'
|
||||
assert traverse_obj(_TEST_DATA, [{
|
||||
'all': [('dict', 'None', 100, 1.2), all],
|
||||
'any': [('dict', 'None', 100, 1.2), any],
|
||||
}], get_all=False) == {'all': [100, 1.2], 'any': 100}, \
|
||||
'`all`/`any` should apply to dict regardless of `get_all`'
|
||||
assert traverse_obj(_TEST_DATA, [('dict', 'None', 100, 1.2), all, {float}]) is None, \
|
||||
'`all` should reset branching status'
|
||||
assert traverse_obj(_TEST_DATA, [('dict', 'None', 100, 1.2), any, {float}]) is None, \
|
||||
'`any` should reset branching status'
|
||||
assert traverse_obj(_TEST_DATA, [('dict', 'None', 100, 1.2), all, ..., {float}]) == [1.2], \
|
||||
'`all` should allow further branching'
|
||||
assert traverse_obj(_TEST_DATA, [('dict', 'None', 'urls', 'data'), any, ..., 'index']) == [0, 1], \
|
||||
'`any` should allow further branching'
|
||||
|
||||
def test_traversal_morsel(self):
|
||||
values = {
|
||||
'expires': 'a',
|
||||
'path': 'b',
|
||||
'comment': 'c',
|
||||
'domain': 'd',
|
||||
'max-age': 'e',
|
||||
'secure': 'f',
|
||||
'httponly': 'g',
|
||||
'version': 'h',
|
||||
'samesite': 'i',
|
||||
}
|
||||
morsel = http.cookies.Morsel()
|
||||
morsel.set('item_key', 'item_value', 'coded_value')
|
||||
morsel.update(values)
|
||||
values['key'] = 'item_key'
|
||||
values['value'] = 'item_value'
|
||||
|
||||
for key, value in values.items():
|
||||
assert traverse_obj(morsel, key) == value, \
|
||||
'Morsel should provide access to all values'
|
||||
assert traverse_obj(morsel, ...) == list(values.values()), \
|
||||
'`...` should yield all values'
|
||||
assert traverse_obj(morsel, lambda k, v: True) == list(values.values()), \
|
||||
'function key should yield all values'
|
||||
assert traverse_obj(morsel, [(None,), any]) == morsel, \
|
||||
'Morsel should not be implicitly changed to dict on usage'
|
||||
|
||||
def test_traversal_filter(self):
|
||||
data = [None, False, True, 0, 1, 0.0, 1.1, '', 'str', {}, {0: 0}, [], [1]]
|
||||
|
||||
assert traverse_obj(data, [..., filter]) == [True, 1, 1.1, 'str', {0: 0}, [1]], \
|
||||
'`filter` should filter falsy values'
|
||||
|
||||
|
||||
class TestTraversalHelpers:
|
||||
def test_traversal_require(self):
|
||||
with pytest.raises(ExtractorError):
|
||||
traverse_obj(_TEST_DATA, ['None', {require('value')}])
|
||||
assert traverse_obj(_TEST_DATA, ['str', {require('value')}]) == 'str', \
|
||||
'`require` should pass through non `None` values'
|
||||
|
||||
def test_subs_list_to_dict(self):
|
||||
assert traverse_obj([
|
||||
{'name': 'de', 'url': 'https://example.com/subs/de.vtt'},
|
||||
{'name': 'en', 'url': 'https://example.com/subs/en1.ass'},
|
||||
{'name': 'en', 'url': 'https://example.com/subs/en2.ass'},
|
||||
], [..., {
|
||||
'id': 'name',
|
||||
'url': 'url',
|
||||
}, all, {subs_list_to_dict}]) == {
|
||||
'de': [{'url': 'https://example.com/subs/de.vtt'}],
|
||||
'en': [
|
||||
{'url': 'https://example.com/subs/en1.ass'},
|
||||
{'url': 'https://example.com/subs/en2.ass'},
|
||||
],
|
||||
}, 'function should build subtitle dict from list of subtitles'
|
||||
assert traverse_obj([
|
||||
{'name': 'de', 'url': 'https://example.com/subs/de.ass'},
|
||||
{'name': 'de'},
|
||||
{'name': 'en', 'content': 'content'},
|
||||
{'url': 'https://example.com/subs/en'},
|
||||
], [..., {
|
||||
'id': 'name',
|
||||
'data': 'content',
|
||||
'url': 'url',
|
||||
}, all, {subs_list_to_dict}]) == {
|
||||
'de': [{'url': 'https://example.com/subs/de.ass'}],
|
||||
'en': [{'data': 'content'}],
|
||||
}, 'subs with mandatory items missing should be filtered'
|
||||
assert traverse_obj([
|
||||
{'url': 'https://example.com/subs/de.ass', 'name': 'de'},
|
||||
{'url': 'https://example.com/subs/en', 'name': 'en'},
|
||||
], [..., {
|
||||
'id': 'name',
|
||||
'ext': ['url', {lambda x: determine_ext(x, default_ext=None)}],
|
||||
'url': 'url',
|
||||
}, all, {subs_list_to_dict(ext='ext')}]) == {
|
||||
'de': [{'url': 'https://example.com/subs/de.ass', 'ext': 'ass'}],
|
||||
'en': [{'url': 'https://example.com/subs/en', 'ext': 'ext'}],
|
||||
}, '`ext` should set default ext but leave existing value untouched'
|
||||
assert traverse_obj([
|
||||
{'name': 'en', 'url': 'https://example.com/subs/en2', 'prio': True},
|
||||
{'name': 'en', 'url': 'https://example.com/subs/en1', 'prio': False},
|
||||
], [..., {
|
||||
'id': 'name',
|
||||
'quality': ['prio', {int}],
|
||||
'url': 'url',
|
||||
}, all, {subs_list_to_dict(ext='ext')}]) == {'en': [
|
||||
{'url': 'https://example.com/subs/en1', 'ext': 'ext'},
|
||||
{'url': 'https://example.com/subs/en2', 'ext': 'ext'},
|
||||
]}, '`quality` key should sort subtitle list accordingly'
|
||||
|
||||
|
||||
class TestDictGet:
|
||||
def test_dict_get(self):
|
||||
FALSE_VALUES = {
|
||||
'none': None,
|
||||
'false': False,
|
||||
'zero': 0,
|
||||
'empty_string': '',
|
||||
'empty_list': [],
|
||||
}
|
||||
d = {**FALSE_VALUES, 'a': 42}
|
||||
assert dict_get(d, 'a') == 42
|
||||
assert dict_get(d, 'b') is None
|
||||
assert dict_get(d, 'b', 42) == 42
|
||||
assert dict_get(d, ('a',)) == 42
|
||||
assert dict_get(d, ('b', 'a')) == 42
|
||||
assert dict_get(d, ('b', 'c', 'a', 'd')) == 42
|
||||
assert dict_get(d, ('b', 'c')) is None
|
||||
assert dict_get(d, ('b', 'c'), 42) == 42
|
||||
for key, false_value in FALSE_VALUES.items():
|
||||
assert dict_get(d, ('b', 'c', key)) is None
|
||||
assert dict_get(d, ('b', 'c', key), skip_false_values=False) == false_value
|
277
test/test_update.py
Normal file
277
test/test_update.py
Normal file
|
@ -0,0 +1,277 @@
|
|||
#!/usr/bin/env python3
|
||||
|
||||
# Allow direct execution
|
||||
import os
|
||||
import sys
|
||||
import unittest
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
|
||||
from test.helper import FakeYDL, report_warning
|
||||
from yt_dlp.update import UpdateInfo, Updater
|
||||
|
||||
|
||||
# XXX: Keep in sync with yt_dlp.update.UPDATE_SOURCES
|
||||
TEST_UPDATE_SOURCES = {
|
||||
'stable': 'yt-dlp/yt-dlp',
|
||||
'nightly': 'yt-dlp/yt-dlp-nightly-builds',
|
||||
'master': 'yt-dlp/yt-dlp-master-builds',
|
||||
}
|
||||
|
||||
TEST_API_DATA = {
|
||||
'yt-dlp/yt-dlp/latest': {
|
||||
'tag_name': '2023.12.31',
|
||||
'target_commitish': 'bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb',
|
||||
'name': 'yt-dlp 2023.12.31',
|
||||
'body': 'BODY',
|
||||
},
|
||||
'yt-dlp/yt-dlp-nightly-builds/latest': {
|
||||
'tag_name': '2023.12.31.123456',
|
||||
'target_commitish': 'master',
|
||||
'name': 'yt-dlp nightly 2023.12.31.123456',
|
||||
'body': 'Generated from: https://github.com/yt-dlp/yt-dlp/commit/cccccccccccccccccccccccccccccccccccccccc',
|
||||
},
|
||||
'yt-dlp/yt-dlp-master-builds/latest': {
|
||||
'tag_name': '2023.12.31.987654',
|
||||
'target_commitish': 'master',
|
||||
'name': 'yt-dlp master 2023.12.31.987654',
|
||||
'body': 'Generated from: https://github.com/yt-dlp/yt-dlp/commit/dddddddddddddddddddddddddddddddddddddddd',
|
||||
},
|
||||
'yt-dlp/yt-dlp/tags/testing': {
|
||||
'tag_name': 'testing',
|
||||
'target_commitish': '9999999999999999999999999999999999999999',
|
||||
'name': 'testing',
|
||||
'body': 'BODY',
|
||||
},
|
||||
'fork/yt-dlp/latest': {
|
||||
'tag_name': '2050.12.31',
|
||||
'target_commitish': 'eeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee',
|
||||
'name': '2050.12.31',
|
||||
'body': 'BODY',
|
||||
},
|
||||
'fork/yt-dlp/tags/pr0000': {
|
||||
'tag_name': 'pr0000',
|
||||
'target_commitish': 'ffffffffffffffffffffffffffffffffffffffff',
|
||||
'name': 'pr1234 2023.11.11.000000',
|
||||
'body': 'BODY',
|
||||
},
|
||||
'fork/yt-dlp/tags/pr1234': {
|
||||
'tag_name': 'pr1234',
|
||||
'target_commitish': '0000000000000000000000000000000000000000',
|
||||
'name': 'pr1234 2023.12.31.555555',
|
||||
'body': 'BODY',
|
||||
},
|
||||
'fork/yt-dlp/tags/pr9999': {
|
||||
'tag_name': 'pr9999',
|
||||
'target_commitish': '1111111111111111111111111111111111111111',
|
||||
'name': 'pr9999',
|
||||
'body': 'BODY',
|
||||
},
|
||||
'fork/yt-dlp-satellite/tags/pr987': {
|
||||
'tag_name': 'pr987',
|
||||
'target_commitish': 'master',
|
||||
'name': 'pr987',
|
||||
'body': 'Generated from: https://github.com/yt-dlp/yt-dlp/commit/2222222222222222222222222222222222222222',
|
||||
},
|
||||
}
|
||||
|
||||
TEST_LOCKFILE_COMMENT = '# This file is used for regulating self-update'
|
||||
|
||||
TEST_LOCKFILE_V1 = rf'''{TEST_LOCKFILE_COMMENT}
|
||||
lock 2022.08.18.36 .+ Python 3\.6
|
||||
lock 2023.11.16 (?!win_x86_exe).+ Python 3\.7
|
||||
lock 2023.11.16 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
||||
lock 2024.10.22 py2exe .+
|
||||
lock 2024.10.22 linux_(?:armv7l|aarch64)_exe .+-glibc2\.(?:[12]?\d|30)\b
|
||||
lock 2024.10.22 (?!\w+_exe).+ Python 3\.8
|
||||
lock 2024.10.22 win(?:_x86)?_exe Python 3\.[78].+ Windows-(?:7-|2008ServerR2)
|
||||
'''
|
||||
|
||||
TEST_LOCKFILE_V2_TMPL = r'''%s
|
||||
lockV2 yt-dlp/yt-dlp 2022.08.18.36 .+ Python 3\.6
|
||||
lockV2 yt-dlp/yt-dlp 2023.11.16 (?!win_x86_exe).+ Python 3\.7
|
||||
lockV2 yt-dlp/yt-dlp 2023.11.16 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
||||
lockV2 yt-dlp/yt-dlp 2024.10.22 py2exe .+
|
||||
lockV2 yt-dlp/yt-dlp 2024.10.22 linux_(?:armv7l|aarch64)_exe .+-glibc2\.(?:[12]?\d|30)\b
|
||||
lockV2 yt-dlp/yt-dlp 2024.10.22 (?!\w+_exe).+ Python 3\.8
|
||||
lockV2 yt-dlp/yt-dlp 2024.10.22 win(?:_x86)?_exe Python 3\.[78].+ Windows-(?:7-|2008ServerR2)
|
||||
lockV2 yt-dlp/yt-dlp-nightly-builds 2023.11.15.232826 (?!win_x86_exe).+ Python 3\.7
|
||||
lockV2 yt-dlp/yt-dlp-nightly-builds 2023.11.15.232826 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
||||
lockV2 yt-dlp/yt-dlp-nightly-builds 2024.10.22.051025 py2exe .+
|
||||
lockV2 yt-dlp/yt-dlp-nightly-builds 2024.10.22.051025 linux_(?:armv7l|aarch64)_exe .+-glibc2\.(?:[12]?\d|30)\b
|
||||
lockV2 yt-dlp/yt-dlp-nightly-builds 2024.10.22.051025 (?!\w+_exe).+ Python 3\.8
|
||||
lockV2 yt-dlp/yt-dlp-nightly-builds 2024.10.22.051025 win(?:_x86)?_exe Python 3\.[78].+ Windows-(?:7-|2008ServerR2)
|
||||
lockV2 yt-dlp/yt-dlp-master-builds 2023.11.15.232812 (?!win_x86_exe).+ Python 3\.7
|
||||
lockV2 yt-dlp/yt-dlp-master-builds 2023.11.15.232812 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
||||
lockV2 yt-dlp/yt-dlp-master-builds 2024.10.22.045052 py2exe .+
|
||||
lockV2 yt-dlp/yt-dlp-master-builds 2024.10.22.060347 linux_(?:armv7l|aarch64)_exe .+-glibc2\.(?:[12]?\d|30)\b
|
||||
lockV2 yt-dlp/yt-dlp-master-builds 2024.10.22.060347 (?!\w+_exe).+ Python 3\.8
|
||||
lockV2 yt-dlp/yt-dlp-master-builds 2024.10.22.060347 win(?:_x86)?_exe Python 3\.[78].+ Windows-(?:7-|2008ServerR2)
|
||||
'''
|
||||
|
||||
TEST_LOCKFILE_V2 = TEST_LOCKFILE_V2_TMPL % TEST_LOCKFILE_COMMENT
|
||||
|
||||
TEST_LOCKFILE_ACTUAL = TEST_LOCKFILE_V2_TMPL % TEST_LOCKFILE_V1.rstrip('\n')
|
||||
|
||||
TEST_LOCKFILE_FORK = rf'''{TEST_LOCKFILE_ACTUAL}# Test if a fork blocks updates to non-numeric tags
|
||||
lockV2 fork/yt-dlp pr0000 .+ Python 3.6
|
||||
lockV2 fork/yt-dlp pr1234 (?!win_x86_exe).+ Python 3\.7
|
||||
lockV2 fork/yt-dlp pr1234 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
||||
lockV2 fork/yt-dlp pr9999 .+ Python 3.11
|
||||
'''
|
||||
|
||||
|
||||
class FakeUpdater(Updater):
|
||||
current_version = '2022.01.01'
|
||||
current_commit = 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa'
|
||||
|
||||
_channel = 'stable'
|
||||
_origin = 'yt-dlp/yt-dlp'
|
||||
_update_sources = TEST_UPDATE_SOURCES
|
||||
|
||||
def _download_update_spec(self, *args, **kwargs):
|
||||
return TEST_LOCKFILE_ACTUAL
|
||||
|
||||
def _call_api(self, tag):
|
||||
tag = f'tags/{tag}' if tag != 'latest' else tag
|
||||
return TEST_API_DATA[f'{self.requested_repo}/{tag}']
|
||||
|
||||
def _report_error(self, msg, *args, **kwargs):
|
||||
report_warning(msg)
|
||||
|
||||
|
||||
class TestUpdate(unittest.TestCase):
|
||||
maxDiff = None
|
||||
|
||||
def test_update_spec(self):
|
||||
ydl = FakeYDL()
|
||||
updater = FakeUpdater(ydl, 'stable')
|
||||
|
||||
def test(lockfile, identifier, input_tag, expect_tag, exact=False, repo='yt-dlp/yt-dlp'):
|
||||
updater._identifier = identifier
|
||||
updater._exact = exact
|
||||
updater.requested_repo = repo
|
||||
result = updater._process_update_spec(lockfile, input_tag)
|
||||
self.assertEqual(
|
||||
result, expect_tag,
|
||||
f'{identifier!r} requesting {repo}@{input_tag} (exact={exact}) '
|
||||
f'returned {result!r} instead of {expect_tag!r}')
|
||||
|
||||
for lockfile in (TEST_LOCKFILE_V1, TEST_LOCKFILE_V2, TEST_LOCKFILE_ACTUAL, TEST_LOCKFILE_FORK):
|
||||
# Normal operation
|
||||
test(lockfile, 'zip Python 3.12.0', '2023.12.31', '2023.12.31')
|
||||
test(lockfile, 'zip Python 3.12.0', '2023.12.31', '2023.12.31', exact=True)
|
||||
# py2exe should never update beyond 2024.10.22
|
||||
test(lockfile, 'py2exe Python 3.8', '2025.01.01', '2024.10.22')
|
||||
test(lockfile, 'py2exe Python 3.8', '2025.01.01', None, exact=True)
|
||||
# Python 3.6 --update should update only to the py3.6 lock
|
||||
test(lockfile, 'zip Python 3.6.0', '2023.11.16', '2022.08.18.36')
|
||||
# Python 3.6 --update-to an exact version later than the py3.6 lock should return None
|
||||
test(lockfile, 'zip Python 3.6.0', '2023.11.16', None, exact=True)
|
||||
# Python 3.7 should be able to update to the py3.7 lock
|
||||
test(lockfile, 'zip Python 3.7.0', '2023.11.16', '2023.11.16')
|
||||
test(lockfile, 'zip Python 3.7.1', '2023.11.16', '2023.11.16', exact=True)
|
||||
# Non-win_x86_exe builds on py3.7 must be locked at py3.7 lock
|
||||
test(lockfile, 'zip Python 3.7.1', '2023.12.31', '2023.11.16')
|
||||
test(lockfile, 'zip Python 3.7.1', '2023.12.31', None, exact=True)
|
||||
# Python 3.8 should only update to the py3.8 lock
|
||||
test(lockfile, 'zip Python 3.8.10', '2025.01.01', '2024.10.22')
|
||||
test(lockfile, 'zip Python 3.8.110', '2025.01.01', None, exact=True)
|
||||
test( # Windows Vista w/ win_x86_exe must be locked at Vista lock
|
||||
lockfile, 'win_x86_exe Python 3.7.9 (CPython x86 32bit) - Windows-Vista-6.0.6003-SP2',
|
||||
'2023.12.31', '2023.11.16')
|
||||
test( # Windows 2008Server w/ win_x86_exe must be locked at Vista lock
|
||||
lockfile, 'win_x86_exe Python 3.7.9 (CPython x86 32bit) - Windows-2008Server',
|
||||
'2023.12.31', None, exact=True)
|
||||
test( # Windows 7 w/ win_x86_exe py3.7 build should be able to update beyond py3.7 lock
|
||||
lockfile, 'win_x86_exe Python 3.7.9 (CPython x86 32bit) - Windows-7-6.1.7601-SP1',
|
||||
'2023.12.31', '2023.12.31', exact=True)
|
||||
test( # Windows 7 win_x86_exe should only update to Win7 lock
|
||||
lockfile, 'win_x86_exe Python 3.7.9 (CPython x86 32bit) - Windows-7-6.1.7601-SP1',
|
||||
'2025.01.01', '2024.10.22')
|
||||
test( # Windows 2008ServerR2 win_exe should only update to Win7 lock
|
||||
lockfile, 'win_exe Python 3.8.10 (CPython x86 32bit) - Windows-2008ServerR2',
|
||||
'2025.12.31', '2024.10.22')
|
||||
test( # Windows 8.1 w/ '2008Server' in platform string should be able to update beyond py3.7 lock
|
||||
lockfile, 'win_x86_exe Python 3.7.9 (CPython x86 32bit) - Windows-post2008Server-6.2.9200',
|
||||
'2023.12.31', '2023.12.31', exact=True)
|
||||
test( # win_exe built w/Python 3.8 on Windows>=8 should be able to update beyond py3.8 lock
|
||||
lockfile, 'win_exe Python 3.8.10 (CPython AMD64 64bit) - Windows-10-10.0.20348-SP0',
|
||||
'2025.01.01', '2025.01.01', exact=True)
|
||||
test( # linux_armv7l_exe w/glibc2.7 should only update to glibc<2.31 lock
|
||||
lockfile, 'linux_armv7l_exe Python 3.8.0 (CPython armv7l 32bit) - Linux-6.5.0-1025-azure-armv7l-with-glibc2.7',
|
||||
'2025.01.01', '2024.10.22')
|
||||
test( # linux_armv7l_exe w/Python 3.8 and glibc>=2.31 should be able to update beyond py3.8 and glibc<2.31 locks
|
||||
lockfile, 'linux_armv7l_exe Python 3.8.0 (CPython armv7l 32bit) - Linux-6.5.0-1025-azure-armv7l-with-glibc2.31',
|
||||
'2025.01.01', '2025.01.01')
|
||||
test( # linux_armv7l_exe w/glibc2.30 should only update to glibc<2.31 lock
|
||||
lockfile, 'linux_armv7l_exe Python 3.8.0 (CPython armv7l 64bit) - Linux-6.5.0-1025-azure-aarch64-with-glibc2.30 (OpenSSL',
|
||||
'2025.01.01', '2024.10.22')
|
||||
test( # linux_aarch64_exe w/glibc2.17 should only update to glibc<2.31 lock
|
||||
lockfile, 'linux_aarch64_exe Python 3.8.0 (CPython aarch64 64bit) - Linux-6.5.0-1025-azure-aarch64-with-glibc2.17',
|
||||
'2025.01.01', '2024.10.22')
|
||||
test( # linux_aarch64_exe w/glibc2.40 and glibc>=2.31 should be able to update beyond py3.8 and glibc<2.31 locks
|
||||
lockfile, 'linux_aarch64_exe Python 3.8.0 (CPython aarch64 64bit) - Linux-6.5.0-1025-azure-aarch64-with-glibc2.40',
|
||||
'2025.01.01', '2025.01.01')
|
||||
test( # linux_aarch64_exe w/glibc2.3 should only update to glibc<2.31 lock
|
||||
lockfile, 'linux_aarch64_exe Python 3.8.0 (CPython aarch64 64bit) - Linux-6.5.0-1025-azure-aarch64-with-glibc2.3 (OpenSSL',
|
||||
'2025.01.01', '2024.10.22')
|
||||
|
||||
# Forks can block updates to non-numeric tags rather than lock
|
||||
test(TEST_LOCKFILE_FORK, 'zip Python 3.6.3', 'pr0000', None, repo='fork/yt-dlp')
|
||||
test(TEST_LOCKFILE_FORK, 'zip Python 3.7.4', 'pr0000', 'pr0000', repo='fork/yt-dlp')
|
||||
test(TEST_LOCKFILE_FORK, 'zip Python 3.7.4', 'pr1234', None, repo='fork/yt-dlp')
|
||||
test(TEST_LOCKFILE_FORK, 'zip Python 3.8.1', 'pr1234', 'pr1234', repo='fork/yt-dlp', exact=True)
|
||||
test(
|
||||
TEST_LOCKFILE_FORK, 'win_x86_exe Python 3.7.9 (CPython x86 32bit) - Windows-Vista-6.0.6003-SP2',
|
||||
'pr1234', None, repo='fork/yt-dlp')
|
||||
test(
|
||||
TEST_LOCKFILE_FORK, 'win_x86_exe Python 3.7.9 (CPython x86 32bit) - Windows-7-6.1.7601-SP1',
|
||||
'2023.12.31', '2023.12.31', repo='fork/yt-dlp')
|
||||
test(TEST_LOCKFILE_FORK, 'zip Python 3.11.2', 'pr9999', None, repo='fork/yt-dlp', exact=True)
|
||||
test(TEST_LOCKFILE_FORK, 'zip Python 3.12.0', 'pr9999', 'pr9999', repo='fork/yt-dlp')
|
||||
|
||||
def test_query_update(self):
|
||||
ydl = FakeYDL()
|
||||
|
||||
def test(target, expected, current_version=None, current_commit=None, identifier=None):
|
||||
updater = FakeUpdater(ydl, target)
|
||||
if current_version:
|
||||
updater.current_version = current_version
|
||||
if current_commit:
|
||||
updater.current_commit = current_commit
|
||||
updater._identifier = identifier or 'zip'
|
||||
update_info = updater.query_update(_output=True)
|
||||
self.assertDictEqual(
|
||||
update_info.__dict__ if update_info else {}, expected.__dict__ if expected else {})
|
||||
|
||||
test('yt-dlp/yt-dlp@latest', UpdateInfo(
|
||||
'2023.12.31', version='2023.12.31', requested_version='2023.12.31', commit='b' * 40))
|
||||
test('yt-dlp/yt-dlp-nightly-builds@latest', UpdateInfo(
|
||||
'2023.12.31.123456', version='2023.12.31.123456', requested_version='2023.12.31.123456', commit='c' * 40))
|
||||
test('yt-dlp/yt-dlp-master-builds@latest', UpdateInfo(
|
||||
'2023.12.31.987654', version='2023.12.31.987654', requested_version='2023.12.31.987654', commit='d' * 40))
|
||||
test('fork/yt-dlp@latest', UpdateInfo(
|
||||
'2050.12.31', version='2050.12.31', requested_version='2050.12.31', commit='e' * 40))
|
||||
test('fork/yt-dlp@pr0000', UpdateInfo(
|
||||
'pr0000', version='2023.11.11.000000', requested_version='2023.11.11.000000', commit='f' * 40))
|
||||
test('fork/yt-dlp@pr1234', UpdateInfo(
|
||||
'pr1234', version='2023.12.31.555555', requested_version='2023.12.31.555555', commit='0' * 40))
|
||||
test('fork/yt-dlp@pr9999', UpdateInfo(
|
||||
'pr9999', version=None, requested_version=None, commit='1' * 40))
|
||||
test('fork/yt-dlp-satellite@pr987', UpdateInfo(
|
||||
'pr987', version=None, requested_version=None, commit='2' * 40))
|
||||
test('yt-dlp/yt-dlp', None, current_version='2024.01.01')
|
||||
test('stable', UpdateInfo(
|
||||
'2023.12.31', version='2023.12.31', requested_version='2023.12.31', commit='b' * 40))
|
||||
test('nightly', UpdateInfo(
|
||||
'2023.12.31.123456', version='2023.12.31.123456', requested_version='2023.12.31.123456', commit='c' * 40))
|
||||
test('master', UpdateInfo(
|
||||
'2023.12.31.987654', version='2023.12.31.987654', requested_version='2023.12.31.987654', commit='d' * 40))
|
||||
test('testing', None, current_commit='9' * 40)
|
||||
test('testing', UpdateInfo('testing', commit='9' * 40))
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main()
|
|
@ -1,30 +0,0 @@
|
|||
#!/usr/bin/env python3
|
||||
|
||||
# Allow direct execution
|
||||
import os
|
||||
import sys
|
||||
import unittest
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
|
||||
import json
|
||||
|
||||
from yt_dlp.update import rsa_verify
|
||||
|
||||
|
||||
class TestUpdate(unittest.TestCase):
|
||||
def test_rsa_verify(self):
|
||||
UPDATES_RSA_KEY = (0x9d60ee4d8f805312fdb15a62f87b95bd66177b91df176765d13514a0f1754bcd2057295c5b6f1d35daa6742c3ffc9a82d3e118861c207995a8031e151d863c9927e304576bc80692bc8e094896fcf11b66f3e29e04e3a71e9a11558558acea1840aec37fc396fb6b65dc81a1c4144e03bd1c011de62e3f1357b327d08426fe93, 65537)
|
||||
with open(os.path.join(os.path.dirname(os.path.abspath(__file__)), 'versions.json'), 'rb') as f:
|
||||
versions_info = f.read().decode()
|
||||
versions_info = json.loads(versions_info)
|
||||
signature = versions_info['signature']
|
||||
del versions_info['signature']
|
||||
self.assertTrue(rsa_verify(
|
||||
json.dumps(versions_info, sort_keys=True).encode(),
|
||||
signature, UPDATES_RSA_KEY))
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main()
|
|
@ -2,9 +2,10 @@
|
|||
|
||||
# Allow direct execution
|
||||
import os
|
||||
import re
|
||||
import sys
|
||||
import unittest
|
||||
import warnings
|
||||
import datetime as dt
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||
|
||||
|
@ -13,6 +14,7 @@
|
|||
import io
|
||||
import itertools
|
||||
import json
|
||||
import subprocess
|
||||
import xml.etree.ElementTree
|
||||
|
||||
from yt_dlp.compat import (
|
||||
|
@ -26,7 +28,9 @@
|
|||
ExtractorError,
|
||||
InAdvancePagedList,
|
||||
LazyList,
|
||||
NO_DEFAULT,
|
||||
OnDemandPagedList,
|
||||
Popen,
|
||||
age_restricted,
|
||||
args_to_str,
|
||||
base_url,
|
||||
|
@ -42,14 +46,12 @@
|
|||
determine_ext,
|
||||
determine_file_encoding,
|
||||
dfxp2srt,
|
||||
dict_get,
|
||||
encode_base_n,
|
||||
encode_compat_str,
|
||||
encodeFilename,
|
||||
escape_rfc3986,
|
||||
escape_url,
|
||||
expand_path,
|
||||
extract_attributes,
|
||||
extract_basic_auth,
|
||||
find_xpath_attr,
|
||||
fix_xml_ampersands,
|
||||
float_or_none,
|
||||
|
@ -102,7 +104,6 @@
|
|||
sanitize_filename,
|
||||
sanitize_path,
|
||||
sanitize_url,
|
||||
sanitized_Request,
|
||||
shell_quote,
|
||||
smuggle_url,
|
||||
str_to_int,
|
||||
|
@ -110,7 +111,7 @@
|
|||
strip_or_none,
|
||||
subtitles_filename,
|
||||
timeconvert,
|
||||
traverse_obj,
|
||||
try_call,
|
||||
unescapeHTML,
|
||||
unified_strdate,
|
||||
unified_timestamp,
|
||||
|
@ -122,12 +123,20 @@
|
|||
urlencode_postdata,
|
||||
urljoin,
|
||||
urshift,
|
||||
variadic,
|
||||
version_tuple,
|
||||
xpath_attr,
|
||||
xpath_element,
|
||||
xpath_text,
|
||||
xpath_with_ns,
|
||||
)
|
||||
from yt_dlp.utils._utils import _UnsafeExtensionError
|
||||
from yt_dlp.utils.networking import (
|
||||
HTTPHeaderDict,
|
||||
escape_rfc3986,
|
||||
normalize_url,
|
||||
remove_dot_segments,
|
||||
)
|
||||
|
||||
|
||||
class TestUtil(unittest.TestCase):
|
||||
|
@ -212,9 +221,10 @@ def test_sanitize_ids(self):
|
|||
self.assertEqual(sanitize_filename('N0Y__7-UOdI', is_id=True), 'N0Y__7-UOdI')
|
||||
|
||||
def test_sanitize_path(self):
|
||||
if sys.platform != 'win32':
|
||||
return
|
||||
with unittest.mock.patch('sys.platform', 'win32'):
|
||||
self._test_sanitize_path()
|
||||
|
||||
def _test_sanitize_path(self):
|
||||
self.assertEqual(sanitize_path('abc'), 'abc')
|
||||
self.assertEqual(sanitize_path('abc/def'), 'abc\\def')
|
||||
self.assertEqual(sanitize_path('abc\\def'), 'abc\\def')
|
||||
|
@ -247,6 +257,11 @@ def test_sanitize_path(self):
|
|||
self.assertEqual(sanitize_path('./abc'), 'abc')
|
||||
self.assertEqual(sanitize_path('./../abc'), '..\\abc')
|
||||
|
||||
self.assertEqual(sanitize_path('\\abc'), '\\abc')
|
||||
self.assertEqual(sanitize_path('C:abc'), 'C:abc')
|
||||
self.assertEqual(sanitize_path('C:abc\\..\\'), 'C:..')
|
||||
self.assertEqual(sanitize_path('C:\\abc:%(title)s.%(ext)s'), 'C:\\abc#%(title)s.%(ext)s')
|
||||
|
||||
def test_sanitize_url(self):
|
||||
self.assertEqual(sanitize_url('//foo.bar'), 'http://foo.bar')
|
||||
self.assertEqual(sanitize_url('httpss://foo.bar'), 'https://foo.bar')
|
||||
|
@ -254,15 +269,6 @@ def test_sanitize_url(self):
|
|||
self.assertEqual(sanitize_url('https://foo.bar'), 'https://foo.bar')
|
||||
self.assertEqual(sanitize_url('foo bar'), 'foo bar')
|
||||
|
||||
def test_extract_basic_auth(self):
|
||||
auth_header = lambda url: sanitized_Request(url).get_header('Authorization')
|
||||
self.assertFalse(auth_header('http://foo.bar'))
|
||||
self.assertFalse(auth_header('http://:foo.bar'))
|
||||
self.assertEqual(auth_header('http://@foo.bar'), 'Basic Og==')
|
||||
self.assertEqual(auth_header('http://:pass@foo.bar'), 'Basic OnBhc3M=')
|
||||
self.assertEqual(auth_header('http://user:@foo.bar'), 'Basic dXNlcjo=')
|
||||
self.assertEqual(auth_header('http://user:pass@foo.bar'), 'Basic dXNlcjpwYXNz')
|
||||
|
||||
def test_expand_path(self):
|
||||
def env(var):
|
||||
return f'%{var}%' if sys.platform == 'win32' else f'${var}'
|
||||
|
@ -277,11 +283,18 @@ def env(var):
|
|||
self.assertEqual(expand_path(env('HOME')), os.getenv('HOME'))
|
||||
self.assertEqual(expand_path('~'), os.getenv('HOME'))
|
||||
self.assertEqual(
|
||||
expand_path('~/%s' % env('yt_dlp_EXPATH_PATH')),
|
||||
'%s/expanded' % os.getenv('HOME'))
|
||||
expand_path('~/{}'.format(env('yt_dlp_EXPATH_PATH'))),
|
||||
'{}/expanded'.format(os.getenv('HOME')))
|
||||
finally:
|
||||
os.environ['HOME'] = old_home or ''
|
||||
|
||||
_uncommon_extensions = [
|
||||
('exe', 'abc.exe.ext'),
|
||||
('de', 'abc.de.ext'),
|
||||
('../.mp4', None),
|
||||
('..\\.mp4', None),
|
||||
]
|
||||
|
||||
def test_prepend_extension(self):
|
||||
self.assertEqual(prepend_extension('abc.ext', 'temp'), 'abc.temp.ext')
|
||||
self.assertEqual(prepend_extension('abc.ext', 'temp', 'ext'), 'abc.temp.ext')
|
||||
|
@ -290,6 +303,19 @@ def test_prepend_extension(self):
|
|||
self.assertEqual(prepend_extension('.abc', 'temp'), '.abc.temp')
|
||||
self.assertEqual(prepend_extension('.abc.ext', 'temp'), '.abc.temp.ext')
|
||||
|
||||
# Test uncommon extensions
|
||||
self.assertEqual(prepend_extension('abc.ext', 'bin'), 'abc.bin.ext')
|
||||
for ext, result in self._uncommon_extensions:
|
||||
with self.assertRaises(_UnsafeExtensionError):
|
||||
prepend_extension('abc', ext)
|
||||
if result:
|
||||
self.assertEqual(prepend_extension('abc.ext', ext, 'ext'), result)
|
||||
else:
|
||||
with self.assertRaises(_UnsafeExtensionError):
|
||||
prepend_extension('abc.ext', ext, 'ext')
|
||||
with self.assertRaises(_UnsafeExtensionError):
|
||||
prepend_extension('abc.unexpected_ext', ext, 'ext')
|
||||
|
||||
def test_replace_extension(self):
|
||||
self.assertEqual(replace_extension('abc.ext', 'temp'), 'abc.temp')
|
||||
self.assertEqual(replace_extension('abc.ext', 'temp', 'ext'), 'abc.temp')
|
||||
|
@ -298,6 +324,16 @@ def test_replace_extension(self):
|
|||
self.assertEqual(replace_extension('.abc', 'temp'), '.abc.temp')
|
||||
self.assertEqual(replace_extension('.abc.ext', 'temp'), '.abc.temp')
|
||||
|
||||
# Test uncommon extensions
|
||||
self.assertEqual(replace_extension('abc.ext', 'bin'), 'abc.unknown_video')
|
||||
for ext, _ in self._uncommon_extensions:
|
||||
with self.assertRaises(_UnsafeExtensionError):
|
||||
replace_extension('abc', ext)
|
||||
with self.assertRaises(_UnsafeExtensionError):
|
||||
replace_extension('abc.ext', ext, 'ext')
|
||||
with self.assertRaises(_UnsafeExtensionError):
|
||||
replace_extension('abc.unexpected_ext', ext, 'ext')
|
||||
|
||||
def test_subtitles_filename(self):
|
||||
self.assertEqual(subtitles_filename('abc.ext', 'en', 'vtt'), 'abc.en.vtt')
|
||||
self.assertEqual(subtitles_filename('abc.ext', 'en', 'vtt', 'ext'), 'abc.en.vtt')
|
||||
|
@ -357,12 +393,12 @@ def test_datetime_from_str(self):
|
|||
self.assertEqual(datetime_from_str('now+23hours', precision='hour'), datetime_from_str('now+23hours', precision='auto'))
|
||||
|
||||
def test_daterange(self):
|
||||
_20century = DateRange("19000101", "20000101")
|
||||
self.assertFalse("17890714" in _20century)
|
||||
_ac = DateRange("00010101")
|
||||
self.assertTrue("19690721" in _ac)
|
||||
_firstmilenium = DateRange(end="10000101")
|
||||
self.assertTrue("07110427" in _firstmilenium)
|
||||
_20century = DateRange('19000101', '20000101')
|
||||
self.assertFalse('17890714' in _20century)
|
||||
_ac = DateRange('00010101')
|
||||
self.assertTrue('19690721' in _ac)
|
||||
_firstmilenium = DateRange(end='10000101')
|
||||
self.assertTrue('07110427' in _firstmilenium)
|
||||
|
||||
def test_unified_dates(self):
|
||||
self.assertEqual(unified_strdate('December 21, 2010'), '20101221')
|
||||
|
@ -414,6 +450,8 @@ def test_unified_timestamps(self):
|
|||
self.assertEqual(unified_timestamp('Sep 11, 2013 | 5:49 AM'), 1378878540)
|
||||
self.assertEqual(unified_timestamp('December 15, 2017 at 7:49 am'), 1513324140)
|
||||
self.assertEqual(unified_timestamp('2018-03-14T08:32:43.1493874+00:00'), 1521016363)
|
||||
self.assertEqual(unified_timestamp('Sunday, 26 Nov 2006, 19:00'), 1164567600)
|
||||
self.assertEqual(unified_timestamp('wed, aug 16, 2008, 12:00pm'), 1218931200)
|
||||
|
||||
self.assertEqual(unified_timestamp('December 31 1969 20:00:01 EDT'), 1)
|
||||
self.assertEqual(unified_timestamp('Wednesday 31 December 1969 18:01:26 MDT'), 86)
|
||||
|
@ -507,7 +545,7 @@ def test_xpath_attr(self):
|
|||
self.assertRaises(ExtractorError, xpath_attr, doc, 'div/p', 'y', fatal=True)
|
||||
|
||||
def test_smuggle_url(self):
|
||||
data = {"ö": "ö", "abc": [3]}
|
||||
data = {'ö': 'ö', 'abc': [3]}
|
||||
url = 'https://foo.bar/baz?x=y#a'
|
||||
smug_url = smuggle_url(url, data)
|
||||
unsmug_url, unsmug_data = unsmuggle_url(smug_url)
|
||||
|
@ -659,6 +697,8 @@ def test_parse_duration(self):
|
|||
self.assertEqual(parse_duration('P0Y0M0DT0H4M20.880S'), 260.88)
|
||||
self.assertEqual(parse_duration('01:02:03:050'), 3723.05)
|
||||
self.assertEqual(parse_duration('103:050'), 103.05)
|
||||
self.assertEqual(parse_duration('1HR 3MIN'), 3780)
|
||||
self.assertEqual(parse_duration('2hrs 3mins'), 7380)
|
||||
|
||||
def test_fix_xml_ampersands(self):
|
||||
self.assertEqual(
|
||||
|
@ -752,28 +792,6 @@ def test_multipart_encode(self):
|
|||
self.assertRaises(
|
||||
ValueError, multipart_encode, {b'field': b'value'}, boundary='value')
|
||||
|
||||
def test_dict_get(self):
|
||||
FALSE_VALUES = {
|
||||
'none': None,
|
||||
'false': False,
|
||||
'zero': 0,
|
||||
'empty_string': '',
|
||||
'empty_list': [],
|
||||
}
|
||||
d = FALSE_VALUES.copy()
|
||||
d['a'] = 42
|
||||
self.assertEqual(dict_get(d, 'a'), 42)
|
||||
self.assertEqual(dict_get(d, 'b'), None)
|
||||
self.assertEqual(dict_get(d, 'b', 42), 42)
|
||||
self.assertEqual(dict_get(d, ('a', )), 42)
|
||||
self.assertEqual(dict_get(d, ('b', 'a', )), 42)
|
||||
self.assertEqual(dict_get(d, ('b', 'c', 'a', 'd', )), 42)
|
||||
self.assertEqual(dict_get(d, ('b', 'c', )), None)
|
||||
self.assertEqual(dict_get(d, ('b', 'c', ), 42), 42)
|
||||
for key, false_value in FALSE_VALUES.items():
|
||||
self.assertEqual(dict_get(d, ('b', 'c', key, )), None)
|
||||
self.assertEqual(dict_get(d, ('b', 'c', key, ), skip_false_values=False), false_value)
|
||||
|
||||
def test_merge_dicts(self):
|
||||
self.assertEqual(merge_dicts({'a': 1}, {'b': 2}), {'a': 1, 'b': 2})
|
||||
self.assertEqual(merge_dicts({'a': 1}, {'a': 2}), {'a': 1})
|
||||
|
@ -791,6 +809,11 @@ def test_encode_compat_str(self):
|
|||
|
||||
def test_parse_iso8601(self):
|
||||
self.assertEqual(parse_iso8601('2014-03-23T23:04:26+0100'), 1395612266)
|
||||
self.assertEqual(parse_iso8601('2014-03-23T23:04:26-07:00'), 1395641066)
|
||||
self.assertEqual(parse_iso8601('2014-03-23T23:04:26', timezone=dt.timedelta(hours=-7)), 1395641066)
|
||||
self.assertEqual(parse_iso8601('2014-03-23T23:04:26', timezone=NO_DEFAULT), None)
|
||||
# default does not override timezone in date_str
|
||||
self.assertEqual(parse_iso8601('2014-03-23T23:04:26-07:00', timezone=dt.timedelta(hours=-10)), 1395641066)
|
||||
self.assertEqual(parse_iso8601('2014-03-23T22:04:26+0000'), 1395612266)
|
||||
self.assertEqual(parse_iso8601('2014-03-23T22:04:26Z'), 1395612266)
|
||||
self.assertEqual(parse_iso8601('2014-03-23T22:04:26.1234Z'), 1395612266)
|
||||
|
@ -800,7 +823,7 @@ def test_parse_iso8601(self):
|
|||
def test_strip_jsonp(self):
|
||||
stripped = strip_jsonp('cb ([ {"id":"532cb",\n\n\n"x":\n3}\n]\n);')
|
||||
d = json.loads(stripped)
|
||||
self.assertEqual(d, [{"id": "532cb", "x": 3}])
|
||||
self.assertEqual(d, [{'id': '532cb', 'x': 3}])
|
||||
|
||||
stripped = strip_jsonp('parseMetadata({"STATUS":"OK"})\n\n\n//epc')
|
||||
d = json.loads(stripped)
|
||||
|
@ -904,6 +927,11 @@ def test_parse_codecs(self):
|
|||
'acodec': 'none',
|
||||
'dynamic_range': 'HDR10',
|
||||
})
|
||||
self.assertEqual(parse_codecs('vp09.02.50.10.01.09.18.09.00'), {
|
||||
'vcodec': 'vp09.02.50.10.01.09.18.09.00',
|
||||
'acodec': 'none',
|
||||
'dynamic_range': 'HDR10',
|
||||
})
|
||||
self.assertEqual(parse_codecs('av01.0.12M.10.0.110.09.16.09.0'), {
|
||||
'vcodec': 'av01.0.12M.10.0.110.09.16.09.0',
|
||||
'acodec': 'none',
|
||||
|
@ -914,6 +942,11 @@ def test_parse_codecs(self):
|
|||
'acodec': 'none',
|
||||
'dynamic_range': 'DV',
|
||||
})
|
||||
self.assertEqual(parse_codecs('fLaC'), {
|
||||
'vcodec': 'none',
|
||||
'acodec': 'flac',
|
||||
'dynamic_range': None,
|
||||
})
|
||||
self.assertEqual(parse_codecs('theora, vorbis'), {
|
||||
'vcodec': 'theora',
|
||||
'acodec': 'vorbis',
|
||||
|
@ -935,24 +968,124 @@ def test_escape_rfc3986(self):
|
|||
self.assertEqual(escape_rfc3986('foo bar'), 'foo%20bar')
|
||||
self.assertEqual(escape_rfc3986('foo%20bar'), 'foo%20bar')
|
||||
|
||||
def test_escape_url(self):
|
||||
def test_normalize_url(self):
|
||||
self.assertEqual(
|
||||
escape_url('http://wowza.imust.org/srv/vod/telemb/new/UPLOAD/UPLOAD/20224_IncendieHavré_FD.mp4'),
|
||||
'http://wowza.imust.org/srv/vod/telemb/new/UPLOAD/UPLOAD/20224_IncendieHavre%CC%81_FD.mp4'
|
||||
normalize_url('http://wowza.imust.org/srv/vod/telemb/new/UPLOAD/UPLOAD/20224_IncendieHavré_FD.mp4'),
|
||||
'http://wowza.imust.org/srv/vod/telemb/new/UPLOAD/UPLOAD/20224_IncendieHavre%CC%81_FD.mp4',
|
||||
)
|
||||
self.assertEqual(
|
||||
escape_url('http://www.ardmediathek.de/tv/Sturm-der-Liebe/Folge-2036-Zu-Mann-und-Frau-erklärt/Das-Erste/Video?documentId=22673108&bcastId=5290'),
|
||||
'http://www.ardmediathek.de/tv/Sturm-der-Liebe/Folge-2036-Zu-Mann-und-Frau-erkl%C3%A4rt/Das-Erste/Video?documentId=22673108&bcastId=5290'
|
||||
normalize_url('http://www.ardmediathek.de/tv/Sturm-der-Liebe/Folge-2036-Zu-Mann-und-Frau-erklärt/Das-Erste/Video?documentId=22673108&bcastId=5290'),
|
||||
'http://www.ardmediathek.de/tv/Sturm-der-Liebe/Folge-2036-Zu-Mann-und-Frau-erkl%C3%A4rt/Das-Erste/Video?documentId=22673108&bcastId=5290',
|
||||
)
|
||||
self.assertEqual(
|
||||
escape_url('http://тест.рф/фрагмент'),
|
||||
'http://xn--e1aybc.xn--p1ai/%D1%84%D1%80%D0%B0%D0%B3%D0%BC%D0%B5%D0%BD%D1%82'
|
||||
normalize_url('http://тест.рф/фрагмент'),
|
||||
'http://xn--e1aybc.xn--p1ai/%D1%84%D1%80%D0%B0%D0%B3%D0%BC%D0%B5%D0%BD%D1%82',
|
||||
)
|
||||
self.assertEqual(
|
||||
escape_url('http://тест.рф/абв?абв=абв#абв'),
|
||||
'http://xn--e1aybc.xn--p1ai/%D0%B0%D0%B1%D0%B2?%D0%B0%D0%B1%D0%B2=%D0%B0%D0%B1%D0%B2#%D0%B0%D0%B1%D0%B2'
|
||||
normalize_url('http://тест.рф/абв?абв=абв#абв'),
|
||||
'http://xn--e1aybc.xn--p1ai/%D0%B0%D0%B1%D0%B2?%D0%B0%D0%B1%D0%B2=%D0%B0%D0%B1%D0%B2#%D0%B0%D0%B1%D0%B2',
|
||||
)
|
||||
self.assertEqual(normalize_url('http://vimeo.com/56015672#at=0'), 'http://vimeo.com/56015672#at=0')
|
||||
|
||||
self.assertEqual(normalize_url('http://www.example.com/../a/b/../c/./d.html'), 'http://www.example.com/a/c/d.html')
|
||||
|
||||
def test_remove_dot_segments(self):
|
||||
self.assertEqual(remove_dot_segments('/a/b/c/./../../g'), '/a/g')
|
||||
self.assertEqual(remove_dot_segments('mid/content=5/../6'), 'mid/6')
|
||||
self.assertEqual(remove_dot_segments('/ad/../cd'), '/cd')
|
||||
self.assertEqual(remove_dot_segments('/ad/../cd/'), '/cd/')
|
||||
self.assertEqual(remove_dot_segments('/..'), '/')
|
||||
self.assertEqual(remove_dot_segments('/./'), '/')
|
||||
self.assertEqual(remove_dot_segments('/./a'), '/a')
|
||||
self.assertEqual(remove_dot_segments('/abc/./.././d/././e/.././f/./../../ghi'), '/ghi')
|
||||
self.assertEqual(remove_dot_segments('/'), '/')
|
||||
self.assertEqual(remove_dot_segments('/t'), '/t')
|
||||
self.assertEqual(remove_dot_segments('t'), 't')
|
||||
self.assertEqual(remove_dot_segments(''), '')
|
||||
self.assertEqual(remove_dot_segments('/../a/b/c'), '/a/b/c')
|
||||
self.assertEqual(remove_dot_segments('../a'), 'a')
|
||||
self.assertEqual(remove_dot_segments('./a'), 'a')
|
||||
self.assertEqual(remove_dot_segments('.'), '')
|
||||
self.assertEqual(remove_dot_segments('////'), '////')
|
||||
|
||||
def test_js_to_json_vars_strings(self):
|
||||
self.assertDictEqual(
|
||||
json.loads(js_to_json(
|
||||
'''{
|
||||
'null': a,
|
||||
'nullStr': b,
|
||||
'true': c,
|
||||
'trueStr': d,
|
||||
'false': e,
|
||||
'falseStr': f,
|
||||
'unresolvedVar': g,
|
||||
}''',
|
||||
{
|
||||
'a': 'null',
|
||||
'b': '"null"',
|
||||
'c': 'true',
|
||||
'd': '"true"',
|
||||
'e': 'false',
|
||||
'f': '"false"',
|
||||
'g': 'var',
|
||||
},
|
||||
)),
|
||||
{
|
||||
'null': None,
|
||||
'nullStr': 'null',
|
||||
'true': True,
|
||||
'trueStr': 'true',
|
||||
'false': False,
|
||||
'falseStr': 'false',
|
||||
'unresolvedVar': 'var',
|
||||
},
|
||||
)
|
||||
|
||||
self.assertDictEqual(
|
||||
json.loads(js_to_json(
|
||||
'''{
|
||||
'int': a,
|
||||
'intStr': b,
|
||||
'float': c,
|
||||
'floatStr': d,
|
||||
}''',
|
||||
{
|
||||
'a': '123',
|
||||
'b': '"123"',
|
||||
'c': '1.23',
|
||||
'd': '"1.23"',
|
||||
},
|
||||
)),
|
||||
{
|
||||
'int': 123,
|
||||
'intStr': '123',
|
||||
'float': 1.23,
|
||||
'floatStr': '1.23',
|
||||
},
|
||||
)
|
||||
|
||||
self.assertDictEqual(
|
||||
json.loads(js_to_json(
|
||||
'''{
|
||||
'object': a,
|
||||
'objectStr': b,
|
||||
'array': c,
|
||||
'arrayStr': d,
|
||||
}''',
|
||||
{
|
||||
'a': '{}',
|
||||
'b': '"{}"',
|
||||
'c': '[]',
|
||||
'd': '"[]"',
|
||||
},
|
||||
)),
|
||||
{
|
||||
'object': {},
|
||||
'objectStr': '{}',
|
||||
'array': [],
|
||||
'arrayStr': '[]',
|
||||
},
|
||||
)
|
||||
self.assertEqual(escape_url('http://vimeo.com/56015672#at=0'), 'http://vimeo.com/56015672#at=0')
|
||||
|
||||
def test_js_to_json_realworld(self):
|
||||
inp = '''{
|
||||
|
@ -997,7 +1130,7 @@ def test_js_to_json_realworld(self):
|
|||
|
||||
def test_js_to_json_edgecases(self):
|
||||
on = js_to_json("{abc_def:'1\\'\\\\2\\\\\\'3\"4'}")
|
||||
self.assertEqual(json.loads(on), {"abc_def": "1'\\2\\'3\"4"})
|
||||
self.assertEqual(json.loads(on), {'abc_def': "1'\\2\\'3\"4"})
|
||||
|
||||
on = js_to_json('{"abc": true}')
|
||||
self.assertEqual(json.loads(on), {'abc': True})
|
||||
|
@ -1029,9 +1162,9 @@ def test_js_to_json_edgecases(self):
|
|||
'c': 0,
|
||||
'd': 42.42,
|
||||
'e': [],
|
||||
'f': "abc",
|
||||
'g': "",
|
||||
'42': 42
|
||||
'f': 'abc',
|
||||
'g': '',
|
||||
'42': 42,
|
||||
})
|
||||
|
||||
on = js_to_json('["abc", "def",]')
|
||||
|
@ -1106,10 +1239,28 @@ def test_js_to_json_edgecases(self):
|
|||
on = js_to_json('\'"\\""\'')
|
||||
self.assertEqual(json.loads(on), '"""', msg='Unnecessary quote escape should be escaped')
|
||||
|
||||
on = js_to_json('[new Date("spam"), \'("eggs")\']')
|
||||
self.assertEqual(json.loads(on), ['spam', '("eggs")'], msg='Date regex should match a single string')
|
||||
|
||||
def test_js_to_json_malformed(self):
|
||||
self.assertEqual(js_to_json('42a1'), '42"a1"')
|
||||
self.assertEqual(js_to_json('42a-1'), '42"a"-1')
|
||||
|
||||
def test_js_to_json_template_literal(self):
|
||||
self.assertEqual(js_to_json('`Hello ${name}`', {'name': '"world"'}), '"Hello world"')
|
||||
self.assertEqual(js_to_json('`${name}${name}`', {'name': '"X"'}), '"XX"')
|
||||
self.assertEqual(js_to_json('`${name}${name}`', {'name': '5'}), '"55"')
|
||||
self.assertEqual(js_to_json('`${name}"${name}"`', {'name': '5'}), '"5\\"5\\""')
|
||||
self.assertEqual(js_to_json('`${name}`', {}), '"name"')
|
||||
|
||||
def test_js_to_json_common_constructors(self):
|
||||
self.assertEqual(json.loads(js_to_json('new Map([["a", 5]])')), {'a': 5})
|
||||
self.assertEqual(json.loads(js_to_json('Array(5, 10)')), [5, 10])
|
||||
self.assertEqual(json.loads(js_to_json('new Array(15,5)')), [15, 5])
|
||||
self.assertEqual(json.loads(js_to_json('new Map([Array(5, 10),new Array(15,5)])')), {'5': 10, '15': 5})
|
||||
self.assertEqual(json.loads(js_to_json('new Date("123")')), '123')
|
||||
self.assertEqual(json.loads(js_to_json('new Date(\'2023-10-19\')')), '2023-10-19')
|
||||
|
||||
def test_extract_attributes(self):
|
||||
self.assertEqual(extract_attributes('<e x="y">'), {'x': 'y'})
|
||||
self.assertEqual(extract_attributes("<e x='y'>"), {'x': 'y'})
|
||||
|
@ -1163,7 +1314,7 @@ def test_intlist_to_bytes(self):
|
|||
def test_args_to_str(self):
|
||||
self.assertEqual(
|
||||
args_to_str(['foo', 'ba/r', '-baz', '2 be', '']),
|
||||
'foo ba/r -baz \'2 be\' \'\'' if compat_os_name != 'nt' else 'foo ba/r -baz "2 be" ""'
|
||||
'foo ba/r -baz \'2 be\' \'\'' if compat_os_name != 'nt' else 'foo ba/r -baz "2 be" ""',
|
||||
)
|
||||
|
||||
def test_parse_filesize(self):
|
||||
|
@ -1246,10 +1397,10 @@ def test_is_html(self):
|
|||
self.assertTrue(is_html( # UTF-8 with BOM
|
||||
b'\xef\xbb\xbf<!DOCTYPE foo>\xaaa'))
|
||||
self.assertTrue(is_html( # UTF-16-LE
|
||||
b'\xff\xfe<\x00h\x00t\x00m\x00l\x00>\x00\xe4\x00'
|
||||
b'\xff\xfe<\x00h\x00t\x00m\x00l\x00>\x00\xe4\x00',
|
||||
))
|
||||
self.assertTrue(is_html( # UTF-16-BE
|
||||
b'\xfe\xff\x00<\x00h\x00t\x00m\x00l\x00>\x00\xe4'
|
||||
b'\xfe\xff\x00<\x00h\x00t\x00m\x00l\x00>\x00\xe4',
|
||||
))
|
||||
self.assertTrue(is_html( # UTF-32-BE
|
||||
b'\x00\x00\xFE\xFF\x00\x00\x00<\x00\x00\x00h\x00\x00\x00t\x00\x00\x00m\x00\x00\x00l\x00\x00\x00>\x00\x00\x00\xe4'))
|
||||
|
@ -1745,6 +1896,8 @@ def test_iri_to_uri(self):
|
|||
def test_clean_podcast_url(self):
|
||||
self.assertEqual(clean_podcast_url('https://www.podtrac.com/pts/redirect.mp3/chtbl.com/track/5899E/traffic.megaphone.fm/HSW7835899191.mp3'), 'https://traffic.megaphone.fm/HSW7835899191.mp3')
|
||||
self.assertEqual(clean_podcast_url('https://play.podtrac.com/npr-344098539/edge1.pod.npr.org/anon.npr-podcasts/podcast/npr/waitwait/2020/10/20201003_waitwait_wwdtmpodcast201003-015621a5-f035-4eca-a9a1-7c118d90bc3c.mp3'), 'https://edge1.pod.npr.org/anon.npr-podcasts/podcast/npr/waitwait/2020/10/20201003_waitwait_wwdtmpodcast201003-015621a5-f035-4eca-a9a1-7c118d90bc3c.mp3')
|
||||
self.assertEqual(clean_podcast_url('https://pdst.fm/e/2.gum.fm/chtbl.com/track/chrt.fm/track/34D33/pscrb.fm/rss/p/traffic.megaphone.fm/ITLLC7765286967.mp3?updated=1687282661'), 'https://traffic.megaphone.fm/ITLLC7765286967.mp3?updated=1687282661')
|
||||
self.assertEqual(clean_podcast_url('https://pdst.fm/e/https://mgln.ai/e/441/www.buzzsprout.com/1121972/13019085-ep-252-the-deep-life-stack.mp3'), 'https://www.buzzsprout.com/1121972/13019085-ep-252-the-deep-life-stack.mp3')
|
||||
|
||||
def test_LazyList(self):
|
||||
it = list(range(10))
|
||||
|
@ -1831,7 +1984,7 @@ def test_locked_file(self):
|
|||
with locked_file(FILE, test_mode, False):
|
||||
pass
|
||||
except (BlockingIOError, PermissionError):
|
||||
if not testing_write: # FIXME
|
||||
if not testing_write: # FIXME: blocked read access
|
||||
print(f'Known issue: Exclusive lock ({lock_mode}) blocks read access ({test_mode})')
|
||||
continue
|
||||
self.assertTrue(testing_write, f'{test_mode} is blocked by {lock_mode}')
|
||||
|
@ -1874,6 +2027,8 @@ def test_get_compatible_ext(self):
|
|||
vcodecs=[None], acodecs=[None], vexts=['webm'], aexts=['m4a']), 'mkv')
|
||||
self.assertEqual(get_compatible_ext(
|
||||
vcodecs=[None], acodecs=[None], vexts=['webm'], aexts=['webm']), 'webm')
|
||||
self.assertEqual(get_compatible_ext(
|
||||
vcodecs=[None], acodecs=[None], vexts=['webm'], aexts=['weba']), 'webm')
|
||||
|
||||
self.assertEqual(get_compatible_ext(
|
||||
vcodecs=['h264'], acodecs=['mp4a'], vexts=['mov'], aexts=['m4a']), 'mp4')
|
||||
|
@ -1885,229 +2040,113 @@ def test_get_compatible_ext(self):
|
|||
self.assertEqual(get_compatible_ext(
|
||||
vcodecs=['av1'], acodecs=['mp4a'], vexts=['webm'], aexts=['m4a'], preferences=('webm', 'mkv')), 'mkv')
|
||||
|
||||
def test_traverse_obj(self):
|
||||
_TEST_DATA = {
|
||||
100: 100,
|
||||
1.2: 1.2,
|
||||
'str': 'str',
|
||||
'None': None,
|
||||
'...': ...,
|
||||
'urls': [
|
||||
{'index': 0, 'url': 'https://www.example.com/0'},
|
||||
{'index': 1, 'url': 'https://www.example.com/1'},
|
||||
],
|
||||
'data': (
|
||||
{'index': 2},
|
||||
{'index': 3},
|
||||
),
|
||||
'dict': {},
|
||||
}
|
||||
def test_try_call(self):
|
||||
def total(*x, **kwargs):
|
||||
return sum(x) + sum(kwargs.values())
|
||||
|
||||
# Test base functionality
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, ('str',)), 'str',
|
||||
msg='allow tuple path')
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, ['str']), 'str',
|
||||
msg='allow list path')
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, (value for value in ("str",))), 'str',
|
||||
msg='allow iterable path')
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, 'str'), 'str',
|
||||
msg='single items should be treated as a path')
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, None), _TEST_DATA)
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, 100), 100)
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, 1.2), 1.2)
|
||||
self.assertEqual(try_call(None), None,
|
||||
msg='not a fn should give None')
|
||||
self.assertEqual(try_call(lambda: 1), 1,
|
||||
msg='int fn with no expected_type should give int')
|
||||
self.assertEqual(try_call(lambda: 1, expected_type=int), 1,
|
||||
msg='int fn with expected_type int should give int')
|
||||
self.assertEqual(try_call(lambda: 1, expected_type=dict), None,
|
||||
msg='int fn with wrong expected_type should give None')
|
||||
self.assertEqual(try_call(total, args=(0, 1, 0), expected_type=int), 1,
|
||||
msg='fn should accept arglist')
|
||||
self.assertEqual(try_call(total, kwargs={'a': 0, 'b': 1, 'c': 0}, expected_type=int), 1,
|
||||
msg='fn should accept kwargs')
|
||||
self.assertEqual(try_call(lambda: 1, expected_type=dict), None,
|
||||
msg='int fn with no expected_type should give None')
|
||||
self.assertEqual(try_call(lambda x: {}, total, args=(42, ), expected_type=int), 42,
|
||||
msg='expect first int result with expected_type int')
|
||||
|
||||
# Test Ellipsis behavior
|
||||
self.assertCountEqual(traverse_obj(_TEST_DATA, ...),
|
||||
(item for item in _TEST_DATA.values() if item is not None),
|
||||
msg='`...` should give all values except `None`')
|
||||
self.assertCountEqual(traverse_obj(_TEST_DATA, ('urls', 0, ...)), _TEST_DATA['urls'][0].values(),
|
||||
msg='`...` selection for dicts should select all values')
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, (..., ..., 'url')),
|
||||
['https://www.example.com/0', 'https://www.example.com/1'],
|
||||
msg='nested `...` queries should work')
|
||||
self.assertCountEqual(traverse_obj(_TEST_DATA, (..., ..., 'index')), range(4),
|
||||
msg='`...` query result should be flattened')
|
||||
def test_variadic(self):
|
||||
self.assertEqual(variadic(None), (None, ))
|
||||
self.assertEqual(variadic('spam'), ('spam', ))
|
||||
self.assertEqual(variadic('spam', allowed_types=dict), 'spam')
|
||||
with warnings.catch_warnings():
|
||||
warnings.simplefilter('ignore')
|
||||
self.assertEqual(variadic('spam', allowed_types=[dict]), 'spam')
|
||||
|
||||
# Test function as key
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, lambda x, y: x == 'urls' and isinstance(y, list)),
|
||||
[_TEST_DATA['urls']],
|
||||
msg='function as query key should perform a filter based on (key, value)')
|
||||
self.assertCountEqual(traverse_obj(_TEST_DATA, lambda _, x: isinstance(x[0], str)), {'str'},
|
||||
msg='exceptions in the query function should be catched')
|
||||
def test_http_header_dict(self):
|
||||
headers = HTTPHeaderDict()
|
||||
headers['ytdl-test'] = b'0'
|
||||
self.assertEqual(list(headers.items()), [('Ytdl-Test', '0')])
|
||||
headers['ytdl-test'] = 1
|
||||
self.assertEqual(list(headers.items()), [('Ytdl-Test', '1')])
|
||||
headers['Ytdl-test'] = '2'
|
||||
self.assertEqual(list(headers.items()), [('Ytdl-Test', '2')])
|
||||
self.assertTrue('ytDl-Test' in headers)
|
||||
self.assertEqual(str(headers), str(dict(headers)))
|
||||
self.assertEqual(repr(headers), str(dict(headers)))
|
||||
|
||||
# Test alternative paths
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, 'fail', 'str'), 'str',
|
||||
msg='multiple `paths` should be treated as alternative paths')
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, 'str', 100), 'str',
|
||||
msg='alternatives should exit early')
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, 'fail', 'fail'), None,
|
||||
msg='alternatives should return `default` if exhausted')
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, (..., 'fail'), 100), 100,
|
||||
msg='alternatives should track their own branching return')
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, ('dict', ...), ('data', ...)), list(_TEST_DATA['data']),
|
||||
msg='alternatives on empty objects should search further')
|
||||
headers.update({'X-dlp': 'data'})
|
||||
self.assertEqual(set(headers.items()), {('Ytdl-Test', '2'), ('X-Dlp', 'data')})
|
||||
self.assertEqual(dict(headers), {'Ytdl-Test': '2', 'X-Dlp': 'data'})
|
||||
self.assertEqual(len(headers), 2)
|
||||
self.assertEqual(headers.copy(), headers)
|
||||
headers2 = HTTPHeaderDict({'X-dlp': 'data3'}, **headers, **{'X-dlp': 'data2'})
|
||||
self.assertEqual(set(headers2.items()), {('Ytdl-Test', '2'), ('X-Dlp', 'data2')})
|
||||
self.assertEqual(len(headers2), 2)
|
||||
headers2.clear()
|
||||
self.assertEqual(len(headers2), 0)
|
||||
|
||||
# Test branch and path nesting
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, ('urls', (3, 0), 'url')), ['https://www.example.com/0'],
|
||||
msg='tuple as key should be treated as branches')
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, ('urls', [3, 0], 'url')), ['https://www.example.com/0'],
|
||||
msg='list as key should be treated as branches')
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, ('urls', ((1, 'fail'), (0, 'url')))), ['https://www.example.com/0'],
|
||||
msg='double nesting in path should be treated as paths')
|
||||
self.assertEqual(traverse_obj(['0', [1, 2]], [(0, 1), 0]), [1],
|
||||
msg='do not fail early on branching')
|
||||
self.assertCountEqual(traverse_obj(_TEST_DATA, ('urls', ((1, ('fail', 'url')), (0, 'url')))),
|
||||
['https://www.example.com/0', 'https://www.example.com/1'],
|
||||
msg='tripple nesting in path should be treated as branches')
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, ('urls', ('fail', (..., 'url')))),
|
||||
['https://www.example.com/0', 'https://www.example.com/1'],
|
||||
msg='ellipsis as branch path start gets flattened')
|
||||
# ensure we prefer latter headers
|
||||
headers3 = HTTPHeaderDict({'Ytdl-TeSt': 1}, {'Ytdl-test': 2})
|
||||
self.assertEqual(set(headers3.items()), {('Ytdl-Test', '2')})
|
||||
del headers3['ytdl-tesT']
|
||||
self.assertEqual(dict(headers3), {})
|
||||
|
||||
# Test dictionary as key
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, {0: 100, 1: 1.2}), {0: 100, 1: 1.2},
|
||||
msg='dict key should result in a dict with the same keys')
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, {0: ('urls', 0, 'url')}),
|
||||
{0: 'https://www.example.com/0'},
|
||||
msg='dict key should allow paths')
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, {0: ('urls', (3, 0), 'url')}),
|
||||
{0: ['https://www.example.com/0']},
|
||||
msg='tuple in dict path should be treated as branches')
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, {0: ('urls', ((1, 'fail'), (0, 'url')))}),
|
||||
{0: ['https://www.example.com/0']},
|
||||
msg='double nesting in dict path should be treated as paths')
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, {0: ('urls', ((1, ('fail', 'url')), (0, 'url')))}),
|
||||
{0: ['https://www.example.com/1', 'https://www.example.com/0']},
|
||||
msg='tripple nesting in dict path should be treated as branches')
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, {0: 'fail'}), {},
|
||||
msg='remove `None` values when dict key')
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, {0: 'fail'}, default=...), {0: ...},
|
||||
msg='do not remove `None` values if `default`')
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, {0: 'dict'}), {0: {}},
|
||||
msg='do not remove empty values when dict key')
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, {0: 'dict'}, default=...), {0: {}},
|
||||
msg='do not remove empty values when dict key and a default')
|
||||
self.assertEqual(traverse_obj(_TEST_DATA, {0: ('dict', ...)}), {0: []},
|
||||
msg='if branch in dict key not successful, return `[]`')
|
||||
headers4 = HTTPHeaderDict({'ytdl-test': 'data;'})
|
||||
self.assertEqual(set(headers4.items()), {('Ytdl-Test', 'data;')})
|
||||
|
||||
# Testing default parameter behavior
|
||||
_DEFAULT_DATA = {'None': None, 'int': 0, 'list': []}
|
||||
self.assertEqual(traverse_obj(_DEFAULT_DATA, 'fail'), None,
|
||||
msg='default value should be `None`')
|
||||
self.assertEqual(traverse_obj(_DEFAULT_DATA, 'fail', 'fail', default=...), ...,
|
||||
msg='chained fails should result in default')
|
||||
self.assertEqual(traverse_obj(_DEFAULT_DATA, 'None', 'int'), 0,
|
||||
msg='should not short cirquit on `None`')
|
||||
self.assertEqual(traverse_obj(_DEFAULT_DATA, 'fail', default=1), 1,
|
||||
msg='invalid dict key should result in `default`')
|
||||
self.assertEqual(traverse_obj(_DEFAULT_DATA, 'None', default=1), 1,
|
||||
msg='`None` is a deliberate sentinel and should become `default`')
|
||||
self.assertEqual(traverse_obj(_DEFAULT_DATA, ('list', 10)), None,
|
||||
msg='`IndexError` should result in `default`')
|
||||
self.assertEqual(traverse_obj(_DEFAULT_DATA, (..., 'fail'), default=1), 1,
|
||||
msg='if branched but not successful return `default` if defined, not `[]`')
|
||||
self.assertEqual(traverse_obj(_DEFAULT_DATA, (..., 'fail'), default=None), None,
|
||||
msg='if branched but not successful return `default` even if `default` is `None`')
|
||||
self.assertEqual(traverse_obj(_DEFAULT_DATA, (..., 'fail')), [],
|
||||
msg='if branched but not successful return `[]`, not `default`')
|
||||
self.assertEqual(traverse_obj(_DEFAULT_DATA, ('list', ...)), [],
|
||||
msg='if branched but object is empty return `[]`, not `default`')
|
||||
# common mistake: strip whitespace from values
|
||||
# https://github.com/yt-dlp/yt-dlp/issues/8729
|
||||
headers5 = HTTPHeaderDict({'ytdl-test': ' data; '})
|
||||
self.assertEqual(set(headers5.items()), {('Ytdl-Test', 'data;')})
|
||||
|
||||
# Testing expected_type behavior
|
||||
_EXPECTED_TYPE_DATA = {'str': 'str', 'int': 0}
|
||||
self.assertEqual(traverse_obj(_EXPECTED_TYPE_DATA, 'str', expected_type=str), 'str',
|
||||
msg='accept matching `expected_type` type')
|
||||
self.assertEqual(traverse_obj(_EXPECTED_TYPE_DATA, 'str', expected_type=int), None,
|
||||
msg='reject non matching `expected_type` type')
|
||||
self.assertEqual(traverse_obj(_EXPECTED_TYPE_DATA, 'int', expected_type=lambda x: str(x)), '0',
|
||||
msg='transform type using type function')
|
||||
self.assertEqual(traverse_obj(_EXPECTED_TYPE_DATA, 'str',
|
||||
expected_type=lambda _: 1 / 0), None,
|
||||
msg='wrap expected_type fuction in try_call')
|
||||
self.assertEqual(traverse_obj(_EXPECTED_TYPE_DATA, ..., expected_type=str), ['str'],
|
||||
msg='eliminate items that expected_type fails on')
|
||||
def test_extract_basic_auth(self):
|
||||
assert extract_basic_auth('http://:foo.bar') == ('http://:foo.bar', None)
|
||||
assert extract_basic_auth('http://foo.bar') == ('http://foo.bar', None)
|
||||
assert extract_basic_auth('http://@foo.bar') == ('http://foo.bar', 'Basic Og==')
|
||||
assert extract_basic_auth('http://:pass@foo.bar') == ('http://foo.bar', 'Basic OnBhc3M=')
|
||||
assert extract_basic_auth('http://user:@foo.bar') == ('http://foo.bar', 'Basic dXNlcjo=')
|
||||
assert extract_basic_auth('http://user:pass@foo.bar') == ('http://foo.bar', 'Basic dXNlcjpwYXNz')
|
||||
|
||||
# Test get_all behavior
|
||||
_GET_ALL_DATA = {'key': [0, 1, 2]}
|
||||
self.assertEqual(traverse_obj(_GET_ALL_DATA, ('key', ...), get_all=False), 0,
|
||||
msg='if not `get_all`, return only first matching value')
|
||||
self.assertEqual(traverse_obj(_GET_ALL_DATA, ..., get_all=False), [0, 1, 2],
|
||||
msg='do not overflatten if not `get_all`')
|
||||
@unittest.skipUnless(compat_os_name == 'nt', 'Only relevant on Windows')
|
||||
def test_windows_escaping(self):
|
||||
tests = [
|
||||
'test"&',
|
||||
'%CMDCMDLINE:~-1%&',
|
||||
'a\nb',
|
||||
'"',
|
||||
'\\',
|
||||
'!',
|
||||
'^!',
|
||||
'a \\ b',
|
||||
'a \\" b',
|
||||
'a \\ b\\',
|
||||
# We replace \r with \n
|
||||
('a\r\ra', 'a\n\na'),
|
||||
]
|
||||
|
||||
# Test casesense behavior
|
||||
_CASESENSE_DATA = {
|
||||
'KeY': 'value0',
|
||||
0: {
|
||||
'KeY': 'value1',
|
||||
0: {'KeY': 'value2'},
|
||||
},
|
||||
}
|
||||
self.assertEqual(traverse_obj(_CASESENSE_DATA, 'key'), None,
|
||||
msg='dict keys should be case sensitive unless `casesense`')
|
||||
self.assertEqual(traverse_obj(_CASESENSE_DATA, 'keY',
|
||||
casesense=False), 'value0',
|
||||
msg='allow non matching key case if `casesense`')
|
||||
self.assertEqual(traverse_obj(_CASESENSE_DATA, (0, ('keY',)),
|
||||
casesense=False), ['value1'],
|
||||
msg='allow non matching key case in branch if `casesense`')
|
||||
self.assertEqual(traverse_obj(_CASESENSE_DATA, (0, ((0, 'keY'),)),
|
||||
casesense=False), ['value2'],
|
||||
msg='allow non matching key case in branch path if `casesense`')
|
||||
def run_shell(args):
|
||||
stdout, stderr, error = Popen.run(
|
||||
args, text=True, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
|
||||
assert not stderr
|
||||
assert not error
|
||||
return stdout
|
||||
|
||||
# Test traverse_string behavior
|
||||
_TRAVERSE_STRING_DATA = {'str': 'str', 1.2: 1.2}
|
||||
self.assertEqual(traverse_obj(_TRAVERSE_STRING_DATA, ('str', 0)), None,
|
||||
msg='do not traverse into string if not `traverse_string`')
|
||||
self.assertEqual(traverse_obj(_TRAVERSE_STRING_DATA, ('str', 0),
|
||||
traverse_string=True), 's',
|
||||
msg='traverse into string if `traverse_string`')
|
||||
self.assertEqual(traverse_obj(_TRAVERSE_STRING_DATA, (1.2, 1),
|
||||
traverse_string=True), '.',
|
||||
msg='traverse into converted data if `traverse_string`')
|
||||
self.assertEqual(traverse_obj(_TRAVERSE_STRING_DATA, ('str', ...),
|
||||
traverse_string=True), list('str'),
|
||||
msg='`...` branching into string should result in list')
|
||||
self.assertEqual(traverse_obj(_TRAVERSE_STRING_DATA, ('str', (0, 2)),
|
||||
traverse_string=True), ['s', 'r'],
|
||||
msg='branching into string should result in list')
|
||||
self.assertEqual(traverse_obj(_TRAVERSE_STRING_DATA, ('str', lambda _, x: x),
|
||||
traverse_string=True), list('str'),
|
||||
msg='function branching into string should result in list')
|
||||
for argument in tests:
|
||||
if isinstance(argument, str):
|
||||
expected = argument
|
||||
else:
|
||||
argument, expected = argument
|
||||
|
||||
# Test is_user_input behavior
|
||||
_IS_USER_INPUT_DATA = {'range8': list(range(8))}
|
||||
self.assertEqual(traverse_obj(_IS_USER_INPUT_DATA, ('range8', '3'),
|
||||
is_user_input=True), 3,
|
||||
msg='allow for string indexing if `is_user_input`')
|
||||
self.assertCountEqual(traverse_obj(_IS_USER_INPUT_DATA, ('range8', '3:'),
|
||||
is_user_input=True), tuple(range(8))[3:],
|
||||
msg='allow for string slice if `is_user_input`')
|
||||
self.assertCountEqual(traverse_obj(_IS_USER_INPUT_DATA, ('range8', ':4:2'),
|
||||
is_user_input=True), tuple(range(8))[:4:2],
|
||||
msg='allow step in string slice if `is_user_input`')
|
||||
self.assertCountEqual(traverse_obj(_IS_USER_INPUT_DATA, ('range8', ':'),
|
||||
is_user_input=True), range(8),
|
||||
msg='`:` should be treated as `...` if `is_user_input`')
|
||||
with self.assertRaises(TypeError, msg='too many params should result in error'):
|
||||
traverse_obj(_IS_USER_INPUT_DATA, ('range8', ':::'), is_user_input=True)
|
||||
|
||||
# Test re.Match as input obj
|
||||
mobj = re.fullmatch(r'0(12)(?P<group>3)(4)?', '0123')
|
||||
self.assertEqual(traverse_obj(mobj, ...), [x for x in mobj.groups() if x is not None],
|
||||
msg='`...` on a `re.Match` should give its `groups()`')
|
||||
self.assertEqual(traverse_obj(mobj, lambda k, _: k in (0, 2)), ['0123', '3'],
|
||||
msg='function on a `re.Match` should give groupno, value starting at 0')
|
||||
self.assertEqual(traverse_obj(mobj, 'group'), '3',
|
||||
msg='str key on a `re.Match` should give group with that name')
|
||||
self.assertEqual(traverse_obj(mobj, 2), '3',
|
||||
msg='int key on a `re.Match` should give group with that name')
|
||||
self.assertEqual(traverse_obj(mobj, 'gRoUp', casesense=False), '3',
|
||||
msg='str key on a `re.Match` should respect casesense')
|
||||
self.assertEqual(traverse_obj(mobj, 'fail'), None,
|
||||
msg='failing str key on a `re.Match` should return `default`')
|
||||
self.assertEqual(traverse_obj(mobj, 'gRoUpS', casesense=False), None,
|
||||
msg='failing str key on a `re.Match` should return `default`')
|
||||
self.assertEqual(traverse_obj(mobj, 8), None,
|
||||
msg='failing int key on a `re.Match` should return `default`')
|
||||
args = [sys.executable, '-c', 'import sys; print(end=sys.argv[1])', argument, 'end']
|
||||
assert run_shell(args) == expected
|
||||
assert run_shell(shell_quote(args, shell=True)) == expected
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
|
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Reference in a new issue