mirror of
https://github.com/yt-dlp/yt-dlp.git
synced 2024-11-15 21:03:18 +00:00
Merge branch 'yt-dlp:master' into pr/malformed-manifest-fix
This commit is contained in:
commit
258fbc9f96
29
.github/ISSUE_TEMPLATE/1_broken_site.yml
vendored
29
.github/ISSUE_TEMPLATE/1_broken_site.yml
vendored
|
@ -1,5 +1,5 @@
|
||||||
name: Broken site
|
name: Broken site support
|
||||||
description: Report broken or misfunctioning site
|
description: Report issue with yt-dlp on a supported site
|
||||||
labels: [triage, site-bug]
|
labels: [triage, site-bug]
|
||||||
body:
|
body:
|
||||||
- type: checkboxes
|
- type: checkboxes
|
||||||
|
@ -7,7 +7,7 @@ body:
|
||||||
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||||
description: Fill all fields even if you think it is irrelevant for the issue
|
description: Fill all fields even if you think it is irrelevant for the issue
|
||||||
options:
|
options:
|
||||||
- label: I understand that I will be **blocked** if I remove or skip any mandatory\* field
|
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
|
||||||
required: true
|
required: true
|
||||||
- type: checkboxes
|
- type: checkboxes
|
||||||
id: checklist
|
id: checklist
|
||||||
|
@ -16,15 +16,15 @@ body:
|
||||||
description: |
|
description: |
|
||||||
Carefully read and work through this check list in order to prevent the most common mistakes and misuse of yt-dlp:
|
Carefully read and work through this check list in order to prevent the most common mistakes and misuse of yt-dlp:
|
||||||
options:
|
options:
|
||||||
- label: I'm reporting a broken site
|
- label: I'm reporting that yt-dlp is broken on a **supported** site
|
||||||
required: true
|
required: true
|
||||||
- label: I've verified that I'm running yt-dlp version **2023.01.06** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||||
required: true
|
required: true
|
||||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||||
required: true
|
required: true
|
||||||
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#video-url-contains-an-ampersand--and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
|
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#video-url-contains-an-ampersand--and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
|
||||||
required: true
|
required: true
|
||||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||||
required: true
|
required: true
|
||||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||||
required: true
|
required: true
|
||||||
|
@ -50,6 +50,8 @@ body:
|
||||||
options:
|
options:
|
||||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||||
required: true
|
required: true
|
||||||
|
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
|
||||||
|
required: false
|
||||||
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
||||||
required: true
|
required: true
|
||||||
- type: textarea
|
- type: textarea
|
||||||
|
@ -59,19 +61,18 @@ body:
|
||||||
description: |
|
description: |
|
||||||
It should start like this:
|
It should start like this:
|
||||||
placeholder: |
|
placeholder: |
|
||||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
||||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
|
||||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||||
[debug] yt-dlp version 2023.01.06 [9d339c4] (win32_exe)
|
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
|
||||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||||
[debug] Checking exe version: ffmpeg -bsfs
|
|
||||||
[debug] Checking exe version: ffprobe -bsfs
|
|
||||||
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||||
[debug] Proxy map: {}
|
[debug] Proxy map: {}
|
||||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
[debug] Request Handlers: urllib, requests
|
||||||
Latest version: 2023.01.06, Current version: 2023.01.06
|
[debug] Loaded 1893 extractors
|
||||||
yt-dlp is up to date (2023.01.06)
|
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
|
||||||
|
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
||||||
|
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
||||||
<more lines>
|
<more lines>
|
||||||
render: shell
|
render: shell
|
||||||
validations:
|
validations:
|
||||||
|
|
|
@ -7,7 +7,7 @@ body:
|
||||||
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||||
description: Fill all fields even if you think it is irrelevant for the issue
|
description: Fill all fields even if you think it is irrelevant for the issue
|
||||||
options:
|
options:
|
||||||
- label: I understand that I will be **blocked** if I remove or skip any mandatory\* field
|
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
|
||||||
required: true
|
required: true
|
||||||
- type: checkboxes
|
- type: checkboxes
|
||||||
id: checklist
|
id: checklist
|
||||||
|
@ -18,13 +18,13 @@ body:
|
||||||
options:
|
options:
|
||||||
- label: I'm reporting a new site support request
|
- label: I'm reporting a new site support request
|
||||||
required: true
|
required: true
|
||||||
- label: I've verified that I'm running yt-dlp version **2023.01.06** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||||
required: true
|
required: true
|
||||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||||
required: true
|
required: true
|
||||||
- label: I've checked that none of provided URLs [violate any copyrights](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-website-primarily-used-for-piracy) or contain any [DRM](https://en.wikipedia.org/wiki/Digital_rights_management) to the best of my knowledge
|
- label: I've checked that none of provided URLs [violate any copyrights](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-website-primarily-used-for-piracy) or contain any [DRM](https://en.wikipedia.org/wiki/Digital_rights_management) to the best of my knowledge
|
||||||
required: true
|
required: true
|
||||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||||
required: true
|
required: true
|
||||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||||
required: true
|
required: true
|
||||||
|
@ -62,6 +62,8 @@ body:
|
||||||
options:
|
options:
|
||||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||||
required: true
|
required: true
|
||||||
|
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
|
||||||
|
required: false
|
||||||
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
||||||
required: true
|
required: true
|
||||||
- type: textarea
|
- type: textarea
|
||||||
|
@ -71,19 +73,18 @@ body:
|
||||||
description: |
|
description: |
|
||||||
It should start like this:
|
It should start like this:
|
||||||
placeholder: |
|
placeholder: |
|
||||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
||||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
|
||||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||||
[debug] yt-dlp version 2023.01.06 [9d339c4] (win32_exe)
|
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
|
||||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||||
[debug] Checking exe version: ffmpeg -bsfs
|
|
||||||
[debug] Checking exe version: ffprobe -bsfs
|
|
||||||
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||||
[debug] Proxy map: {}
|
[debug] Proxy map: {}
|
||||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
[debug] Request Handlers: urllib, requests
|
||||||
Latest version: 2023.01.06, Current version: 2023.01.06
|
[debug] Loaded 1893 extractors
|
||||||
yt-dlp is up to date (2023.01.06)
|
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
|
||||||
|
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
||||||
|
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
||||||
<more lines>
|
<more lines>
|
||||||
render: shell
|
render: shell
|
||||||
validations:
|
validations:
|
||||||
|
|
|
@ -7,7 +7,7 @@ body:
|
||||||
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||||
description: Fill all fields even if you think it is irrelevant for the issue
|
description: Fill all fields even if you think it is irrelevant for the issue
|
||||||
options:
|
options:
|
||||||
- label: I understand that I will be **blocked** if I remove or skip any mandatory\* field
|
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
|
||||||
required: true
|
required: true
|
||||||
- type: checkboxes
|
- type: checkboxes
|
||||||
id: checklist
|
id: checklist
|
||||||
|
@ -18,11 +18,11 @@ body:
|
||||||
options:
|
options:
|
||||||
- label: I'm requesting a site-specific feature
|
- label: I'm requesting a site-specific feature
|
||||||
required: true
|
required: true
|
||||||
- label: I've verified that I'm running yt-dlp version **2023.01.06** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||||
required: true
|
required: true
|
||||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||||
required: true
|
required: true
|
||||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||||
required: true
|
required: true
|
||||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||||
required: true
|
required: true
|
||||||
|
@ -58,6 +58,8 @@ body:
|
||||||
options:
|
options:
|
||||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||||
required: true
|
required: true
|
||||||
|
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
|
||||||
|
required: false
|
||||||
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
||||||
required: true
|
required: true
|
||||||
- type: textarea
|
- type: textarea
|
||||||
|
@ -67,19 +69,18 @@ body:
|
||||||
description: |
|
description: |
|
||||||
It should start like this:
|
It should start like this:
|
||||||
placeholder: |
|
placeholder: |
|
||||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
||||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
|
||||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||||
[debug] yt-dlp version 2023.01.06 [9d339c4] (win32_exe)
|
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
|
||||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||||
[debug] Checking exe version: ffmpeg -bsfs
|
|
||||||
[debug] Checking exe version: ffprobe -bsfs
|
|
||||||
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||||
[debug] Proxy map: {}
|
[debug] Proxy map: {}
|
||||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
[debug] Request Handlers: urllib, requests
|
||||||
Latest version: 2023.01.06, Current version: 2023.01.06
|
[debug] Loaded 1893 extractors
|
||||||
yt-dlp is up to date (2023.01.06)
|
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
|
||||||
|
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
||||||
|
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
||||||
<more lines>
|
<more lines>
|
||||||
render: shell
|
render: shell
|
||||||
validations:
|
validations:
|
||||||
|
|
25
.github/ISSUE_TEMPLATE/4_bug_report.yml
vendored
25
.github/ISSUE_TEMPLATE/4_bug_report.yml
vendored
|
@ -1,4 +1,4 @@
|
||||||
name: Bug report
|
name: Core bug report
|
||||||
description: Report a bug unrelated to any particular site or extractor
|
description: Report a bug unrelated to any particular site or extractor
|
||||||
labels: [triage, bug]
|
labels: [triage, bug]
|
||||||
body:
|
body:
|
||||||
|
@ -7,7 +7,7 @@ body:
|
||||||
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||||
description: Fill all fields even if you think it is irrelevant for the issue
|
description: Fill all fields even if you think it is irrelevant for the issue
|
||||||
options:
|
options:
|
||||||
- label: I understand that I will be **blocked** if I remove or skip any mandatory\* field
|
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
|
||||||
required: true
|
required: true
|
||||||
- type: checkboxes
|
- type: checkboxes
|
||||||
id: checklist
|
id: checklist
|
||||||
|
@ -18,13 +18,13 @@ body:
|
||||||
options:
|
options:
|
||||||
- label: I'm reporting a bug unrelated to a specific site
|
- label: I'm reporting a bug unrelated to a specific site
|
||||||
required: true
|
required: true
|
||||||
- label: I've verified that I'm running yt-dlp version **2023.01.06** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||||
required: true
|
required: true
|
||||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||||
required: true
|
required: true
|
||||||
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#video-url-contains-an-ampersand--and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
|
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#video-url-contains-an-ampersand--and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
|
||||||
required: true
|
required: true
|
||||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||||
required: true
|
required: true
|
||||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||||
required: true
|
required: true
|
||||||
|
@ -43,6 +43,8 @@ body:
|
||||||
options:
|
options:
|
||||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||||
required: true
|
required: true
|
||||||
|
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
|
||||||
|
required: false
|
||||||
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
||||||
required: true
|
required: true
|
||||||
- type: textarea
|
- type: textarea
|
||||||
|
@ -52,19 +54,18 @@ body:
|
||||||
description: |
|
description: |
|
||||||
It should start like this:
|
It should start like this:
|
||||||
placeholder: |
|
placeholder: |
|
||||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
||||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
|
||||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||||
[debug] yt-dlp version 2023.01.06 [9d339c4] (win32_exe)
|
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
|
||||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||||
[debug] Checking exe version: ffmpeg -bsfs
|
|
||||||
[debug] Checking exe version: ffprobe -bsfs
|
|
||||||
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||||
[debug] Proxy map: {}
|
[debug] Proxy map: {}
|
||||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
[debug] Request Handlers: urllib, requests
|
||||||
Latest version: 2023.01.06, Current version: 2023.01.06
|
[debug] Loaded 1893 extractors
|
||||||
yt-dlp is up to date (2023.01.06)
|
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
|
||||||
|
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
||||||
|
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
||||||
<more lines>
|
<more lines>
|
||||||
render: shell
|
render: shell
|
||||||
validations:
|
validations:
|
||||||
|
|
23
.github/ISSUE_TEMPLATE/5_feature_request.yml
vendored
23
.github/ISSUE_TEMPLATE/5_feature_request.yml
vendored
|
@ -7,7 +7,7 @@ body:
|
||||||
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||||
description: Fill all fields even if you think it is irrelevant for the issue
|
description: Fill all fields even if you think it is irrelevant for the issue
|
||||||
options:
|
options:
|
||||||
- label: I understand that I will be **blocked** if I remove or skip any mandatory\* field
|
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
|
||||||
required: true
|
required: true
|
||||||
- type: checkboxes
|
- type: checkboxes
|
||||||
id: checklist
|
id: checklist
|
||||||
|
@ -20,9 +20,9 @@ body:
|
||||||
required: true
|
required: true
|
||||||
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
|
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
|
||||||
required: true
|
required: true
|
||||||
- label: I've verified that I'm running yt-dlp version **2023.01.06** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||||
required: true
|
required: true
|
||||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||||
required: true
|
required: true
|
||||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||||
required: true
|
required: true
|
||||||
|
@ -40,6 +40,8 @@ body:
|
||||||
label: Provide verbose output that clearly demonstrates the problem
|
label: Provide verbose output that clearly demonstrates the problem
|
||||||
options:
|
options:
|
||||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||||
|
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
|
||||||
|
required: false
|
||||||
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
||||||
- type: textarea
|
- type: textarea
|
||||||
id: log
|
id: log
|
||||||
|
@ -48,18 +50,17 @@ body:
|
||||||
description: |
|
description: |
|
||||||
It should start like this:
|
It should start like this:
|
||||||
placeholder: |
|
placeholder: |
|
||||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
||||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
|
||||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||||
[debug] yt-dlp version 2023.01.06 [9d339c4] (win32_exe)
|
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
|
||||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||||
[debug] Checking exe version: ffmpeg -bsfs
|
|
||||||
[debug] Checking exe version: ffprobe -bsfs
|
|
||||||
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||||
[debug] Proxy map: {}
|
[debug] Proxy map: {}
|
||||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
[debug] Request Handlers: urllib, requests
|
||||||
Latest version: 2023.01.06, Current version: 2023.01.06
|
[debug] Loaded 1893 extractors
|
||||||
yt-dlp is up to date (2023.01.06)
|
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
|
||||||
|
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
||||||
|
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
||||||
<more lines>
|
<more lines>
|
||||||
render: shell
|
render: shell
|
||||||
|
|
23
.github/ISSUE_TEMPLATE/6_question.yml
vendored
23
.github/ISSUE_TEMPLATE/6_question.yml
vendored
|
@ -7,7 +7,7 @@ body:
|
||||||
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||||
description: Fill all fields even if you think it is irrelevant for the issue
|
description: Fill all fields even if you think it is irrelevant for the issue
|
||||||
options:
|
options:
|
||||||
- label: I understand that I will be **blocked** if I remove or skip any mandatory\* field
|
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\* field
|
||||||
required: true
|
required: true
|
||||||
- type: markdown
|
- type: markdown
|
||||||
attributes:
|
attributes:
|
||||||
|
@ -26,9 +26,9 @@ body:
|
||||||
required: true
|
required: true
|
||||||
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
|
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
|
||||||
required: true
|
required: true
|
||||||
- label: I've verified that I'm running yt-dlp version **2023.01.06** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||||
required: true
|
required: true
|
||||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar questions **including closed ones**. DO NOT post duplicates
|
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar questions **including closed ones**. DO NOT post duplicates
|
||||||
required: true
|
required: true
|
||||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||||
required: true
|
required: true
|
||||||
|
@ -46,6 +46,8 @@ body:
|
||||||
label: Provide verbose output that clearly demonstrates the problem
|
label: Provide verbose output that clearly demonstrates the problem
|
||||||
options:
|
options:
|
||||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||||
|
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
|
||||||
|
required: false
|
||||||
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
||||||
- type: textarea
|
- type: textarea
|
||||||
id: log
|
id: log
|
||||||
|
@ -54,18 +56,17 @@ body:
|
||||||
description: |
|
description: |
|
||||||
It should start like this:
|
It should start like this:
|
||||||
placeholder: |
|
placeholder: |
|
||||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
||||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
|
||||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||||
[debug] yt-dlp version 2023.01.06 [9d339c4] (win32_exe)
|
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
|
||||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||||
[debug] Checking exe version: ffmpeg -bsfs
|
|
||||||
[debug] Checking exe version: ffprobe -bsfs
|
|
||||||
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||||
[debug] Proxy map: {}
|
[debug] Proxy map: {}
|
||||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
[debug] Request Handlers: urllib, requests
|
||||||
Latest version: 2023.01.06, Current version: 2023.01.06
|
[debug] Loaded 1893 extractors
|
||||||
yt-dlp is up to date (2023.01.06)
|
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
|
||||||
|
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
||||||
|
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
||||||
<more lines>
|
<more lines>
|
||||||
render: shell
|
render: shell
|
||||||
|
|
10
.github/ISSUE_TEMPLATE_tmpl/1_broken_site.yml
vendored
10
.github/ISSUE_TEMPLATE_tmpl/1_broken_site.yml
vendored
|
@ -1,5 +1,5 @@
|
||||||
name: Broken site
|
name: Broken site support
|
||||||
description: Report broken or misfunctioning site
|
description: Report issue with yt-dlp on a supported site
|
||||||
labels: [triage, site-bug]
|
labels: [triage, site-bug]
|
||||||
body:
|
body:
|
||||||
%(no_skip)s
|
%(no_skip)s
|
||||||
|
@ -10,15 +10,15 @@ body:
|
||||||
description: |
|
description: |
|
||||||
Carefully read and work through this check list in order to prevent the most common mistakes and misuse of yt-dlp:
|
Carefully read and work through this check list in order to prevent the most common mistakes and misuse of yt-dlp:
|
||||||
options:
|
options:
|
||||||
- label: I'm reporting a broken site
|
- label: I'm reporting that yt-dlp is broken on a **supported** site
|
||||||
required: true
|
required: true
|
||||||
- label: I've verified that I'm running yt-dlp version **%(version)s** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||||
required: true
|
required: true
|
||||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||||
required: true
|
required: true
|
||||||
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#video-url-contains-an-ampersand--and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
|
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#video-url-contains-an-ampersand--and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
|
||||||
required: true
|
required: true
|
||||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||||
required: true
|
required: true
|
||||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||||
required: true
|
required: true
|
||||||
|
|
|
@ -12,13 +12,13 @@ body:
|
||||||
options:
|
options:
|
||||||
- label: I'm reporting a new site support request
|
- label: I'm reporting a new site support request
|
||||||
required: true
|
required: true
|
||||||
- label: I've verified that I'm running yt-dlp version **%(version)s** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||||
required: true
|
required: true
|
||||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||||
required: true
|
required: true
|
||||||
- label: I've checked that none of provided URLs [violate any copyrights](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-website-primarily-used-for-piracy) or contain any [DRM](https://en.wikipedia.org/wiki/Digital_rights_management) to the best of my knowledge
|
- label: I've checked that none of provided URLs [violate any copyrights](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#is-the-website-primarily-used-for-piracy) or contain any [DRM](https://en.wikipedia.org/wiki/Digital_rights_management) to the best of my knowledge
|
||||||
required: true
|
required: true
|
||||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||||
required: true
|
required: true
|
||||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||||
required: true
|
required: true
|
||||||
|
|
|
@ -12,11 +12,11 @@ body:
|
||||||
options:
|
options:
|
||||||
- label: I'm requesting a site-specific feature
|
- label: I'm requesting a site-specific feature
|
||||||
required: true
|
required: true
|
||||||
- label: I've verified that I'm running yt-dlp version **%(version)s** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||||
required: true
|
required: true
|
||||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||||
required: true
|
required: true
|
||||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||||
required: true
|
required: true
|
||||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||||
required: true
|
required: true
|
||||||
|
|
6
.github/ISSUE_TEMPLATE_tmpl/4_bug_report.yml
vendored
6
.github/ISSUE_TEMPLATE_tmpl/4_bug_report.yml
vendored
|
@ -1,4 +1,4 @@
|
||||||
name: Bug report
|
name: Core bug report
|
||||||
description: Report a bug unrelated to any particular site or extractor
|
description: Report a bug unrelated to any particular site or extractor
|
||||||
labels: [triage, bug]
|
labels: [triage, bug]
|
||||||
body:
|
body:
|
||||||
|
@ -12,13 +12,13 @@ body:
|
||||||
options:
|
options:
|
||||||
- label: I'm reporting a bug unrelated to a specific site
|
- label: I'm reporting a bug unrelated to a specific site
|
||||||
required: true
|
required: true
|
||||||
- label: I've verified that I'm running yt-dlp version **%(version)s** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||||
required: true
|
required: true
|
||||||
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
- label: I've checked that all provided URLs are playable in a browser with the same IP and same login details
|
||||||
required: true
|
required: true
|
||||||
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#video-url-contains-an-ampersand--and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
|
- label: I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/yt-dlp/yt-dlp/wiki/FAQ#video-url-contains-an-ampersand--and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
|
||||||
required: true
|
required: true
|
||||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||||
required: true
|
required: true
|
||||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||||
required: true
|
required: true
|
||||||
|
|
|
@ -14,9 +14,9 @@ body:
|
||||||
required: true
|
required: true
|
||||||
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
|
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
|
||||||
required: true
|
required: true
|
||||||
- label: I've verified that I'm running yt-dlp version **%(version)s** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||||
required: true
|
required: true
|
||||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues **including closed ones**. DO NOT post duplicates
|
||||||
required: true
|
required: true
|
||||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||||
required: true
|
required: true
|
||||||
|
|
4
.github/ISSUE_TEMPLATE_tmpl/6_question.yml
vendored
4
.github/ISSUE_TEMPLATE_tmpl/6_question.yml
vendored
|
@ -20,9 +20,9 @@ body:
|
||||||
required: true
|
required: true
|
||||||
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
|
- label: I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
|
||||||
required: true
|
required: true
|
||||||
- label: I've verified that I'm running yt-dlp version **%(version)s** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
|
- label: I've verified that I have **updated yt-dlp to nightly or master** ([update instructions](https://github.com/yt-dlp/yt-dlp#update-channels))
|
||||||
required: true
|
required: true
|
||||||
- label: I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar questions **including closed ones**. DO NOT post duplicates
|
- label: I've searched [known issues](https://github.com/yt-dlp/yt-dlp/issues/3766) and the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar questions **including closed ones**. DO NOT post duplicates
|
||||||
required: true
|
required: true
|
||||||
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
- label: I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
|
||||||
required: true
|
required: true
|
||||||
|
|
2
.github/PULL_REQUEST_TEMPLATE.md
vendored
2
.github/PULL_REQUEST_TEMPLATE.md
vendored
|
@ -30,7 +30,7 @@ ### Before submitting a *pull request* make sure you have:
|
||||||
- [ ] [Searched](https://github.com/yt-dlp/yt-dlp/search?q=is%3Apr&type=Issues) the bugtracker for similar pull requests
|
- [ ] [Searched](https://github.com/yt-dlp/yt-dlp/search?q=is%3Apr&type=Issues) the bugtracker for similar pull requests
|
||||||
- [ ] Checked the code with [flake8](https://pypi.python.org/pypi/flake8) and [ran relevant tests](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#developer-instructions)
|
- [ ] Checked the code with [flake8](https://pypi.python.org/pypi/flake8) and [ran relevant tests](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#developer-instructions)
|
||||||
|
|
||||||
### In order to be accepted and merged into yt-dlp each piece of code must be in public domain or released under [Unlicense](http://unlicense.org/). Check one of the following options:
|
### In order to be accepted and merged into yt-dlp each piece of code must be in public domain or released under [Unlicense](http://unlicense.org/). Check all of the following options that apply:
|
||||||
- [ ] I am the original author of this code and I am willing to release it under [Unlicense](http://unlicense.org/)
|
- [ ] I am the original author of this code and I am willing to release it under [Unlicense](http://unlicense.org/)
|
||||||
- [ ] I am not the original author of this code but it is in public domain or released under [Unlicense](http://unlicense.org/) (provide reliable evidence)
|
- [ ] I am not the original author of this code but it is in public domain or released under [Unlicense](http://unlicense.org/) (provide reliable evidence)
|
||||||
|
|
||||||
|
|
711
.github/workflows/build.yml
vendored
711
.github/workflows/build.yml
vendored
|
@ -1,393 +1,486 @@
|
||||||
name: Build
|
name: Build Artifacts
|
||||||
on: workflow_dispatch
|
on:
|
||||||
|
workflow_call:
|
||||||
|
inputs:
|
||||||
|
version:
|
||||||
|
required: true
|
||||||
|
type: string
|
||||||
|
channel:
|
||||||
|
required: false
|
||||||
|
default: stable
|
||||||
|
type: string
|
||||||
|
unix:
|
||||||
|
default: true
|
||||||
|
type: boolean
|
||||||
|
linux_arm:
|
||||||
|
default: true
|
||||||
|
type: boolean
|
||||||
|
macos:
|
||||||
|
default: true
|
||||||
|
type: boolean
|
||||||
|
macos_legacy:
|
||||||
|
default: true
|
||||||
|
type: boolean
|
||||||
|
windows:
|
||||||
|
default: true
|
||||||
|
type: boolean
|
||||||
|
windows32:
|
||||||
|
default: true
|
||||||
|
type: boolean
|
||||||
|
meta_files:
|
||||||
|
default: true
|
||||||
|
type: boolean
|
||||||
|
origin:
|
||||||
|
required: false
|
||||||
|
default: ''
|
||||||
|
type: string
|
||||||
|
secrets:
|
||||||
|
GPG_SIGNING_KEY:
|
||||||
|
required: false
|
||||||
|
|
||||||
|
workflow_dispatch:
|
||||||
|
inputs:
|
||||||
|
version:
|
||||||
|
description: |
|
||||||
|
VERSION: yyyy.mm.dd[.rev] or rev
|
||||||
|
required: true
|
||||||
|
type: string
|
||||||
|
channel:
|
||||||
|
description: |
|
||||||
|
SOURCE of this build's updates: stable/nightly/master/<repo>
|
||||||
|
required: true
|
||||||
|
default: stable
|
||||||
|
type: string
|
||||||
|
unix:
|
||||||
|
description: yt-dlp, yt-dlp.tar.gz, yt-dlp_linux, yt-dlp_linux.zip
|
||||||
|
default: true
|
||||||
|
type: boolean
|
||||||
|
linux_arm:
|
||||||
|
description: yt-dlp_linux_aarch64, yt-dlp_linux_armv7l
|
||||||
|
default: true
|
||||||
|
type: boolean
|
||||||
|
macos:
|
||||||
|
description: yt-dlp_macos, yt-dlp_macos.zip
|
||||||
|
default: true
|
||||||
|
type: boolean
|
||||||
|
macos_legacy:
|
||||||
|
description: yt-dlp_macos_legacy
|
||||||
|
default: true
|
||||||
|
type: boolean
|
||||||
|
windows:
|
||||||
|
description: yt-dlp.exe, yt-dlp_min.exe, yt-dlp_win.zip
|
||||||
|
default: true
|
||||||
|
type: boolean
|
||||||
|
windows32:
|
||||||
|
description: yt-dlp_x86.exe
|
||||||
|
default: true
|
||||||
|
type: boolean
|
||||||
|
meta_files:
|
||||||
|
description: SHA2-256SUMS, SHA2-512SUMS, _update_spec
|
||||||
|
default: true
|
||||||
|
type: boolean
|
||||||
|
origin:
|
||||||
|
description: Origin
|
||||||
|
required: false
|
||||||
|
default: 'current repo'
|
||||||
|
type: choice
|
||||||
|
options:
|
||||||
|
- 'current repo'
|
||||||
|
|
||||||
permissions:
|
permissions:
|
||||||
contents: read
|
contents: read
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
prepare:
|
process:
|
||||||
permissions:
|
|
||||||
contents: write # for push_release
|
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
outputs:
|
outputs:
|
||||||
version_suffix: ${{ steps.version_suffix.outputs.version_suffix }}
|
origin: ${{ steps.process_origin.outputs.origin }}
|
||||||
ytdlp_version: ${{ steps.bump_version.outputs.ytdlp_version }}
|
|
||||||
head_sha: ${{ steps.push_release.outputs.head_sha }}
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v3
|
- name: Process origin
|
||||||
with:
|
id: process_origin
|
||||||
fetch-depth: 0
|
run: |
|
||||||
- uses: actions/setup-python@v4
|
echo "origin=${{ inputs.origin == 'current repo' && github.repository || inputs.origin }}" | tee "$GITHUB_OUTPUT"
|
||||||
with:
|
|
||||||
python-version: '3.10'
|
|
||||||
|
|
||||||
- name: Set version suffix
|
unix:
|
||||||
id: version_suffix
|
needs: process
|
||||||
env:
|
if: inputs.unix
|
||||||
PUSH_VERSION_COMMIT: ${{ secrets.PUSH_VERSION_COMMIT }}
|
|
||||||
if: "env.PUSH_VERSION_COMMIT == ''"
|
|
||||||
run: echo "version_suffix=$(date -u +"%H%M%S")" >> "$GITHUB_OUTPUT"
|
|
||||||
- name: Bump version
|
|
||||||
id: bump_version
|
|
||||||
run: |
|
|
||||||
python devscripts/update-version.py ${{ steps.version_suffix.outputs.version_suffix }}
|
|
||||||
make issuetemplates
|
|
||||||
|
|
||||||
- name: Push to release
|
|
||||||
id: push_release
|
|
||||||
run: |
|
|
||||||
git config --global user.name github-actions
|
|
||||||
git config --global user.email github-actions@example.com
|
|
||||||
git add -u
|
|
||||||
git commit -m "[version] update" -m "Created by: ${{ github.event.sender.login }}" -m ":ci skip all :ci run dl"
|
|
||||||
git push origin --force ${{ github.event.ref }}:release
|
|
||||||
echo "head_sha=$(git rev-parse HEAD)" >> "$GITHUB_OUTPUT"
|
|
||||||
- name: Update master
|
|
||||||
env:
|
|
||||||
PUSH_VERSION_COMMIT: ${{ secrets.PUSH_VERSION_COMMIT }}
|
|
||||||
if: "env.PUSH_VERSION_COMMIT != ''"
|
|
||||||
run: git push origin ${{ github.event.ref }}
|
|
||||||
|
|
||||||
|
|
||||||
build_unix:
|
|
||||||
needs: prepare
|
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v3
|
- uses: actions/checkout@v4
|
||||||
- uses: actions/setup-python@v4
|
- uses: actions/setup-python@v5
|
||||||
with:
|
with:
|
||||||
python-version: '3.10'
|
python-version: "3.10"
|
||||||
- uses: conda-incubator/setup-miniconda@v2
|
- uses: conda-incubator/setup-miniconda@v3
|
||||||
with:
|
with:
|
||||||
miniforge-variant: Mambaforge
|
miniforge-variant: Mambaforge
|
||||||
use-mamba: true
|
use-mamba: true
|
||||||
channels: conda-forge
|
channels: conda-forge
|
||||||
auto-update-conda: true
|
auto-update-conda: true
|
||||||
activate-environment: ''
|
activate-environment: ""
|
||||||
auto-activate-base: false
|
auto-activate-base: false
|
||||||
- name: Install Requirements
|
- name: Install Requirements
|
||||||
run: |
|
run: |
|
||||||
sudo apt-get -y install zip pandoc man sed
|
sudo apt -y install zip pandoc man sed
|
||||||
python -m pip install -U pip setuptools wheel twine
|
cat > ./requirements.txt << EOF
|
||||||
python -m pip install -U Pyinstaller -r requirements.txt
|
python=3.10.*
|
||||||
reqs=$(mktemp)
|
brotli-python
|
||||||
echo -e 'python=3.10.*\npyinstaller' >$reqs
|
EOF
|
||||||
sed 's/^brotli.*/brotli-python/' <requirements.txt >>$reqs
|
python devscripts/install_deps.py --print \
|
||||||
mamba create -n build --file $reqs
|
--exclude brotli --exclude brotlicffi \
|
||||||
|
--include secretstorage --include pyinstaller >> ./requirements.txt
|
||||||
|
mamba create -n build --file ./requirements.txt
|
||||||
|
|
||||||
- name: Prepare
|
- name: Prepare
|
||||||
run: |
|
run: |
|
||||||
python devscripts/update-version.py ${{ needs.prepare.outputs.version_suffix }}
|
python devscripts/update-version.py -c "${{ inputs.channel }}" -r "${{ needs.process.outputs.origin }}" "${{ inputs.version }}"
|
||||||
python devscripts/make_lazy_extractors.py
|
python devscripts/make_lazy_extractors.py
|
||||||
- name: Build Unix platform-independent binary
|
- name: Build Unix platform-independent binary
|
||||||
run: |
|
run: |
|
||||||
make all tar
|
make all tar
|
||||||
- name: Build Unix standalone binary
|
- name: Build Unix standalone binary
|
||||||
shell: bash -l {0}
|
shell: bash -l {0}
|
||||||
run: |
|
run: |
|
||||||
unset LD_LIBRARY_PATH # Harmful; set by setup-python
|
unset LD_LIBRARY_PATH # Harmful; set by setup-python
|
||||||
conda activate build
|
conda activate build
|
||||||
python pyinst.py --onedir
|
python -m bundle.pyinstaller --onedir
|
||||||
(cd ./dist/yt-dlp_linux && zip -r ../yt-dlp_linux.zip .)
|
(cd ./dist/yt-dlp_linux && zip -r ../yt-dlp_linux.zip .)
|
||||||
python pyinst.py
|
python -m bundle.pyinstaller
|
||||||
|
mv ./dist/yt-dlp_linux ./yt-dlp_linux
|
||||||
|
mv ./dist/yt-dlp_linux.zip ./yt-dlp_linux.zip
|
||||||
|
|
||||||
- name: Upload artifacts
|
- name: Verify --update-to
|
||||||
uses: actions/upload-artifact@v3
|
if: vars.UPDATE_TO_VERIFICATION
|
||||||
with:
|
run: |
|
||||||
path: |
|
binaries=("yt-dlp" "yt-dlp_linux")
|
||||||
yt-dlp
|
for binary in "${binaries[@]}"; do
|
||||||
yt-dlp.tar.gz
|
chmod +x ./${binary}
|
||||||
dist/yt-dlp_linux
|
cp ./${binary} ./${binary}_downgraded
|
||||||
dist/yt-dlp_linux.zip
|
version="$(./${binary} --version)"
|
||||||
|
./${binary}_downgraded -v --update-to yt-dlp/yt-dlp@2023.03.04
|
||||||
|
downgraded_version="$(./${binary}_downgraded --version)"
|
||||||
|
[[ "$version" != "$downgraded_version" ]]
|
||||||
|
done
|
||||||
|
|
||||||
- name: Build and publish on PyPi
|
- name: Upload artifacts
|
||||||
env:
|
uses: actions/upload-artifact@v4
|
||||||
TWINE_USERNAME: __token__
|
with:
|
||||||
TWINE_PASSWORD: ${{ secrets.PYPI_TOKEN }}
|
name: build-${{ github.job }}
|
||||||
if: "env.TWINE_PASSWORD != ''"
|
path: |
|
||||||
run: |
|
yt-dlp
|
||||||
rm -rf dist/*
|
yt-dlp.tar.gz
|
||||||
python devscripts/set-variant.py pip -M "You installed yt-dlp with pip or using the wheel from PyPi; Use that to update"
|
yt-dlp_linux
|
||||||
python setup.py sdist bdist_wheel
|
yt-dlp_linux.zip
|
||||||
twine upload dist/*
|
compression-level: 0
|
||||||
|
|
||||||
- name: Install SSH private key for Homebrew
|
linux_arm:
|
||||||
env:
|
needs: process
|
||||||
BREW_TOKEN: ${{ secrets.BREW_TOKEN }}
|
if: inputs.linux_arm
|
||||||
if: "env.BREW_TOKEN != ''"
|
|
||||||
uses: yt-dlp/ssh-agent@v0.5.3
|
|
||||||
with:
|
|
||||||
ssh-private-key: ${{ env.BREW_TOKEN }}
|
|
||||||
- name: Update Homebrew Formulae
|
|
||||||
env:
|
|
||||||
BREW_TOKEN: ${{ secrets.BREW_TOKEN }}
|
|
||||||
if: "env.BREW_TOKEN != ''"
|
|
||||||
run: |
|
|
||||||
git clone git@github.com:yt-dlp/homebrew-taps taps/
|
|
||||||
python devscripts/update-formulae.py taps/Formula/yt-dlp.rb "${{ needs.prepare.outputs.ytdlp_version }}"
|
|
||||||
git -C taps/ config user.name github-actions
|
|
||||||
git -C taps/ config user.email github-actions@example.com
|
|
||||||
git -C taps/ commit -am 'yt-dlp: ${{ needs.prepare.outputs.ytdlp_version }}'
|
|
||||||
git -C taps/ push
|
|
||||||
|
|
||||||
|
|
||||||
build_linux_arm:
|
|
||||||
permissions:
|
permissions:
|
||||||
packages: write # for Creating cache
|
contents: read
|
||||||
|
packages: write # for creating cache
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
needs: prepare
|
|
||||||
strategy:
|
strategy:
|
||||||
matrix:
|
matrix:
|
||||||
architecture:
|
architecture:
|
||||||
- armv7
|
- armv7
|
||||||
- aarch64
|
- aarch64
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v3
|
- uses: actions/checkout@v4
|
||||||
with:
|
with:
|
||||||
path: ./repo
|
path: ./repo
|
||||||
- name: Virtualized Install, Prepare & Build
|
- name: Virtualized Install, Prepare & Build
|
||||||
uses: yt-dlp/run-on-arch-action@v2
|
uses: yt-dlp/run-on-arch-action@v2
|
||||||
with:
|
with:
|
||||||
githubToken: ${{ github.token }} # To cache image
|
# Ref: https://github.com/uraimo/run-on-arch-action/issues/55
|
||||||
arch: ${{ matrix.architecture }}
|
env: |
|
||||||
distro: ubuntu18.04 # Standalone executable should be built on minimum supported OS
|
GITHUB_WORKFLOW: build
|
||||||
dockerRunArgs: --volume "${PWD}/repo:/repo"
|
githubToken: ${{ github.token }} # To cache image
|
||||||
install: | # Installing Python 3.10 from the Deadsnakes repo raises errors
|
arch: ${{ matrix.architecture }}
|
||||||
apt update
|
distro: ubuntu18.04 # Standalone executable should be built on minimum supported OS
|
||||||
apt -y install zlib1g-dev python3.8 python3.8-dev python3.8-distutils python3-pip
|
dockerRunArgs: --volume "${PWD}/repo:/repo"
|
||||||
python3.8 -m pip install -U pip setuptools wheel
|
install: | # Installing Python 3.10 from the Deadsnakes repo raises errors
|
||||||
# Cannot access requirements.txt from the repo directory at this stage
|
apt update
|
||||||
python3.8 -m pip install -U Pyinstaller mutagen pycryptodomex websockets brotli certifi
|
apt -y install zlib1g-dev libffi-dev python3.8 python3.8-dev python3.8-distutils python3-pip
|
||||||
|
python3.8 -m pip install -U pip setuptools wheel
|
||||||
|
# Cannot access any files from the repo directory at this stage
|
||||||
|
python3.8 -m pip install -U Pyinstaller mutagen pycryptodomex websockets brotli certifi secretstorage cffi
|
||||||
|
|
||||||
run: |
|
run: |
|
||||||
cd repo
|
cd repo
|
||||||
python3.8 -m pip install -U Pyinstaller -r requirements.txt # Cached version may be out of date
|
python3.8 devscripts/install_deps.py -o --include build
|
||||||
python3.8 devscripts/update-version.py ${{ needs.prepare.outputs.version_suffix }}
|
python3.8 devscripts/install_deps.py --include pyinstaller --include secretstorage # Cached version may be out of date
|
||||||
python3.8 devscripts/make_lazy_extractors.py
|
python3.8 devscripts/update-version.py -c "${{ inputs.channel }}" -r "${{ needs.process.outputs.origin }}" "${{ inputs.version }}"
|
||||||
python3.8 pyinst.py
|
python3.8 devscripts/make_lazy_extractors.py
|
||||||
|
python3.8 -m bundle.pyinstaller
|
||||||
|
|
||||||
- name: Upload artifacts
|
if ${{ vars.UPDATE_TO_VERIFICATION && 'true' || 'false' }}; then
|
||||||
uses: actions/upload-artifact@v3
|
arch="${{ (matrix.architecture == 'armv7' && 'armv7l') || matrix.architecture }}"
|
||||||
with:
|
chmod +x ./dist/yt-dlp_linux_${arch}
|
||||||
path: | # run-on-arch-action designates armv7l as armv7
|
cp ./dist/yt-dlp_linux_${arch} ./dist/yt-dlp_linux_${arch}_downgraded
|
||||||
repo/dist/yt-dlp_linux_${{ (matrix.architecture == 'armv7' && 'armv7l') || matrix.architecture }}
|
version="$(./dist/yt-dlp_linux_${arch} --version)"
|
||||||
|
./dist/yt-dlp_linux_${arch}_downgraded -v --update-to yt-dlp/yt-dlp@2023.03.04
|
||||||
|
downgraded_version="$(./dist/yt-dlp_linux_${arch}_downgraded --version)"
|
||||||
|
[[ "$version" != "$downgraded_version" ]]
|
||||||
|
fi
|
||||||
|
|
||||||
|
- name: Upload artifacts
|
||||||
|
uses: actions/upload-artifact@v4
|
||||||
|
with:
|
||||||
|
name: build-linux_${{ matrix.architecture }}
|
||||||
|
path: | # run-on-arch-action designates armv7l as armv7
|
||||||
|
repo/dist/yt-dlp_linux_${{ (matrix.architecture == 'armv7' && 'armv7l') || matrix.architecture }}
|
||||||
|
compression-level: 0
|
||||||
|
|
||||||
build_macos:
|
macos:
|
||||||
|
needs: process
|
||||||
|
if: inputs.macos
|
||||||
runs-on: macos-11
|
runs-on: macos-11
|
||||||
needs: prepare
|
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v3
|
- uses: actions/checkout@v4
|
||||||
# NB: In order to create a universal2 application, the version of python3 in /usr/bin has to be used
|
# NB: Building universal2 does not work with python from actions/setup-python
|
||||||
- name: Install Requirements
|
- name: Install Requirements
|
||||||
run: |
|
run: |
|
||||||
brew install coreutils
|
brew install coreutils
|
||||||
/usr/bin/python3 -m pip install -U --user pip Pyinstaller -r requirements.txt
|
python3 devscripts/install_deps.py --user -o --include build
|
||||||
|
python3 devscripts/install_deps.py --print --include pyinstaller > requirements.txt
|
||||||
|
# We need to ignore wheels otherwise we break universal2 builds
|
||||||
|
python3 -m pip install -U --user --no-binary :all: -r requirements.txt
|
||||||
|
|
||||||
- name: Prepare
|
- name: Prepare
|
||||||
run: |
|
run: |
|
||||||
/usr/bin/python3 devscripts/update-version.py ${{ needs.prepare.outputs.version_suffix }}
|
python3 devscripts/update-version.py -c "${{ inputs.channel }}" -r "${{ needs.process.outputs.origin }}" "${{ inputs.version }}"
|
||||||
/usr/bin/python3 devscripts/make_lazy_extractors.py
|
python3 devscripts/make_lazy_extractors.py
|
||||||
- name: Build
|
- name: Build
|
||||||
run: |
|
run: |
|
||||||
/usr/bin/python3 pyinst.py --target-architecture universal2 --onedir
|
python3 -m bundle.pyinstaller --target-architecture universal2 --onedir
|
||||||
(cd ./dist/yt-dlp_macos && zip -r ../yt-dlp_macos.zip .)
|
(cd ./dist/yt-dlp_macos && zip -r ../yt-dlp_macos.zip .)
|
||||||
/usr/bin/python3 pyinst.py --target-architecture universal2
|
python3 -m bundle.pyinstaller --target-architecture universal2
|
||||||
|
|
||||||
- name: Upload artifacts
|
- name: Verify --update-to
|
||||||
uses: actions/upload-artifact@v3
|
if: vars.UPDATE_TO_VERIFICATION
|
||||||
with:
|
run: |
|
||||||
path: |
|
chmod +x ./dist/yt-dlp_macos
|
||||||
dist/yt-dlp_macos
|
cp ./dist/yt-dlp_macos ./dist/yt-dlp_macos_downgraded
|
||||||
dist/yt-dlp_macos.zip
|
version="$(./dist/yt-dlp_macos --version)"
|
||||||
|
./dist/yt-dlp_macos_downgraded -v --update-to yt-dlp/yt-dlp@2023.03.04
|
||||||
|
downgraded_version="$(./dist/yt-dlp_macos_downgraded --version)"
|
||||||
|
[[ "$version" != "$downgraded_version" ]]
|
||||||
|
|
||||||
|
- name: Upload artifacts
|
||||||
|
uses: actions/upload-artifact@v4
|
||||||
|
with:
|
||||||
|
name: build-${{ github.job }}
|
||||||
|
path: |
|
||||||
|
dist/yt-dlp_macos
|
||||||
|
dist/yt-dlp_macos.zip
|
||||||
|
compression-level: 0
|
||||||
|
|
||||||
build_macos_legacy:
|
macos_legacy:
|
||||||
|
needs: process
|
||||||
|
if: inputs.macos_legacy
|
||||||
runs-on: macos-latest
|
runs-on: macos-latest
|
||||||
needs: prepare
|
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v3
|
- uses: actions/checkout@v4
|
||||||
- name: Install Python
|
- name: Install Python
|
||||||
# We need the official Python, because the GA ones only support newer macOS versions
|
# We need the official Python, because the GA ones only support newer macOS versions
|
||||||
env:
|
env:
|
||||||
PYTHON_VERSION: 3.10.5
|
PYTHON_VERSION: 3.10.5
|
||||||
MACOSX_DEPLOYMENT_TARGET: 10.9 # Used up by the Python build tools
|
MACOSX_DEPLOYMENT_TARGET: 10.9 # Used up by the Python build tools
|
||||||
run: |
|
run: |
|
||||||
# Hack to get the latest patch version. Uncomment if needed
|
# Hack to get the latest patch version. Uncomment if needed
|
||||||
#brew install python@3.10
|
#brew install python@3.10
|
||||||
#export PYTHON_VERSION=$( $(brew --prefix)/opt/python@3.10/bin/python3 --version | cut -d ' ' -f 2 )
|
#export PYTHON_VERSION=$( $(brew --prefix)/opt/python@3.10/bin/python3 --version | cut -d ' ' -f 2 )
|
||||||
curl https://www.python.org/ftp/python/${PYTHON_VERSION}/python-${PYTHON_VERSION}-macos11.pkg -o "python.pkg"
|
curl https://www.python.org/ftp/python/${PYTHON_VERSION}/python-${PYTHON_VERSION}-macos11.pkg -o "python.pkg"
|
||||||
sudo installer -pkg python.pkg -target /
|
sudo installer -pkg python.pkg -target /
|
||||||
python3 --version
|
python3 --version
|
||||||
- name: Install Requirements
|
- name: Install Requirements
|
||||||
run: |
|
run: |
|
||||||
brew install coreutils
|
brew install coreutils
|
||||||
python3 -m pip install -U --user pip Pyinstaller -r requirements.txt
|
python3 devscripts/install_deps.py --user -o --include build
|
||||||
|
python3 devscripts/install_deps.py --user --include pyinstaller
|
||||||
|
|
||||||
- name: Prepare
|
- name: Prepare
|
||||||
run: |
|
run: |
|
||||||
python3 devscripts/update-version.py ${{ needs.prepare.outputs.version_suffix }}
|
python3 devscripts/update-version.py -c "${{ inputs.channel }}" -r "${{ needs.process.outputs.origin }}" "${{ inputs.version }}"
|
||||||
python3 devscripts/make_lazy_extractors.py
|
python3 devscripts/make_lazy_extractors.py
|
||||||
- name: Build
|
- name: Build
|
||||||
run: |
|
run: |
|
||||||
python3 pyinst.py
|
python3 -m bundle.pyinstaller
|
||||||
mv dist/yt-dlp_macos dist/yt-dlp_macos_legacy
|
mv dist/yt-dlp_macos dist/yt-dlp_macos_legacy
|
||||||
|
|
||||||
- name: Upload artifacts
|
- name: Verify --update-to
|
||||||
uses: actions/upload-artifact@v3
|
if: vars.UPDATE_TO_VERIFICATION
|
||||||
with:
|
run: |
|
||||||
path: |
|
chmod +x ./dist/yt-dlp_macos_legacy
|
||||||
dist/yt-dlp_macos_legacy
|
cp ./dist/yt-dlp_macos_legacy ./dist/yt-dlp_macos_legacy_downgraded
|
||||||
|
version="$(./dist/yt-dlp_macos_legacy --version)"
|
||||||
|
./dist/yt-dlp_macos_legacy_downgraded -v --update-to yt-dlp/yt-dlp@2023.03.04
|
||||||
|
downgraded_version="$(./dist/yt-dlp_macos_legacy_downgraded --version)"
|
||||||
|
[[ "$version" != "$downgraded_version" ]]
|
||||||
|
|
||||||
|
- name: Upload artifacts
|
||||||
|
uses: actions/upload-artifact@v4
|
||||||
|
with:
|
||||||
|
name: build-${{ github.job }}
|
||||||
|
path: |
|
||||||
|
dist/yt-dlp_macos_legacy
|
||||||
|
compression-level: 0
|
||||||
|
|
||||||
build_windows:
|
windows:
|
||||||
|
needs: process
|
||||||
|
if: inputs.windows
|
||||||
runs-on: windows-latest
|
runs-on: windows-latest
|
||||||
needs: prepare
|
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v3
|
- uses: actions/checkout@v4
|
||||||
- uses: actions/setup-python@v4
|
- uses: actions/setup-python@v5
|
||||||
with: # 3.8 is used for Win7 support
|
with: # 3.8 is used for Win7 support
|
||||||
python-version: '3.8'
|
python-version: "3.8"
|
||||||
- name: Install Requirements
|
- name: Install Requirements
|
||||||
run: | # Custom pyinstaller built with https://github.com/yt-dlp/pyinstaller-builds
|
run: | # Custom pyinstaller built with https://github.com/yt-dlp/pyinstaller-builds
|
||||||
python -m pip install -U pip setuptools wheel py2exe
|
python devscripts/install_deps.py -o --include build
|
||||||
pip install -U "https://yt-dlp.github.io/Pyinstaller-Builds/x86_64/pyinstaller-5.3-py3-none-any.whl" -r requirements.txt
|
python devscripts/install_deps.py --include py2exe
|
||||||
|
python -m pip install -U "https://yt-dlp.github.io/Pyinstaller-Builds/x86_64/pyinstaller-5.8.0-py3-none-any.whl"
|
||||||
|
|
||||||
- name: Prepare
|
- name: Prepare
|
||||||
run: |
|
run: |
|
||||||
python devscripts/update-version.py ${{ needs.prepare.outputs.version_suffix }}
|
python devscripts/update-version.py -c "${{ inputs.channel }}" -r "${{ needs.process.outputs.origin }}" "${{ inputs.version }}"
|
||||||
python devscripts/make_lazy_extractors.py
|
python devscripts/make_lazy_extractors.py
|
||||||
- name: Build
|
- name: Build
|
||||||
run: |
|
run: |
|
||||||
python setup.py py2exe
|
python -m bundle.py2exe
|
||||||
Move-Item ./dist/yt-dlp.exe ./dist/yt-dlp_min.exe
|
Move-Item ./dist/yt-dlp.exe ./dist/yt-dlp_min.exe
|
||||||
python pyinst.py
|
python -m bundle.pyinstaller
|
||||||
python pyinst.py --onedir
|
python -m bundle.pyinstaller --onedir
|
||||||
Compress-Archive -Path ./dist/yt-dlp/* -DestinationPath ./dist/yt-dlp_win.zip
|
Compress-Archive -Path ./dist/yt-dlp/* -DestinationPath ./dist/yt-dlp_win.zip
|
||||||
|
|
||||||
- name: Upload artifacts
|
- name: Verify --update-to
|
||||||
uses: actions/upload-artifact@v3
|
if: vars.UPDATE_TO_VERIFICATION
|
||||||
with:
|
run: |
|
||||||
path: |
|
foreach ($name in @("yt-dlp","yt-dlp_min")) {
|
||||||
dist/yt-dlp.exe
|
Copy-Item "./dist/${name}.exe" "./dist/${name}_downgraded.exe"
|
||||||
dist/yt-dlp_min.exe
|
$version = & "./dist/${name}.exe" --version
|
||||||
dist/yt-dlp_win.zip
|
& "./dist/${name}_downgraded.exe" -v --update-to yt-dlp/yt-dlp@2023.03.04
|
||||||
|
$downgraded_version = & "./dist/${name}_downgraded.exe" --version
|
||||||
|
if ($version -eq $downgraded_version) {
|
||||||
|
exit 1
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
- name: Upload artifacts
|
||||||
|
uses: actions/upload-artifact@v4
|
||||||
|
with:
|
||||||
|
name: build-${{ github.job }}
|
||||||
|
path: |
|
||||||
|
dist/yt-dlp.exe
|
||||||
|
dist/yt-dlp_min.exe
|
||||||
|
dist/yt-dlp_win.zip
|
||||||
|
compression-level: 0
|
||||||
|
|
||||||
build_windows32:
|
windows32:
|
||||||
|
needs: process
|
||||||
|
if: inputs.windows32
|
||||||
runs-on: windows-latest
|
runs-on: windows-latest
|
||||||
needs: prepare
|
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v3
|
- uses: actions/checkout@v4
|
||||||
- uses: actions/setup-python@v4
|
- uses: actions/setup-python@v5
|
||||||
with: # 3.7 is used for Vista support. See https://github.com/yt-dlp/yt-dlp/issues/390
|
with:
|
||||||
python-version: '3.7'
|
python-version: "3.8"
|
||||||
architecture: 'x86'
|
architecture: "x86"
|
||||||
- name: Install Requirements
|
- name: Install Requirements
|
||||||
run: |
|
run: |
|
||||||
python -m pip install -U pip setuptools wheel
|
python devscripts/install_deps.py -o --include build
|
||||||
pip install -U "https://yt-dlp.github.io/Pyinstaller-Builds/i686/pyinstaller-5.3-py3-none-any.whl" -r requirements.txt
|
python devscripts/install_deps.py
|
||||||
|
python -m pip install -U "https://yt-dlp.github.io/Pyinstaller-Builds/i686/pyinstaller-5.8.0-py3-none-any.whl"
|
||||||
|
|
||||||
- name: Prepare
|
- name: Prepare
|
||||||
run: |
|
run: |
|
||||||
python devscripts/update-version.py ${{ needs.prepare.outputs.version_suffix }}
|
python devscripts/update-version.py -c "${{ inputs.channel }}" -r "${{ needs.process.outputs.origin }}" "${{ inputs.version }}"
|
||||||
python devscripts/make_lazy_extractors.py
|
python devscripts/make_lazy_extractors.py
|
||||||
- name: Build
|
- name: Build
|
||||||
run: |
|
run: |
|
||||||
python pyinst.py
|
python -m bundle.pyinstaller
|
||||||
|
|
||||||
- name: Upload artifacts
|
- name: Verify --update-to
|
||||||
uses: actions/upload-artifact@v3
|
if: vars.UPDATE_TO_VERIFICATION
|
||||||
with:
|
run: |
|
||||||
path: |
|
foreach ($name in @("yt-dlp_x86")) {
|
||||||
dist/yt-dlp_x86.exe
|
Copy-Item "./dist/${name}.exe" "./dist/${name}_downgraded.exe"
|
||||||
|
$version = & "./dist/${name}.exe" --version
|
||||||
|
& "./dist/${name}_downgraded.exe" -v --update-to yt-dlp/yt-dlp@2023.03.04
|
||||||
|
$downgraded_version = & "./dist/${name}_downgraded.exe" --version
|
||||||
|
if ($version -eq $downgraded_version) {
|
||||||
|
exit 1
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
- name: Upload artifacts
|
||||||
|
uses: actions/upload-artifact@v4
|
||||||
|
with:
|
||||||
|
name: build-${{ github.job }}
|
||||||
|
path: |
|
||||||
|
dist/yt-dlp_x86.exe
|
||||||
|
compression-level: 0
|
||||||
|
|
||||||
publish_release:
|
meta_files:
|
||||||
permissions:
|
if: inputs.meta_files && always() && !cancelled()
|
||||||
contents: write # for action-gh-release
|
needs:
|
||||||
|
- process
|
||||||
|
- unix
|
||||||
|
- linux_arm
|
||||||
|
- macos
|
||||||
|
- macos_legacy
|
||||||
|
- windows
|
||||||
|
- windows32
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
needs: [prepare, build_unix, build_linux_arm, build_windows, build_windows32, build_macos, build_macos_legacy]
|
|
||||||
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v3
|
- uses: actions/download-artifact@v4
|
||||||
- uses: actions/download-artifact@v3
|
with:
|
||||||
|
path: artifact
|
||||||
|
pattern: build-*
|
||||||
|
merge-multiple: true
|
||||||
|
|
||||||
- name: Get Changelog
|
- name: Make SHA2-SUMS files
|
||||||
run: |
|
run: |
|
||||||
changelog=$(grep -oPz '(?s)(?<=### ${{ needs.prepare.outputs.ytdlp_version }}\n{2}).+?(?=\n{2,3}###)' Changelog.md) || true
|
cd ./artifact/
|
||||||
echo "changelog<<EOF" >> $GITHUB_ENV
|
sha256sum * > ../SHA2-256SUMS
|
||||||
echo "$changelog" >> $GITHUB_ENV
|
sha512sum * > ../SHA2-512SUMS
|
||||||
echo "EOF" >> $GITHUB_ENV
|
|
||||||
- name: Make Update spec
|
|
||||||
run: |
|
|
||||||
echo "# This file is used for regulating self-update" >> _update_spec
|
|
||||||
echo "lock 2022.07.18 .+ Python 3.6" >> _update_spec
|
|
||||||
- name: Make SHA2-SUMS files
|
|
||||||
run: |
|
|
||||||
sha256sum artifact/yt-dlp | awk '{print $1 " yt-dlp"}' >> SHA2-256SUMS
|
|
||||||
sha256sum artifact/yt-dlp.tar.gz | awk '{print $1 " yt-dlp.tar.gz"}' >> SHA2-256SUMS
|
|
||||||
sha256sum artifact/yt-dlp.exe | awk '{print $1 " yt-dlp.exe"}' >> SHA2-256SUMS
|
|
||||||
sha256sum artifact/yt-dlp_win.zip | awk '{print $1 " yt-dlp_win.zip"}' >> SHA2-256SUMS
|
|
||||||
sha256sum artifact/yt-dlp_min.exe | awk '{print $1 " yt-dlp_min.exe"}' >> SHA2-256SUMS
|
|
||||||
sha256sum artifact/yt-dlp_x86.exe | awk '{print $1 " yt-dlp_x86.exe"}' >> SHA2-256SUMS
|
|
||||||
sha256sum artifact/yt-dlp_macos | awk '{print $1 " yt-dlp_macos"}' >> SHA2-256SUMS
|
|
||||||
sha256sum artifact/yt-dlp_macos.zip | awk '{print $1 " yt-dlp_macos.zip"}' >> SHA2-256SUMS
|
|
||||||
sha256sum artifact/yt-dlp_macos_legacy | awk '{print $1 " yt-dlp_macos_legacy"}' >> SHA2-256SUMS
|
|
||||||
sha256sum artifact/yt-dlp_linux_armv7l | awk '{print $1 " yt-dlp_linux_armv7l"}' >> SHA2-256SUMS
|
|
||||||
sha256sum artifact/yt-dlp_linux_aarch64 | awk '{print $1 " yt-dlp_linux_aarch64"}' >> SHA2-256SUMS
|
|
||||||
sha256sum artifact/dist/yt-dlp_linux | awk '{print $1 " yt-dlp_linux"}' >> SHA2-256SUMS
|
|
||||||
sha256sum artifact/dist/yt-dlp_linux.zip | awk '{print $1 " yt-dlp_linux.zip"}' >> SHA2-256SUMS
|
|
||||||
sha512sum artifact/yt-dlp | awk '{print $1 " yt-dlp"}' >> SHA2-512SUMS
|
|
||||||
sha512sum artifact/yt-dlp.tar.gz | awk '{print $1 " yt-dlp.tar.gz"}' >> SHA2-512SUMS
|
|
||||||
sha512sum artifact/yt-dlp.exe | awk '{print $1 " yt-dlp.exe"}' >> SHA2-512SUMS
|
|
||||||
sha512sum artifact/yt-dlp_win.zip | awk '{print $1 " yt-dlp_win.zip"}' >> SHA2-512SUMS
|
|
||||||
sha512sum artifact/yt-dlp_min.exe | awk '{print $1 " yt-dlp_min.exe"}' >> SHA2-512SUMS
|
|
||||||
sha512sum artifact/yt-dlp_x86.exe | awk '{print $1 " yt-dlp_x86.exe"}' >> SHA2-512SUMS
|
|
||||||
sha512sum artifact/yt-dlp_macos | awk '{print $1 " yt-dlp_macos"}' >> SHA2-512SUMS
|
|
||||||
sha512sum artifact/yt-dlp_macos.zip | awk '{print $1 " yt-dlp_macos.zip"}' >> SHA2-512SUMS
|
|
||||||
sha512sum artifact/yt-dlp_macos_legacy | awk '{print $1 " yt-dlp_macos_legacy"}' >> SHA2-512SUMS
|
|
||||||
sha512sum artifact/yt-dlp_linux_armv7l | awk '{print $1 " yt-dlp_linux_armv7l"}' >> SHA2-512SUMS
|
|
||||||
sha512sum artifact/yt-dlp_linux_aarch64 | awk '{print $1 " yt-dlp_linux_aarch64"}' >> SHA2-512SUMS
|
|
||||||
sha512sum artifact/dist/yt-dlp_linux | awk '{print $1 " yt-dlp_linux"}' >> SHA2-512SUMS
|
|
||||||
sha512sum artifact/dist/yt-dlp_linux.zip | awk '{print $1 " yt-dlp_linux.zip"}' >> SHA2-512SUMS
|
|
||||||
|
|
||||||
- name: Publish Release
|
- name: Make Update spec
|
||||||
uses: yt-dlp/action-gh-release@v1
|
run: |
|
||||||
with:
|
cat >> _update_spec << EOF
|
||||||
tag_name: ${{ needs.prepare.outputs.ytdlp_version }}
|
# This file is used for regulating self-update
|
||||||
name: yt-dlp ${{ needs.prepare.outputs.ytdlp_version }}
|
lock 2022.08.18.36 .+ Python 3\.6
|
||||||
target_commitish: ${{ needs.prepare.outputs.head_sha }}
|
lock 2023.11.16 (?!win_x86_exe).+ Python 3\.7
|
||||||
body: |
|
lock 2023.11.16 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
||||||
#### [A description of the various files]((https://github.com/yt-dlp/yt-dlp#release-files)) are in the README
|
lockV2 yt-dlp/yt-dlp 2022.08.18.36 .+ Python 3\.6
|
||||||
|
lockV2 yt-dlp/yt-dlp 2023.11.16 (?!win_x86_exe).+ Python 3\.7
|
||||||
|
lockV2 yt-dlp/yt-dlp 2023.11.16 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
||||||
|
lockV2 yt-dlp/yt-dlp-nightly-builds 2023.11.15.232826 (?!win_x86_exe).+ Python 3\.7
|
||||||
|
lockV2 yt-dlp/yt-dlp-nightly-builds 2023.11.15.232826 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
||||||
|
lockV2 yt-dlp/yt-dlp-master-builds 2023.11.15.232812 (?!win_x86_exe).+ Python 3\.7
|
||||||
|
lockV2 yt-dlp/yt-dlp-master-builds 2023.11.15.232812 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
||||||
|
EOF
|
||||||
|
|
||||||
---
|
- name: Sign checksum files
|
||||||
<details open><summary><h3>Changelog</summary>
|
env:
|
||||||
<p>
|
GPG_SIGNING_KEY: ${{ secrets.GPG_SIGNING_KEY }}
|
||||||
|
if: env.GPG_SIGNING_KEY != ''
|
||||||
|
run: |
|
||||||
|
gpg --batch --import <<< "${{ secrets.GPG_SIGNING_KEY }}"
|
||||||
|
for signfile in ./SHA*SUMS; do
|
||||||
|
gpg --batch --detach-sign "$signfile"
|
||||||
|
done
|
||||||
|
|
||||||
${{ env.changelog }}
|
- name: Upload artifacts
|
||||||
|
uses: actions/upload-artifact@v4
|
||||||
</p>
|
with:
|
||||||
</details>
|
name: build-${{ github.job }}
|
||||||
files: |
|
path: |
|
||||||
SHA2-256SUMS
|
_update_spec
|
||||||
SHA2-512SUMS
|
SHA*SUMS*
|
||||||
artifact/yt-dlp
|
compression-level: 0
|
||||||
artifact/yt-dlp.tar.gz
|
|
||||||
artifact/yt-dlp.exe
|
|
||||||
artifact/yt-dlp_win.zip
|
|
||||||
artifact/yt-dlp_min.exe
|
|
||||||
artifact/yt-dlp_x86.exe
|
|
||||||
artifact/yt-dlp_macos
|
|
||||||
artifact/yt-dlp_macos.zip
|
|
||||||
artifact/yt-dlp_macos_legacy
|
|
||||||
artifact/yt-dlp_linux_armv7l
|
|
||||||
artifact/yt-dlp_linux_aarch64
|
|
||||||
artifact/dist/yt-dlp_linux
|
|
||||||
artifact/dist/yt-dlp_linux.zip
|
|
||||||
_update_spec
|
|
||||||
|
|
65
.github/workflows/codeql.yml
vendored
Normal file
65
.github/workflows/codeql.yml
vendored
Normal file
|
@ -0,0 +1,65 @@
|
||||||
|
name: "CodeQL"
|
||||||
|
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches: [ 'master', 'gh-pages', 'release' ]
|
||||||
|
pull_request:
|
||||||
|
# The branches below must be a subset of the branches above
|
||||||
|
branches: [ 'master' ]
|
||||||
|
schedule:
|
||||||
|
- cron: '59 11 * * 5'
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
analyze:
|
||||||
|
name: Analyze
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
permissions:
|
||||||
|
actions: read
|
||||||
|
contents: read
|
||||||
|
security-events: write
|
||||||
|
|
||||||
|
strategy:
|
||||||
|
fail-fast: false
|
||||||
|
matrix:
|
||||||
|
language: [ 'python' ]
|
||||||
|
# CodeQL supports [ 'cpp', 'csharp', 'go', 'java', 'javascript', 'python', 'ruby' ]
|
||||||
|
# Use only 'java' to analyze code written in Java, Kotlin or both
|
||||||
|
# Use only 'javascript' to analyze code written in JavaScript, TypeScript or both
|
||||||
|
# Learn more about CodeQL language support at https://aka.ms/codeql-docs/language-support
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Checkout repository
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
# Initializes the CodeQL tools for scanning.
|
||||||
|
- name: Initialize CodeQL
|
||||||
|
uses: github/codeql-action/init@v2
|
||||||
|
with:
|
||||||
|
languages: ${{ matrix.language }}
|
||||||
|
# If you wish to specify custom queries, you can do so here or in a config file.
|
||||||
|
# By default, queries listed here will override any specified in a config file.
|
||||||
|
# Prefix the list here with "+" to use these queries and those in the config file.
|
||||||
|
|
||||||
|
# For more details on CodeQL's query packs, refer to: https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs
|
||||||
|
# queries: security-extended,security-and-quality
|
||||||
|
|
||||||
|
|
||||||
|
# Autobuild attempts to build any compiled languages (C/C++, C#, Go, Java, or Swift).
|
||||||
|
# If this step fails, then you should remove it and run the build manually (see below)
|
||||||
|
- name: Autobuild
|
||||||
|
uses: github/codeql-action/autobuild@v2
|
||||||
|
|
||||||
|
# ℹ️ Command-line programs to run using the OS shell.
|
||||||
|
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun
|
||||||
|
|
||||||
|
# If the Autobuild fails above, remove it and uncomment the following three lines.
|
||||||
|
# modify them (or add more) to build your code if your project, please refer to the EXAMPLE below for guidance.
|
||||||
|
|
||||||
|
# - run: |
|
||||||
|
# echo "Run, Build Application using script"
|
||||||
|
# ./location_of_script_within_repo/buildscript.sh
|
||||||
|
|
||||||
|
- name: Perform CodeQL Analysis
|
||||||
|
uses: github/codeql-action/analyze@v2
|
||||||
|
with:
|
||||||
|
category: "/language:${{matrix.language}}"
|
47
.github/workflows/core.yml
vendored
47
.github/workflows/core.yml
vendored
|
@ -1,8 +1,32 @@
|
||||||
name: Core Tests
|
name: Core Tests
|
||||||
on: [push, pull_request]
|
on:
|
||||||
|
push:
|
||||||
|
paths:
|
||||||
|
- .github/**
|
||||||
|
- devscripts/**
|
||||||
|
- test/**
|
||||||
|
- yt_dlp/**.py
|
||||||
|
- '!yt_dlp/extractor/*.py'
|
||||||
|
- yt_dlp/extractor/__init__.py
|
||||||
|
- yt_dlp/extractor/common.py
|
||||||
|
- yt_dlp/extractor/extractors.py
|
||||||
|
pull_request:
|
||||||
|
paths:
|
||||||
|
- .github/**
|
||||||
|
- devscripts/**
|
||||||
|
- test/**
|
||||||
|
- yt_dlp/**.py
|
||||||
|
- '!yt_dlp/extractor/*.py'
|
||||||
|
- yt_dlp/extractor/__init__.py
|
||||||
|
- yt_dlp/extractor/common.py
|
||||||
|
- yt_dlp/extractor/extractors.py
|
||||||
permissions:
|
permissions:
|
||||||
contents: read
|
contents: read
|
||||||
|
|
||||||
|
concurrency:
|
||||||
|
group: core-${{ github.event.pull_request.number || github.ref }}
|
||||||
|
cancel-in-progress: ${{ github.event_name == 'pull_request' }}
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
tests:
|
tests:
|
||||||
name: Core Tests
|
name: Core Tests
|
||||||
|
@ -12,27 +36,26 @@ jobs:
|
||||||
fail-fast: false
|
fail-fast: false
|
||||||
matrix:
|
matrix:
|
||||||
os: [ubuntu-latest]
|
os: [ubuntu-latest]
|
||||||
# CPython 3.11 is in quick-test
|
# CPython 3.8 is in quick-test
|
||||||
python-version: ['3.8', '3.9', '3.10', pypy-3.7, pypy-3.8]
|
python-version: ['3.9', '3.10', '3.11', '3.12', pypy-3.8, pypy-3.10]
|
||||||
run-tests-ext: [sh]
|
|
||||||
include:
|
include:
|
||||||
# atleast one of each CPython/PyPy tests must be in windows
|
# atleast one of each CPython/PyPy tests must be in windows
|
||||||
- os: windows-latest
|
- os: windows-latest
|
||||||
python-version: '3.7'
|
python-version: '3.8'
|
||||||
run-tests-ext: bat
|
- os: windows-latest
|
||||||
|
python-version: '3.12'
|
||||||
- os: windows-latest
|
- os: windows-latest
|
||||||
python-version: pypy-3.9
|
python-version: pypy-3.9
|
||||||
run-tests-ext: bat
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v3
|
- uses: actions/checkout@v4
|
||||||
- name: Set up Python ${{ matrix.python-version }}
|
- name: Set up Python ${{ matrix.python-version }}
|
||||||
uses: actions/setup-python@v4
|
uses: actions/setup-python@v5
|
||||||
with:
|
with:
|
||||||
python-version: ${{ matrix.python-version }}
|
python-version: ${{ matrix.python-version }}
|
||||||
- name: Install pytest
|
- name: Install test requirements
|
||||||
run: pip install pytest
|
run: python3 ./devscripts/install_deps.py --include dev
|
||||||
- name: Run tests
|
- name: Run tests
|
||||||
continue-on-error: False
|
continue-on-error: False
|
||||||
run: |
|
run: |
|
||||||
python3 -m yt_dlp -v || true # Print debug head
|
python3 -m yt_dlp -v || true # Print debug head
|
||||||
./devscripts/run_tests.${{ matrix.run-tests-ext }} core
|
python3 ./devscripts/run_tests.py core
|
||||||
|
|
23
.github/workflows/download.yml
vendored
23
.github/workflows/download.yml
vendored
|
@ -9,16 +9,16 @@ jobs:
|
||||||
if: "contains(github.event.head_commit.message, 'ci run dl')"
|
if: "contains(github.event.head_commit.message, 'ci run dl')"
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v3
|
- uses: actions/checkout@v4
|
||||||
- name: Set up Python
|
- name: Set up Python
|
||||||
uses: actions/setup-python@v4
|
uses: actions/setup-python@v5
|
||||||
with:
|
with:
|
||||||
python-version: 3.9
|
python-version: 3.9
|
||||||
- name: Install test requirements
|
- name: Install test requirements
|
||||||
run: pip install pytest
|
run: python3 ./devscripts/install_deps.py --include dev
|
||||||
- name: Run tests
|
- name: Run tests
|
||||||
continue-on-error: true
|
continue-on-error: true
|
||||||
run: ./devscripts/run_tests.sh download
|
run: python3 ./devscripts/run_tests.py download
|
||||||
|
|
||||||
full:
|
full:
|
||||||
name: Full Download Tests
|
name: Full Download Tests
|
||||||
|
@ -28,24 +28,21 @@ jobs:
|
||||||
fail-fast: true
|
fail-fast: true
|
||||||
matrix:
|
matrix:
|
||||||
os: [ubuntu-latest]
|
os: [ubuntu-latest]
|
||||||
python-version: ['3.7', '3.10', 3.11-dev, pypy-3.7, pypy-3.8]
|
python-version: ['3.10', '3.11', '3.12', pypy-3.8, pypy-3.10]
|
||||||
run-tests-ext: [sh]
|
|
||||||
include:
|
include:
|
||||||
# atleast one of each CPython/PyPy tests must be in windows
|
# atleast one of each CPython/PyPy tests must be in windows
|
||||||
- os: windows-latest
|
- os: windows-latest
|
||||||
python-version: '3.8'
|
python-version: '3.8'
|
||||||
run-tests-ext: bat
|
|
||||||
- os: windows-latest
|
- os: windows-latest
|
||||||
python-version: pypy-3.9
|
python-version: pypy-3.9
|
||||||
run-tests-ext: bat
|
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v3
|
- uses: actions/checkout@v4
|
||||||
- name: Set up Python ${{ matrix.python-version }}
|
- name: Set up Python ${{ matrix.python-version }}
|
||||||
uses: actions/setup-python@v4
|
uses: actions/setup-python@v5
|
||||||
with:
|
with:
|
||||||
python-version: ${{ matrix.python-version }}
|
python-version: ${{ matrix.python-version }}
|
||||||
- name: Install pytest
|
- name: Install test requirements
|
||||||
run: pip install pytest
|
run: python3 ./devscripts/install_deps.py --include dev
|
||||||
- name: Run tests
|
- name: Run tests
|
||||||
continue-on-error: true
|
continue-on-error: true
|
||||||
run: ./devscripts/run_tests.${{ matrix.run-tests-ext }} download
|
run: python3 ./devscripts/run_tests.py download
|
||||||
|
|
20
.github/workflows/quick-test.yml
vendored
20
.github/workflows/quick-test.yml
vendored
|
@ -9,27 +9,27 @@ jobs:
|
||||||
if: "!contains(github.event.head_commit.message, 'ci skip all')"
|
if: "!contains(github.event.head_commit.message, 'ci skip all')"
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v3
|
- uses: actions/checkout@v4
|
||||||
- name: Set up Python 3.11
|
- name: Set up Python 3.8
|
||||||
uses: actions/setup-python@v4
|
uses: actions/setup-python@v5
|
||||||
with:
|
with:
|
||||||
python-version: '3.11'
|
python-version: '3.8'
|
||||||
- name: Install test requirements
|
- name: Install test requirements
|
||||||
run: pip install pytest pycryptodomex
|
run: python3 ./devscripts/install_deps.py --include dev
|
||||||
- name: Run tests
|
- name: Run tests
|
||||||
run: |
|
run: |
|
||||||
python3 -m yt_dlp -v || true
|
python3 -m yt_dlp -v || true
|
||||||
./devscripts/run_tests.sh core
|
python3 ./devscripts/run_tests.py core
|
||||||
flake8:
|
flake8:
|
||||||
name: Linter
|
name: Linter
|
||||||
if: "!contains(github.event.head_commit.message, 'ci skip all')"
|
if: "!contains(github.event.head_commit.message, 'ci skip all')"
|
||||||
runs-on: ubuntu-latest
|
runs-on: ubuntu-latest
|
||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v3
|
- uses: actions/checkout@v4
|
||||||
- uses: actions/setup-python@v4
|
- uses: actions/setup-python@v5
|
||||||
- name: Install flake8
|
- name: Install flake8
|
||||||
run: pip install flake8
|
run: python3 ./devscripts/install_deps.py -o --include dev
|
||||||
- name: Make lazy extractors
|
- name: Make lazy extractors
|
||||||
run: python devscripts/make_lazy_extractors.py
|
run: python3 ./devscripts/make_lazy_extractors.py
|
||||||
- name: Run flake8
|
- name: Run flake8
|
||||||
run: flake8 .
|
run: flake8 .
|
||||||
|
|
29
.github/workflows/release-master.yml
vendored
Normal file
29
.github/workflows/release-master.yml
vendored
Normal file
|
@ -0,0 +1,29 @@
|
||||||
|
name: Release (master)
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches:
|
||||||
|
- master
|
||||||
|
paths:
|
||||||
|
- "yt_dlp/**.py"
|
||||||
|
- "!yt_dlp/version.py"
|
||||||
|
- "bundle/*.py"
|
||||||
|
- "pyproject.toml"
|
||||||
|
- "Makefile"
|
||||||
|
- ".github/workflows/build.yml"
|
||||||
|
concurrency:
|
||||||
|
group: release-master
|
||||||
|
permissions:
|
||||||
|
contents: read
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
release:
|
||||||
|
if: vars.BUILD_MASTER != ''
|
||||||
|
uses: ./.github/workflows/release.yml
|
||||||
|
with:
|
||||||
|
prerelease: true
|
||||||
|
source: master
|
||||||
|
permissions:
|
||||||
|
contents: write
|
||||||
|
packages: write
|
||||||
|
id-token: write # mandatory for trusted publishing
|
||||||
|
secrets: inherit
|
42
.github/workflows/release-nightly.yml
vendored
Normal file
42
.github/workflows/release-nightly.yml
vendored
Normal file
|
@ -0,0 +1,42 @@
|
||||||
|
name: Release (nightly)
|
||||||
|
on:
|
||||||
|
schedule:
|
||||||
|
- cron: '23 23 * * *'
|
||||||
|
permissions:
|
||||||
|
contents: read
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
check_nightly:
|
||||||
|
if: vars.BUILD_NIGHTLY != ''
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
outputs:
|
||||||
|
commit: ${{ steps.check_for_new_commits.outputs.commit }}
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
with:
|
||||||
|
fetch-depth: 0
|
||||||
|
- name: Check for new commits
|
||||||
|
id: check_for_new_commits
|
||||||
|
run: |
|
||||||
|
relevant_files=(
|
||||||
|
"yt_dlp/*.py"
|
||||||
|
':!yt_dlp/version.py'
|
||||||
|
"bundle/*.py"
|
||||||
|
"pyproject.toml"
|
||||||
|
"Makefile"
|
||||||
|
".github/workflows/build.yml"
|
||||||
|
)
|
||||||
|
echo "commit=$(git log --format=%H -1 --since="24 hours ago" -- "${relevant_files[@]}")" | tee "$GITHUB_OUTPUT"
|
||||||
|
|
||||||
|
release:
|
||||||
|
needs: [check_nightly]
|
||||||
|
if: ${{ needs.check_nightly.outputs.commit }}
|
||||||
|
uses: ./.github/workflows/release.yml
|
||||||
|
with:
|
||||||
|
prerelease: true
|
||||||
|
source: nightly
|
||||||
|
permissions:
|
||||||
|
contents: write
|
||||||
|
packages: write
|
||||||
|
id-token: write # mandatory for trusted publishing
|
||||||
|
secrets: inherit
|
387
.github/workflows/release.yml
vendored
Normal file
387
.github/workflows/release.yml
vendored
Normal file
|
@ -0,0 +1,387 @@
|
||||||
|
name: Release
|
||||||
|
on:
|
||||||
|
workflow_call:
|
||||||
|
inputs:
|
||||||
|
prerelease:
|
||||||
|
required: false
|
||||||
|
default: true
|
||||||
|
type: boolean
|
||||||
|
source:
|
||||||
|
required: false
|
||||||
|
default: ''
|
||||||
|
type: string
|
||||||
|
target:
|
||||||
|
required: false
|
||||||
|
default: ''
|
||||||
|
type: string
|
||||||
|
version:
|
||||||
|
required: false
|
||||||
|
default: ''
|
||||||
|
type: string
|
||||||
|
workflow_dispatch:
|
||||||
|
inputs:
|
||||||
|
source:
|
||||||
|
description: |
|
||||||
|
SOURCE of this release's updates:
|
||||||
|
channel, repo, tag, or channel/repo@tag
|
||||||
|
(default: <current_repo>)
|
||||||
|
required: false
|
||||||
|
default: ''
|
||||||
|
type: string
|
||||||
|
target:
|
||||||
|
description: |
|
||||||
|
TARGET to publish this release to:
|
||||||
|
channel, tag, or channel@tag
|
||||||
|
(default: <source> if writable else <current_repo>[@source_tag])
|
||||||
|
required: false
|
||||||
|
default: ''
|
||||||
|
type: string
|
||||||
|
version:
|
||||||
|
description: |
|
||||||
|
VERSION: yyyy.mm.dd[.rev] or rev
|
||||||
|
(default: auto-generated)
|
||||||
|
required: false
|
||||||
|
default: ''
|
||||||
|
type: string
|
||||||
|
prerelease:
|
||||||
|
description: Pre-release
|
||||||
|
default: false
|
||||||
|
type: boolean
|
||||||
|
|
||||||
|
permissions:
|
||||||
|
contents: read
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
prepare:
|
||||||
|
permissions:
|
||||||
|
contents: write
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
outputs:
|
||||||
|
channel: ${{ steps.setup_variables.outputs.channel }}
|
||||||
|
version: ${{ steps.setup_variables.outputs.version }}
|
||||||
|
target_repo: ${{ steps.setup_variables.outputs.target_repo }}
|
||||||
|
target_repo_token: ${{ steps.setup_variables.outputs.target_repo_token }}
|
||||||
|
target_tag: ${{ steps.setup_variables.outputs.target_tag }}
|
||||||
|
pypi_project: ${{ steps.setup_variables.outputs.pypi_project }}
|
||||||
|
pypi_suffix: ${{ steps.setup_variables.outputs.pypi_suffix }}
|
||||||
|
head_sha: ${{ steps.get_target.outputs.head_sha }}
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
with:
|
||||||
|
fetch-depth: 0
|
||||||
|
|
||||||
|
- uses: actions/setup-python@v5
|
||||||
|
with:
|
||||||
|
python-version: "3.10"
|
||||||
|
|
||||||
|
- name: Process inputs
|
||||||
|
id: process_inputs
|
||||||
|
run: |
|
||||||
|
cat << EOF
|
||||||
|
::group::Inputs
|
||||||
|
prerelease=${{ inputs.prerelease }}
|
||||||
|
source=${{ inputs.source }}
|
||||||
|
target=${{ inputs.target }}
|
||||||
|
version=${{ inputs.version }}
|
||||||
|
::endgroup::
|
||||||
|
EOF
|
||||||
|
IFS='@' read -r source_repo source_tag <<<"${{ inputs.source }}"
|
||||||
|
IFS='@' read -r target_repo target_tag <<<"${{ inputs.target }}"
|
||||||
|
cat << EOF >> "$GITHUB_OUTPUT"
|
||||||
|
source_repo=${source_repo}
|
||||||
|
source_tag=${source_tag}
|
||||||
|
target_repo=${target_repo}
|
||||||
|
target_tag=${target_tag}
|
||||||
|
EOF
|
||||||
|
|
||||||
|
- name: Setup variables
|
||||||
|
id: setup_variables
|
||||||
|
env:
|
||||||
|
source_repo: ${{ steps.process_inputs.outputs.source_repo }}
|
||||||
|
source_tag: ${{ steps.process_inputs.outputs.source_tag }}
|
||||||
|
target_repo: ${{ steps.process_inputs.outputs.target_repo }}
|
||||||
|
target_tag: ${{ steps.process_inputs.outputs.target_tag }}
|
||||||
|
run: |
|
||||||
|
# unholy bash monstrosity (sincere apologies)
|
||||||
|
fallback_token () {
|
||||||
|
if ${{ !secrets.ARCHIVE_REPO_TOKEN }}; then
|
||||||
|
echo "::error::Repository access secret ${target_repo_token^^} not found"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
target_repo_token=ARCHIVE_REPO_TOKEN
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
|
||||||
|
source_is_channel=0
|
||||||
|
[[ "${source_repo}" == 'stable' ]] && source_repo='yt-dlp/yt-dlp'
|
||||||
|
if [[ -z "${source_repo}" ]]; then
|
||||||
|
source_repo='${{ github.repository }}'
|
||||||
|
elif [[ '${{ vars[format('{0}_archive_repo', env.source_repo)] }}' ]]; then
|
||||||
|
source_is_channel=1
|
||||||
|
source_channel='${{ vars[format('{0}_archive_repo', env.source_repo)] }}'
|
||||||
|
elif [[ -z "${source_tag}" && "${source_repo}" != */* ]]; then
|
||||||
|
source_tag="${source_repo}"
|
||||||
|
source_repo='${{ github.repository }}'
|
||||||
|
fi
|
||||||
|
resolved_source="${source_repo}"
|
||||||
|
if [[ "${source_tag}" ]]; then
|
||||||
|
resolved_source="${resolved_source}@${source_tag}"
|
||||||
|
elif [[ "${source_repo}" == 'yt-dlp/yt-dlp' ]]; then
|
||||||
|
resolved_source='stable'
|
||||||
|
fi
|
||||||
|
|
||||||
|
revision="${{ (inputs.prerelease || !vars.PUSH_VERSION_COMMIT) && '$(date -u +"%H%M%S")' || '' }}"
|
||||||
|
version="$(
|
||||||
|
python devscripts/update-version.py \
|
||||||
|
-c "${resolved_source}" -r "${{ github.repository }}" ${{ inputs.version || '$revision' }} | \
|
||||||
|
grep -Po "version=\K\d+\.\d+\.\d+(\.\d+)?")"
|
||||||
|
|
||||||
|
if [[ "${target_repo}" ]]; then
|
||||||
|
if [[ -z "${target_tag}" ]]; then
|
||||||
|
if [[ '${{ vars[format('{0}_archive_repo', env.target_repo)] }}' ]]; then
|
||||||
|
target_tag="${source_tag:-${version}}"
|
||||||
|
else
|
||||||
|
target_tag="${target_repo}"
|
||||||
|
target_repo='${{ github.repository }}'
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
if [[ "${target_repo}" != '${{ github.repository}}' ]]; then
|
||||||
|
target_repo='${{ vars[format('{0}_archive_repo', env.target_repo)] }}'
|
||||||
|
target_repo_token='${{ env.target_repo }}_archive_repo_token'
|
||||||
|
${{ !!secrets[format('{0}_archive_repo_token', env.target_repo)] }} || fallback_token
|
||||||
|
pypi_project='${{ vars[format('{0}_pypi_project', env.target_repo)] }}'
|
||||||
|
pypi_suffix='${{ vars[format('{0}_pypi_suffix', env.target_repo)] }}'
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
target_tag="${source_tag:-${version}}"
|
||||||
|
if ((source_is_channel)); then
|
||||||
|
target_repo="${source_channel}"
|
||||||
|
target_repo_token='${{ env.source_repo }}_archive_repo_token'
|
||||||
|
${{ !!secrets[format('{0}_archive_repo_token', env.source_repo)] }} || fallback_token
|
||||||
|
pypi_project='${{ vars[format('{0}_pypi_project', env.source_repo)] }}'
|
||||||
|
pypi_suffix='${{ vars[format('{0}_pypi_suffix', env.source_repo)] }}'
|
||||||
|
else
|
||||||
|
target_repo='${{ github.repository }}'
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [[ "${target_repo}" == '${{ github.repository }}' ]] && ${{ !inputs.prerelease }}; then
|
||||||
|
pypi_project='${{ vars.PYPI_PROJECT }}'
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "::group::Output variables"
|
||||||
|
cat << EOF | tee -a "$GITHUB_OUTPUT"
|
||||||
|
channel=${resolved_source}
|
||||||
|
version=${version}
|
||||||
|
target_repo=${target_repo}
|
||||||
|
target_repo_token=${target_repo_token}
|
||||||
|
target_tag=${target_tag}
|
||||||
|
pypi_project=${pypi_project}
|
||||||
|
pypi_suffix=${pypi_suffix}
|
||||||
|
EOF
|
||||||
|
echo "::endgroup::"
|
||||||
|
|
||||||
|
- name: Update documentation
|
||||||
|
env:
|
||||||
|
version: ${{ steps.setup_variables.outputs.version }}
|
||||||
|
target_repo: ${{ steps.setup_variables.outputs.target_repo }}
|
||||||
|
if: |
|
||||||
|
!inputs.prerelease && env.target_repo == github.repository
|
||||||
|
run: |
|
||||||
|
make doc
|
||||||
|
sed '/### /Q' Changelog.md >> ./CHANGELOG
|
||||||
|
echo '### ${{ env.version }}' >> ./CHANGELOG
|
||||||
|
python ./devscripts/make_changelog.py -vv -c >> ./CHANGELOG
|
||||||
|
echo >> ./CHANGELOG
|
||||||
|
grep -Poz '(?s)### \d+\.\d+\.\d+.+' 'Changelog.md' | head -n -1 >> ./CHANGELOG
|
||||||
|
cat ./CHANGELOG > Changelog.md
|
||||||
|
|
||||||
|
- name: Push to release
|
||||||
|
id: push_release
|
||||||
|
env:
|
||||||
|
version: ${{ steps.setup_variables.outputs.version }}
|
||||||
|
target_repo: ${{ steps.setup_variables.outputs.target_repo }}
|
||||||
|
if: |
|
||||||
|
!inputs.prerelease && env.target_repo == github.repository
|
||||||
|
run: |
|
||||||
|
git config --global user.name "github-actions[bot]"
|
||||||
|
git config --global user.email "41898282+github-actions[bot]@users.noreply.github.com"
|
||||||
|
git add -u
|
||||||
|
git commit -m "Release ${{ env.version }}" \
|
||||||
|
-m "Created by: ${{ github.event.sender.login }}" -m ":ci skip all :ci run dl"
|
||||||
|
git push origin --force ${{ github.event.ref }}:release
|
||||||
|
|
||||||
|
- name: Get target commitish
|
||||||
|
id: get_target
|
||||||
|
run: |
|
||||||
|
echo "head_sha=$(git rev-parse HEAD)" >> "$GITHUB_OUTPUT"
|
||||||
|
|
||||||
|
- name: Update master
|
||||||
|
env:
|
||||||
|
target_repo: ${{ steps.setup_variables.outputs.target_repo }}
|
||||||
|
if: |
|
||||||
|
vars.PUSH_VERSION_COMMIT != '' && !inputs.prerelease && env.target_repo == github.repository
|
||||||
|
run: git push origin ${{ github.event.ref }}
|
||||||
|
|
||||||
|
build:
|
||||||
|
needs: prepare
|
||||||
|
uses: ./.github/workflows/build.yml
|
||||||
|
with:
|
||||||
|
version: ${{ needs.prepare.outputs.version }}
|
||||||
|
channel: ${{ needs.prepare.outputs.channel }}
|
||||||
|
origin: ${{ needs.prepare.outputs.target_repo }}
|
||||||
|
permissions:
|
||||||
|
contents: read
|
||||||
|
packages: write # For package cache
|
||||||
|
secrets:
|
||||||
|
GPG_SIGNING_KEY: ${{ secrets.GPG_SIGNING_KEY }}
|
||||||
|
|
||||||
|
publish_pypi:
|
||||||
|
needs: [prepare, build]
|
||||||
|
if: ${{ needs.prepare.outputs.pypi_project }}
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
permissions:
|
||||||
|
id-token: write # mandatory for trusted publishing
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
with:
|
||||||
|
fetch-depth: 0
|
||||||
|
- uses: actions/setup-python@v5
|
||||||
|
with:
|
||||||
|
python-version: "3.10"
|
||||||
|
|
||||||
|
- name: Install Requirements
|
||||||
|
run: |
|
||||||
|
sudo apt -y install pandoc man
|
||||||
|
python devscripts/install_deps.py -o --include build
|
||||||
|
|
||||||
|
- name: Prepare
|
||||||
|
env:
|
||||||
|
version: ${{ needs.prepare.outputs.version }}
|
||||||
|
suffix: ${{ needs.prepare.outputs.pypi_suffix }}
|
||||||
|
channel: ${{ needs.prepare.outputs.channel }}
|
||||||
|
target_repo: ${{ needs.prepare.outputs.target_repo }}
|
||||||
|
pypi_project: ${{ needs.prepare.outputs.pypi_project }}
|
||||||
|
run: |
|
||||||
|
python devscripts/update-version.py -c "${{ env.channel }}" -r "${{ env.target_repo }}" -s "${{ env.suffix }}" "${{ env.version }}"
|
||||||
|
python devscripts/make_lazy_extractors.py
|
||||||
|
sed -i -E '0,/(name = ")[^"]+(")/s//\1${{ env.pypi_project }}\2/' pyproject.toml
|
||||||
|
|
||||||
|
- name: Build
|
||||||
|
run: |
|
||||||
|
rm -rf dist/*
|
||||||
|
make pypi-files
|
||||||
|
printf '%s\n\n' \
|
||||||
|
'Official repository: <https://github.com/yt-dlp/yt-dlp>' \
|
||||||
|
'**PS**: Some links in this document will not work since this is a copy of the README.md from Github' > ./README.md.new
|
||||||
|
cat ./README.md >> ./README.md.new && mv -f ./README.md.new ./README.md
|
||||||
|
python devscripts/set-variant.py pip -M "You installed yt-dlp with pip or using the wheel from PyPi; Use that to update"
|
||||||
|
make clean-cache
|
||||||
|
python -m build --no-isolation .
|
||||||
|
|
||||||
|
- name: Publish to PyPI
|
||||||
|
uses: pypa/gh-action-pypi-publish@release/v1
|
||||||
|
with:
|
||||||
|
verbose: true
|
||||||
|
|
||||||
|
publish:
|
||||||
|
needs: [prepare, build]
|
||||||
|
permissions:
|
||||||
|
contents: write
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v4
|
||||||
|
with:
|
||||||
|
fetch-depth: 0
|
||||||
|
- uses: actions/download-artifact@v4
|
||||||
|
with:
|
||||||
|
path: artifact
|
||||||
|
pattern: build-*
|
||||||
|
merge-multiple: true
|
||||||
|
- uses: actions/setup-python@v5
|
||||||
|
with:
|
||||||
|
python-version: "3.10"
|
||||||
|
|
||||||
|
- name: Generate release notes
|
||||||
|
env:
|
||||||
|
head_sha: ${{ needs.prepare.outputs.head_sha }}
|
||||||
|
target_repo: ${{ needs.prepare.outputs.target_repo }}
|
||||||
|
target_tag: ${{ needs.prepare.outputs.target_tag }}
|
||||||
|
run: |
|
||||||
|
printf '%s' \
|
||||||
|
'[![Installation](https://img.shields.io/badge/-Which%20file%20should%20I%20download%3F-white.svg?style=for-the-badge)]' \
|
||||||
|
'(https://github.com/${{ github.repository }}#installation "Installation instructions") ' \
|
||||||
|
'[![Documentation](https://img.shields.io/badge/-Docs-brightgreen.svg?style=for-the-badge&logo=GitBook&labelColor=555555)]' \
|
||||||
|
'(https://github.com/${{ github.repository }}' \
|
||||||
|
'${{ env.target_repo == github.repository && format('/tree/{0}', env.target_tag) || '' }}#readme "Documentation") ' \
|
||||||
|
'[![Donate](https://img.shields.io/badge/_-Donate-red.svg?logo=githubsponsors&labelColor=555555&style=for-the-badge)]' \
|
||||||
|
'(https://github.com/yt-dlp/yt-dlp/blob/master/Collaborators.md#collaborators "Donate") ' \
|
||||||
|
'[![Discord](https://img.shields.io/discord/807245652072857610?color=blue&labelColor=555555&label=&logo=discord&style=for-the-badge)]' \
|
||||||
|
'(https://discord.gg/H5MNcFW63r "Discord") ' \
|
||||||
|
${{ env.target_repo == 'yt-dlp/yt-dlp' && '\
|
||||||
|
"[![Nightly](https://img.shields.io/badge/Get%20nightly%20builds-purple.svg?style=for-the-badge)]" \
|
||||||
|
"(https://github.com/yt-dlp/yt-dlp-nightly-builds/releases/latest \"Nightly builds\") " \
|
||||||
|
"[![Master](https://img.shields.io/badge/Get%20master%20builds-lightblue.svg?style=for-the-badge)]" \
|
||||||
|
"(https://github.com/yt-dlp/yt-dlp-master-builds/releases/latest \"Master builds\")"' || '' }} > ./RELEASE_NOTES
|
||||||
|
printf '\n\n' >> ./RELEASE_NOTES
|
||||||
|
cat >> ./RELEASE_NOTES << EOF
|
||||||
|
#### A description of the various files are in the [README](https://github.com/${{ github.repository }}#release-files)
|
||||||
|
---
|
||||||
|
$(python ./devscripts/make_changelog.py -vv --collapsible)
|
||||||
|
EOF
|
||||||
|
printf '%s\n\n' '**This is a pre-release build**' >> ./PRERELEASE_NOTES
|
||||||
|
cat ./RELEASE_NOTES >> ./PRERELEASE_NOTES
|
||||||
|
printf '%s\n\n' 'Generated from: https://github.com/${{ github.repository }}/commit/${{ env.head_sha }}' >> ./ARCHIVE_NOTES
|
||||||
|
cat ./RELEASE_NOTES >> ./ARCHIVE_NOTES
|
||||||
|
|
||||||
|
- name: Publish to archive repo
|
||||||
|
env:
|
||||||
|
GH_TOKEN: ${{ secrets[needs.prepare.outputs.target_repo_token] }}
|
||||||
|
GH_REPO: ${{ needs.prepare.outputs.target_repo }}
|
||||||
|
version: ${{ needs.prepare.outputs.version }}
|
||||||
|
channel: ${{ needs.prepare.outputs.channel }}
|
||||||
|
if: |
|
||||||
|
inputs.prerelease && env.GH_TOKEN != '' && env.GH_REPO != '' && env.GH_REPO != github.repository
|
||||||
|
run: |
|
||||||
|
title="${{ startswith(env.GH_REPO, 'yt-dlp/') && 'yt-dlp ' || '' }}${{ env.channel }}"
|
||||||
|
gh release create \
|
||||||
|
--notes-file ARCHIVE_NOTES \
|
||||||
|
--title "${title} ${{ env.version }}" \
|
||||||
|
${{ env.version }} \
|
||||||
|
artifact/*
|
||||||
|
|
||||||
|
- name: Prune old release
|
||||||
|
env:
|
||||||
|
GH_TOKEN: ${{ github.token }}
|
||||||
|
version: ${{ needs.prepare.outputs.version }}
|
||||||
|
target_repo: ${{ needs.prepare.outputs.target_repo }}
|
||||||
|
target_tag: ${{ needs.prepare.outputs.target_tag }}
|
||||||
|
if: |
|
||||||
|
env.target_repo == github.repository && env.target_tag != env.version
|
||||||
|
run: |
|
||||||
|
gh release delete --yes --cleanup-tag "${{ env.target_tag }}" || true
|
||||||
|
git tag --delete "${{ env.target_tag }}" || true
|
||||||
|
sleep 5 # Enough time to cover deletion race condition
|
||||||
|
|
||||||
|
- name: Publish release
|
||||||
|
env:
|
||||||
|
GH_TOKEN: ${{ github.token }}
|
||||||
|
version: ${{ needs.prepare.outputs.version }}
|
||||||
|
target_repo: ${{ needs.prepare.outputs.target_repo }}
|
||||||
|
target_tag: ${{ needs.prepare.outputs.target_tag }}
|
||||||
|
head_sha: ${{ needs.prepare.outputs.head_sha }}
|
||||||
|
if: |
|
||||||
|
env.target_repo == github.repository
|
||||||
|
run: |
|
||||||
|
title="${{ github.repository == 'yt-dlp/yt-dlp' && 'yt-dlp ' || '' }}"
|
||||||
|
title+="${{ env.target_tag != env.version && format('{0} ', env.target_tag) || '' }}"
|
||||||
|
gh release create \
|
||||||
|
--notes-file ${{ inputs.prerelease && 'PRERELEASE_NOTES' || 'RELEASE_NOTES' }} \
|
||||||
|
--target ${{ env.head_sha }} \
|
||||||
|
--title "${title}${{ env.version }}" \
|
||||||
|
${{ inputs.prerelease && '--prerelease' || '' }} \
|
||||||
|
${{ env.target_tag }} \
|
||||||
|
artifact/*
|
|
@ -79,7 +79,7 @@ ### Are you using the latest version?
|
||||||
|
|
||||||
### Is the issue already documented?
|
### Is the issue already documented?
|
||||||
|
|
||||||
Make sure that someone has not already opened the issue you're trying to open. Search at the top of the window or browse the [GitHub Issues](https://github.com/yt-dlp/yt-dlp/search?type=Issues) of this repository. If there is an issue, feel free to write something along the lines of "This affects me as well, with version 2021.01.01. Here is some more information on the issue: ...". While some issues may be old, a new post into them often spurs rapid activity.
|
Make sure that someone has not already opened the issue you're trying to open. Search at the top of the window or browse the [GitHub Issues](https://github.com/yt-dlp/yt-dlp/search?type=Issues) of this repository. If there is an issue, subcribe to it to be notified when there is any progress. Unless you have something useful to add to the converation, please refrain from commenting.
|
||||||
|
|
||||||
Additionally, it is also helpful to see if the issue has already been documented in the [youtube-dl issue tracker](https://github.com/ytdl-org/youtube-dl/issues). If similar issues have already been reported in youtube-dl (but not in our issue tracker), links to them can be included in your issue report here.
|
Additionally, it is also helpful to see if the issue has already been documented in the [youtube-dl issue tracker](https://github.com/ytdl-org/youtube-dl/issues). If similar issues have already been reported in youtube-dl (but not in our issue tracker), links to them can be included in your issue report here.
|
||||||
|
|
||||||
|
@ -127,7 +127,7 @@ ### Are you willing to share account details if needed?
|
||||||
|
|
||||||
### Is the website primarily used for piracy?
|
### Is the website primarily used for piracy?
|
||||||
|
|
||||||
We follow [youtube-dl's policy](https://github.com/ytdl-org/youtube-dl#can-you-add-support-for-this-anime-video-site-or-site-which-shows-current-movies-for-free) to not support services that is primarily used for infringing copyright. Additionally, it has been decided to not to support porn sites that specialize in deep fake. We also cannot support any service that serves only [DRM protected content](https://en.wikipedia.org/wiki/Digital_rights_management).
|
We follow [youtube-dl's policy](https://github.com/ytdl-org/youtube-dl#can-you-add-support-for-this-anime-video-site-or-site-which-shows-current-movies-for-free) to not support services that is primarily used for infringing copyright. Additionally, it has been decided to not to support porn sites that specialize in fakes. We also cannot support any service that serves only [DRM protected content](https://en.wikipedia.org/wiki/Digital_rights_management).
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
@ -140,12 +140,9 @@ # DEVELOPER INSTRUCTIONS
|
||||||
|
|
||||||
python -m yt_dlp
|
python -m yt_dlp
|
||||||
|
|
||||||
To run the test, simply invoke your favorite test runner, or execute a test file directly; any of the following work:
|
To run all the available core tests, use:
|
||||||
|
|
||||||
python -m unittest discover
|
python devscripts/run_tests.py
|
||||||
python test/test_download.py
|
|
||||||
nosetests
|
|
||||||
pytest
|
|
||||||
|
|
||||||
See item 6 of [new extractor tutorial](#adding-support-for-a-new-site) for how to run extractor specific test cases.
|
See item 6 of [new extractor tutorial](#adding-support-for-a-new-site) for how to run extractor specific test cases.
|
||||||
|
|
||||||
|
@ -187,15 +184,21 @@ ## Adding support for a new site
|
||||||
'url': 'https://yourextractor.com/watch/42',
|
'url': 'https://yourextractor.com/watch/42',
|
||||||
'md5': 'TODO: md5 sum of the first 10241 bytes of the video file (use --test)',
|
'md5': 'TODO: md5 sum of the first 10241 bytes of the video file (use --test)',
|
||||||
'info_dict': {
|
'info_dict': {
|
||||||
|
# For videos, only the 'id' and 'ext' fields are required to RUN the test:
|
||||||
'id': '42',
|
'id': '42',
|
||||||
'ext': 'mp4',
|
'ext': 'mp4',
|
||||||
'title': 'Video title goes here',
|
# Then if the test run fails, it will output the missing/incorrect fields.
|
||||||
'thumbnail': r're:^https?://.*\.jpg$',
|
# Properties can be added as:
|
||||||
# TODO more properties, either as:
|
# * A value, e.g.
|
||||||
# * A value
|
# 'title': 'Video title goes here',
|
||||||
# * MD5 checksum; start the string with md5:
|
# * MD5 checksum; start the string with 'md5:', e.g.
|
||||||
# * A regular expression; start the string with re:
|
# 'description': 'md5:098f6bcd4621d373cade4e832627b4f6',
|
||||||
# * Any Python type, e.g. int or float
|
# * A regular expression; start the string with 're:', e.g.
|
||||||
|
# 'thumbnail': r're:^https?://.*\.jpg$',
|
||||||
|
# * A count of elements in a list; start the string with 'count:', e.g.
|
||||||
|
# 'tags': 'count:10',
|
||||||
|
# * Any Python type, e.g.
|
||||||
|
# 'view_count': int,
|
||||||
}
|
}
|
||||||
}]
|
}]
|
||||||
|
|
||||||
|
@ -215,14 +218,14 @@ ## Adding support for a new site
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
1. Add an import in [`yt_dlp/extractor/_extractors.py`](yt_dlp/extractor/_extractors.py). Note that the class name must end with `IE`.
|
1. Add an import in [`yt_dlp/extractor/_extractors.py`](yt_dlp/extractor/_extractors.py). Note that the class name must end with `IE`.
|
||||||
1. Run `python test/test_download.py TestDownload.test_YourExtractor` (note that `YourExtractor` doesn't end with `IE`). This *should fail* at first, but you can continually re-run it until you're done. If you decide to add more than one test, the tests will then be named `TestDownload.test_YourExtractor`, `TestDownload.test_YourExtractor_1`, `TestDownload.test_YourExtractor_2`, etc. Note that tests with `only_matching` key in test's dict are not counted in. You can also run all the tests in one go with `TestDownload.test_YourExtractor_all`
|
1. Run `python devscripts/run_tests.py YourExtractor`. This *may fail* at first, but you can continually re-run it until you're done. Upon failure, it will output the missing fields and/or correct values which you can copy. If you decide to add more than one test, the tests will then be named `YourExtractor`, `YourExtractor_1`, `YourExtractor_2`, etc. Note that tests with an `only_matching` key in the test's dict are not included in the count. You can also run all the tests in one go with `YourExtractor_all`
|
||||||
1. Make sure you have atleast one test for your extractor. Even if all videos covered by the extractor are expected to be inaccessible for automated testing, tests should still be added with a `skip` parameter indicating why the particular test is disabled from running.
|
1. Make sure you have at least one test for your extractor. Even if all videos covered by the extractor are expected to be inaccessible for automated testing, tests should still be added with a `skip` parameter indicating why the particular test is disabled from running.
|
||||||
1. Have a look at [`yt_dlp/extractor/common.py`](yt_dlp/extractor/common.py) for possible helper methods and a [detailed description of what your extractor should and may return](yt_dlp/extractor/common.py#L91-L426). Add tests and code for as many as you want.
|
1. Have a look at [`yt_dlp/extractor/common.py`](yt_dlp/extractor/common.py) for possible helper methods and a [detailed description of what your extractor should and may return](yt_dlp/extractor/common.py#L119-L440). Add tests and code for as many as you want.
|
||||||
1. Make sure your code follows [yt-dlp coding conventions](#yt-dlp-coding-conventions) and check the code with [flake8](https://flake8.pycqa.org/en/latest/index.html#quickstart):
|
1. Make sure your code follows [yt-dlp coding conventions](#yt-dlp-coding-conventions) and check the code with [flake8](https://flake8.pycqa.org/en/latest/index.html#quickstart):
|
||||||
|
|
||||||
$ flake8 yt_dlp/extractor/yourextractor.py
|
$ flake8 yt_dlp/extractor/yourextractor.py
|
||||||
|
|
||||||
1. Make sure your code works under all [Python](https://www.python.org/) versions supported by yt-dlp, namely CPython and PyPy for Python 3.7 and above. Backward compatibility is not required for even older versions of Python.
|
1. Make sure your code works under all [Python](https://www.python.org/) versions supported by yt-dlp, namely CPython and PyPy for Python 3.8 and above. Backward compatibility is not required for even older versions of Python.
|
||||||
1. When the tests pass, [add](https://git-scm.com/docs/git-add) the new files, [commit](https://git-scm.com/docs/git-commit) them and [push](https://git-scm.com/docs/git-push) the result, like this:
|
1. When the tests pass, [add](https://git-scm.com/docs/git-add) the new files, [commit](https://git-scm.com/docs/git-commit) them and [push](https://git-scm.com/docs/git-push) the result, like this:
|
||||||
|
|
||||||
$ git add yt_dlp/extractor/_extractors.py
|
$ git add yt_dlp/extractor/_extractors.py
|
||||||
|
@ -246,12 +249,12 @@ ## yt-dlp coding conventions
|
||||||
|
|
||||||
This section introduces a guide lines for writing idiomatic, robust and future-proof extractor code.
|
This section introduces a guide lines for writing idiomatic, robust and future-proof extractor code.
|
||||||
|
|
||||||
Extractors are very fragile by nature since they depend on the layout of the source data provided by 3rd party media hosters out of your control and this layout tends to change. As an extractor implementer your task is not only to write code that will extract media links and metadata correctly but also to minimize dependency on the source's layout and even to make the code foresee potential future changes and be ready for that. This is important because it will allow the extractor not to break on minor layout changes thus keeping old yt-dlp versions working. Even though this breakage issue may be easily fixed by a new version of yt-dlp, this could take some time, during which the the extractor will remain broken.
|
Extractors are very fragile by nature since they depend on the layout of the source data provided by 3rd party media hosters out of your control and this layout tends to change. As an extractor implementer your task is not only to write code that will extract media links and metadata correctly but also to minimize dependency on the source's layout and even to make the code foresee potential future changes and be ready for that. This is important because it will allow the extractor not to break on minor layout changes thus keeping old yt-dlp versions working. Even though this breakage issue may be easily fixed by a new version of yt-dlp, this could take some time, during which the extractor will remain broken.
|
||||||
|
|
||||||
|
|
||||||
### Mandatory and optional metafields
|
### Mandatory and optional metafields
|
||||||
|
|
||||||
For extraction to work yt-dlp relies on metadata your extractor extracts and provides to yt-dlp expressed by an [information dictionary](yt_dlp/extractor/common.py#L91-L426) or simply *info dict*. Only the following meta fields in the *info dict* are considered mandatory for a successful extraction process by yt-dlp:
|
For extraction to work yt-dlp relies on metadata your extractor extracts and provides to yt-dlp expressed by an [information dictionary](yt_dlp/extractor/common.py#L119-L440) or simply *info dict*. Only the following meta fields in the *info dict* are considered mandatory for a successful extraction process by yt-dlp:
|
||||||
|
|
||||||
- `id` (media identifier)
|
- `id` (media identifier)
|
||||||
- `title` (media title)
|
- `title` (media title)
|
||||||
|
@ -696,7 +699,7 @@ #### Examples
|
||||||
|
|
||||||
### Use convenience conversion and parsing functions
|
### Use convenience conversion and parsing functions
|
||||||
|
|
||||||
Wrap all extracted numeric data into safe functions from [`yt_dlp/utils.py`](yt_dlp/utils.py): `int_or_none`, `float_or_none`. Use them for string to number conversions as well.
|
Wrap all extracted numeric data into safe functions from [`yt_dlp/utils/`](yt_dlp/utils/): `int_or_none`, `float_or_none`. Use them for string to number conversions as well.
|
||||||
|
|
||||||
Use `url_or_none` for safe URL processing.
|
Use `url_or_none` for safe URL processing.
|
||||||
|
|
||||||
|
@ -704,7 +707,7 @@ ### Use convenience conversion and parsing functions
|
||||||
|
|
||||||
Use `unified_strdate` for uniform `upload_date` or any `YYYYMMDD` meta field extraction, `unified_timestamp` for uniform `timestamp` extraction, `parse_filesize` for `filesize` extraction, `parse_count` for count meta fields extraction, `parse_resolution`, `parse_duration` for `duration` extraction, `parse_age_limit` for `age_limit` extraction.
|
Use `unified_strdate` for uniform `upload_date` or any `YYYYMMDD` meta field extraction, `unified_timestamp` for uniform `timestamp` extraction, `parse_filesize` for `filesize` extraction, `parse_count` for count meta fields extraction, `parse_resolution`, `parse_duration` for `duration` extraction, `parse_age_limit` for `age_limit` extraction.
|
||||||
|
|
||||||
Explore [`yt_dlp/utils.py`](yt_dlp/utils.py) for more useful convenience functions.
|
Explore [`yt_dlp/utils/`](yt_dlp/utils/) for more useful convenience functions.
|
||||||
|
|
||||||
#### Examples
|
#### Examples
|
||||||
|
|
||||||
|
|
165
CONTRIBUTORS
165
CONTRIBUTORS
|
@ -2,8 +2,8 @@ pukkandan (owner)
|
||||||
shirt-dev (collaborator)
|
shirt-dev (collaborator)
|
||||||
coletdjnz/colethedj (collaborator)
|
coletdjnz/colethedj (collaborator)
|
||||||
Ashish0804 (collaborator)
|
Ashish0804 (collaborator)
|
||||||
nao20010128nao/Lesmiscore (collaborator)
|
|
||||||
bashonly (collaborator)
|
bashonly (collaborator)
|
||||||
|
Grub4K (collaborator)
|
||||||
h-h-h-h
|
h-h-h-h
|
||||||
pauldubois98
|
pauldubois98
|
||||||
nixxo
|
nixxo
|
||||||
|
@ -319,7 +319,6 @@ columndeeply
|
||||||
DoubleCouponDay
|
DoubleCouponDay
|
||||||
Fabi019
|
Fabi019
|
||||||
GautamMKGarg
|
GautamMKGarg
|
||||||
Grub4K
|
|
||||||
itachi-19
|
itachi-19
|
||||||
jeroenj
|
jeroenj
|
||||||
josanabr
|
josanabr
|
||||||
|
@ -381,3 +380,165 @@ gschizas
|
||||||
JC-Chung
|
JC-Chung
|
||||||
mzhou
|
mzhou
|
||||||
OndrejBakan
|
OndrejBakan
|
||||||
|
ab4cbef
|
||||||
|
aionescu
|
||||||
|
amra
|
||||||
|
ByteDream
|
||||||
|
carusocr
|
||||||
|
chexxor
|
||||||
|
felixonmars
|
||||||
|
FrankZ85
|
||||||
|
FriedrichRehren
|
||||||
|
gregsadetsky
|
||||||
|
LeoniePhiline
|
||||||
|
LowSuggestion912
|
||||||
|
Matumo
|
||||||
|
OIRNOIR
|
||||||
|
OMEGARAZER
|
||||||
|
oxamun
|
||||||
|
pmitchell86
|
||||||
|
qbnu
|
||||||
|
qulaz
|
||||||
|
rebane2001
|
||||||
|
road-master
|
||||||
|
rohieb
|
||||||
|
sdht0
|
||||||
|
seproDev
|
||||||
|
Hill-98
|
||||||
|
LXYan2333
|
||||||
|
mushbite
|
||||||
|
venkata-krishnas
|
||||||
|
7vlad7
|
||||||
|
alexklapheke
|
||||||
|
arobase-che
|
||||||
|
bepvte
|
||||||
|
bergoid
|
||||||
|
blmarket
|
||||||
|
brandon-dacrib
|
||||||
|
c-basalt
|
||||||
|
CoryTibbettsDev
|
||||||
|
Cyberes
|
||||||
|
D0LLYNH0
|
||||||
|
danog
|
||||||
|
DataGhost
|
||||||
|
falbrechtskirchinger
|
||||||
|
foreignBlade
|
||||||
|
garret1317
|
||||||
|
hasezoey
|
||||||
|
hoaluvn
|
||||||
|
ItzMaxTV
|
||||||
|
ivanskodje
|
||||||
|
jo-nike
|
||||||
|
kangalio
|
||||||
|
linsui
|
||||||
|
makew0rld
|
||||||
|
menschel
|
||||||
|
mikf
|
||||||
|
mrscrapy
|
||||||
|
NDagestad
|
||||||
|
Neurognostic
|
||||||
|
NextFire
|
||||||
|
nick-cd
|
||||||
|
permunkle
|
||||||
|
pzhlkj6612
|
||||||
|
ringus1
|
||||||
|
rjy
|
||||||
|
Schmoaaaaah
|
||||||
|
sjthespian
|
||||||
|
theperfectpunk
|
||||||
|
toomyzoom
|
||||||
|
truedread
|
||||||
|
TxI5
|
||||||
|
unbeatable-101
|
||||||
|
vampirefrog
|
||||||
|
vidiot720
|
||||||
|
viktor-enzell
|
||||||
|
zhgwn
|
||||||
|
barthelmannk
|
||||||
|
berkanteber
|
||||||
|
OverlordQ
|
||||||
|
rexlambert22
|
||||||
|
Ti4eeT4e
|
||||||
|
AmanSal1
|
||||||
|
bbilly1
|
||||||
|
meliber
|
||||||
|
nnoboa
|
||||||
|
rdamas
|
||||||
|
RfadnjdExt
|
||||||
|
urectanc
|
||||||
|
nao20010128nao/Lesmiscore
|
||||||
|
04-pasha-04
|
||||||
|
aaruni96
|
||||||
|
aky-01
|
||||||
|
AmirAflak
|
||||||
|
ApoorvShah111
|
||||||
|
at-wat
|
||||||
|
davinkevin
|
||||||
|
demon071
|
||||||
|
denhotte
|
||||||
|
FinnRG
|
||||||
|
fireattack
|
||||||
|
Frankgoji
|
||||||
|
GD-Slime
|
||||||
|
hatsomatt
|
||||||
|
ifan-t
|
||||||
|
kshitiz305
|
||||||
|
kylegustavo
|
||||||
|
mabdelfattah
|
||||||
|
nathantouze
|
||||||
|
niemands
|
||||||
|
Rajeshwaran2001
|
||||||
|
RedDeffender
|
||||||
|
Rohxn16
|
||||||
|
sb0stn
|
||||||
|
SevenLives
|
||||||
|
simon300000
|
||||||
|
snixon
|
||||||
|
soundchaser128
|
||||||
|
szabyg
|
||||||
|
trainman261
|
||||||
|
trislee
|
||||||
|
wader
|
||||||
|
Yalab7
|
||||||
|
zhallgato
|
||||||
|
zhong-yiyu
|
||||||
|
Zprokkel
|
||||||
|
AS6939
|
||||||
|
drzraf
|
||||||
|
handlerug
|
||||||
|
jiru
|
||||||
|
madewokherd
|
||||||
|
xofe
|
||||||
|
awalgarg
|
||||||
|
midnightveil
|
||||||
|
naginatana
|
||||||
|
Riteo
|
||||||
|
1100101
|
||||||
|
aniolpages
|
||||||
|
bartbroere
|
||||||
|
CrendKing
|
||||||
|
Esokrates
|
||||||
|
HitomaruKonpaku
|
||||||
|
LoserFox
|
||||||
|
peci1
|
||||||
|
saintliao
|
||||||
|
shubhexists
|
||||||
|
SirElderling
|
||||||
|
almx
|
||||||
|
elivinsky
|
||||||
|
starius
|
||||||
|
TravisDupes
|
||||||
|
amir16yp
|
||||||
|
Fymyte
|
||||||
|
Ganesh910
|
||||||
|
hashFactory
|
||||||
|
kclauhk
|
||||||
|
Kyraminol
|
||||||
|
lstrojny
|
||||||
|
middlingphys
|
||||||
|
NickCis
|
||||||
|
nicodato
|
||||||
|
prettykool
|
||||||
|
S-Aarab
|
||||||
|
sonmezberkay
|
||||||
|
TSRBerry
|
||||||
|
|
987
Changelog.md
987
Changelog.md
File diff suppressed because it is too large
Load diff
|
@ -8,6 +8,7 @@ # Collaborators
|
||||||
## [pukkandan](https://github.com/pukkandan)
|
## [pukkandan](https://github.com/pukkandan)
|
||||||
|
|
||||||
[![ko-fi](https://img.shields.io/badge/_-Ko--fi-red.svg?logo=kofi&labelColor=555555&style=for-the-badge)](https://ko-fi.com/pukkandan)
|
[![ko-fi](https://img.shields.io/badge/_-Ko--fi-red.svg?logo=kofi&labelColor=555555&style=for-the-badge)](https://ko-fi.com/pukkandan)
|
||||||
|
[![gh-sponsor](https://img.shields.io/badge/_-Github-white.svg?logo=github&labelColor=555555&style=for-the-badge)](https://github.com/sponsors/pukkandan)
|
||||||
|
|
||||||
* Owner of the fork
|
* Owner of the fork
|
||||||
|
|
||||||
|
@ -25,8 +26,10 @@ ## [shirt](https://github.com/shirt-dev)
|
||||||
|
|
||||||
## [coletdjnz](https://github.com/coletdjnz)
|
## [coletdjnz](https://github.com/coletdjnz)
|
||||||
|
|
||||||
[![gh-sponsor](https://img.shields.io/badge/_-Sponsor-red.svg?logo=githubsponsors&labelColor=555555&style=for-the-badge)](https://github.com/sponsors/coletdjnz)
|
[![gh-sponsor](https://img.shields.io/badge/_-Github-white.svg?logo=github&labelColor=555555&style=for-the-badge)](https://github.com/sponsors/coletdjnz)
|
||||||
|
|
||||||
|
* Improved plugin architecture
|
||||||
|
* Rewrote the networking infrastructure, implemented support for `requests`
|
||||||
* YouTube improvements including: age-gate bypass, private playlists, multiple-clients (to avoid throttling) and a lot of under-the-hood improvements
|
* YouTube improvements including: age-gate bypass, private playlists, multiple-clients (to avoid throttling) and a lot of under-the-hood improvements
|
||||||
* Added support for new websites YoutubeWebArchive, MainStreaming, PRX, nzherald, Mediaklikk, StarTV etc
|
* Added support for new websites YoutubeWebArchive, MainStreaming, PRX, nzherald, Mediaklikk, StarTV etc
|
||||||
* Improved/fixed support for Patreon, panopto, gfycat, itv, pbs, SouthParkDE etc
|
* Improved/fixed support for Patreon, panopto, gfycat, itv, pbs, SouthParkDE etc
|
||||||
|
@ -42,18 +45,19 @@ ## [Ashish0804](https://github.com/Ashish0804) <sub><sup>[Inactive]</sup></sub>
|
||||||
* Improved/fixed support for HiDive, HotStar, Hungama, LBRY, LinkedInLearning, Mxplayer, SonyLiv, TV2, Vimeo, VLive etc
|
* Improved/fixed support for HiDive, HotStar, Hungama, LBRY, LinkedInLearning, Mxplayer, SonyLiv, TV2, Vimeo, VLive etc
|
||||||
|
|
||||||
|
|
||||||
## [Lesmiscore](https://github.com/Lesmiscore) <sub><sup>(nao20010128nao)</sup></sub>
|
|
||||||
|
|
||||||
**Bitcoin**: bc1qfd02r007cutfdjwjmyy9w23rjvtls6ncve7r3s
|
|
||||||
**Monacoin**: mona1q3tf7dzvshrhfe3md379xtvt2n22duhglv5dskr
|
|
||||||
|
|
||||||
* Download live from start to end for YouTube
|
|
||||||
* Added support for new websites AbemaTV, mildom, PixivSketch, skeb, radiko, voicy, mirrativ, openrec, whowatch, damtomo, 17.live, mixch etc
|
|
||||||
* Improved/fixed support for fc2, YahooJapanNews, tver, iwara etc
|
|
||||||
|
|
||||||
|
|
||||||
## [bashonly](https://github.com/bashonly)
|
## [bashonly](https://github.com/bashonly)
|
||||||
|
|
||||||
* `--cookies-from-browser` support for Firefox containers
|
* `--update-to`, self-updater rewrite, automated/nightly/master releases
|
||||||
* Added support for new websites Genius, Kick, NBCStations, Triller, VideoKen etc
|
* `--cookies-from-browser` support for Firefox containers, external downloader cookie handling overhaul
|
||||||
* Improved/fixed support for Anvato, Brightcove, Instagram, ParamountPlus, Reddit, SlidesLive, TikTok, Twitter, Vimeo etc
|
* Added support for new websites like Dacast, Kick, NBCStations, Triller, VideoKen, Weverse, WrestleUniverse etc
|
||||||
|
* Improved/fixed support for Anvato, Brightcove, Reddit, SlidesLive, TikTok, Twitter, Vimeo etc
|
||||||
|
|
||||||
|
|
||||||
|
## [Grub4K](https://github.com/Grub4K)
|
||||||
|
|
||||||
|
[![gh-sponsor](https://img.shields.io/badge/_-Github-white.svg?logo=github&labelColor=555555&style=for-the-badge)](https://github.com/sponsors/Grub4K) [![ko-fi](https://img.shields.io/badge/_-Ko--fi-red.svg?logo=kofi&labelColor=555555&style=for-the-badge)](https://ko-fi.com/Grub4K)
|
||||||
|
|
||||||
|
* `--update-to`, self-updater rewrite, automated/nightly/master releases
|
||||||
|
* Reworked internals like `traverse_obj`, various core refactors and bugs fixes
|
||||||
|
* Implemented proper progress reporting for parallel downloads
|
||||||
|
* Improved/fixed/added Bundestag, crunchyroll, pr0gramm, Twitter, WrestleUniverse etc
|
||||||
|
|
10
MANIFEST.in
10
MANIFEST.in
|
@ -1,10 +0,0 @@
|
||||||
include AUTHORS
|
|
||||||
include Changelog.md
|
|
||||||
include LICENSE
|
|
||||||
include README.md
|
|
||||||
include completions/*/*
|
|
||||||
include supportedsites.md
|
|
||||||
include yt-dlp.1
|
|
||||||
include requirements.txt
|
|
||||||
recursive-include devscripts *
|
|
||||||
recursive-include test *
|
|
51
Makefile
51
Makefile
|
@ -6,11 +6,11 @@ doc: README.md CONTRIBUTING.md issuetemplates supportedsites
|
||||||
ot: offlinetest
|
ot: offlinetest
|
||||||
tar: yt-dlp.tar.gz
|
tar: yt-dlp.tar.gz
|
||||||
|
|
||||||
# Keep this list in sync with MANIFEST.in
|
# Keep this list in sync with pyproject.toml includes/artifacts
|
||||||
# intended use: when building a source distribution,
|
# intended use: when building a source distribution,
|
||||||
# make pypi-files && python setup.py sdist
|
# make pypi-files && python3 -m build -sn .
|
||||||
pypi-files: AUTHORS Changelog.md LICENSE README.md README.txt supportedsites \
|
pypi-files: AUTHORS Changelog.md LICENSE README.md README.txt supportedsites \
|
||||||
completions yt-dlp.1 requirements.txt setup.cfg devscripts/* test/*
|
completions yt-dlp.1 pyproject.toml setup.cfg devscripts/* test/*
|
||||||
|
|
||||||
.PHONY: all clean install test tar pypi-files completions ot offlinetest codetest supportedsites
|
.PHONY: all clean install test tar pypi-files completions ot offlinetest codetest supportedsites
|
||||||
|
|
||||||
|
@ -21,7 +21,7 @@ clean-test:
|
||||||
*.mp4 *.mpga *.oga *.ogg *.opus *.png *.sbv *.srt *.swf *.swp *.tt *.ttml *.url *.vtt *.wav *.webloc *.webm *.webp
|
*.mp4 *.mpga *.oga *.ogg *.opus *.png *.sbv *.srt *.swf *.swp *.tt *.ttml *.url *.vtt *.wav *.webloc *.webm *.webp
|
||||||
clean-dist:
|
clean-dist:
|
||||||
rm -rf yt-dlp.1.temp.md yt-dlp.1 README.txt MANIFEST build/ dist/ .coverage cover/ yt-dlp.tar.gz completions/ \
|
rm -rf yt-dlp.1.temp.md yt-dlp.1 README.txt MANIFEST build/ dist/ .coverage cover/ yt-dlp.tar.gz completions/ \
|
||||||
yt_dlp/extractor/lazy_extractors.py *.spec CONTRIBUTING.md.tmp yt-dlp yt-dlp.exe yt_dlp.egg-info/ AUTHORS .mailmap
|
yt_dlp/extractor/lazy_extractors.py *.spec CONTRIBUTING.md.tmp yt-dlp yt-dlp.exe yt_dlp.egg-info/ AUTHORS
|
||||||
clean-cache:
|
clean-cache:
|
||||||
find . \( \
|
find . \( \
|
||||||
-type d -name .pytest_cache -o -type d -name __pycache__ -o -name "*.pyc" -o -name "*.class" \
|
-type d -name .pytest_cache -o -type d -name __pycache__ -o -name "*.pyc" -o -name "*.class" \
|
||||||
|
@ -38,11 +38,13 @@ MANDIR ?= $(PREFIX)/man
|
||||||
SHAREDIR ?= $(PREFIX)/share
|
SHAREDIR ?= $(PREFIX)/share
|
||||||
PYTHON ?= /usr/bin/env python3
|
PYTHON ?= /usr/bin/env python3
|
||||||
|
|
||||||
# set SYSCONFDIR to /etc if PREFIX=/usr or PREFIX=/usr/local
|
# $(shell) and $(error) are no-ops in BSD Make and the != variable assignment operator is not supported by GNU Make <4.0
|
||||||
SYSCONFDIR = $(shell if [ $(PREFIX) = /usr -o $(PREFIX) = /usr/local ]; then echo /etc; else echo $(PREFIX)/etc; fi)
|
VERSION_CHECK != echo supported
|
||||||
|
VERSION_CHECK ?= $(error GNU Make 4+ or BSD Make is required)
|
||||||
|
CHECK_VERSION := $(VERSION_CHECK)
|
||||||
|
|
||||||
# set markdown input format to "markdown-smart" for pandoc version 2 and to "markdown" for pandoc prior to version 2
|
# set markdown input format to "markdown-smart" for pandoc version 2+ and to "markdown" for pandoc prior to version 2
|
||||||
MARKDOWN = $(shell if [ `pandoc -v | head -n1 | cut -d" " -f2 | head -c1` = "2" ]; then echo markdown-smart; else echo markdown; fi)
|
MARKDOWN != if [ "`pandoc -v | head -n1 | cut -d' ' -f2 | head -c1`" -ge "2" ]; then echo markdown-smart; else echo markdown; fi
|
||||||
|
|
||||||
install: lazy-extractors yt-dlp yt-dlp.1 completions
|
install: lazy-extractors yt-dlp yt-dlp.1 completions
|
||||||
mkdir -p $(DESTDIR)$(BINDIR)
|
mkdir -p $(DESTDIR)$(BINDIR)
|
||||||
|
@ -73,24 +75,24 @@ test:
|
||||||
offlinetest: codetest
|
offlinetest: codetest
|
||||||
$(PYTHON) -m pytest -k "not download"
|
$(PYTHON) -m pytest -k "not download"
|
||||||
|
|
||||||
# XXX: This is hard to maintain
|
CODE_FOLDERS != find yt_dlp -type f -name '__init__.py' -exec dirname {} \+ | grep -v '/__' | sort
|
||||||
CODE_FOLDERS = yt_dlp yt_dlp/downloader yt_dlp/extractor yt_dlp/postprocessor yt_dlp/compat
|
CODE_FILES != for f in $(CODE_FOLDERS) ; do echo "$$f" | sed 's,$$,/*.py,' ; done
|
||||||
yt-dlp: yt_dlp/*.py yt_dlp/*/*.py
|
yt-dlp: $(CODE_FILES)
|
||||||
mkdir -p zip
|
mkdir -p zip
|
||||||
for d in $(CODE_FOLDERS) ; do \
|
for d in $(CODE_FOLDERS) ; do \
|
||||||
mkdir -p zip/$$d ;\
|
mkdir -p zip/$$d ;\
|
||||||
cp -pPR $$d/*.py zip/$$d/ ;\
|
cp -pPR $$d/*.py zip/$$d/ ;\
|
||||||
done
|
done
|
||||||
touch -t 200001010101 zip/yt_dlp/*.py zip/yt_dlp/*/*.py
|
(cd zip && touch -t 200001010101 $(CODE_FILES))
|
||||||
mv zip/yt_dlp/__main__.py zip/
|
mv zip/yt_dlp/__main__.py zip/
|
||||||
cd zip ; zip -q ../yt-dlp yt_dlp/*.py yt_dlp/*/*.py __main__.py
|
(cd zip && zip -q ../yt-dlp $(CODE_FILES) __main__.py)
|
||||||
rm -rf zip
|
rm -rf zip
|
||||||
echo '#!$(PYTHON)' > yt-dlp
|
echo '#!$(PYTHON)' > yt-dlp
|
||||||
cat yt-dlp.zip >> yt-dlp
|
cat yt-dlp.zip >> yt-dlp
|
||||||
rm yt-dlp.zip
|
rm yt-dlp.zip
|
||||||
chmod a+x yt-dlp
|
chmod a+x yt-dlp
|
||||||
|
|
||||||
README.md: yt_dlp/*.py yt_dlp/*/*.py devscripts/make_readme.py
|
README.md: $(CODE_FILES) devscripts/make_readme.py
|
||||||
COLUMNS=80 $(PYTHON) yt_dlp/__main__.py --ignore-config --help | $(PYTHON) devscripts/make_readme.py
|
COLUMNS=80 $(PYTHON) yt_dlp/__main__.py --ignore-config --help | $(PYTHON) devscripts/make_readme.py
|
||||||
|
|
||||||
CONTRIBUTING.md: README.md devscripts/make_contributing.py
|
CONTRIBUTING.md: README.md devscripts/make_contributing.py
|
||||||
|
@ -115,19 +117,19 @@ yt-dlp.1: README.md devscripts/prepare_manpage.py
|
||||||
pandoc -s -f $(MARKDOWN) -t man yt-dlp.1.temp.md -o yt-dlp.1
|
pandoc -s -f $(MARKDOWN) -t man yt-dlp.1.temp.md -o yt-dlp.1
|
||||||
rm -f yt-dlp.1.temp.md
|
rm -f yt-dlp.1.temp.md
|
||||||
|
|
||||||
completions/bash/yt-dlp: yt_dlp/*.py yt_dlp/*/*.py devscripts/bash-completion.in
|
completions/bash/yt-dlp: $(CODE_FILES) devscripts/bash-completion.in
|
||||||
mkdir -p completions/bash
|
mkdir -p completions/bash
|
||||||
$(PYTHON) devscripts/bash-completion.py
|
$(PYTHON) devscripts/bash-completion.py
|
||||||
|
|
||||||
completions/zsh/_yt-dlp: yt_dlp/*.py yt_dlp/*/*.py devscripts/zsh-completion.in
|
completions/zsh/_yt-dlp: $(CODE_FILES) devscripts/zsh-completion.in
|
||||||
mkdir -p completions/zsh
|
mkdir -p completions/zsh
|
||||||
$(PYTHON) devscripts/zsh-completion.py
|
$(PYTHON) devscripts/zsh-completion.py
|
||||||
|
|
||||||
completions/fish/yt-dlp.fish: yt_dlp/*.py yt_dlp/*/*.py devscripts/fish-completion.in
|
completions/fish/yt-dlp.fish: $(CODE_FILES) devscripts/fish-completion.in
|
||||||
mkdir -p completions/fish
|
mkdir -p completions/fish
|
||||||
$(PYTHON) devscripts/fish-completion.py
|
$(PYTHON) devscripts/fish-completion.py
|
||||||
|
|
||||||
_EXTRACTOR_FILES = $(shell find yt_dlp/extractor -name '*.py' -and -not -name 'lazy_extractors.py')
|
_EXTRACTOR_FILES != find yt_dlp/extractor -name '*.py' -and -not -name 'lazy_extractors.py'
|
||||||
yt_dlp/extractor/lazy_extractors.py: devscripts/make_lazy_extractors.py devscripts/lazy_load_template.py $(_EXTRACTOR_FILES)
|
yt_dlp/extractor/lazy_extractors.py: devscripts/make_lazy_extractors.py devscripts/lazy_load_template.py $(_EXTRACTOR_FILES)
|
||||||
$(PYTHON) devscripts/make_lazy_extractors.py $@
|
$(PYTHON) devscripts/make_lazy_extractors.py $@
|
||||||
|
|
||||||
|
@ -141,15 +143,12 @@ yt-dlp.tar.gz: all
|
||||||
--exclude '__pycache__' \
|
--exclude '__pycache__' \
|
||||||
--exclude '.pytest_cache' \
|
--exclude '.pytest_cache' \
|
||||||
--exclude '.git' \
|
--exclude '.git' \
|
||||||
|
--exclude '__pyinstaller' \
|
||||||
-- \
|
-- \
|
||||||
README.md supportedsites.md Changelog.md LICENSE \
|
README.md supportedsites.md Changelog.md LICENSE \
|
||||||
CONTRIBUTING.md Collaborators.md CONTRIBUTORS AUTHORS \
|
CONTRIBUTING.md Collaborators.md CONTRIBUTORS AUTHORS \
|
||||||
Makefile MANIFEST.in yt-dlp.1 README.txt completions \
|
Makefile yt-dlp.1 README.txt completions .gitignore \
|
||||||
setup.py setup.cfg yt-dlp yt_dlp requirements.txt \
|
setup.cfg yt-dlp yt_dlp pyproject.toml devscripts test
|
||||||
devscripts test
|
|
||||||
|
|
||||||
AUTHORS: .mailmap
|
AUTHORS:
|
||||||
git shortlog -s -n | cut -f2 | sort > AUTHORS
|
git shortlog -s -n HEAD | cut -f2 | sort > AUTHORS
|
||||||
|
|
||||||
.mailmap:
|
|
||||||
git shortlog -s -e -n | awk '!(out[$$NF]++) { $$1="";sub(/^[ \t]+/,""); print}' > .mailmap
|
|
||||||
|
|
293
README.md
293
README.md
|
@ -12,7 +12,7 @@
|
||||||
[![License: Unlicense](https://img.shields.io/badge/-Unlicense-blue.svg?style=for-the-badge)](LICENSE "License")
|
[![License: Unlicense](https://img.shields.io/badge/-Unlicense-blue.svg?style=for-the-badge)](LICENSE "License")
|
||||||
[![CI Status](https://img.shields.io/github/actions/workflow/status/yt-dlp/yt-dlp/core.yml?branch=master&label=Tests&style=for-the-badge)](https://github.com/yt-dlp/yt-dlp/actions "CI Status")
|
[![CI Status](https://img.shields.io/github/actions/workflow/status/yt-dlp/yt-dlp/core.yml?branch=master&label=Tests&style=for-the-badge)](https://github.com/yt-dlp/yt-dlp/actions "CI Status")
|
||||||
[![Commits](https://img.shields.io/github/commit-activity/m/yt-dlp/yt-dlp?label=commits&style=for-the-badge)](https://github.com/yt-dlp/yt-dlp/commits "Commit History")
|
[![Commits](https://img.shields.io/github/commit-activity/m/yt-dlp/yt-dlp?label=commits&style=for-the-badge)](https://github.com/yt-dlp/yt-dlp/commits "Commit History")
|
||||||
[![Last Commit](https://img.shields.io/github/last-commit/yt-dlp/yt-dlp/master?label=&style=for-the-badge&display_timestamp=committer)](https://github.com/yt-dlp/yt-dlp/commits "Commit History")
|
[![Last Commit](https://img.shields.io/github/last-commit/yt-dlp/yt-dlp/master?label=&style=for-the-badge&display_timestamp=committer)](https://github.com/yt-dlp/yt-dlp/pulse/monthly "Last activity")
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
<!-- MANPAGE: END EXCLUDED SECTION -->
|
<!-- MANPAGE: END EXCLUDED SECTION -->
|
||||||
|
@ -49,7 +49,7 @@
|
||||||
* [Extractor Options](#extractor-options)
|
* [Extractor Options](#extractor-options)
|
||||||
* [CONFIGURATION](#configuration)
|
* [CONFIGURATION](#configuration)
|
||||||
* [Configuration file encoding](#configuration-file-encoding)
|
* [Configuration file encoding](#configuration-file-encoding)
|
||||||
* [Authentication with .netrc file](#authentication-with-netrc-file)
|
* [Authentication with netrc](#authentication-with-netrc)
|
||||||
* [Notes about environment variables](#notes-about-environment-variables)
|
* [Notes about environment variables](#notes-about-environment-variables)
|
||||||
* [OUTPUT TEMPLATE](#output-template)
|
* [OUTPUT TEMPLATE](#output-template)
|
||||||
* [Output template examples](#output-template-examples)
|
* [Output template examples](#output-template-examples)
|
||||||
|
@ -76,7 +76,7 @@
|
||||||
|
|
||||||
# NEW FEATURES
|
# NEW FEATURES
|
||||||
|
|
||||||
* Merged with **youtube-dl v2021.12.17+ [commit/195f22f](https://github.com/ytdl-org/youtube-dl/commit/195f22f)** <!--([exceptions](https://github.com/yt-dlp/yt-dlp/issues/21))--> and **youtube-dlc v2020.11.11-3+ [commit/f9401f2](https://github.com/blackjack4494/yt-dlc/commit/f9401f2a91987068139c5f757b12fc711d4c0cee)**: You get all the features and patches of [youtube-dlc](https://github.com/blackjack4494/yt-dlc) in addition to the latest [youtube-dl](https://github.com/ytdl-org/youtube-dl)
|
* Forked from [**yt-dlc@f9401f2**](https://github.com/blackjack4494/yt-dlc/commit/f9401f2a91987068139c5f757b12fc711d4c0cee) and merged with [**youtube-dl@be008e6**](https://github.com/ytdl-org/youtube-dl/commit/be008e657d79832642e2158557c899249c9e31cd) ([exceptions](https://github.com/yt-dlp/yt-dlp/issues/21))
|
||||||
|
|
||||||
* **[SponsorBlock Integration](#sponsorblock-options)**: You can mark/remove sponsor sections in YouTube videos by utilizing the [SponsorBlock](https://sponsor.ajay.app) API
|
* **[SponsorBlock Integration](#sponsorblock-options)**: You can mark/remove sponsor sections in YouTube videos by utilizing the [SponsorBlock](https://sponsor.ajay.app) API
|
||||||
|
|
||||||
|
@ -85,11 +85,10 @@ # NEW FEATURES
|
||||||
* **Merged with animelover1984/youtube-dl**: You get most of the features and improvements from [animelover1984/youtube-dl](https://github.com/animelover1984/youtube-dl) including `--write-comments`, `BiliBiliSearch`, `BilibiliChannel`, Embedding thumbnail in mp4/ogg/opus, playlist infojson etc. Note that NicoNico livestreams are not available. See [#31](https://github.com/yt-dlp/yt-dlp/pull/31) for details.
|
* **Merged with animelover1984/youtube-dl**: You get most of the features and improvements from [animelover1984/youtube-dl](https://github.com/animelover1984/youtube-dl) including `--write-comments`, `BiliBiliSearch`, `BilibiliChannel`, Embedding thumbnail in mp4/ogg/opus, playlist infojson etc. Note that NicoNico livestreams are not available. See [#31](https://github.com/yt-dlp/yt-dlp/pull/31) for details.
|
||||||
|
|
||||||
* **YouTube improvements**:
|
* **YouTube improvements**:
|
||||||
* Supports Clips, Stories (`ytstories:<channel UCID>`), Search (including filters)**\***, YouTube Music Search, Channel-specific search, Search prefixes (`ytsearch:`, `ytsearchdate:`)**\***, Mixes, YouTube Music Albums/Channels ([except self-uploaded music](https://github.com/yt-dlp/yt-dlp/issues/723)), and Feeds (`:ytfav`, `:ytwatchlater`, `:ytsubs`, `:ythistory`, `:ytrec`, `:ytnotif`)
|
* Supports Clips, Stories (`ytstories:<channel UCID>`), Search (including filters)**\***, YouTube Music Search, Channel-specific search, Search prefixes (`ytsearch:`, `ytsearchdate:`)**\***, Mixes, and Feeds (`:ytfav`, `:ytwatchlater`, `:ytsubs`, `:ythistory`, `:ytrec`, `:ytnotif`)
|
||||||
* Fix for [n-sig based throttling](https://github.com/ytdl-org/youtube-dl/issues/29326) **\***
|
* Fix for [n-sig based throttling](https://github.com/ytdl-org/youtube-dl/issues/29326) **\***
|
||||||
* Supports some (but not all) age-gated content without cookies
|
* Supports some (but not all) age-gated content without cookies
|
||||||
* Download livestreams from the start using `--live-from-start` (*experimental*)
|
* Download livestreams from the start using `--live-from-start` (*experimental*)
|
||||||
* `255kbps` audio is extracted (if available) from YouTube Music when premium cookies are given
|
|
||||||
* Channel URLs download all uploads of the channel, including shorts and live
|
* Channel URLs download all uploads of the channel, including shorts and live
|
||||||
|
|
||||||
* **Cookies from browser**: Cookies can be automatically extracted from all major web browsers using `--cookies-from-browser BROWSER[+KEYRING][:PROFILE][::CONTAINER]`
|
* **Cookies from browser**: Cookies can be automatically extracted from all major web browsers using `--cookies-from-browser BROWSER[+KEYRING][:PROFILE][::CONTAINER]`
|
||||||
|
@ -114,13 +113,15 @@ # NEW FEATURES
|
||||||
|
|
||||||
* **Output template improvements**: Output templates can now have date-time formatting, numeric offsets, object traversal etc. See [output template](#output-template) for details. Even more advanced operations can also be done with the help of `--parse-metadata` and `--replace-in-metadata`
|
* **Output template improvements**: Output templates can now have date-time formatting, numeric offsets, object traversal etc. See [output template](#output-template) for details. Even more advanced operations can also be done with the help of `--parse-metadata` and `--replace-in-metadata`
|
||||||
|
|
||||||
* **Other new options**: Many new options have been added such as `--alias`, `--print`, `--concat-playlist`, `--wait-for-video`, `--retry-sleep`, `--sleep-requests`, `--convert-thumbnails`, `--force-download-archive`, `--force-overwrites`, `--break-on-reject` etc
|
* **Other new options**: Many new options have been added such as `--alias`, `--print`, `--concat-playlist`, `--wait-for-video`, `--retry-sleep`, `--sleep-requests`, `--convert-thumbnails`, `--force-download-archive`, `--force-overwrites`, `--break-match-filter` etc
|
||||||
|
|
||||||
* **Improvements**: Regex and other operators in `--format`/`--match-filter`, multiple `--postprocessor-args` and `--downloader-args`, faster archive checking, more [format selection options](#format-selection), merge multi-video/audio, multiple `--config-locations`, `--exec` at different stages, etc
|
* **Improvements**: Regex and other operators in `--format`/`--match-filter`, multiple `--postprocessor-args` and `--downloader-args`, faster archive checking, more [format selection options](#format-selection), merge multi-video/audio, multiple `--config-locations`, `--exec` at different stages, etc
|
||||||
|
|
||||||
* **Plugins**: Extractors and PostProcessors can be loaded from an external file. See [plugins](#plugins) for details
|
* **Plugins**: Extractors and PostProcessors can be loaded from an external file. See [plugins](#plugins) for details
|
||||||
|
|
||||||
* **Self-updater**: The releases can be updated using `yt-dlp -U`
|
* **Self updater**: The releases can be updated using `yt-dlp -U`, and downgraded using `--update-to` if required
|
||||||
|
|
||||||
|
* **Automated builds**: [Nightly/master builds](#update-channels) can be used with `--update-to nightly` and `--update-to master`
|
||||||
|
|
||||||
See [changelog](Changelog.md) or [commits](https://github.com/yt-dlp/yt-dlp/commits) for the full list of changes
|
See [changelog](Changelog.md) or [commits](https://github.com/yt-dlp/yt-dlp/commits) for the full list of changes
|
||||||
|
|
||||||
|
@ -130,6 +131,7 @@ ### Differences in default behavior
|
||||||
|
|
||||||
Some of yt-dlp's default options are different from that of youtube-dl and youtube-dlc:
|
Some of yt-dlp's default options are different from that of youtube-dl and youtube-dlc:
|
||||||
|
|
||||||
|
* yt-dlp supports only [Python 3.8+](## "Windows 7"), and *may* remove support for more versions as they [become EOL](https://devguide.python.org/versions/#python-release-cycle); while [youtube-dl still supports Python 2.6+ and 3.2+](https://github.com/ytdl-org/youtube-dl/issues/30568#issue-1118238743)
|
||||||
* The options `--auto-number` (`-A`), `--title` (`-t`) and `--literal` (`-l`), no longer work. See [removed options](#Removed) for details
|
* The options `--auto-number` (`-A`), `--title` (`-t`) and `--literal` (`-l`), no longer work. See [removed options](#Removed) for details
|
||||||
* `avconv` is not supported as an alternative to `ffmpeg`
|
* `avconv` is not supported as an alternative to `ffmpeg`
|
||||||
* yt-dlp stores config files in slightly different locations to youtube-dl. See [CONFIGURATION](#configuration) for a list of correct locations
|
* yt-dlp stores config files in slightly different locations to youtube-dl. See [CONFIGURATION](#configuration) for a list of correct locations
|
||||||
|
@ -149,19 +151,24 @@ ### Differences in default behavior
|
||||||
* The upload dates extracted from YouTube are in UTC [when available](https://github.com/yt-dlp/yt-dlp/blob/89e4d86171c7b7c997c77d4714542e0383bf0db0/yt_dlp/extractor/youtube.py#L3898-L3900). Use `--compat-options no-youtube-prefer-utc-upload-date` to prefer the non-UTC upload date.
|
* The upload dates extracted from YouTube are in UTC [when available](https://github.com/yt-dlp/yt-dlp/blob/89e4d86171c7b7c997c77d4714542e0383bf0db0/yt_dlp/extractor/youtube.py#L3898-L3900). Use `--compat-options no-youtube-prefer-utc-upload-date` to prefer the non-UTC upload date.
|
||||||
* If `ffmpeg` is used as the downloader, the downloading and merging of formats happen in a single step when possible. Use `--compat-options no-direct-merge` to revert this
|
* If `ffmpeg` is used as the downloader, the downloading and merging of formats happen in a single step when possible. Use `--compat-options no-direct-merge` to revert this
|
||||||
* Thumbnail embedding in `mp4` is done with mutagen if possible. Use `--compat-options embed-thumbnail-atomicparsley` to force the use of AtomicParsley instead
|
* Thumbnail embedding in `mp4` is done with mutagen if possible. Use `--compat-options embed-thumbnail-atomicparsley` to force the use of AtomicParsley instead
|
||||||
* Some private fields such as filenames are removed by default from the infojson. Use `--no-clean-infojson` or `--compat-options no-clean-infojson` to revert this
|
* Some internal metadata such as filenames are removed by default from the infojson. Use `--no-clean-infojson` or `--compat-options no-clean-infojson` to revert this
|
||||||
* When `--embed-subs` and `--write-subs` are used together, the subtitles are written to disk and also embedded in the media file. You can use just `--embed-subs` to embed the subs and automatically delete the separate file. See [#630 (comment)](https://github.com/yt-dlp/yt-dlp/issues/630#issuecomment-893659460) for more info. `--compat-options no-keep-subs` can be used to revert this
|
* When `--embed-subs` and `--write-subs` are used together, the subtitles are written to disk and also embedded in the media file. You can use just `--embed-subs` to embed the subs and automatically delete the separate file. See [#630 (comment)](https://github.com/yt-dlp/yt-dlp/issues/630#issuecomment-893659460) for more info. `--compat-options no-keep-subs` can be used to revert this
|
||||||
* `certifi` will be used for SSL root certificates, if installed. If you want to use system certificates (e.g. self-signed), use `--compat-options no-certifi`
|
* `certifi` will be used for SSL root certificates, if installed. If you want to use system certificates (e.g. self-signed), use `--compat-options no-certifi`
|
||||||
* yt-dlp's sanitization of invalid characters in filenames is different/smarter than in youtube-dl. You can use `--compat-options filename-sanitization` to revert to youtube-dl's behavior
|
* yt-dlp's sanitization of invalid characters in filenames is different/smarter than in youtube-dl. You can use `--compat-options filename-sanitization` to revert to youtube-dl's behavior
|
||||||
* yt-dlp tries to parse the external downloader outputs into the standard progress output if possible (Currently implemented: [~~aria2c~~](https://github.com/yt-dlp/yt-dlp/issues/5931)). You can use `--compat-options no-external-downloader-progress` to get the downloader output as-is
|
* yt-dlp tries to parse the external downloader outputs into the standard progress output if possible (Currently implemented: [~~aria2c~~](https://github.com/yt-dlp/yt-dlp/issues/5931)). You can use `--compat-options no-external-downloader-progress` to get the downloader output as-is
|
||||||
|
* yt-dlp versions between 2021.09.01 and 2023.01.02 applies `--match-filter` to nested playlists. This was an unintentional side-effect of [8f18ac](https://github.com/yt-dlp/yt-dlp/commit/8f18aca8717bb0dd49054555af8d386e5eda3a88) and is fixed in [d7b460](https://github.com/yt-dlp/yt-dlp/commit/d7b460d0e5fc710950582baed2e3fc616ed98a80). Use `--compat-options playlist-match-filter` to revert this
|
||||||
|
* yt-dlp versions between 2021.11.10 and 2023.06.21 estimated `filesize_approx` values for fragmented/manifest formats. This was added for convenience in [f2fe69](https://github.com/yt-dlp/yt-dlp/commit/f2fe69c7b0d208bdb1f6292b4ae92bc1e1a7444a), but was reverted in [0dff8e](https://github.com/yt-dlp/yt-dlp/commit/0dff8e4d1e6e9fb938f4256ea9af7d81f42fd54f) due to the potentially extreme inaccuracy of the estimated values. Use `--compat-options manifest-filesize-approx` to keep extracting the estimated values
|
||||||
|
* yt-dlp uses modern http client backends such as `requests`. Use `--compat-options prefer-legacy-http-handler` to prefer the legacy http handler (`urllib`) to be used for standard http requests.
|
||||||
|
* The sub-module `swfinterp` is removed.
|
||||||
|
|
||||||
For ease of use, a few more compat options are available:
|
For ease of use, a few more compat options are available:
|
||||||
|
|
||||||
* `--compat-options all`: Use all compat options (Do NOT use)
|
* `--compat-options all`: Use all compat options (Do NOT use)
|
||||||
* `--compat-options youtube-dl`: Same as `--compat-options all,-multistreams`
|
* `--compat-options youtube-dl`: Same as `--compat-options all,-multistreams,-playlist-match-filter,-manifest-filesize-approx`
|
||||||
* `--compat-options youtube-dlc`: Same as `--compat-options all,-no-live-chat,-no-youtube-channel-redirect`
|
* `--compat-options youtube-dlc`: Same as `--compat-options all,-no-live-chat,-no-youtube-channel-redirect,-playlist-match-filter,-manifest-filesize-approx`
|
||||||
* `--compat-options 2021`: Same as `--compat-options 2022,no-certifi,filename-sanitization,no-youtube-prefer-utc-upload-date`
|
* `--compat-options 2021`: Same as `--compat-options 2022,no-certifi,filename-sanitization,no-youtube-prefer-utc-upload-date`
|
||||||
* `--compat-options 2022`: Same as `--compat-options no-external-downloader-progress`. Use this to enable all future compat options
|
* `--compat-options 2022`: Same as `--compat-options 2023,playlist-match-filter,no-external-downloader-progress`
|
||||||
|
* `--compat-options 2023`: Same as `--compat-options prefer-legacy-http-handler,manifest-filesize-approx`. Use this to enable all future compat options
|
||||||
|
|
||||||
|
|
||||||
# INSTALLATION
|
# INSTALLATION
|
||||||
|
@ -176,16 +183,43 @@ # INSTALLATION
|
||||||
[![All versions](https://img.shields.io/badge/-All_Versions-lightgrey.svg?style=for-the-badge)](https://github.com/yt-dlp/yt-dlp/releases)
|
[![All versions](https://img.shields.io/badge/-All_Versions-lightgrey.svg?style=for-the-badge)](https://github.com/yt-dlp/yt-dlp/releases)
|
||||||
<!-- MANPAGE: END EXCLUDED SECTION -->
|
<!-- MANPAGE: END EXCLUDED SECTION -->
|
||||||
|
|
||||||
You can install yt-dlp using [the binaries](#release-files), [PIP](https://pypi.org/project/yt-dlp) or one using a third-party package manager. See [the wiki](https://github.com/yt-dlp/yt-dlp/wiki/Installation) for detailed instructions
|
You can install yt-dlp using [the binaries](#release-files), [pip](https://pypi.org/project/yt-dlp) or one using a third-party package manager. See [the wiki](https://github.com/yt-dlp/yt-dlp/wiki/Installation) for detailed instructions
|
||||||
|
|
||||||
|
|
||||||
## UPDATE
|
## UPDATE
|
||||||
You can use `yt-dlp -U` to update if you are [using the release binaries](#release-files)
|
You can use `yt-dlp -U` to update if you are using the [release binaries](#release-files)
|
||||||
|
|
||||||
If you [installed with PIP](https://github.com/yt-dlp/yt-dlp/wiki/Installation#with-pip), simply re-run the same command that was used to install the program
|
If you [installed with pip](https://github.com/yt-dlp/yt-dlp/wiki/Installation#with-pip), simply re-run the same command that was used to install the program
|
||||||
|
|
||||||
For other third-party package managers, see [the wiki](https://github.com/yt-dlp/yt-dlp/wiki/Installation#third-party-package-managers) or refer their documentation
|
For other third-party package managers, see [the wiki](https://github.com/yt-dlp/yt-dlp/wiki/Installation#third-party-package-managers) or refer their documentation
|
||||||
|
|
||||||
|
<a id="update-channels"/>
|
||||||
|
|
||||||
|
There are currently three release channels for binaries: `stable`, `nightly` and `master`.
|
||||||
|
|
||||||
|
* `stable` is the default channel, and many of its changes have been tested by users of the `nightly` and `master` channels.
|
||||||
|
* The `nightly` channel has releases scheduled to build every day around midnight UTC, for a snapshot of the project's new patches and changes. This is the **recommended channel for regular users** of yt-dlp. The `nightly` releases are available from [yt-dlp/yt-dlp-nightly-builds](https://github.com/yt-dlp/yt-dlp-nightly-builds/releases) or as development releases of the `yt-dlp` PyPI package (which can be installed with pip's `--pre` flag).
|
||||||
|
* The `master` channel features releases that are built after each push to the master branch, and these will have the very latest fixes and additions, but may also be more prone to regressions. They are available from [yt-dlp/yt-dlp-master-builds](https://github.com/yt-dlp/yt-dlp-master-builds/releases).
|
||||||
|
|
||||||
|
When using `--update`/`-U`, a release binary will only update to its current channel.
|
||||||
|
`--update-to CHANNEL` can be used to switch to a different channel when a newer version is available. `--update-to [CHANNEL@]TAG` can also be used to upgrade or downgrade to specific tags from a channel.
|
||||||
|
|
||||||
|
You may also use `--update-to <repository>` (`<owner>/<repository>`) to update to a channel on a completely different repository. Be careful with what repository you are updating to though, there is no verification done for binaries from different repositories.
|
||||||
|
|
||||||
|
Example usage:
|
||||||
|
* `yt-dlp --update-to master` switch to the `master` channel and update to its latest release
|
||||||
|
* `yt-dlp --update-to stable@2023.07.06` upgrade/downgrade to release to `stable` channel tag `2023.07.06`
|
||||||
|
* `yt-dlp --update-to 2023.10.07` upgrade/downgrade to tag `2023.10.07` if it exists on the current channel
|
||||||
|
* `yt-dlp --update-to example/yt-dlp@2023.09.24` upgrade/downgrade to the release from the `example/yt-dlp` repository, tag `2023.09.24`
|
||||||
|
|
||||||
|
**Important**: Any user experiencing an issue with the `stable` release should install or update to the `nightly` release before submitting a bug report:
|
||||||
|
```
|
||||||
|
# To update to nightly from stable executable/binary:
|
||||||
|
yt-dlp --update-to nightly
|
||||||
|
|
||||||
|
# To install nightly with pip:
|
||||||
|
python -m pip install -U --pre yt-dlp
|
||||||
|
```
|
||||||
|
|
||||||
<!-- MANPAGE: BEGIN EXCLUDED SECTION -->
|
<!-- MANPAGE: BEGIN EXCLUDED SECTION -->
|
||||||
## RELEASE FILES
|
## RELEASE FILES
|
||||||
|
@ -218,14 +252,23 @@ #### Misc
|
||||||
:---|:---
|
:---|:---
|
||||||
[yt-dlp.tar.gz](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp.tar.gz)|Source tarball
|
[yt-dlp.tar.gz](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp.tar.gz)|Source tarball
|
||||||
[SHA2-512SUMS](https://github.com/yt-dlp/yt-dlp/releases/latest/download/SHA2-512SUMS)|GNU-style SHA512 sums
|
[SHA2-512SUMS](https://github.com/yt-dlp/yt-dlp/releases/latest/download/SHA2-512SUMS)|GNU-style SHA512 sums
|
||||||
|
[SHA2-512SUMS.sig](https://github.com/yt-dlp/yt-dlp/releases/latest/download/SHA2-512SUMS.sig)|GPG signature file for SHA512 sums
|
||||||
[SHA2-256SUMS](https://github.com/yt-dlp/yt-dlp/releases/latest/download/SHA2-256SUMS)|GNU-style SHA256 sums
|
[SHA2-256SUMS](https://github.com/yt-dlp/yt-dlp/releases/latest/download/SHA2-256SUMS)|GNU-style SHA256 sums
|
||||||
|
[SHA2-256SUMS.sig](https://github.com/yt-dlp/yt-dlp/releases/latest/download/SHA2-256SUMS.sig)|GPG signature file for SHA256 sums
|
||||||
|
|
||||||
|
The public key that can be used to verify the GPG signatures is [available here](https://github.com/yt-dlp/yt-dlp/blob/master/public.key)
|
||||||
|
Example usage:
|
||||||
|
```
|
||||||
|
curl -L https://github.com/yt-dlp/yt-dlp/raw/master/public.key | gpg --import
|
||||||
|
gpg --verify SHA2-256SUMS.sig SHA2-256SUMS
|
||||||
|
gpg --verify SHA2-512SUMS.sig SHA2-512SUMS
|
||||||
|
```
|
||||||
<!-- MANPAGE: END EXCLUDED SECTION -->
|
<!-- MANPAGE: END EXCLUDED SECTION -->
|
||||||
|
|
||||||
|
**Note**: The manpages, shell completion (autocomplete) files etc. are available inside the [source tarball](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp.tar.gz)
|
||||||
**Note**: The manpages, shell completion files etc. are available in the [source tarball](https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp.tar.gz)
|
|
||||||
|
|
||||||
## DEPENDENCIES
|
## DEPENDENCIES
|
||||||
Python versions 3.7+ (CPython and PyPy) are supported. Other versions and implementations may or may not work correctly.
|
Python versions 3.8+ (CPython and PyPy) are supported. Other versions and implementations may or may not work correctly.
|
||||||
|
|
||||||
<!-- Python 3.5+ uses VC++14 and it is already embedded in the binary created
|
<!-- Python 3.5+ uses VC++14 and it is already embedded in the binary created
|
||||||
<!x-- https://www.microsoft.com/en-us/download/details.aspx?id=26999 --x>
|
<!x-- https://www.microsoft.com/en-us/download/details.aspx?id=26999 --x>
|
||||||
|
@ -238,7 +281,7 @@ ### Strongly recommended
|
||||||
|
|
||||||
* [**ffmpeg** and **ffprobe**](https://www.ffmpeg.org) - Required for [merging separate video and audio files](#format-selection) as well as for various [post-processing](#post-processing-options) tasks. License [depends on the build](https://www.ffmpeg.org/legal.html)
|
* [**ffmpeg** and **ffprobe**](https://www.ffmpeg.org) - Required for [merging separate video and audio files](#format-selection) as well as for various [post-processing](#post-processing-options) tasks. License [depends on the build](https://www.ffmpeg.org/legal.html)
|
||||||
|
|
||||||
There are bugs in ffmpeg that causes various issues when used alongside yt-dlp. Since ffmpeg is such an important dependency, we provide [custom builds](https://github.com/yt-dlp/FFmpeg-Builds#ffmpeg-static-auto-builds) with patches for some of these issues at [yt-dlp/FFmpeg-Builds](https://github.com/yt-dlp/FFmpeg-Builds). See [the readme](https://github.com/yt-dlp/FFmpeg-Builds#patches-applied) for details on the specific issues solved by these builds
|
There are bugs in ffmpeg that cause various issues when used alongside yt-dlp. Since ffmpeg is such an important dependency, we provide [custom builds](https://github.com/yt-dlp/FFmpeg-Builds#ffmpeg-static-auto-builds) with patches for some of these issues at [yt-dlp/FFmpeg-Builds](https://github.com/yt-dlp/FFmpeg-Builds). See [the readme](https://github.com/yt-dlp/FFmpeg-Builds#patches-applied) for details on the specific issues solved by these builds
|
||||||
|
|
||||||
**Important**: What you need is ffmpeg *binary*, **NOT** [the python package of the same name](https://pypi.org/project/ffmpeg)
|
**Important**: What you need is ffmpeg *binary*, **NOT** [the python package of the same name](https://pypi.org/project/ffmpeg)
|
||||||
|
|
||||||
|
@ -246,18 +289,19 @@ ### Networking
|
||||||
* [**certifi**](https://github.com/certifi/python-certifi)\* - Provides Mozilla's root certificate bundle. Licensed under [MPLv2](https://github.com/certifi/python-certifi/blob/master/LICENSE)
|
* [**certifi**](https://github.com/certifi/python-certifi)\* - Provides Mozilla's root certificate bundle. Licensed under [MPLv2](https://github.com/certifi/python-certifi/blob/master/LICENSE)
|
||||||
* [**brotli**](https://github.com/google/brotli)\* or [**brotlicffi**](https://github.com/python-hyper/brotlicffi) - [Brotli](https://en.wikipedia.org/wiki/Brotli) content encoding support. Both licensed under MIT <sup>[1](https://github.com/google/brotli/blob/master/LICENSE) [2](https://github.com/python-hyper/brotlicffi/blob/master/LICENSE) </sup>
|
* [**brotli**](https://github.com/google/brotli)\* or [**brotlicffi**](https://github.com/python-hyper/brotlicffi) - [Brotli](https://en.wikipedia.org/wiki/Brotli) content encoding support. Both licensed under MIT <sup>[1](https://github.com/google/brotli/blob/master/LICENSE) [2](https://github.com/python-hyper/brotlicffi/blob/master/LICENSE) </sup>
|
||||||
* [**websockets**](https://github.com/aaugustin/websockets)\* - For downloading over websocket. Licensed under [BSD-3-Clause](https://github.com/aaugustin/websockets/blob/main/LICENSE)
|
* [**websockets**](https://github.com/aaugustin/websockets)\* - For downloading over websocket. Licensed under [BSD-3-Clause](https://github.com/aaugustin/websockets/blob/main/LICENSE)
|
||||||
|
* [**requests**](https://github.com/psf/requests)\* - HTTP library. For HTTPS proxy and persistent connections support. Licensed under [Apache-2.0](https://github.com/psf/requests/blob/main/LICENSE)
|
||||||
|
|
||||||
### Metadata
|
### Metadata
|
||||||
|
|
||||||
* [**mutagen**](https://github.com/quodlibet/mutagen)\* - For `--embed-thumbnail` in certain formats. Licensed under [GPLv2+](https://github.com/quodlibet/mutagen/blob/master/COPYING)
|
* [**mutagen**](https://github.com/quodlibet/mutagen)\* - For `--embed-thumbnail` in certain formats. Licensed under [GPLv2+](https://github.com/quodlibet/mutagen/blob/master/COPYING)
|
||||||
* [**AtomicParsley**](https://github.com/wez/atomicparsley) - For `--embed-thumbnail` in `mp4`/`m4a` files when `mutagen`/`ffmpeg` cannot. Licensed under [GPLv2+](https://github.com/wez/atomicparsley/blob/master/COPYING)
|
* [**AtomicParsley**](https://github.com/wez/atomicparsley) - For `--embed-thumbnail` in `mp4`/`m4a` files when `mutagen`/`ffmpeg` cannot. Licensed under [GPLv2+](https://github.com/wez/atomicparsley/blob/master/COPYING)
|
||||||
* [**xattr**](https://github.com/xattr/xattr), [**pyxattr**](https://github.com/iustin/pyxattr) or [**setfattr**](http://savannah.nongnu.org/projects/attr) - For writing xattr metadata (`--xattr`) on **Linux**. Licensed under [MIT](https://github.com/xattr/xattr/blob/master/LICENSE.txt), [LGPL2.1](https://github.com/iustin/pyxattr/blob/master/COPYING) and [GPLv2+](http://git.savannah.nongnu.org/cgit/attr.git/tree/doc/COPYING) respectively
|
* [**xattr**](https://github.com/xattr/xattr), [**pyxattr**](https://github.com/iustin/pyxattr) or [**setfattr**](http://savannah.nongnu.org/projects/attr) - For writing xattr metadata (`--xattr`) on **Mac** and **BSD**. Licensed under [MIT](https://github.com/xattr/xattr/blob/master/LICENSE.txt), [LGPL2.1](https://github.com/iustin/pyxattr/blob/master/COPYING) and [GPLv2+](http://git.savannah.nongnu.org/cgit/attr.git/tree/doc/COPYING) respectively
|
||||||
|
|
||||||
### Misc
|
### Misc
|
||||||
|
|
||||||
* [**pycryptodomex**](https://github.com/Legrandin/pycryptodome)\* - For decrypting AES-128 HLS streams and various other data. Licensed under [BSD-2-Clause](https://github.com/Legrandin/pycryptodome/blob/master/LICENSE.rst)
|
* [**pycryptodomex**](https://github.com/Legrandin/pycryptodome)\* - For decrypting AES-128 HLS streams and various other data. Licensed under [BSD-2-Clause](https://github.com/Legrandin/pycryptodome/blob/master/LICENSE.rst)
|
||||||
* [**phantomjs**](https://github.com/ariya/phantomjs) - Used in extractors where javascript needs to be run. Licensed under [BSD-3-Clause](https://github.com/ariya/phantomjs/blob/master/LICENSE.BSD)
|
* [**phantomjs**](https://github.com/ariya/phantomjs) - Used in extractors where javascript needs to be run. Licensed under [BSD-3-Clause](https://github.com/ariya/phantomjs/blob/master/LICENSE.BSD)
|
||||||
* [**secretstorage**](https://github.com/mitya57/secretstorage) - For `--cookies-from-browser` to access the **Gnome** keyring while decrypting cookies of **Chromium**-based browsers on **Linux**. Licensed under [BSD-3-Clause](https://github.com/mitya57/secretstorage/blob/master/LICENSE)
|
* [**secretstorage**](https://github.com/mitya57/secretstorage)\* - For `--cookies-from-browser` to access the **Gnome** keyring while decrypting cookies of **Chromium**-based browsers on **Linux**. Licensed under [BSD-3-Clause](https://github.com/mitya57/secretstorage/blob/master/LICENSE)
|
||||||
* Any external downloader that you want to use with `--downloader`
|
* Any external downloader that you want to use with `--downloader`
|
||||||
|
|
||||||
### Deprecated
|
### Deprecated
|
||||||
|
@ -277,22 +321,24 @@ ### Deprecated
|
||||||
## COMPILE
|
## COMPILE
|
||||||
|
|
||||||
### Standalone PyInstaller Builds
|
### Standalone PyInstaller Builds
|
||||||
To build the standalone executable, you must have Python and `pyinstaller` (plus any of yt-dlp's [optional dependencies](#dependencies) if needed). Once you have all the necessary dependencies installed, simply run `pyinst.py`. The executable will be built for the same architecture (x86/ARM, 32/64 bit) as the Python used.
|
To build the standalone executable, you must have Python and `pyinstaller` (plus any of yt-dlp's [optional dependencies](#dependencies) if needed). The executable will be built for the same architecture (x86/ARM, 32/64 bit) as the Python used. You can run the following commands:
|
||||||
|
|
||||||
python3 -m pip install -U pyinstaller -r requirements.txt
|
```
|
||||||
python3 devscripts/make_lazy_extractors.py
|
python3 devscripts/install_deps.py --include pyinstaller
|
||||||
python3 pyinst.py
|
python3 devscripts/make_lazy_extractors.py
|
||||||
|
python3 -m bundle.pyinstaller
|
||||||
|
```
|
||||||
|
|
||||||
On some systems, you may need to use `py` or `python` instead of `python3`.
|
On some systems, you may need to use `py` or `python` instead of `python3`.
|
||||||
|
|
||||||
`pyinst.py` accepts any arguments that can be passed to `pyinstaller`, such as `--onefile/-F` or `--onedir/-D`, which is further [documented here](https://pyinstaller.org/en/stable/usage.html#what-to-generate).
|
`bundle/pyinstaller.py` accepts any arguments that can be passed to `pyinstaller`, such as `--onefile/-F` or `--onedir/-D`, which is further [documented here](https://pyinstaller.org/en/stable/usage.html#what-to-generate).
|
||||||
|
|
||||||
**Note**: Pyinstaller versions below 4.4 [do not support](https://github.com/pyinstaller/pyinstaller#requirements-and-tested-platforms) Python installed from the Windows store without using a virtual environment.
|
**Note**: Pyinstaller versions below 4.4 [do not support](https://github.com/pyinstaller/pyinstaller#requirements-and-tested-platforms) Python installed from the Windows store without using a virtual environment.
|
||||||
|
|
||||||
**Important**: Running `pyinstaller` directly **without** using `pyinst.py` is **not** officially supported. This may or may not work correctly.
|
**Important**: Running `pyinstaller` directly **without** using `bundle/pyinstaller.py` is **not** officially supported. This may or may not work correctly.
|
||||||
|
|
||||||
### Platform-independent Binary (UNIX)
|
### Platform-independent Binary (UNIX)
|
||||||
You will need the build tools `python` (3.7+), `zip`, `make` (GNU), `pandoc`\* and `pytest`\*.
|
You will need the build tools `python` (3.8+), `zip`, `make` (GNU), `pandoc`\* and `pytest`\*.
|
||||||
|
|
||||||
After installing these, simply run `make`.
|
After installing these, simply run `make`.
|
||||||
|
|
||||||
|
@ -302,19 +348,26 @@ ### Standalone Py2Exe Builds (Windows)
|
||||||
|
|
||||||
While we provide the option to build with [py2exe](https://www.py2exe.org), it is recommended to build [using PyInstaller](#standalone-pyinstaller-builds) instead since the py2exe builds **cannot contain `pycryptodomex`/`certifi` and needs VC++14** on the target computer to run.
|
While we provide the option to build with [py2exe](https://www.py2exe.org), it is recommended to build [using PyInstaller](#standalone-pyinstaller-builds) instead since the py2exe builds **cannot contain `pycryptodomex`/`certifi` and needs VC++14** on the target computer to run.
|
||||||
|
|
||||||
If you wish to build it anyway, install Python and py2exe, and then simply run `setup.py py2exe`
|
If you wish to build it anyway, install Python (if it is not already installed) and you can run the following commands:
|
||||||
|
|
||||||
py -m pip install -U py2exe -r requirements.txt
|
```
|
||||||
py devscripts/make_lazy_extractors.py
|
py devscripts/install_deps.py --include py2exe
|
||||||
py setup.py py2exe
|
py devscripts/make_lazy_extractors.py
|
||||||
|
py -m bundle.py2exe
|
||||||
|
```
|
||||||
|
|
||||||
### Related scripts
|
### Related scripts
|
||||||
|
|
||||||
* **`devscripts/update-version.py [revision]`** - Update the version number based on current date
|
* **`devscripts/install_deps.py`** - Install dependencies for yt-dlp.
|
||||||
* **`devscripts/set-variant.py variant [-M update_message]`** - Set the build variant of the executable
|
* **`devscripts/update-version.py`** - Update the version number based on current date.
|
||||||
|
* **`devscripts/set-variant.py`** - Set the build variant of the executable.
|
||||||
|
* **`devscripts/make_changelog.py`** - Create a markdown changelog using short commit messages and update `CONTRIBUTORS` file.
|
||||||
* **`devscripts/make_lazy_extractors.py`** - Create lazy extractors. Running this before building the binaries (any variant) will improve their startup performance. Set the environment variable `YTDLP_NO_LAZY_EXTRACTORS=1` if you wish to forcefully disable lazy extractor loading.
|
* **`devscripts/make_lazy_extractors.py`** - Create lazy extractors. Running this before building the binaries (any variant) will improve their startup performance. Set the environment variable `YTDLP_NO_LAZY_EXTRACTORS=1` if you wish to forcefully disable lazy extractor loading.
|
||||||
|
|
||||||
You can also fork the project on GitHub and run your fork's [build workflow](.github/workflows/build.yml) to automatically build a full release
|
Note: See their `--help` for more info.
|
||||||
|
|
||||||
|
### Forking the project
|
||||||
|
If you fork the project on GitHub, you can run your fork's [build workflow](.github/workflows/build.yml) to automatically build the selected version(s) as artifacts. Alternatively, you can run the [release workflow](.github/workflows/release.yml) or enable the [nightly workflow](.github/workflows/release-nightly.yml) to create full (pre-)releases.
|
||||||
|
|
||||||
# USAGE AND OPTIONS
|
# USAGE AND OPTIONS
|
||||||
|
|
||||||
|
@ -330,6 +383,12 @@ ## General Options:
|
||||||
--version Print program version and exit
|
--version Print program version and exit
|
||||||
-U, --update Update this program to the latest version
|
-U, --update Update this program to the latest version
|
||||||
--no-update Do not check for updates (default)
|
--no-update Do not check for updates (default)
|
||||||
|
--update-to [CHANNEL]@[TAG] Upgrade/downgrade to a specific version.
|
||||||
|
CHANNEL can be a repository as well. CHANNEL
|
||||||
|
and TAG default to "stable" and "latest"
|
||||||
|
respectively if omitted; See "UPDATE" for
|
||||||
|
details. Supported channels: stable,
|
||||||
|
nightly, master
|
||||||
-i, --ignore-errors Ignore download and postprocessing errors.
|
-i, --ignore-errors Ignore download and postprocessing errors.
|
||||||
The download will be considered successful
|
The download will be considered successful
|
||||||
even if the postprocessing fails
|
even if the postprocessing fails
|
||||||
|
@ -375,7 +434,8 @@ ## General Options:
|
||||||
configuration files
|
configuration files
|
||||||
--flat-playlist Do not extract the videos of a playlist,
|
--flat-playlist Do not extract the videos of a playlist,
|
||||||
only list them
|
only list them
|
||||||
--no-flat-playlist Extract the videos of a playlist
|
--no-flat-playlist Fully extract the videos of a playlist
|
||||||
|
(default)
|
||||||
--live-from-start Download livestreams from the start.
|
--live-from-start Download livestreams from the start.
|
||||||
Currently only supported for YouTube
|
Currently only supported for YouTube
|
||||||
(Experimental)
|
(Experimental)
|
||||||
|
@ -387,8 +447,12 @@ ## General Options:
|
||||||
--no-wait-for-video Do not wait for scheduled streams (default)
|
--no-wait-for-video Do not wait for scheduled streams (default)
|
||||||
--mark-watched Mark videos watched (even with --simulate)
|
--mark-watched Mark videos watched (even with --simulate)
|
||||||
--no-mark-watched Do not mark videos watched (default)
|
--no-mark-watched Do not mark videos watched (default)
|
||||||
--no-colors Do not emit color codes in output (Alias:
|
--color [STREAM:]POLICY Whether to emit color codes in output,
|
||||||
--no-colours)
|
optionally prefixed by the STREAM (stdout or
|
||||||
|
stderr) to apply the setting to. Can be one
|
||||||
|
of "always", "auto" (default), "never", or
|
||||||
|
"no_color" (use non color terminal
|
||||||
|
sequences). Can be used multiple times
|
||||||
--compat-options OPTS Options that can help keep compatibility
|
--compat-options OPTS Options that can help keep compatibility
|
||||||
with youtube-dl or youtube-dlc
|
with youtube-dl or youtube-dlc
|
||||||
configurations by reverting some of the
|
configurations by reverting some of the
|
||||||
|
@ -429,15 +493,11 @@ ## Geo-restriction:
|
||||||
specified by --proxy (or none, if the option
|
specified by --proxy (or none, if the option
|
||||||
is not present) is used for the actual
|
is not present) is used for the actual
|
||||||
downloading
|
downloading
|
||||||
--geo-bypass Bypass geographic restriction via faking
|
--xff VALUE How to fake X-Forwarded-For HTTP header to
|
||||||
X-Forwarded-For HTTP header (default)
|
try bypassing geographic restriction. One of
|
||||||
--no-geo-bypass Do not bypass geographic restriction via
|
"default" (only when known to be useful),
|
||||||
faking X-Forwarded-For HTTP header
|
"never", an IP block in CIDR notation, or a
|
||||||
--geo-bypass-country CODE Force bypass geographic restriction with
|
two-letter ISO 3166-2 country code
|
||||||
explicitly provided two-letter ISO 3166-2
|
|
||||||
country code
|
|
||||||
--geo-bypass-ip-block IP_BLOCK Force bypass geographic restriction with
|
|
||||||
explicitly provided IP block in CIDR notation
|
|
||||||
|
|
||||||
## Video Selection:
|
## Video Selection:
|
||||||
-I, --playlist-items ITEM_SPEC Comma separated playlist_index of the items
|
-I, --playlist-items ITEM_SPEC Comma separated playlist_index of the items
|
||||||
|
@ -456,9 +516,8 @@ ## Video Selection:
|
||||||
--date DATE Download only videos uploaded on this date.
|
--date DATE Download only videos uploaded on this date.
|
||||||
The date can be "YYYYMMDD" or in the format
|
The date can be "YYYYMMDD" or in the format
|
||||||
[now|today|yesterday][-N[day|week|month|year]].
|
[now|today|yesterday][-N[day|week|month|year]].
|
||||||
E.g. "--date today-2weeks" downloads
|
E.g. "--date today-2weeks" downloads only
|
||||||
only videos uploaded on the same day two
|
videos uploaded on the same day two weeks ago
|
||||||
weeks ago
|
|
||||||
--datebefore DATE Download only videos uploaded on or before
|
--datebefore DATE Download only videos uploaded on or before
|
||||||
this date. The date formats accepted is the
|
this date. The date formats accepted is the
|
||||||
same as --date
|
same as --date
|
||||||
|
@ -485,7 +544,10 @@ ## Video Selection:
|
||||||
dogs" (caseless). Use "--match-filter -" to
|
dogs" (caseless). Use "--match-filter -" to
|
||||||
interactively ask whether to download each
|
interactively ask whether to download each
|
||||||
video
|
video
|
||||||
--no-match-filter Do not use generic video filter (default)
|
--no-match-filters Do not use any --match-filter (default)
|
||||||
|
--break-match-filters FILTER Same as "--match-filters" but stops the
|
||||||
|
download process when a video is rejected
|
||||||
|
--no-break-match-filters Do not use any --break-match-filters (default)
|
||||||
--no-playlist Download only the video, if the URL refers
|
--no-playlist Download only the video, if the URL refers
|
||||||
to a video and a playlist
|
to a video and a playlist
|
||||||
--yes-playlist Download the playlist, if the URL refers to
|
--yes-playlist Download the playlist, if the URL refers to
|
||||||
|
@ -499,11 +561,9 @@ ## Video Selection:
|
||||||
--max-downloads NUMBER Abort after downloading NUMBER files
|
--max-downloads NUMBER Abort after downloading NUMBER files
|
||||||
--break-on-existing Stop the download process when encountering
|
--break-on-existing Stop the download process when encountering
|
||||||
a file that is in the archive
|
a file that is in the archive
|
||||||
--break-on-reject Stop the download process when encountering
|
|
||||||
a file that has been filtered out
|
|
||||||
--break-per-input Alters --max-downloads, --break-on-existing,
|
--break-per-input Alters --max-downloads, --break-on-existing,
|
||||||
--break-on-reject, and autonumber to reset
|
--break-match-filter, and autonumber to
|
||||||
per input URL
|
reset per input URL
|
||||||
--no-break-per-input --break-on-existing and similar options
|
--no-break-per-input --break-on-existing and similar options
|
||||||
terminates the entire download queue
|
terminates the entire download queue
|
||||||
--skip-playlist-after-errors N Number of allowed failures until the rest of
|
--skip-playlist-after-errors N Number of allowed failures until the rest of
|
||||||
|
@ -571,12 +631,14 @@ ## Download Options:
|
||||||
--no-hls-use-mpegts Do not use the mpegts container for HLS
|
--no-hls-use-mpegts Do not use the mpegts container for HLS
|
||||||
videos. This is default when not downloading
|
videos. This is default when not downloading
|
||||||
live streams
|
live streams
|
||||||
--download-sections REGEX Download only chapters whose title matches
|
--download-sections REGEX Download only chapters that match the
|
||||||
the given regular expression. Time ranges
|
regular expression. A "*" prefix denotes
|
||||||
prefixed by a "*" can also be used in place
|
time-range instead of chapter. Negative
|
||||||
of chapters to download the specified range.
|
timestamps are calculated from the end.
|
||||||
Needs ffmpeg. This option can be used
|
"*from-url" can be used to download between
|
||||||
multiple times to download multiple
|
the "start_time" and "end_time" extracted
|
||||||
|
from the URL. Needs ffmpeg. This option can
|
||||||
|
be used multiple times to download multiple
|
||||||
sections, e.g. --download-sections
|
sections, e.g. --download-sections
|
||||||
"*10:15-inf" --download-sections "intro"
|
"*10:15-inf" --download-sections "intro"
|
||||||
--downloader [PROTO:]NAME Name or path of the external downloader to
|
--downloader [PROTO:]NAME Name or path of the external downloader to
|
||||||
|
@ -660,9 +722,8 @@ ## Filesystem Options:
|
||||||
--write-description etc. (default)
|
--write-description etc. (default)
|
||||||
--no-write-playlist-metafiles Do not write playlist metadata when using
|
--no-write-playlist-metafiles Do not write playlist metadata when using
|
||||||
--write-info-json, --write-description etc.
|
--write-info-json, --write-description etc.
|
||||||
--clean-info-json Remove some private fields such as filenames
|
--clean-info-json Remove some internal metadata such as
|
||||||
from the infojson. Note that it could still
|
filenames from the infojson (default)
|
||||||
contain some personal information (default)
|
|
||||||
--no-clean-info-json Write all fields to the infojson
|
--no-clean-info-json Write all fields to the infojson
|
||||||
--write-comments Retrieve video comments to be placed in the
|
--write-comments Retrieve video comments to be placed in the
|
||||||
infojson. The comments are fetched even
|
infojson. The comments are fetched even
|
||||||
|
@ -690,7 +751,7 @@ ## Filesystem Options:
|
||||||
By default, all containers of the most
|
By default, all containers of the most
|
||||||
recently accessed profile are used.
|
recently accessed profile are used.
|
||||||
Currently supported keyrings are: basictext,
|
Currently supported keyrings are: basictext,
|
||||||
gnomekeyring, kwallet
|
gnomekeyring, kwallet, kwallet5, kwallet6
|
||||||
--no-cookies-from-browser Do not load cookies from browser (default)
|
--no-cookies-from-browser Do not load cookies from browser (default)
|
||||||
--cache-dir DIR Location in the filesystem where yt-dlp can
|
--cache-dir DIR Location in the filesystem where yt-dlp can
|
||||||
store some downloaded information (such as
|
store some downloaded information (such as
|
||||||
|
@ -718,6 +779,7 @@ ## Internet Shortcut Options:
|
||||||
## Verbosity and Simulation Options:
|
## Verbosity and Simulation Options:
|
||||||
-q, --quiet Activate quiet mode. If used with --verbose,
|
-q, --quiet Activate quiet mode. If used with --verbose,
|
||||||
print the log to stderr
|
print the log to stderr
|
||||||
|
--no-quiet Deactivate quiet mode. (Default)
|
||||||
--no-warnings Ignore warnings
|
--no-warnings Ignore warnings
|
||||||
-s, --simulate Do not download the video and do not write
|
-s, --simulate Do not download the video and do not write
|
||||||
anything to disk
|
anything to disk
|
||||||
|
@ -788,7 +850,7 @@ ## Workarounds:
|
||||||
--prefer-insecure Use an unencrypted connection to retrieve
|
--prefer-insecure Use an unencrypted connection to retrieve
|
||||||
information about the video (Currently
|
information about the video (Currently
|
||||||
supported only for YouTube)
|
supported only for YouTube)
|
||||||
--add-header FIELD:VALUE Specify a custom HTTP header and its value,
|
--add-headers FIELD:VALUE Specify a custom HTTP header and its value,
|
||||||
separated by a colon ":". You can use this
|
separated by a colon ":". You can use this
|
||||||
option multiple times
|
option multiple times
|
||||||
--bidi-workaround Work around terminals that lack
|
--bidi-workaround Work around terminals that lack
|
||||||
|
@ -870,7 +932,9 @@ ## Authentication Options:
|
||||||
--netrc-location PATH Location of .netrc authentication data;
|
--netrc-location PATH Location of .netrc authentication data;
|
||||||
either the path or its containing directory.
|
either the path or its containing directory.
|
||||||
Defaults to ~/.netrc
|
Defaults to ~/.netrc
|
||||||
--video-password PASSWORD Video password (vimeo, youku)
|
--netrc-cmd NETRC_CMD Command to execute to get the credentials
|
||||||
|
for an extractor.
|
||||||
|
--video-password PASSWORD Video-specific password
|
||||||
--ap-mso MSO Adobe Pass multiple-system operator (TV
|
--ap-mso MSO Adobe Pass multiple-system operator (TV
|
||||||
provider) identifier, use --ap-list-mso for
|
provider) identifier, use --ap-list-mso for
|
||||||
a list of available MSOs
|
a list of available MSOs
|
||||||
|
@ -999,13 +1063,10 @@ ## Post-Processing Options:
|
||||||
that of --use-postprocessor (default:
|
that of --use-postprocessor (default:
|
||||||
after_move). Same syntax as the output
|
after_move). Same syntax as the output
|
||||||
template can be used to pass any field as
|
template can be used to pass any field as
|
||||||
arguments to the command. After download, an
|
arguments to the command. If no fields are
|
||||||
additional field "filepath" that contains
|
passed, %(filepath,_filename|)q is appended
|
||||||
the final path of the downloaded file is
|
to the end of the command. This option can
|
||||||
also available, and if no fields are passed,
|
be used multiple times
|
||||||
%(filepath,_filename|)q is appended to the
|
|
||||||
end of the command. This option can be used
|
|
||||||
multiple times
|
|
||||||
--no-exec Remove any previously defined --exec
|
--no-exec Remove any previously defined --exec
|
||||||
--convert-subs FORMAT Convert the subtitles to another format
|
--convert-subs FORMAT Convert the subtitles to another format
|
||||||
(currently supported: ass, lrc, srt, vtt)
|
(currently supported: ass, lrc, srt, vtt)
|
||||||
|
@ -1163,7 +1224,7 @@ ### Configuration file encoding
|
||||||
|
|
||||||
If you want your file to be decoded differently, add `# coding: ENCODING` to the beginning of the file (e.g. `# coding: shift-jis`). There must be no characters before that, even spaces or BOM.
|
If you want your file to be decoded differently, add `# coding: ENCODING` to the beginning of the file (e.g. `# coding: shift-jis`). There must be no characters before that, even spaces or BOM.
|
||||||
|
|
||||||
### Authentication with `.netrc` file
|
### Authentication with netrc
|
||||||
|
|
||||||
You may also want to configure automatic credentials storage for extractors that support authentication (by providing login and password with `--username` and `--password`) in order not to pass credentials as command line arguments on every yt-dlp execution and prevent tracking plain text passwords in the shell command history. You can achieve this using a [`.netrc` file](https://stackoverflow.com/tags/.netrc/info) on a per-extractor basis. For that you will need to create a `.netrc` file in `--netrc-location` and restrict permissions to read/write by only you:
|
You may also want to configure automatic credentials storage for extractors that support authentication (by providing login and password with `--username` and `--password`) in order not to pass credentials as command line arguments on every yt-dlp execution and prevent tracking plain text passwords in the shell command history. You can achieve this using a [`.netrc` file](https://stackoverflow.com/tags/.netrc/info) on a per-extractor basis. For that you will need to create a `.netrc` file in `--netrc-location` and restrict permissions to read/write by only you:
|
||||||
```
|
```
|
||||||
|
@ -1183,6 +1244,14 @@ ### Authentication with `.netrc` file
|
||||||
|
|
||||||
The default location of the .netrc file is `~` (see below).
|
The default location of the .netrc file is `~` (see below).
|
||||||
|
|
||||||
|
As an alternative to using the `.netrc` file, which has the disadvantage of keeping your passwords in a plain text file, you can configure a custom shell command to provide the credentials for an extractor. This is done by providing the `--netrc-cmd` parameter, it shall output the credentials in the netrc format and return `0` on success, other values will be treated as an error. `{}` in the command will be replaced by the name of the extractor to make it possible to select the credentials for the right extractor.
|
||||||
|
|
||||||
|
E.g. To use an encrypted `.netrc` file stored as `.authinfo.gpg`
|
||||||
|
```
|
||||||
|
yt-dlp --netrc-cmd 'gpg --decrypt ~/.authinfo.gpg' https://www.youtube.com/watch?v=BaW_jenozKc
|
||||||
|
```
|
||||||
|
|
||||||
|
|
||||||
### Notes about environment variables
|
### Notes about environment variables
|
||||||
* Environment variables are normally specified as `${VARIABLE}`/`$VARIABLE` on UNIX and `%VARIABLE%` on Windows; but is always shown as `${VARIABLE}` in this documentation
|
* Environment variables are normally specified as `${VARIABLE}`/`$VARIABLE` on UNIX and `%VARIABLE%` on Windows; but is always shown as `${VARIABLE}` in this documentation
|
||||||
* yt-dlp also allow using UNIX-style variables on Windows for path-like options; e.g. `--output`, `--config-location`
|
* yt-dlp also allow using UNIX-style variables on Windows for path-like options; e.g. `--output`, `--config-location`
|
||||||
|
@ -1206,13 +1275,13 @@ # OUTPUT TEMPLATE
|
||||||
|
|
||||||
1. **Object traversal**: The dictionaries and lists available in metadata can be traversed by using a dot `.` separator; e.g. `%(tags.0)s`, `%(subtitles.en.-1.ext)s`. You can do Python slicing with colon `:`; E.g. `%(id.3:7:-1)s`, `%(formats.:.format_id)s`. Curly braces `{}` can be used to build dictionaries with only specific keys; e.g. `%(formats.:.{format_id,height})#j`. An empty field name `%()s` refers to the entire infodict; e.g. `%(.{id,title})s`. Note that all the fields that become available using this method are not listed below. Use `-j` to see such fields
|
1. **Object traversal**: The dictionaries and lists available in metadata can be traversed by using a dot `.` separator; e.g. `%(tags.0)s`, `%(subtitles.en.-1.ext)s`. You can do Python slicing with colon `:`; E.g. `%(id.3:7:-1)s`, `%(formats.:.format_id)s`. Curly braces `{}` can be used to build dictionaries with only specific keys; e.g. `%(formats.:.{format_id,height})#j`. An empty field name `%()s` refers to the entire infodict; e.g. `%(.{id,title})s`. Note that all the fields that become available using this method are not listed below. Use `-j` to see such fields
|
||||||
|
|
||||||
1. **Addition**: Addition and subtraction of numeric fields can be done using `+` and `-` respectively. E.g. `%(playlist_index+10)03d`, `%(n_entries+1-playlist_index)d`
|
1. **Arithmetic**: Simple arithmetic can be done on numeric fields using `+`, `-` and `*`. E.g. `%(playlist_index+10)03d`, `%(n_entries+1-playlist_index)d`
|
||||||
|
|
||||||
1. **Date/time Formatting**: Date/time fields can be formatted according to [strftime formatting](https://docs.python.org/3/library/datetime.html#strftime-and-strptime-format-codes) by specifying it separated from the field name using a `>`. E.g. `%(duration>%H-%M-%S)s`, `%(upload_date>%Y-%m-%d)s`, `%(epoch-3600>%H-%M-%S)s`
|
1. **Date/time Formatting**: Date/time fields can be formatted according to [strftime formatting](https://docs.python.org/3/library/datetime.html#strftime-and-strptime-format-codes) by specifying it separated from the field name using a `>`. E.g. `%(duration>%H-%M-%S)s`, `%(upload_date>%Y-%m-%d)s`, `%(epoch-3600>%H-%M-%S)s`
|
||||||
|
|
||||||
1. **Alternatives**: Alternate fields can be specified separated with a `,`. E.g. `%(release_date>%Y,upload_date>%Y|Unknown)s`
|
1. **Alternatives**: Alternate fields can be specified separated with a `,`. E.g. `%(release_date>%Y,upload_date>%Y|Unknown)s`
|
||||||
|
|
||||||
1. **Replacement**: A replacement value can be specified using a `&` separator. If the field is *not* empty, this replacement value will be used instead of the actual field content. This is done after alternate fields are considered; thus the replacement is used if *any* of the alternative fields is *not* empty.
|
1. **Replacement**: A replacement value can be specified using a `&` separator according to the [`str.format` mini-language](https://docs.python.org/3/library/string.html#format-specification-mini-language). If the field is *not* empty, this replacement value will be used instead of the actual field content. This is done after alternate fields are considered; thus the replacement is used if *any* of the alternative fields is *not* empty. E.g. `%(chapters&has chapters|no chapters)s`, `%(title&TITLE={:>20}|NO TITLE)s`
|
||||||
|
|
||||||
1. **Default**: A literal default value can be specified for when the field is empty using a `|` separator. This overrides `--output-na-placeholder`. E.g. `%(uploader|Unknown)s`
|
1. **Default**: A literal default value can be specified for when the field is empty using a `|` separator. This overrides `--output-na-placeholder`. E.g. `%(uploader|Unknown)s`
|
||||||
|
|
||||||
|
@ -1227,7 +1296,7 @@ # OUTPUT TEMPLATE
|
||||||
|
|
||||||
Additionally, you can set different output templates for the various metadata files separately from the general output template by specifying the type of file followed by the template separated by a colon `:`. The different file types supported are `subtitle`, `thumbnail`, `description`, `annotation` (deprecated), `infojson`, `link`, `pl_thumbnail`, `pl_description`, `pl_infojson`, `chapter`, `pl_video`. E.g. `-o "%(title)s.%(ext)s" -o "thumbnail:%(title)s\%(title)s.%(ext)s"` will put the thumbnails in a folder with the same name as the video. If any of the templates is empty, that type of file will not be written. E.g. `--write-thumbnail -o "thumbnail:"` will write thumbnails only for playlists and not for video.
|
Additionally, you can set different output templates for the various metadata files separately from the general output template by specifying the type of file followed by the template separated by a colon `:`. The different file types supported are `subtitle`, `thumbnail`, `description`, `annotation` (deprecated), `infojson`, `link`, `pl_thumbnail`, `pl_description`, `pl_infojson`, `chapter`, `pl_video`. E.g. `-o "%(title)s.%(ext)s" -o "thumbnail:%(title)s\%(title)s.%(ext)s"` will put the thumbnails in a folder with the same name as the video. If any of the templates is empty, that type of file will not be written. E.g. `--write-thumbnail -o "thumbnail:"` will write thumbnails only for playlists and not for video.
|
||||||
|
|
||||||
<a id="outtmpl-postprocess-note"></a>
|
<a id="outtmpl-postprocess-note"/>
|
||||||
|
|
||||||
**Note**: Due to post-processing (i.e. merging etc.), the actual output filename might differ. Use `--print after_move:filepath` to get the name after all post-processing is complete.
|
**Note**: Due to post-processing (i.e. merging etc.), the actual output filename might differ. Use `--print after_move:filepath` to get the name after all post-processing is complete.
|
||||||
|
|
||||||
|
@ -1247,12 +1316,14 @@ # OUTPUT TEMPLATE
|
||||||
- `upload_date` (string): Video upload date in UTC (YYYYMMDD)
|
- `upload_date` (string): Video upload date in UTC (YYYYMMDD)
|
||||||
- `release_timestamp` (numeric): UNIX timestamp of the moment the video was released
|
- `release_timestamp` (numeric): UNIX timestamp of the moment the video was released
|
||||||
- `release_date` (string): The date (YYYYMMDD) when the video was released in UTC
|
- `release_date` (string): The date (YYYYMMDD) when the video was released in UTC
|
||||||
|
- `release_year` (numeric): Year (YYYY) when the video or album was released
|
||||||
- `modified_timestamp` (numeric): UNIX timestamp of the moment the video was last modified
|
- `modified_timestamp` (numeric): UNIX timestamp of the moment the video was last modified
|
||||||
- `modified_date` (string): The date (YYYYMMDD) when the video was last modified in UTC
|
- `modified_date` (string): The date (YYYYMMDD) when the video was last modified in UTC
|
||||||
- `uploader_id` (string): Nickname or id of the video uploader
|
- `uploader_id` (string): Nickname or id of the video uploader
|
||||||
- `channel` (string): Full name of the channel the video is uploaded on
|
- `channel` (string): Full name of the channel the video is uploaded on
|
||||||
- `channel_id` (string): Id of the channel
|
- `channel_id` (string): Id of the channel
|
||||||
- `channel_follower_count` (numeric): Number of followers of the channel
|
- `channel_follower_count` (numeric): Number of followers of the channel
|
||||||
|
- `channel_is_verified` (boolean): Whether the channel is verified on the platform
|
||||||
- `location` (string): Physical location where the video was filmed
|
- `location` (string): Physical location where the video was filmed
|
||||||
- `duration` (numeric): Length of the video in seconds
|
- `duration` (numeric): Length of the video in seconds
|
||||||
- `duration_string` (string): Length of the video (HH:mm:ss)
|
- `duration_string` (string): Length of the video (HH:mm:ss)
|
||||||
|
@ -1269,12 +1340,13 @@ # OUTPUT TEMPLATE
|
||||||
- `was_live` (boolean): Whether this video was originally a live stream
|
- `was_live` (boolean): Whether this video was originally a live stream
|
||||||
- `playable_in_embed` (string): Whether this video is allowed to play in embedded players on other sites
|
- `playable_in_embed` (string): Whether this video is allowed to play in embedded players on other sites
|
||||||
- `availability` (string): Whether the video is "private", "premium_only", "subscriber_only", "needs_auth", "unlisted" or "public"
|
- `availability` (string): Whether the video is "private", "premium_only", "subscriber_only", "needs_auth", "unlisted" or "public"
|
||||||
|
- `media_type` (string): The type of media as classified by the site, e.g. "episode", "clip", "trailer"
|
||||||
- `start_time` (numeric): Time in seconds where the reproduction should start, as specified in the URL
|
- `start_time` (numeric): Time in seconds where the reproduction should start, as specified in the URL
|
||||||
- `end_time` (numeric): Time in seconds where the reproduction should end, as specified in the URL
|
- `end_time` (numeric): Time in seconds where the reproduction should end, as specified in the URL
|
||||||
- `extractor` (string): Name of the extractor
|
- `extractor` (string): Name of the extractor
|
||||||
- `extractor_key` (string): Key name of the extractor
|
- `extractor_key` (string): Key name of the extractor
|
||||||
- `epoch` (numeric): Unix epoch of when the information extraction was completed
|
- `epoch` (numeric): Unix epoch of when the information extraction was completed
|
||||||
- `autonumber` (numeric): Number that will be increased with each download, starting at `--autonumber-start`
|
- `autonumber` (numeric): Number that will be increased with each download, starting at `--autonumber-start`, padded with leading zeros to 5 digits
|
||||||
- `video_autonumber` (numeric): Number that will be increased with each video
|
- `video_autonumber` (numeric): Number that will be increased with each video
|
||||||
- `n_entries` (numeric): Total number of extracted items in the playlist
|
- `n_entries` (numeric): Total number of extracted items in the playlist
|
||||||
- `playlist_id` (string): Identifier of the playlist that contains the video
|
- `playlist_id` (string): Identifier of the playlist that contains the video
|
||||||
|
@ -1319,7 +1391,6 @@ # OUTPUT TEMPLATE
|
||||||
- `album_type` (string): Type of the album
|
- `album_type` (string): Type of the album
|
||||||
- `album_artist` (string): List of all artists appeared on the album
|
- `album_artist` (string): List of all artists appeared on the album
|
||||||
- `disc_number` (numeric): Number of the disc or other physical medium the track belongs to
|
- `disc_number` (numeric): Number of the disc or other physical medium the track belongs to
|
||||||
- `release_year` (numeric): Year (YYYY) when the album was released
|
|
||||||
|
|
||||||
Available only when using `--download-sections` and for `chapter:` prefix when using `--split-chapters` for videos with internal chapters:
|
Available only when using `--download-sections` and for `chapter:` prefix when using `--split-chapters` for videos with internal chapters:
|
||||||
|
|
||||||
|
@ -1337,6 +1408,9 @@ # OUTPUT TEMPLATE
|
||||||
- `subtitles_table` (table): The subtitle format table as printed by `--list-subs`
|
- `subtitles_table` (table): The subtitle format table as printed by `--list-subs`
|
||||||
- `automatic_captions_table` (table): The automatic subtitle format table as printed by `--list-subs`
|
- `automatic_captions_table` (table): The automatic subtitle format table as printed by `--list-subs`
|
||||||
|
|
||||||
|
Available only after the video is downloaded (`post_process`/`after_move`):
|
||||||
|
|
||||||
|
- `filepath`: Actual path of downloaded video file
|
||||||
|
|
||||||
Available only in `--sponsorblock-chapter-title`:
|
Available only in `--sponsorblock-chapter-title`:
|
||||||
|
|
||||||
|
@ -1383,7 +1457,7 @@ # Download YouTube playlist videos in separate directories according to their up
|
||||||
$ yt-dlp -o "%(upload_date>%Y)s/%(title)s.%(ext)s" "https://www.youtube.com/playlist?list=PLwiyx1dc3P2JR9N8gQaQN_BCvlSlap7re"
|
$ yt-dlp -o "%(upload_date>%Y)s/%(title)s.%(ext)s" "https://www.youtube.com/playlist?list=PLwiyx1dc3P2JR9N8gQaQN_BCvlSlap7re"
|
||||||
|
|
||||||
# Prefix playlist index with " - " separator, but only if it is available
|
# Prefix playlist index with " - " separator, but only if it is available
|
||||||
$ yt-dlp -o '%(playlist_index|)s%(playlist_index& - |)s%(title)s.%(ext)s' BaW_jenozKc "https://www.youtube.com/user/TheLinuxFoundation/playlists"
|
$ yt-dlp -o "%(playlist_index&{} - |)s%(title)s.%(ext)s" BaW_jenozKc "https://www.youtube.com/user/TheLinuxFoundation/playlists"
|
||||||
|
|
||||||
# Download all playlists of YouTube channel/user keeping each playlist in separate directory:
|
# Download all playlists of YouTube channel/user keeping each playlist in separate directory:
|
||||||
$ yt-dlp -o "%(uploader)s/%(playlist)s/%(playlist_index)s - %(title)s.%(ext)s" "https://www.youtube.com/user/TheLinuxFoundation/playlists"
|
$ yt-dlp -o "%(uploader)s/%(playlist)s/%(playlist_index)s - %(title)s.%(ext)s" "https://www.youtube.com/user/TheLinuxFoundation/playlists"
|
||||||
|
@ -1457,7 +1531,7 @@ # FORMAT SELECTION
|
||||||
|
|
||||||
## Filtering Formats
|
## Filtering Formats
|
||||||
|
|
||||||
You can also filter the video formats by putting a condition in brackets, as in `-f "best[height=720]"` (or `-f "[filesize>10M]"`).
|
You can also filter the video formats by putting a condition in brackets, as in `-f "best[height=720]"` (or `-f "[filesize>10M]"` since filters without a selector are interpreted as `best`).
|
||||||
|
|
||||||
The following numeric meta fields can be used with comparisons `<`, `<=`, `>`, `>=`, `=` (equals), `!=` (not equals):
|
The following numeric meta fields can be used with comparisons `<`, `<=`, `>`, `>=`, `=` (equals), `!=` (not equals):
|
||||||
|
|
||||||
|
@ -1493,7 +1567,7 @@ ## Filtering Formats
|
||||||
|
|
||||||
**Note**: None of the aforementioned meta fields are guaranteed to be present since this solely depends on the metadata obtained by particular extractor, i.e. the metadata offered by the website. Any other field made available by the extractor can also be used for filtering.
|
**Note**: None of the aforementioned meta fields are guaranteed to be present since this solely depends on the metadata obtained by particular extractor, i.e. the metadata offered by the website. Any other field made available by the extractor can also be used for filtering.
|
||||||
|
|
||||||
Formats for which the value is not known are excluded unless you put a question mark (`?`) after the operator. You can combine format filters, so `-f "[height<=?720][tbr>500]"` selects up to 720p videos (or videos where the height is not known) with a bitrate of at least 500 KBit/s. You can also use the filters with `all` to download all formats that satisfy the filter, e.g. `-f "all[vcodec=none]"` selects all audio-only formats.
|
Formats for which the value is not known are excluded unless you put a question mark (`?`) after the operator. You can combine format filters, so `-f "bv[height<=?720][tbr>500]"` selects up to 720p videos (or videos where the height is not known) with a bitrate of at least 500 KBit/s. You can also use the filters with `all` to download all formats that satisfy the filter, e.g. `-f "all[vcodec=none]"` selects all audio-only formats.
|
||||||
|
|
||||||
Format selectors can also be grouped using parentheses; e.g. `-f "(mp4,webm)[height<480]"` will download the best pre-merged mp4 and webm formats with a height lower than 480.
|
Format selectors can also be grouped using parentheses; e.g. `-f "(mp4,webm)[height<480]"` will download the best pre-merged mp4 and webm formats with a height lower than 480.
|
||||||
|
|
||||||
|
@ -1511,13 +1585,13 @@ ## Sorting Formats
|
||||||
- `source`: The preference of the source
|
- `source`: The preference of the source
|
||||||
- `proto`: Protocol used for download (`https`/`ftps` > `http`/`ftp` > `m3u8_native`/`m3u8` > `http_dash_segments`> `websocket_frag` > `mms`/`rtsp` > `f4f`/`f4m`)
|
- `proto`: Protocol used for download (`https`/`ftps` > `http`/`ftp` > `m3u8_native`/`m3u8` > `http_dash_segments`> `websocket_frag` > `mms`/`rtsp` > `f4f`/`f4m`)
|
||||||
- `vcodec`: Video Codec (`av01` > `vp9.2` > `vp9` > `h265` > `h264` > `vp8` > `h263` > `theora` > other)
|
- `vcodec`: Video Codec (`av01` > `vp9.2` > `vp9` > `h265` > `h264` > `vp8` > `h263` > `theora` > other)
|
||||||
- `acodec`: Audio Codec (`flac`/`alac` > `wav`/`aiff` > `opus` > `vorbis` > `aac` > `mp4a` > `mp3` `ac4` > > `eac3` > `ac3` > `dts` > other)
|
- `acodec`: Audio Codec (`flac`/`alac` > `wav`/`aiff` > `opus` > `vorbis` > `aac` > `mp4a` > `mp3` > `ac4` > `eac3` > `ac3` > `dts` > other)
|
||||||
- `codec`: Equivalent to `vcodec,acodec`
|
- `codec`: Equivalent to `vcodec,acodec`
|
||||||
- `vext`: Video Extension (`mp4` > `mov` > `webm` > `flv` > other). If `--prefer-free-formats` is used, `webm` is preferred.
|
- `vext`: Video Extension (`mp4` > `mov` > `webm` > `flv` > other). If `--prefer-free-formats` is used, `webm` is preferred.
|
||||||
- `aext`: Audio Extension (`m4a` > `aac` > `mp3` > `ogg` > `opus` > `webm` > other). If `--prefer-free-formats` is used, the order changes to `ogg` > `opus` > `webm` > `mp3` > `m4a` > `aac`
|
- `aext`: Audio Extension (`m4a` > `aac` > `mp3` > `ogg` > `opus` > `webm` > other). If `--prefer-free-formats` is used, the order changes to `ogg` > `opus` > `webm` > `mp3` > `m4a` > `aac`
|
||||||
- `ext`: Equivalent to `vext,aext`
|
- `ext`: Equivalent to `vext,aext`
|
||||||
- `filesize`: Exact filesize, if known in advance
|
- `filesize`: Exact filesize, if known in advance
|
||||||
- `fs_approx`: Approximate filesize calculated from the manifests
|
- `fs_approx`: Approximate filesize
|
||||||
- `size`: Exact filesize if available, otherwise approximate filesize
|
- `size`: Exact filesize if available, otherwise approximate filesize
|
||||||
- `height`: Height of video
|
- `height`: Height of video
|
||||||
- `width`: Width of video
|
- `width`: Width of video
|
||||||
|
@ -1528,7 +1602,7 @@ ## Sorting Formats
|
||||||
- `tbr`: Total average bitrate in KBit/s
|
- `tbr`: Total average bitrate in KBit/s
|
||||||
- `vbr`: Average video bitrate in KBit/s
|
- `vbr`: Average video bitrate in KBit/s
|
||||||
- `abr`: Average audio bitrate in KBit/s
|
- `abr`: Average audio bitrate in KBit/s
|
||||||
- `br`: Equivalent to using `tbr,vbr,abr`
|
- `br`: Average bitrate in KBit/s, `tbr`/`vbr`/`abr`
|
||||||
- `asr`: Audio sample rate in Hz
|
- `asr`: Audio sample rate in Hz
|
||||||
|
|
||||||
**Deprecation warning**: Many of these fields have (currently undocumented) aliases, that may be removed in a future version. It is recommended to use only the documented field names.
|
**Deprecation warning**: Many of these fields have (currently undocumented) aliases, that may be removed in a future version. It is recommended to use only the documented field names.
|
||||||
|
@ -1678,7 +1752,7 @@ # MODIFYING METADATA
|
||||||
|
|
||||||
This option also has a few special uses:
|
This option also has a few special uses:
|
||||||
|
|
||||||
* You can download an additional URL based on the metadata of the currently downloaded video. To do this, set the field `additional_urls` to the URL that you want to download. E.g. `--parse-metadata "description:(?P<additional_urls>https?://www\.vimeo\.com/\d+)` will download the first vimeo video found in the description
|
* You can download an additional URL based on the metadata of the currently downloaded video. To do this, set the field `additional_urls` to the URL that you want to download. E.g. `--parse-metadata "description:(?P<additional_urls>https?://www\.vimeo\.com/\d+)"` will download the first vimeo video found in the description
|
||||||
|
|
||||||
* You can use this to change the metadata that is embedded in the media file. To do this, set the value of the corresponding field with a `meta_` prefix. For example, any value you set to `meta_description` field will be added to the `description` field in the file - you can use this to set a different "description" and "synopsis". To modify the metadata of individual streams, use the `meta<n>_` prefix (e.g. `meta1_language`). Any value set to the `meta_` field will overwrite all default values.
|
* You can use this to change the metadata that is embedded in the media file. To do this, set the value of the corresponding field with a `meta_` prefix. For example, any value you set to `meta_description` field will be added to the `description` field in the file - you can use this to set a different "description" and "synopsis". To modify the metadata of individual streams, use the `meta<n>_` prefix (e.g. `meta1_language`). Any value set to the `meta_` field will overwrite all default values.
|
||||||
|
|
||||||
|
@ -1730,7 +1804,7 @@ # Do not set any "synopsis" in the video metadata
|
||||||
$ yt-dlp --parse-metadata ":(?P<meta_synopsis>)"
|
$ yt-dlp --parse-metadata ":(?P<meta_synopsis>)"
|
||||||
|
|
||||||
# Remove "formats" field from the infojson by setting it to an empty string
|
# Remove "formats" field from the infojson by setting it to an empty string
|
||||||
$ yt-dlp --parse-metadata ":(?P<formats>)" -j
|
$ yt-dlp --parse-metadata "video::(?P<formats>)" --write-info-json
|
||||||
|
|
||||||
# Replace all spaces and "_" in title and uploader with a `-`
|
# Replace all spaces and "_" in title and uploader with a `-`
|
||||||
$ yt-dlp --replace-in-metadata "title,uploader" "[ _]" "-"
|
$ yt-dlp --replace-in-metadata "title,uploader" "[ _]" "-"
|
||||||
|
@ -1741,26 +1815,33 @@ # EXTRACTOR ARGUMENTS
|
||||||
|
|
||||||
Some extractors accept additional arguments which can be passed using `--extractor-args KEY:ARGS`. `ARGS` is a `;` (semicolon) separated string of `ARG=VAL1,VAL2`. E.g. `--extractor-args "youtube:player-client=android_embedded,web;include_live_dash" --extractor-args "funimation:version=uncut"`
|
Some extractors accept additional arguments which can be passed using `--extractor-args KEY:ARGS`. `ARGS` is a `;` (semicolon) separated string of `ARG=VAL1,VAL2`. E.g. `--extractor-args "youtube:player-client=android_embedded,web;include_live_dash" --extractor-args "funimation:version=uncut"`
|
||||||
|
|
||||||
|
Note: In CLI, `ARG` can use `-` instead of `_`; e.g. `youtube:player-client"` becomes `youtube:player_client"`
|
||||||
|
|
||||||
The following extractors use this feature:
|
The following extractors use this feature:
|
||||||
|
|
||||||
#### youtube
|
#### youtube
|
||||||
* `lang`: Prefer translated metadata (`title`, `description` etc) of this language code (case-sensitive). By default, the video primary language metadata is preferred, with a fallback to `en` translated. See [youtube.py](https://github.com/yt-dlp/yt-dlp/blob/c26f9b991a0681fd3ea548d535919cec1fbbd430/yt_dlp/extractor/youtube.py#L381-L390) for list of supported content language codes
|
* `lang`: Prefer translated metadata (`title`, `description` etc) of this language code (case-sensitive). By default, the video primary language metadata is preferred, with a fallback to `en` translated. See [youtube.py](https://github.com/yt-dlp/yt-dlp/blob/c26f9b991a0681fd3ea548d535919cec1fbbd430/yt_dlp/extractor/youtube.py#L381-L390) for list of supported content language codes
|
||||||
* `skip`: One or more of `hls`, `dash` or `translated_subs` to skip extraction of the m3u8 manifests, dash manifests and [auto-translated subtitles](https://github.com/yt-dlp/yt-dlp/issues/4090#issuecomment-1158102032) respectively
|
* `skip`: One or more of `hls`, `dash` or `translated_subs` to skip extraction of the m3u8 manifests, dash manifests and [auto-translated subtitles](https://github.com/yt-dlp/yt-dlp/issues/4090#issuecomment-1158102032) respectively
|
||||||
* `player_client`: Clients to extract video data from. The main clients are `web`, `android` and `ios` with variants `_music`, `_embedded`, `_embedscreen`, `_creator` (e.g. `web_embedded`); and `mweb` and `tv_embedded` (agegate bypass) with no variants. By default, `android,web` is used, but `tv_embedded` and `creator` variants are added as required for age-gated videos. Similarly, the music variants are added for `music.youtube.com` urls. You can use `all` to use all the clients, and `default` for the default clients.
|
* `player_client`: Clients to extract video data from. The main clients are `web`, `android` and `ios` with variants `_music`, `_embedded`, `_embedscreen`, `_creator` (e.g. `web_embedded`); and `mweb`, `mweb_embedscreen` and `tv_embedded` (agegate bypass) with no variants. By default, `ios,android,web` is used, but `tv_embedded` and `creator` variants are added as required for age-gated videos. Similarly, the music variants are added for `music.youtube.com` urls. You can use `all` to use all the clients, and `default` for the default clients.
|
||||||
* `player_skip`: Skip some network requests that are generally needed for robust extraction. One or more of `configs` (skip client configs), `webpage` (skip initial webpage), `js` (skip js player). While these options can help reduce the number of requests needed or avoid some rate-limiting, they could cause some issues. See [#860](https://github.com/yt-dlp/yt-dlp/pull/860) for more details
|
* `player_skip`: Skip some network requests that are generally needed for robust extraction. One or more of `configs` (skip client configs), `webpage` (skip initial webpage), `js` (skip js player). While these options can help reduce the number of requests needed or avoid some rate-limiting, they could cause some issues. See [#860](https://github.com/yt-dlp/yt-dlp/pull/860) for more details
|
||||||
|
* `player_params`: YouTube player parameters to use for player requests. Will overwrite any default ones set by yt-dlp.
|
||||||
* `comment_sort`: `top` or `new` (default) - choose comment sorting mode (on YouTube's side)
|
* `comment_sort`: `top` or `new` (default) - choose comment sorting mode (on YouTube's side)
|
||||||
* `max_comments`: Limit the amount of comments to gather. Comma-separated list of integers representing `max-comments,max-parents,max-replies,max-replies-per-thread`. Default is `all,all,all,all`
|
* `max_comments`: Limit the amount of comments to gather. Comma-separated list of integers representing `max-comments,max-parents,max-replies,max-replies-per-thread`. Default is `all,all,all,all`
|
||||||
* E.g. `all,all,1000,10` will get a maximum of 1000 replies total, with up to 10 replies per thread. `1000,all,100` will get a maximum of 1000 comments, with a maximum of 100 replies total
|
* E.g. `all,all,1000,10` will get a maximum of 1000 replies total, with up to 10 replies per thread. `1000,all,100` will get a maximum of 1000 comments, with a maximum of 100 replies total
|
||||||
* `include_incomplete_formats`: Extract formats that cannot be downloaded completely (live dash and post-live m3u8)
|
* `formats`: Change the types of formats to return. `dashy` (convert HTTP to DASH), `duplicate` (identical content but different URLs or protocol; includes `dashy`), `incomplete` (cannot be downloaded completely - live dash and post-live m3u8)
|
||||||
* `innertube_host`: Innertube API host to use for all API requests; e.g. `studio.youtube.com`, `youtubei.googleapis.com`. Note that cookies exported from one subdomain will not work on others
|
* `innertube_host`: Innertube API host to use for all API requests; e.g. `studio.youtube.com`, `youtubei.googleapis.com`. Note that cookies exported from one subdomain will not work on others
|
||||||
* `innertube_key`: Innertube API key to use for all API requests
|
* `innertube_key`: Innertube API key to use for all API requests
|
||||||
|
* `raise_incomplete_data`: `Incomplete Data Received` raises an error instead of reporting a warning
|
||||||
|
|
||||||
#### youtubetab (YouTube playlists, channels, feeds, etc.)
|
#### youtubetab (YouTube playlists, channels, feeds, etc.)
|
||||||
* `skip`: One or more of `webpage` (skip initial webpage download), `authcheck` (allow the download of playlists requiring authentication when no initial webpage is downloaded. This may cause unwanted behavior, see [#1122](https://github.com/yt-dlp/yt-dlp/pull/1122) for more details)
|
* `skip`: One or more of `webpage` (skip initial webpage download), `authcheck` (allow the download of playlists requiring authentication when no initial webpage is downloaded. This may cause unwanted behavior, see [#1122](https://github.com/yt-dlp/yt-dlp/pull/1122) for more details)
|
||||||
* `approximate_date`: Extract approximate `upload_date` and `timestamp` in flat-playlist. This may cause date-based filters to be slightly off
|
* `approximate_date`: Extract approximate `upload_date` and `timestamp` in flat-playlist. This may cause date-based filters to be slightly off
|
||||||
|
|
||||||
#### generic
|
#### generic
|
||||||
* `fragment_query`: Passthrough any query in mpd/m3u8 manifest URLs to their fragments. Does not apply to ffmpeg
|
* `fragment_query`: Passthrough any query in mpd/m3u8 manifest URLs to their fragments if no value is provided, or else apply the query string given as `fragment_query=VALUE`. Does not apply to ffmpeg
|
||||||
|
* `variant_query`: Passthrough the master m3u8 URL query to its variant playlist URLs if no value is provided, or else apply the query string given as `variant_query=VALUE`
|
||||||
|
* `hls_key`: An HLS AES-128 key URI *or* key (as hex), and optionally the IV (as hex), in the form of `(URI|KEY)[,IV]`; e.g. `generic:hls_key=ABCDEF1234567980,0xFEDCBA0987654321`. Passing any of these values will force usage of the native HLS downloader and override the corresponding values found in the m3u8 playlist
|
||||||
|
* `is_live`: Bypass live HLS detection and manually set `live_status` - a value of `false` will set `not_live`, any other value (or no value) will set `is_live`
|
||||||
|
|
||||||
#### funimation
|
#### funimation
|
||||||
* `language`: Audio languages to extract, e.g. `funimation:language=english,japanese`
|
* `language`: Audio languages to extract, e.g. `funimation:language=english,japanese`
|
||||||
|
@ -1787,6 +1868,9 @@ #### hotstar
|
||||||
* `vcodec`: vcodec to ignore - one or more of `h264`, `h265`, `dvh265`
|
* `vcodec`: vcodec to ignore - one or more of `h264`, `h265`, `dvh265`
|
||||||
* `dr`: dynamic range to ignore - one or more of `sdr`, `hdr10`, `dv`
|
* `dr`: dynamic range to ignore - one or more of `sdr`, `hdr10`, `dv`
|
||||||
|
|
||||||
|
#### niconicochannelplus
|
||||||
|
* `max_comments`: Maximum number of comments to extract - default is `120`
|
||||||
|
|
||||||
#### tiktok
|
#### tiktok
|
||||||
* `api_hostname`: Hostname to use for mobile API requests, e.g. `api-h2.tiktokv.com`
|
* `api_hostname`: Hostname to use for mobile API requests, e.g. `api-h2.tiktokv.com`
|
||||||
* `app_version`: App version to call mobile APIs with - should be set along with `manifest_app_version`, e.g. `20.2.1`
|
* `app_version`: App version to call mobile APIs with - should be set along with `manifest_app_version`, e.g. `20.2.1`
|
||||||
|
@ -1796,7 +1880,22 @@ #### rokfinchannel
|
||||||
* `tab`: Which tab to download - one of `new`, `top`, `videos`, `podcasts`, `streams`, `stacks`
|
* `tab`: Which tab to download - one of `new`, `top`, `videos`, `podcasts`, `streams`, `stacks`
|
||||||
|
|
||||||
#### twitter
|
#### twitter
|
||||||
* `force_graphql`: Force usage of the GraphQL API. By default it will only be used if login cookies are provided
|
* `api`: Select one of `graphql` (default), `legacy` or `syndication` as the API for tweet extraction. Has no effect if logged in
|
||||||
|
|
||||||
|
#### stacommu, wrestleuniverse
|
||||||
|
* `device_id`: UUID value assigned by the website and used to enforce device limits for paid livestream content. Can be found in browser local storage
|
||||||
|
|
||||||
|
#### twitch
|
||||||
|
* `client_id`: Client ID value to be sent with GraphQL requests, e.g. `twitch:client_id=kimne78kx3ncx6brgo4mv6wki5h1ko`
|
||||||
|
|
||||||
|
#### nhkradirulive (NHK らじる★らじる LIVE)
|
||||||
|
* `area`: Which regional variation to extract. Valid areas are: `sapporo`, `sendai`, `tokyo`, `nagoya`, `osaka`, `hiroshima`, `matsuyama`, `fukuoka`. Defaults to `tokyo`
|
||||||
|
|
||||||
|
#### nflplusreplay
|
||||||
|
* `type`: Type(s) of game replays to extract. Valid types are: `full_game`, `full_game_spanish`, `condensed_game` and `all_22`. You can use `all` to extract all available replay types, which is the default
|
||||||
|
|
||||||
|
#### jiosaavn
|
||||||
|
* `bitrate`: Audio bitrates to request. One or more of `16`, `32`, `64`, `128`, `320`. Default is `128,320`
|
||||||
|
|
||||||
**Note**: These options may be changed/removed in the future without concern for backward compatibility
|
**Note**: These options may be changed/removed in the future without concern for backward compatibility
|
||||||
|
|
||||||
|
@ -1843,7 +1942,7 @@ ## Installing Plugins
|
||||||
* **System Plugins**
|
* **System Plugins**
|
||||||
* `/etc/yt-dlp/plugins/<package name>/yt_dlp_plugins/`
|
* `/etc/yt-dlp/plugins/<package name>/yt_dlp_plugins/`
|
||||||
* `/etc/yt-dlp-plugins/<package name>/yt_dlp_plugins/`
|
* `/etc/yt-dlp-plugins/<package name>/yt_dlp_plugins/`
|
||||||
2. **Executable location**: Plugin packages can similarly be installed in a `yt-dlp-plugins` directory under the executable location:
|
2. **Executable location**: Plugin packages can similarly be installed in a `yt-dlp-plugins` directory under the executable location (recommended for portable installations):
|
||||||
* Binary: where `<root-dir>/yt-dlp.exe`, `<root-dir>/yt-dlp-plugins/<package name>/yt_dlp_plugins/`
|
* Binary: where `<root-dir>/yt-dlp.exe`, `<root-dir>/yt-dlp-plugins/<package name>/yt_dlp_plugins/`
|
||||||
* Source: where `<root-dir>/yt_dlp/__main__.py`, `<root-dir>/yt-dlp-plugins/<package name>/yt_dlp_plugins/`
|
* Source: where `<root-dir>/yt_dlp/__main__.py`, `<root-dir>/yt-dlp-plugins/<package name>/yt_dlp_plugins/`
|
||||||
|
|
||||||
|
@ -1887,7 +1986,7 @@ # EMBEDDING YT-DLP
|
||||||
ydl.download(URLS)
|
ydl.download(URLS)
|
||||||
```
|
```
|
||||||
|
|
||||||
Most likely, you'll want to use various options. For a list of options available, have a look at [`yt_dlp/YoutubeDL.py`](yt_dlp/YoutubeDL.py#L180).
|
Most likely, you'll want to use various options. For a list of options available, have a look at [`yt_dlp/YoutubeDL.py`](yt_dlp/YoutubeDL.py#L183) or `help(yt_dlp.YoutubeDL)` in a Python shell. If you are already familiar with the CLI, you can use [`devscripts/cli_to_api.py`](https://github.com/yt-dlp/yt-dlp/blob/master/devscripts/cli_to_api.py) to translate any CLI switches to `YoutubeDL` params.
|
||||||
|
|
||||||
**Tip**: If you are porting your code from youtube-dl to yt-dlp, one important point to look out for is that we do not guarantee the return value of `YoutubeDL.extract_info` to be json serializable, or even be a dictionary. It will be dictionary-like, but if you want to ensure it is a serializable dictionary, pass it through `YoutubeDL.sanitize_info` as shown in the [example below](#extracting-information)
|
**Tip**: If you are porting your code from youtube-dl to yt-dlp, one important point to look out for is that we do not guarantee the return value of `YoutubeDL.extract_info` to be json serializable, or even be a dictionary. It will be dictionary-like, but if you want to ensure it is a serializable dictionary, pass it through `YoutubeDL.sanitize_info` as shown in the [example below](#extracting-information)
|
||||||
|
|
||||||
|
@ -2031,7 +2130,7 @@ #### Use a custom format selector
|
||||||
```python
|
```python
|
||||||
import yt_dlp
|
import yt_dlp
|
||||||
|
|
||||||
URL = ['https://www.youtube.com/watch?v=BaW_jenozKc']
|
URLS = ['https://www.youtube.com/watch?v=BaW_jenozKc']
|
||||||
|
|
||||||
def format_selector(ctx):
|
def format_selector(ctx):
|
||||||
""" Select the best video and the best audio that won't result in an mkv.
|
""" Select the best video and the best audio that won't result in an mkv.
|
||||||
|
@ -2097,12 +2196,14 @@ #### Redundant options
|
||||||
--reject-title REGEX --match-filter "title !~= (?i)REGEX"
|
--reject-title REGEX --match-filter "title !~= (?i)REGEX"
|
||||||
--min-views COUNT --match-filter "view_count >=? COUNT"
|
--min-views COUNT --match-filter "view_count >=? COUNT"
|
||||||
--max-views COUNT --match-filter "view_count <=? COUNT"
|
--max-views COUNT --match-filter "view_count <=? COUNT"
|
||||||
|
--break-on-reject Use --break-match-filter
|
||||||
--user-agent UA --add-header "User-Agent:UA"
|
--user-agent UA --add-header "User-Agent:UA"
|
||||||
--referer URL --add-header "Referer:URL"
|
--referer URL --add-header "Referer:URL"
|
||||||
--playlist-start NUMBER -I NUMBER:
|
--playlist-start NUMBER -I NUMBER:
|
||||||
--playlist-end NUMBER -I :NUMBER
|
--playlist-end NUMBER -I :NUMBER
|
||||||
--playlist-reverse -I ::-1
|
--playlist-reverse -I ::-1
|
||||||
--no-playlist-reverse Default
|
--no-playlist-reverse Default
|
||||||
|
--no-colors --color no_color
|
||||||
|
|
||||||
|
|
||||||
#### Not recommended
|
#### Not recommended
|
||||||
|
@ -2126,6 +2227,10 @@ #### Not recommended
|
||||||
--youtube-skip-hls-manifest --extractor-args "youtube:skip=hls" (Alias: --no-youtube-include-hls-manifest)
|
--youtube-skip-hls-manifest --extractor-args "youtube:skip=hls" (Alias: --no-youtube-include-hls-manifest)
|
||||||
--youtube-include-dash-manifest Default (Alias: --no-youtube-skip-dash-manifest)
|
--youtube-include-dash-manifest Default (Alias: --no-youtube-skip-dash-manifest)
|
||||||
--youtube-include-hls-manifest Default (Alias: --no-youtube-skip-hls-manifest)
|
--youtube-include-hls-manifest Default (Alias: --no-youtube-skip-hls-manifest)
|
||||||
|
--geo-bypass --xff "default"
|
||||||
|
--no-geo-bypass --xff "never"
|
||||||
|
--geo-bypass-country CODE --xff CODE
|
||||||
|
--geo-bypass-ip-block IP_BLOCK --xff IP_BLOCK
|
||||||
|
|
||||||
|
|
||||||
#### Developer options
|
#### Developer options
|
||||||
|
|
1
bundle/__init__.py
Normal file
1
bundle/__init__.py
Normal file
|
@ -0,0 +1 @@
|
||||||
|
# Empty file
|
59
bundle/py2exe.py
Executable file
59
bundle/py2exe.py
Executable file
|
@ -0,0 +1,59 @@
|
||||||
|
#!/usr/bin/env python3
|
||||||
|
|
||||||
|
# Allow execution from anywhere
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
|
||||||
|
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||||
|
|
||||||
|
import warnings
|
||||||
|
|
||||||
|
from py2exe import freeze
|
||||||
|
|
||||||
|
from devscripts.utils import read_version
|
||||||
|
|
||||||
|
VERSION = read_version()
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
warnings.warn(
|
||||||
|
'py2exe builds do not support pycryptodomex and needs VC++14 to run. '
|
||||||
|
'It is recommended to run "pyinst.py" to build using pyinstaller instead')
|
||||||
|
|
||||||
|
return freeze(
|
||||||
|
console=[{
|
||||||
|
'script': './yt_dlp/__main__.py',
|
||||||
|
'dest_base': 'yt-dlp',
|
||||||
|
'icon_resources': [(1, 'devscripts/logo.ico')],
|
||||||
|
}],
|
||||||
|
version_info={
|
||||||
|
'version': VERSION,
|
||||||
|
'description': 'A youtube-dl fork with additional features and patches',
|
||||||
|
'comments': 'Official repository: <https://github.com/yt-dlp/yt-dlp>',
|
||||||
|
'product_name': 'yt-dlp',
|
||||||
|
'product_version': VERSION,
|
||||||
|
},
|
||||||
|
options={
|
||||||
|
'bundle_files': 0,
|
||||||
|
'compressed': 1,
|
||||||
|
'optimize': 2,
|
||||||
|
'dist_dir': './dist',
|
||||||
|
'excludes': [
|
||||||
|
# py2exe cannot import Crypto
|
||||||
|
'Crypto',
|
||||||
|
'Cryptodome',
|
||||||
|
# py2exe appears to confuse this with our socks library.
|
||||||
|
# We don't use pysocks and urllib3.contrib.socks would fail to import if tried.
|
||||||
|
'urllib3.contrib.socks'
|
||||||
|
],
|
||||||
|
'dll_excludes': ['w9xpopen.exe', 'crypt32.dll'],
|
||||||
|
# Modules that are only imported dynamically must be added here
|
||||||
|
'includes': ['yt_dlp.compat._legacy', 'yt_dlp.compat._deprecated',
|
||||||
|
'yt_dlp.utils._legacy', 'yt_dlp.utils._deprecated'],
|
||||||
|
},
|
||||||
|
zipfile=None,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
34
pyinst.py → bundle/pyinstaller.py
Normal file → Executable file
34
pyinst.py → bundle/pyinstaller.py
Normal file → Executable file
|
@ -4,7 +4,7 @@
|
||||||
import os
|
import os
|
||||||
import sys
|
import sys
|
||||||
|
|
||||||
sys.path.insert(0, os.path.dirname(os.path.abspath(__file__)))
|
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||||
|
|
||||||
import platform
|
import platform
|
||||||
|
|
||||||
|
@ -37,7 +37,7 @@ def main():
|
||||||
'--icon=devscripts/logo.ico',
|
'--icon=devscripts/logo.ico',
|
||||||
'--upx-exclude=vcruntime140.dll',
|
'--upx-exclude=vcruntime140.dll',
|
||||||
'--noconfirm',
|
'--noconfirm',
|
||||||
*dependency_options(),
|
'--additional-hooks-dir=yt_dlp/__pyinstaller',
|
||||||
*opts,
|
*opts,
|
||||||
'yt_dlp/__main__.py',
|
'yt_dlp/__main__.py',
|
||||||
]
|
]
|
||||||
|
@ -77,30 +77,6 @@ def version_to_list(version):
|
||||||
return list(map(int, version_list)) + [0] * (4 - len(version_list))
|
return list(map(int, version_list)) + [0] * (4 - len(version_list))
|
||||||
|
|
||||||
|
|
||||||
def dependency_options():
|
|
||||||
# Due to the current implementation, these are auto-detected, but explicitly add them just in case
|
|
||||||
dependencies = [pycryptodome_module(), 'mutagen', 'brotli', 'certifi', 'websockets']
|
|
||||||
excluded_modules = ('youtube_dl', 'youtube_dlc', 'test', 'ytdlp_plugins', 'devscripts')
|
|
||||||
|
|
||||||
yield from (f'--hidden-import={module}' for module in dependencies)
|
|
||||||
yield '--collect-submodules=websockets'
|
|
||||||
yield from (f'--exclude-module={module}' for module in excluded_modules)
|
|
||||||
|
|
||||||
|
|
||||||
def pycryptodome_module():
|
|
||||||
try:
|
|
||||||
import Cryptodome # noqa: F401
|
|
||||||
except ImportError:
|
|
||||||
try:
|
|
||||||
import Crypto # noqa: F401
|
|
||||||
print('WARNING: Using Crypto since Cryptodome is not available. '
|
|
||||||
'Install with: pip install pycryptodomex', file=sys.stderr)
|
|
||||||
return 'Crypto'
|
|
||||||
except ImportError:
|
|
||||||
pass
|
|
||||||
return 'Cryptodome'
|
|
||||||
|
|
||||||
|
|
||||||
def set_version_info(exe, version):
|
def set_version_info(exe, version):
|
||||||
if OS_NAME == 'win32':
|
if OS_NAME == 'win32':
|
||||||
windows_set_version(exe, version)
|
windows_set_version(exe, version)
|
||||||
|
@ -109,7 +85,6 @@ def set_version_info(exe, version):
|
||||||
def windows_set_version(exe, version):
|
def windows_set_version(exe, version):
|
||||||
from PyInstaller.utils.win32.versioninfo import (
|
from PyInstaller.utils.win32.versioninfo import (
|
||||||
FixedFileInfo,
|
FixedFileInfo,
|
||||||
SetVersion,
|
|
||||||
StringFileInfo,
|
StringFileInfo,
|
||||||
StringStruct,
|
StringStruct,
|
||||||
StringTable,
|
StringTable,
|
||||||
|
@ -118,6 +93,11 @@ def windows_set_version(exe, version):
|
||||||
VSVersionInfo,
|
VSVersionInfo,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
from PyInstaller.utils.win32.versioninfo import SetVersion
|
||||||
|
except ImportError: # Pyinstaller >= 5.8
|
||||||
|
from PyInstaller.utils.win32.versioninfo import write_version_info_to_executable as SetVersion
|
||||||
|
|
||||||
version_list = version_to_list(version)
|
version_list = version_to_list(version)
|
||||||
suffix = MACHINE and f'_{MACHINE}'
|
suffix = MACHINE and f'_{MACHINE}'
|
||||||
SetVersion(exe, VSVersionInfo(
|
SetVersion(exe, VSVersionInfo(
|
124
devscripts/changelog_override.json
Normal file
124
devscripts/changelog_override.json
Normal file
|
@ -0,0 +1,124 @@
|
||||||
|
[
|
||||||
|
{
|
||||||
|
"action": "add",
|
||||||
|
"when": "29cb20bd563c02671b31dd840139e93dd37150a1",
|
||||||
|
"short": "[priority] **A new release type has been added!**\n * [`nightly`](https://github.com/yt-dlp/yt-dlp/releases/tag/nightly) builds will be made after each push, containing the latest fixes (but also possibly bugs).\n * When using `--update`/`-U`, a release binary will only update to its current channel (either `stable` or `nightly`).\n * The `--update-to` option has been added allowing the user more control over program upgrades (or downgrades).\n * `--update-to` can change the release channel (`stable`, `nightly`) and also upgrade or downgrade to specific tags.\n * **Usage**: `--update-to CHANNEL`, `--update-to TAG`, `--update-to CHANNEL@TAG`"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"action": "add",
|
||||||
|
"when": "5038f6d713303e0967d002216e7a88652401c22a",
|
||||||
|
"short": "[priority] **YouTube throttling fixes!**"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"action": "remove",
|
||||||
|
"when": "2e023649ea4e11151545a34dc1360c114981a236"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"action": "add",
|
||||||
|
"when": "01aba2519a0884ef17d5f85608dbd2a455577147",
|
||||||
|
"short": "[priority] YouTube: Improved throttling and signature fixes"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"action": "change",
|
||||||
|
"when": "c86e433c35fe5da6cb29f3539eef97497f84ed38",
|
||||||
|
"short": "[extractor/niconico:series] Fix extraction (#6898)",
|
||||||
|
"authors": ["sqrtNOT"]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"action": "change",
|
||||||
|
"when": "69a40e4a7f6caa5662527ebd2f3c4e8aa02857a2",
|
||||||
|
"short": "[extractor/youtube:music_search_url] Extract title (#7102)",
|
||||||
|
"authors": ["kangalio"]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"action": "change",
|
||||||
|
"when": "8417f26b8a819cd7ffcd4e000ca3e45033e670fb",
|
||||||
|
"short": "Add option `--color` (#6904)",
|
||||||
|
"authors": ["Grub4K"]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"action": "change",
|
||||||
|
"when": "b4e0d75848e9447cee2cd3646ce54d4744a7ff56",
|
||||||
|
"short": "Improve `--download-sections`\n - Support negative time-ranges\n - Add `*from-url` to obey time-ranges in URL",
|
||||||
|
"authors": ["pukkandan"]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"action": "change",
|
||||||
|
"when": "1e75d97db21152acc764b30a688e516f04b8a142",
|
||||||
|
"short": "[extractor/youtube] Add `ios` to default clients used\n - IOS is affected neither by 403 nor by nsig so helps mitigate them preemptively\n - IOS also has higher bit-rate 'premium' formats though they are not labeled as such",
|
||||||
|
"authors": ["pukkandan"]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"action": "change",
|
||||||
|
"when": "f2ff0f6f1914b82d4a51681a72cc0828115dcb4a",
|
||||||
|
"short": "[extractor/motherless] Add gallery support, fix groups (#7211)",
|
||||||
|
"authors": ["rexlambert22", "Ti4eeT4e"]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"action": "change",
|
||||||
|
"when": "a4486bfc1dc7057efca9dd3fe70d7fa25c56f700",
|
||||||
|
"short": "[misc] Revert \"Add automatic duplicate issue detection\"",
|
||||||
|
"authors": ["pukkandan"]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"action": "add",
|
||||||
|
"when": "1ceb657bdd254ad961489e5060f2ccc7d556b729",
|
||||||
|
"short": "[priority] Security: [[CVE-2023-35934](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2023-35934)] Fix [Cookie leak](https://github.com/yt-dlp/yt-dlp/security/advisories/GHSA-v8mc-9377-rwjj)\n - `--add-header Cookie:` is deprecated and auto-scoped to input URL domains\n - Cookies are scoped when passed to external downloaders\n - Add `cookies` field to info.json and deprecate `http_headers.Cookie`"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"action": "change",
|
||||||
|
"when": "b03fa7834579a01cc5fba48c0e73488a16683d48",
|
||||||
|
"short": "[ie/twitter] Revert 92315c03774cfabb3a921884326beb4b981f786b",
|
||||||
|
"authors": ["pukkandan"]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"action": "change",
|
||||||
|
"when": "fcd6a76adc49d5cd8783985c7ce35384b72e545f",
|
||||||
|
"short": "[test] Add tests for socks proxies (#7908)",
|
||||||
|
"authors": ["coletdjnz"]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"action": "change",
|
||||||
|
"when": "4bf912282a34b58b6b35d8f7e6be535770c89c76",
|
||||||
|
"short": "[rh:urllib] Remove dot segments during URL normalization (#7662)",
|
||||||
|
"authors": ["coletdjnz"]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"action": "change",
|
||||||
|
"when": "59e92b1f1833440bb2190f847eb735cf0f90bc85",
|
||||||
|
"short": "[rh:urllib] Simplify gzip decoding (#7611)",
|
||||||
|
"authors": ["Grub4K"]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"action": "add",
|
||||||
|
"when": "c1d71d0d9f41db5e4306c86af232f5f6220a130b",
|
||||||
|
"short": "[priority] **The minimum *recommended* Python version has been raised to 3.8**\nSince Python 3.7 has reached end-of-life, support for it will be dropped soon. [Read more](https://github.com/yt-dlp/yt-dlp/issues/7803)"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"action": "add",
|
||||||
|
"when": "61bdf15fc7400601c3da1aa7a43917310a5bf391",
|
||||||
|
"short": "[priority] Security: [[CVE-2023-40581](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2023-40581)] [Prevent RCE when using `--exec` with `%q` on Windows](https://github.com/yt-dlp/yt-dlp/security/advisories/GHSA-42h4-v29r-42qg)\n - The shell escape function is now using `\"\"` instead of `\\\"`.\n - `utils.Popen` has been patched to properly quote commands."
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"action": "change",
|
||||||
|
"when": "8a8b54523addf46dfd50ef599761a81bc22362e6",
|
||||||
|
"short": "[rh:requests] Add handler for `requests` HTTP library (#3668)\n\n\tAdds support for HTTPS proxies and persistent connections (keep-alive)",
|
||||||
|
"authors": ["bashonly", "coletdjnz", "Grub4K"]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"action": "add",
|
||||||
|
"when": "1d03633c5a1621b9f3a756f0a4f9dc61fab3aeaa",
|
||||||
|
"short": "[priority] **The release channels have been adjusted!**\n\t* [`master`](https://github.com/yt-dlp/yt-dlp-master-builds) builds are made after each push, containing the latest fixes (but also possibly bugs). This was previously the `nightly` channel.\n\t* [`nightly`](https://github.com/yt-dlp/yt-dlp-nightly-builds) builds are now made once a day, if there were any changes."
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"action": "add",
|
||||||
|
"when": "f04b5bedad7b281bee9814686bba1762bae092eb",
|
||||||
|
"short": "[priority] Security: [[CVE-2023-46121](https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2023-46121)] Patch [Generic Extractor MITM Vulnerability via Arbitrary Proxy Injection](https://github.com/yt-dlp/yt-dlp/security/advisories/GHSA-3ch3-jhc6-5r8x)\n\t- Disallow smuggling of arbitrary `http_headers`; extractors now only use specific headers"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"action": "change",
|
||||||
|
"when": "15f22b4880b6b3f71f350c64d70976ae65b9f1ca",
|
||||||
|
"short": "[webvtt] Allow spaces before newlines for CueBlock (#7681)",
|
||||||
|
"authors": ["TSRBerry"]
|
||||||
|
}
|
||||||
|
]
|
96
devscripts/changelog_override.schema.json
Normal file
96
devscripts/changelog_override.schema.json
Normal file
|
@ -0,0 +1,96 @@
|
||||||
|
{
|
||||||
|
"$schema": "http://json-schema.org/draft/2020-12/schema",
|
||||||
|
"type": "array",
|
||||||
|
"uniqueItems": true,
|
||||||
|
"items": {
|
||||||
|
"type": "object",
|
||||||
|
"oneOf": [
|
||||||
|
{
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"action": {
|
||||||
|
"enum": [
|
||||||
|
"add"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"when": {
|
||||||
|
"type": "string",
|
||||||
|
"pattern": "^([0-9a-f]{40}|\\d{4}\\.\\d{2}\\.\\d{2})$"
|
||||||
|
},
|
||||||
|
"hash": {
|
||||||
|
"type": "string",
|
||||||
|
"pattern": "^[0-9a-f]{40}$"
|
||||||
|
},
|
||||||
|
"short": {
|
||||||
|
"type": "string"
|
||||||
|
},
|
||||||
|
"authors": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {
|
||||||
|
"type": "string"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"required": [
|
||||||
|
"action",
|
||||||
|
"short"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"action": {
|
||||||
|
"enum": [
|
||||||
|
"remove"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"when": {
|
||||||
|
"type": "string",
|
||||||
|
"pattern": "^([0-9a-f]{40}|\\d{4}\\.\\d{2}\\.\\d{2})$"
|
||||||
|
},
|
||||||
|
"hash": {
|
||||||
|
"type": "string",
|
||||||
|
"pattern": "^[0-9a-f]{40}$"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"required": [
|
||||||
|
"action",
|
||||||
|
"hash"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"action": {
|
||||||
|
"enum": [
|
||||||
|
"change"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"when": {
|
||||||
|
"type": "string",
|
||||||
|
"pattern": "^([0-9a-f]{40}|\\d{4}\\.\\d{2}\\.\\d{2})$"
|
||||||
|
},
|
||||||
|
"hash": {
|
||||||
|
"type": "string",
|
||||||
|
"pattern": "^[0-9a-f]{40}$"
|
||||||
|
},
|
||||||
|
"short": {
|
||||||
|
"type": "string"
|
||||||
|
},
|
||||||
|
"authors": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {
|
||||||
|
"type": "string"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"required": [
|
||||||
|
"action",
|
||||||
|
"hash",
|
||||||
|
"short",
|
||||||
|
"authors"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
48
devscripts/cli_to_api.py
Normal file
48
devscripts/cli_to_api.py
Normal file
|
@ -0,0 +1,48 @@
|
||||||
|
# Allow direct execution
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
|
||||||
|
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||||
|
|
||||||
|
import yt_dlp
|
||||||
|
import yt_dlp.options
|
||||||
|
|
||||||
|
create_parser = yt_dlp.options.create_parser
|
||||||
|
|
||||||
|
|
||||||
|
def parse_patched_options(opts):
|
||||||
|
patched_parser = create_parser()
|
||||||
|
patched_parser.defaults.update({
|
||||||
|
'ignoreerrors': False,
|
||||||
|
'retries': 0,
|
||||||
|
'fragment_retries': 0,
|
||||||
|
'extract_flat': False,
|
||||||
|
'concat_playlist': 'never',
|
||||||
|
})
|
||||||
|
yt_dlp.options.create_parser = lambda: patched_parser
|
||||||
|
try:
|
||||||
|
return yt_dlp.parse_options(opts)
|
||||||
|
finally:
|
||||||
|
yt_dlp.options.create_parser = create_parser
|
||||||
|
|
||||||
|
|
||||||
|
default_opts = parse_patched_options([]).ydl_opts
|
||||||
|
|
||||||
|
|
||||||
|
def cli_to_api(opts, cli_defaults=False):
|
||||||
|
opts = (yt_dlp.parse_options if cli_defaults else parse_patched_options)(opts).ydl_opts
|
||||||
|
|
||||||
|
diff = {k: v for k, v in opts.items() if default_opts[k] != v}
|
||||||
|
if 'postprocessors' in diff:
|
||||||
|
diff['postprocessors'] = [pp for pp in diff['postprocessors']
|
||||||
|
if pp not in default_opts['postprocessors']]
|
||||||
|
return diff
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
from pprint import pprint
|
||||||
|
|
||||||
|
print('\nThe arguments passed translate to:\n')
|
||||||
|
pprint(cli_to_api(sys.argv[1:]))
|
||||||
|
print('\nCombining these with the CLI defaults gives:\n')
|
||||||
|
pprint(cli_to_api(sys.argv[1:], True))
|
66
devscripts/install_deps.py
Executable file
66
devscripts/install_deps.py
Executable file
|
@ -0,0 +1,66 @@
|
||||||
|
#!/usr/bin/env python3
|
||||||
|
|
||||||
|
# Allow execution from anywhere
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
|
||||||
|
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import re
|
||||||
|
import subprocess
|
||||||
|
|
||||||
|
from devscripts.tomlparse import parse_toml
|
||||||
|
from devscripts.utils import read_file
|
||||||
|
|
||||||
|
|
||||||
|
def parse_args():
|
||||||
|
parser = argparse.ArgumentParser(description='Install dependencies for yt-dlp')
|
||||||
|
parser.add_argument(
|
||||||
|
'input', nargs='?', metavar='TOMLFILE', default='pyproject.toml', help='Input file (default: %(default)s)')
|
||||||
|
parser.add_argument(
|
||||||
|
'-e', '--exclude', metavar='REQUIREMENT', action='append', help='Exclude a required dependency')
|
||||||
|
parser.add_argument(
|
||||||
|
'-i', '--include', metavar='GROUP', action='append', help='Include an optional dependency group')
|
||||||
|
parser.add_argument(
|
||||||
|
'-o', '--only-optional', action='store_true', help='Only install optional dependencies')
|
||||||
|
parser.add_argument(
|
||||||
|
'-p', '--print', action='store_true', help='Only print a requirements.txt to stdout')
|
||||||
|
parser.add_argument(
|
||||||
|
'-u', '--user', action='store_true', help='Install with pip as --user')
|
||||||
|
return parser.parse_args()
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
args = parse_args()
|
||||||
|
toml_data = parse_toml(read_file(args.input))
|
||||||
|
deps = toml_data['project']['dependencies']
|
||||||
|
targets = deps.copy() if not args.only_optional else []
|
||||||
|
|
||||||
|
for exclude in args.exclude or []:
|
||||||
|
for dep in deps:
|
||||||
|
simplified_dep = re.match(r'[\w-]+', dep)[0]
|
||||||
|
if dep in targets and (exclude.lower() == simplified_dep.lower() or exclude == dep):
|
||||||
|
targets.remove(dep)
|
||||||
|
|
||||||
|
optional_deps = toml_data['project']['optional-dependencies']
|
||||||
|
for include in args.include or []:
|
||||||
|
group = optional_deps.get(include)
|
||||||
|
if group:
|
||||||
|
targets.extend(group)
|
||||||
|
|
||||||
|
if args.print:
|
||||||
|
for target in targets:
|
||||||
|
print(target)
|
||||||
|
return
|
||||||
|
|
||||||
|
pip_args = [sys.executable, '-m', 'pip', 'install', '-U']
|
||||||
|
if args.user:
|
||||||
|
pip_args.append('--user')
|
||||||
|
pip_args.extend(targets)
|
||||||
|
|
||||||
|
return subprocess.call(pip_args)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
sys.exit(main())
|
|
@ -6,6 +6,7 @@
|
||||||
age_restricted,
|
age_restricted,
|
||||||
bug_reports_message,
|
bug_reports_message,
|
||||||
classproperty,
|
classproperty,
|
||||||
|
variadic,
|
||||||
write_string,
|
write_string,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
503
devscripts/make_changelog.py
Normal file
503
devscripts/make_changelog.py
Normal file
|
@ -0,0 +1,503 @@
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
# Allow direct execution
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
|
||||||
|
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||||
|
|
||||||
|
import enum
|
||||||
|
import itertools
|
||||||
|
import json
|
||||||
|
import logging
|
||||||
|
import re
|
||||||
|
from collections import defaultdict
|
||||||
|
from dataclasses import dataclass
|
||||||
|
from functools import lru_cache
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
from devscripts.utils import read_file, run_process, write_file
|
||||||
|
|
||||||
|
BASE_URL = 'https://github.com'
|
||||||
|
LOCATION_PATH = Path(__file__).parent
|
||||||
|
HASH_LENGTH = 7
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class CommitGroup(enum.Enum):
|
||||||
|
PRIORITY = 'Important'
|
||||||
|
CORE = 'Core'
|
||||||
|
EXTRACTOR = 'Extractor'
|
||||||
|
DOWNLOADER = 'Downloader'
|
||||||
|
POSTPROCESSOR = 'Postprocessor'
|
||||||
|
NETWORKING = 'Networking'
|
||||||
|
MISC = 'Misc.'
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
@lru_cache
|
||||||
|
def subgroup_lookup(cls):
|
||||||
|
return {
|
||||||
|
name: group
|
||||||
|
for group, names in {
|
||||||
|
cls.MISC: {
|
||||||
|
'build',
|
||||||
|
'ci',
|
||||||
|
'cleanup',
|
||||||
|
'devscripts',
|
||||||
|
'docs',
|
||||||
|
'test',
|
||||||
|
},
|
||||||
|
cls.NETWORKING: {
|
||||||
|
'rh',
|
||||||
|
},
|
||||||
|
}.items()
|
||||||
|
for name in names
|
||||||
|
}
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
@lru_cache
|
||||||
|
def group_lookup(cls):
|
||||||
|
result = {
|
||||||
|
'fd': cls.DOWNLOADER,
|
||||||
|
'ie': cls.EXTRACTOR,
|
||||||
|
'pp': cls.POSTPROCESSOR,
|
||||||
|
'upstream': cls.CORE,
|
||||||
|
}
|
||||||
|
result.update({item.name.lower(): item for item in iter(cls)})
|
||||||
|
return result
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def get(cls, value: str) -> tuple[CommitGroup | None, str | None]:
|
||||||
|
group, _, subgroup = (group.strip().lower() for group in value.partition('/'))
|
||||||
|
|
||||||
|
result = cls.group_lookup().get(group)
|
||||||
|
if not result:
|
||||||
|
if subgroup:
|
||||||
|
return None, value
|
||||||
|
subgroup = group
|
||||||
|
result = cls.subgroup_lookup().get(subgroup)
|
||||||
|
|
||||||
|
return result, subgroup or None
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class Commit:
|
||||||
|
hash: str | None
|
||||||
|
short: str
|
||||||
|
authors: list[str]
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
result = f'{self.short!r}'
|
||||||
|
|
||||||
|
if self.hash:
|
||||||
|
result += f' ({self.hash[:HASH_LENGTH]})'
|
||||||
|
|
||||||
|
if self.authors:
|
||||||
|
authors = ', '.join(self.authors)
|
||||||
|
result += f' by {authors}'
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class CommitInfo:
|
||||||
|
details: str | None
|
||||||
|
sub_details: tuple[str, ...]
|
||||||
|
message: str
|
||||||
|
issues: list[str]
|
||||||
|
commit: Commit
|
||||||
|
fixes: list[Commit]
|
||||||
|
|
||||||
|
def key(self):
|
||||||
|
return ((self.details or '').lower(), self.sub_details, self.message)
|
||||||
|
|
||||||
|
|
||||||
|
def unique(items):
|
||||||
|
return sorted({item.strip().lower(): item for item in items if item}.values())
|
||||||
|
|
||||||
|
|
||||||
|
class Changelog:
|
||||||
|
MISC_RE = re.compile(r'(?:^|\b)(?:lint(?:ing)?|misc|format(?:ting)?|fixes)(?:\b|$)', re.IGNORECASE)
|
||||||
|
ALWAYS_SHOWN = (CommitGroup.PRIORITY,)
|
||||||
|
|
||||||
|
def __init__(self, groups, repo, collapsible=False):
|
||||||
|
self._groups = groups
|
||||||
|
self._repo = repo
|
||||||
|
self._collapsible = collapsible
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return '\n'.join(self._format_groups(self._groups)).replace('\t', ' ')
|
||||||
|
|
||||||
|
def _format_groups(self, groups):
|
||||||
|
first = True
|
||||||
|
for item in CommitGroup:
|
||||||
|
if self._collapsible and item not in self.ALWAYS_SHOWN and first:
|
||||||
|
first = False
|
||||||
|
yield '\n<details><summary><h3>Changelog</h3></summary>\n'
|
||||||
|
|
||||||
|
group = groups[item]
|
||||||
|
if group:
|
||||||
|
yield self.format_module(item.value, group)
|
||||||
|
|
||||||
|
if self._collapsible:
|
||||||
|
yield '\n</details>'
|
||||||
|
|
||||||
|
def format_module(self, name, group):
|
||||||
|
result = f'\n#### {name} changes\n' if name else '\n'
|
||||||
|
return result + '\n'.join(self._format_group(group))
|
||||||
|
|
||||||
|
def _format_group(self, group):
|
||||||
|
sorted_group = sorted(group, key=CommitInfo.key)
|
||||||
|
detail_groups = itertools.groupby(sorted_group, lambda item: (item.details or '').lower())
|
||||||
|
for _, items in detail_groups:
|
||||||
|
items = list(items)
|
||||||
|
details = items[0].details
|
||||||
|
|
||||||
|
if details == 'cleanup':
|
||||||
|
items = self._prepare_cleanup_misc_items(items)
|
||||||
|
|
||||||
|
prefix = '-'
|
||||||
|
if details:
|
||||||
|
if len(items) == 1:
|
||||||
|
prefix = f'- **{details}**:'
|
||||||
|
else:
|
||||||
|
yield f'- **{details}**'
|
||||||
|
prefix = '\t-'
|
||||||
|
|
||||||
|
sub_detail_groups = itertools.groupby(items, lambda item: tuple(map(str.lower, item.sub_details)))
|
||||||
|
for sub_details, entries in sub_detail_groups:
|
||||||
|
if not sub_details:
|
||||||
|
for entry in entries:
|
||||||
|
yield f'{prefix} {self.format_single_change(entry)}'
|
||||||
|
continue
|
||||||
|
|
||||||
|
entries = list(entries)
|
||||||
|
sub_prefix = f'{prefix} {", ".join(entries[0].sub_details)}'
|
||||||
|
if len(entries) == 1:
|
||||||
|
yield f'{sub_prefix}: {self.format_single_change(entries[0])}'
|
||||||
|
continue
|
||||||
|
|
||||||
|
yield sub_prefix
|
||||||
|
for entry in entries:
|
||||||
|
yield f'\t{prefix} {self.format_single_change(entry)}'
|
||||||
|
|
||||||
|
def _prepare_cleanup_misc_items(self, items):
|
||||||
|
cleanup_misc_items = defaultdict(list)
|
||||||
|
sorted_items = []
|
||||||
|
for item in items:
|
||||||
|
if self.MISC_RE.search(item.message):
|
||||||
|
cleanup_misc_items[tuple(item.commit.authors)].append(item)
|
||||||
|
else:
|
||||||
|
sorted_items.append(item)
|
||||||
|
|
||||||
|
for commit_infos in cleanup_misc_items.values():
|
||||||
|
sorted_items.append(CommitInfo(
|
||||||
|
'cleanup', ('Miscellaneous',), ', '.join(
|
||||||
|
self._format_message_link(None, info.commit.hash)
|
||||||
|
for info in sorted(commit_infos, key=lambda item: item.commit.hash or '')),
|
||||||
|
[], Commit(None, '', commit_infos[0].commit.authors), []))
|
||||||
|
|
||||||
|
return sorted_items
|
||||||
|
|
||||||
|
def format_single_change(self, info: CommitInfo):
|
||||||
|
message, sep, rest = info.message.partition('\n')
|
||||||
|
if '[' not in message:
|
||||||
|
# If the message doesn't already contain markdown links, try to add a link to the commit
|
||||||
|
message = self._format_message_link(message, info.commit.hash)
|
||||||
|
|
||||||
|
if info.issues:
|
||||||
|
message = f'{message} ({self._format_issues(info.issues)})'
|
||||||
|
|
||||||
|
if info.commit.authors:
|
||||||
|
message = f'{message} by {self._format_authors(info.commit.authors)}'
|
||||||
|
|
||||||
|
if info.fixes:
|
||||||
|
fix_message = ', '.join(f'{self._format_message_link(None, fix.hash)}' for fix in info.fixes)
|
||||||
|
|
||||||
|
authors = sorted({author for fix in info.fixes for author in fix.authors}, key=str.casefold)
|
||||||
|
if authors != info.commit.authors:
|
||||||
|
fix_message = f'{fix_message} by {self._format_authors(authors)}'
|
||||||
|
|
||||||
|
message = f'{message} (With fixes in {fix_message})'
|
||||||
|
|
||||||
|
return message if not sep else f'{message}{sep}{rest}'
|
||||||
|
|
||||||
|
def _format_message_link(self, message, hash):
|
||||||
|
assert message or hash, 'Improperly defined commit message or override'
|
||||||
|
message = message if message else hash[:HASH_LENGTH]
|
||||||
|
return f'[{message}]({self.repo_url}/commit/{hash})' if hash else message
|
||||||
|
|
||||||
|
def _format_issues(self, issues):
|
||||||
|
return ', '.join(f'[#{issue}]({self.repo_url}/issues/{issue})' for issue in issues)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _format_authors(authors):
|
||||||
|
return ', '.join(f'[{author}]({BASE_URL}/{author})' for author in authors)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def repo_url(self):
|
||||||
|
return f'{BASE_URL}/{self._repo}'
|
||||||
|
|
||||||
|
|
||||||
|
class CommitRange:
|
||||||
|
COMMAND = 'git'
|
||||||
|
COMMIT_SEPARATOR = '-----'
|
||||||
|
|
||||||
|
AUTHOR_INDICATOR_RE = re.compile(r'Authored by:? ', re.IGNORECASE)
|
||||||
|
MESSAGE_RE = re.compile(r'''
|
||||||
|
(?:\[(?P<prefix>[^\]]+)\]\ )?
|
||||||
|
(?:(?P<sub_details>`?[\w.-]+`?): )?
|
||||||
|
(?P<message>.+?)
|
||||||
|
(?:\ \((?P<issues>\#\d+(?:,\ \#\d+)*)\))?
|
||||||
|
''', re.VERBOSE | re.DOTALL)
|
||||||
|
EXTRACTOR_INDICATOR_RE = re.compile(r'(?:Fix|Add)\s+Extractors?', re.IGNORECASE)
|
||||||
|
REVERT_RE = re.compile(r'(?:\[[^\]]+\]\s+)?(?i:Revert)\s+([\da-f]{40})')
|
||||||
|
FIXES_RE = re.compile(r'(?i:Fix(?:es)?(?:\s+bugs?)?(?:\s+in|\s+for)?|Revert)\s+([\da-f]{40})')
|
||||||
|
UPSTREAM_MERGE_RE = re.compile(r'Update to ytdl-commit-([\da-f]+)')
|
||||||
|
|
||||||
|
def __init__(self, start, end, default_author=None):
|
||||||
|
self._start, self._end = start, end
|
||||||
|
self._commits, self._fixes = self._get_commits_and_fixes(default_author)
|
||||||
|
self._commits_added = []
|
||||||
|
|
||||||
|
def __iter__(self):
|
||||||
|
return iter(itertools.chain(self._commits.values(), self._commits_added))
|
||||||
|
|
||||||
|
def __len__(self):
|
||||||
|
return len(self._commits) + len(self._commits_added)
|
||||||
|
|
||||||
|
def __contains__(self, commit):
|
||||||
|
if isinstance(commit, Commit):
|
||||||
|
if not commit.hash:
|
||||||
|
return False
|
||||||
|
commit = commit.hash
|
||||||
|
|
||||||
|
return commit in self._commits
|
||||||
|
|
||||||
|
def _get_commits_and_fixes(self, default_author):
|
||||||
|
result = run_process(
|
||||||
|
self.COMMAND, 'log', f'--format=%H%n%s%n%b%n{self.COMMIT_SEPARATOR}',
|
||||||
|
f'{self._start}..{self._end}' if self._start else self._end).stdout
|
||||||
|
|
||||||
|
commits, reverts = {}, {}
|
||||||
|
fixes = defaultdict(list)
|
||||||
|
lines = iter(result.splitlines(False))
|
||||||
|
for i, commit_hash in enumerate(lines):
|
||||||
|
short = next(lines)
|
||||||
|
skip = short.startswith('Release ') or short == '[version] update'
|
||||||
|
|
||||||
|
authors = [default_author] if default_author else []
|
||||||
|
for line in iter(lambda: next(lines), self.COMMIT_SEPARATOR):
|
||||||
|
match = self.AUTHOR_INDICATOR_RE.match(line)
|
||||||
|
if match:
|
||||||
|
authors = sorted(map(str.strip, line[match.end():].split(',')), key=str.casefold)
|
||||||
|
|
||||||
|
commit = Commit(commit_hash, short, authors)
|
||||||
|
if skip and (self._start or not i):
|
||||||
|
logger.debug(f'Skipped commit: {commit}')
|
||||||
|
continue
|
||||||
|
elif skip:
|
||||||
|
logger.debug(f'Reached Release commit, breaking: {commit}')
|
||||||
|
break
|
||||||
|
|
||||||
|
revert_match = self.REVERT_RE.fullmatch(commit.short)
|
||||||
|
if revert_match:
|
||||||
|
reverts[revert_match.group(1)] = commit
|
||||||
|
continue
|
||||||
|
|
||||||
|
fix_match = self.FIXES_RE.search(commit.short)
|
||||||
|
if fix_match:
|
||||||
|
commitish = fix_match.group(1)
|
||||||
|
fixes[commitish].append(commit)
|
||||||
|
|
||||||
|
commits[commit.hash] = commit
|
||||||
|
|
||||||
|
for commitish, revert_commit in reverts.items():
|
||||||
|
reverted = commits.pop(commitish, None)
|
||||||
|
if reverted:
|
||||||
|
logger.debug(f'{commitish} fully reverted {reverted}')
|
||||||
|
else:
|
||||||
|
commits[revert_commit.hash] = revert_commit
|
||||||
|
|
||||||
|
for commitish, fix_commits in fixes.items():
|
||||||
|
if commitish in commits:
|
||||||
|
hashes = ', '.join(commit.hash[:HASH_LENGTH] for commit in fix_commits)
|
||||||
|
logger.info(f'Found fix(es) for {commitish[:HASH_LENGTH]}: {hashes}')
|
||||||
|
for fix_commit in fix_commits:
|
||||||
|
del commits[fix_commit.hash]
|
||||||
|
else:
|
||||||
|
logger.debug(f'Commit with fixes not in changes: {commitish[:HASH_LENGTH]}')
|
||||||
|
|
||||||
|
return commits, fixes
|
||||||
|
|
||||||
|
def apply_overrides(self, overrides):
|
||||||
|
for override in overrides:
|
||||||
|
when = override.get('when')
|
||||||
|
if when and when not in self and when != self._start:
|
||||||
|
logger.debug(f'Ignored {when!r} override')
|
||||||
|
continue
|
||||||
|
|
||||||
|
override_hash = override.get('hash') or when
|
||||||
|
if override['action'] == 'add':
|
||||||
|
commit = Commit(override.get('hash'), override['short'], override.get('authors') or [])
|
||||||
|
logger.info(f'ADD {commit}')
|
||||||
|
self._commits_added.append(commit)
|
||||||
|
|
||||||
|
elif override['action'] == 'remove':
|
||||||
|
if override_hash in self._commits:
|
||||||
|
logger.info(f'REMOVE {self._commits[override_hash]}')
|
||||||
|
del self._commits[override_hash]
|
||||||
|
|
||||||
|
elif override['action'] == 'change':
|
||||||
|
if override_hash not in self._commits:
|
||||||
|
continue
|
||||||
|
commit = Commit(override_hash, override['short'], override.get('authors') or [])
|
||||||
|
logger.info(f'CHANGE {self._commits[commit.hash]} -> {commit}')
|
||||||
|
self._commits[commit.hash] = commit
|
||||||
|
|
||||||
|
self._commits = {key: value for key, value in reversed(self._commits.items())}
|
||||||
|
|
||||||
|
def groups(self):
|
||||||
|
group_dict = defaultdict(list)
|
||||||
|
for commit in self:
|
||||||
|
upstream_re = self.UPSTREAM_MERGE_RE.search(commit.short)
|
||||||
|
if upstream_re:
|
||||||
|
commit.short = f'[upstream] Merged with youtube-dl {upstream_re.group(1)}'
|
||||||
|
|
||||||
|
match = self.MESSAGE_RE.fullmatch(commit.short)
|
||||||
|
if not match:
|
||||||
|
logger.error(f'Error parsing short commit message: {commit.short!r}')
|
||||||
|
continue
|
||||||
|
|
||||||
|
prefix, sub_details_alt, message, issues = match.groups()
|
||||||
|
issues = [issue.strip()[1:] for issue in issues.split(',')] if issues else []
|
||||||
|
|
||||||
|
if prefix:
|
||||||
|
groups, details, sub_details = zip(*map(self.details_from_prefix, prefix.split(',')))
|
||||||
|
group = next(iter(filter(None, groups)), None)
|
||||||
|
details = ', '.join(unique(details))
|
||||||
|
sub_details = list(itertools.chain.from_iterable(sub_details))
|
||||||
|
else:
|
||||||
|
group = CommitGroup.CORE
|
||||||
|
details = None
|
||||||
|
sub_details = []
|
||||||
|
|
||||||
|
if sub_details_alt:
|
||||||
|
sub_details.append(sub_details_alt)
|
||||||
|
sub_details = tuple(unique(sub_details))
|
||||||
|
|
||||||
|
if not group:
|
||||||
|
if self.EXTRACTOR_INDICATOR_RE.search(commit.short):
|
||||||
|
group = CommitGroup.EXTRACTOR
|
||||||
|
logger.error(f'Assuming [ie] group for {commit.short!r}')
|
||||||
|
else:
|
||||||
|
group = CommitGroup.CORE
|
||||||
|
|
||||||
|
commit_info = CommitInfo(
|
||||||
|
details, sub_details, message.strip(),
|
||||||
|
issues, commit, self._fixes[commit.hash])
|
||||||
|
|
||||||
|
logger.debug(f'Resolved {commit.short!r} to {commit_info!r}')
|
||||||
|
group_dict[group].append(commit_info)
|
||||||
|
|
||||||
|
return group_dict
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def details_from_prefix(prefix):
|
||||||
|
if not prefix:
|
||||||
|
return CommitGroup.CORE, None, ()
|
||||||
|
|
||||||
|
prefix, *sub_details = prefix.split(':')
|
||||||
|
|
||||||
|
group, details = CommitGroup.get(prefix)
|
||||||
|
if group is CommitGroup.PRIORITY and details:
|
||||||
|
details = details.partition('/')[2].strip()
|
||||||
|
|
||||||
|
if details and '/' in details:
|
||||||
|
logger.error(f'Prefix is overnested, using first part: {prefix}')
|
||||||
|
details = details.partition('/')[0].strip()
|
||||||
|
|
||||||
|
if details == 'common':
|
||||||
|
details = None
|
||||||
|
elif group is CommitGroup.NETWORKING and details == 'rh':
|
||||||
|
details = 'Request Handler'
|
||||||
|
|
||||||
|
return group, details, sub_details
|
||||||
|
|
||||||
|
|
||||||
|
def get_new_contributors(contributors_path, commits):
|
||||||
|
contributors = set()
|
||||||
|
if contributors_path.exists():
|
||||||
|
for line in read_file(contributors_path).splitlines():
|
||||||
|
author, _, _ = line.strip().partition(' (')
|
||||||
|
authors = author.split('/')
|
||||||
|
contributors.update(map(str.casefold, authors))
|
||||||
|
|
||||||
|
new_contributors = set()
|
||||||
|
for commit in commits:
|
||||||
|
for author in commit.authors:
|
||||||
|
author_folded = author.casefold()
|
||||||
|
if author_folded not in contributors:
|
||||||
|
contributors.add(author_folded)
|
||||||
|
new_contributors.add(author)
|
||||||
|
|
||||||
|
return sorted(new_contributors, key=str.casefold)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
import argparse
|
||||||
|
|
||||||
|
parser = argparse.ArgumentParser(
|
||||||
|
description='Create a changelog markdown from a git commit range')
|
||||||
|
parser.add_argument(
|
||||||
|
'commitish', default='HEAD', nargs='?',
|
||||||
|
help='The commitish to create the range from (default: %(default)s)')
|
||||||
|
parser.add_argument(
|
||||||
|
'-v', '--verbosity', action='count', default=0,
|
||||||
|
help='increase verbosity (can be used twice)')
|
||||||
|
parser.add_argument(
|
||||||
|
'-c', '--contributors', action='store_true',
|
||||||
|
help='update CONTRIBUTORS file (default: %(default)s)')
|
||||||
|
parser.add_argument(
|
||||||
|
'--contributors-path', type=Path, default=LOCATION_PATH.parent / 'CONTRIBUTORS',
|
||||||
|
help='path to the CONTRIBUTORS file')
|
||||||
|
parser.add_argument(
|
||||||
|
'--no-override', action='store_true',
|
||||||
|
help='skip override json in commit generation (default: %(default)s)')
|
||||||
|
parser.add_argument(
|
||||||
|
'--override-path', type=Path, default=LOCATION_PATH / 'changelog_override.json',
|
||||||
|
help='path to the changelog_override.json file')
|
||||||
|
parser.add_argument(
|
||||||
|
'--default-author', default='pukkandan',
|
||||||
|
help='the author to use without a author indicator (default: %(default)s)')
|
||||||
|
parser.add_argument(
|
||||||
|
'--repo', default='yt-dlp/yt-dlp',
|
||||||
|
help='the github repository to use for the operations (default: %(default)s)')
|
||||||
|
parser.add_argument(
|
||||||
|
'--collapsible', action='store_true',
|
||||||
|
help='make changelog collapsible (default: %(default)s)')
|
||||||
|
args = parser.parse_args()
|
||||||
|
|
||||||
|
logging.basicConfig(
|
||||||
|
datefmt='%Y-%m-%d %H-%M-%S', format='{asctime} | {levelname:<8} | {message}',
|
||||||
|
level=logging.WARNING - 10 * args.verbosity, style='{', stream=sys.stderr)
|
||||||
|
|
||||||
|
commits = CommitRange(None, args.commitish, args.default_author)
|
||||||
|
|
||||||
|
if not args.no_override:
|
||||||
|
if args.override_path.exists():
|
||||||
|
overrides = json.loads(read_file(args.override_path))
|
||||||
|
commits.apply_overrides(overrides)
|
||||||
|
else:
|
||||||
|
logger.warning(f'File {args.override_path.as_posix()} does not exist')
|
||||||
|
|
||||||
|
logger.info(f'Loaded {len(commits)} commits')
|
||||||
|
|
||||||
|
new_contributors = get_new_contributors(args.contributors_path, commits)
|
||||||
|
if new_contributors:
|
||||||
|
if args.contributors:
|
||||||
|
write_file(args.contributors_path, '\n'.join(new_contributors) + '\n', mode='a')
|
||||||
|
logger.info(f'New contributors: {", ".join(new_contributors)}')
|
||||||
|
|
||||||
|
print(Changelog(commits.groups(), args.repo, args.collapsible))
|
|
@ -9,12 +9,7 @@
|
||||||
|
|
||||||
import re
|
import re
|
||||||
|
|
||||||
from devscripts.utils import (
|
from devscripts.utils import get_filename_args, read_file, write_file
|
||||||
get_filename_args,
|
|
||||||
read_file,
|
|
||||||
read_version,
|
|
||||||
write_file,
|
|
||||||
)
|
|
||||||
|
|
||||||
VERBOSE_TMPL = '''
|
VERBOSE_TMPL = '''
|
||||||
- type: checkboxes
|
- type: checkboxes
|
||||||
|
@ -24,6 +19,8 @@
|
||||||
options:
|
options:
|
||||||
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
- label: Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
|
||||||
required: true
|
required: true
|
||||||
|
- label: "If using API, add `'verbose': True` to `YoutubeDL` params instead"
|
||||||
|
required: false
|
||||||
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
- label: Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
|
||||||
required: true
|
required: true
|
||||||
- type: textarea
|
- type: textarea
|
||||||
|
@ -33,19 +30,18 @@
|
||||||
description: |
|
description: |
|
||||||
It should start like this:
|
It should start like this:
|
||||||
placeholder: |
|
placeholder: |
|
||||||
[debug] Command-line config: ['-vU', 'test:youtube']
|
[debug] Command-line config: ['-vU', 'https://www.youtube.com/watch?v=BaW_jenozKc']
|
||||||
[debug] Portable config "yt-dlp.conf": ['-i']
|
|
||||||
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
[debug] Encodings: locale cp65001, fs utf-8, pref cp65001, out utf-8, error utf-8, screen utf-8
|
||||||
[debug] yt-dlp version %(version)s [9d339c4] (win32_exe)
|
[debug] yt-dlp version nightly@... from yt-dlp/yt-dlp [b634ba742] (win_exe)
|
||||||
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
[debug] Python 3.8.10 (CPython 64bit) - Windows-10-10.0.22000-SP0
|
||||||
[debug] Checking exe version: ffmpeg -bsfs
|
|
||||||
[debug] Checking exe version: ffprobe -bsfs
|
|
||||||
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
[debug] exe versions: ffmpeg N-106550-g072101bd52-20220410 (fdk,setts), ffprobe N-106624-g391ce570c8-20220415, phantomjs 2.1.1
|
||||||
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.06.15, mutagen-1.45.1, sqlite3-2.6.0, websockets-10.3
|
||||||
[debug] Proxy map: {}
|
[debug] Proxy map: {}
|
||||||
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
|
[debug] Request Handlers: urllib, requests
|
||||||
Latest version: %(version)s, Current version: %(version)s
|
[debug] Loaded 1893 extractors
|
||||||
yt-dlp is up to date (%(version)s)
|
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp-nightly-builds/releases/latest
|
||||||
|
yt-dlp is up to date (nightly@... from yt-dlp/yt-dlp-nightly-builds)
|
||||||
|
[youtube] Extracting URL: https://www.youtube.com/watch?v=BaW_jenozKc
|
||||||
<more lines>
|
<more lines>
|
||||||
render: shell
|
render: shell
|
||||||
validations:
|
validations:
|
||||||
|
@ -58,13 +54,13 @@
|
||||||
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
label: DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
|
||||||
description: Fill all fields even if you think it is irrelevant for the issue
|
description: Fill all fields even if you think it is irrelevant for the issue
|
||||||
options:
|
options:
|
||||||
- label: I understand that I will be **blocked** if I remove or skip any mandatory\\* field
|
- label: I understand that I will be **blocked** if I *intentionally* remove or skip any mandatory\\* field
|
||||||
required: true
|
required: true
|
||||||
'''.strip()
|
'''.strip()
|
||||||
|
|
||||||
|
|
||||||
def main():
|
def main():
|
||||||
fields = {'version': read_version(), 'no_skip': NO_SKIP}
|
fields = {'no_skip': NO_SKIP}
|
||||||
fields['verbose'] = VERBOSE_TMPL % fields
|
fields['verbose'] = VERBOSE_TMPL % fields
|
||||||
fields['verbose_optional'] = re.sub(r'(\n\s+validations:)?\n\s+required: true', '', fields['verbose'])
|
fields['verbose_optional'] = re.sub(r'(\n\s+validations:)?\n\s+required: true', '', fields['verbose'])
|
||||||
|
|
||||||
|
|
|
@ -45,33 +45,43 @@ def apply_patch(text, patch):
|
||||||
delim = f'\n{" " * switch_col_width}'
|
delim = f'\n{" " * switch_col_width}'
|
||||||
|
|
||||||
PATCHES = (
|
PATCHES = (
|
||||||
( # Standardize update message
|
( # Standardize `--update` message
|
||||||
r'(?m)^( -U, --update\s+).+(\n \s.+)*$',
|
r'(?m)^( -U, --update\s+).+(\n \s.+)*$',
|
||||||
r'\1Update this program to the latest version',
|
r'\1Update this program to the latest version',
|
||||||
),
|
),
|
||||||
( # Headings
|
( # Headings
|
||||||
r'(?m)^ (\w.+\n)( (?=\w))?',
|
r'(?m)^ (\w.+\n)( (?=\w))?',
|
||||||
r'## \1'
|
r'## \1'
|
||||||
),
|
),
|
||||||
( # Do not split URLs
|
( # Fixup `--date` formatting
|
||||||
|
rf'(?m)( --date DATE.+({delim}[^\[]+)*)\[.+({delim}.+)*$',
|
||||||
|
(rf'\1[now|today|yesterday][-N[day|week|month|year]].{delim}'
|
||||||
|
f'E.g. "--date today-2weeks" downloads only{delim}'
|
||||||
|
'videos uploaded on the same day two weeks ago'),
|
||||||
|
),
|
||||||
|
( # Do not split URLs
|
||||||
rf'({delim[:-1]})? (?P<label>\[\S+\] )?(?P<url>https?({delim})?:({delim})?/({delim})?/(({delim})?\S+)+)\s',
|
rf'({delim[:-1]})? (?P<label>\[\S+\] )?(?P<url>https?({delim})?:({delim})?/({delim})?/(({delim})?\S+)+)\s',
|
||||||
lambda mobj: ''.join((delim, mobj.group('label') or '', re.sub(r'\s+', '', mobj.group('url')), '\n'))
|
lambda mobj: ''.join((delim, mobj.group('label') or '', re.sub(r'\s+', '', mobj.group('url')), '\n'))
|
||||||
),
|
),
|
||||||
( # Do not split "words"
|
( # Do not split "words"
|
||||||
rf'(?m)({delim}\S+)+$',
|
rf'(?m)({delim}\S+)+$',
|
||||||
lambda mobj: ''.join((delim, mobj.group(0).replace(delim, '')))
|
lambda mobj: ''.join((delim, mobj.group(0).replace(delim, '')))
|
||||||
),
|
),
|
||||||
( # Allow overshooting last line
|
( # Allow overshooting last line
|
||||||
rf'(?m)^(?P<prev>.+)${delim}(?P<current>.+)$(?!{delim})',
|
rf'(?m)^(?P<prev>.+)${delim}(?P<current>.+)$(?!{delim})',
|
||||||
lambda mobj: (mobj.group().replace(delim, ' ')
|
lambda mobj: (mobj.group().replace(delim, ' ')
|
||||||
if len(mobj.group()) - len(delim) + 1 <= max_width + ALLOWED_OVERSHOOT
|
if len(mobj.group()) - len(delim) + 1 <= max_width + ALLOWED_OVERSHOOT
|
||||||
else mobj.group())
|
else mobj.group())
|
||||||
),
|
),
|
||||||
( # Avoid newline when a space is available b/w switch and description
|
( # Avoid newline when a space is available b/w switch and description
|
||||||
DISABLE_PATCH, # This creates issues with prepare_manpage
|
DISABLE_PATCH, # This creates issues with prepare_manpage
|
||||||
r'(?m)^(\s{4}-.{%d})(%s)' % (switch_col_width - 6, delim),
|
r'(?m)^(\s{4}-.{%d})(%s)' % (switch_col_width - 6, delim),
|
||||||
r'\1 '
|
r'\1 '
|
||||||
),
|
),
|
||||||
|
( # Replace brackets with a Markdown link
|
||||||
|
r'SponsorBlock API \((http.+)\)',
|
||||||
|
r'[SponsorBlock API](\1)'
|
||||||
|
),
|
||||||
)
|
)
|
||||||
|
|
||||||
readme = read_file(README_FILE)
|
readme = read_file(README_FILE)
|
||||||
|
|
|
@ -1,17 +1,4 @@
|
||||||
@setlocal
|
|
||||||
@echo off
|
@echo off
|
||||||
cd /d %~dp0..
|
|
||||||
|
|
||||||
if ["%~1"]==[""] (
|
>&2 echo run_tests.bat is deprecated. Please use `devscripts/run_tests.py` instead
|
||||||
set "test_set="test""
|
python %~dp0run_tests.py %~1
|
||||||
) else if ["%~1"]==["core"] (
|
|
||||||
set "test_set="-m not download""
|
|
||||||
) else if ["%~1"]==["download"] (
|
|
||||||
set "test_set="-m "download""
|
|
||||||
) else (
|
|
||||||
echo.Invalid test type "%~1". Use "core" ^| "download"
|
|
||||||
exit /b 1
|
|
||||||
)
|
|
||||||
|
|
||||||
set PYTHONWARNINGS=error
|
|
||||||
pytest %test_set%
|
|
||||||
|
|
71
devscripts/run_tests.py
Executable file
71
devscripts/run_tests.py
Executable file
|
@ -0,0 +1,71 @@
|
||||||
|
#!/usr/bin/env python3
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import functools
|
||||||
|
import os
|
||||||
|
import re
|
||||||
|
import subprocess
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
|
||||||
|
fix_test_name = functools.partial(re.compile(r'IE(_all|_\d+)?$').sub, r'\1')
|
||||||
|
|
||||||
|
|
||||||
|
def parse_args():
|
||||||
|
parser = argparse.ArgumentParser(description='Run selected yt-dlp tests')
|
||||||
|
parser.add_argument(
|
||||||
|
'test', help='a extractor tests, or one of "core" or "download"', nargs='*')
|
||||||
|
parser.add_argument(
|
||||||
|
'-k', help='run a test matching EXPRESSION. Same as "pytest -k"', metavar='EXPRESSION')
|
||||||
|
return parser.parse_args()
|
||||||
|
|
||||||
|
|
||||||
|
def run_tests(*tests, pattern=None, ci=False):
|
||||||
|
run_core = 'core' in tests or (not pattern and not tests)
|
||||||
|
run_download = 'download' in tests
|
||||||
|
tests = list(map(fix_test_name, tests))
|
||||||
|
|
||||||
|
arguments = ['pytest', '-Werror', '--tb=short']
|
||||||
|
if ci:
|
||||||
|
arguments.append('--color=yes')
|
||||||
|
if run_core:
|
||||||
|
arguments.extend(['-m', 'not download'])
|
||||||
|
elif run_download:
|
||||||
|
arguments.extend(['-m', 'download'])
|
||||||
|
elif pattern:
|
||||||
|
arguments.extend(['-k', pattern])
|
||||||
|
else:
|
||||||
|
arguments.extend(
|
||||||
|
f'test/test_download.py::TestDownload::test_{test}' for test in tests)
|
||||||
|
|
||||||
|
print(f'Running {arguments}', flush=True)
|
||||||
|
try:
|
||||||
|
return subprocess.call(arguments)
|
||||||
|
except FileNotFoundError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
arguments = [sys.executable, '-Werror', '-m', 'unittest']
|
||||||
|
if run_core:
|
||||||
|
print('"pytest" needs to be installed to run core tests', file=sys.stderr, flush=True)
|
||||||
|
return 1
|
||||||
|
elif run_download:
|
||||||
|
arguments.append('test.test_download')
|
||||||
|
elif pattern:
|
||||||
|
arguments.extend(['-k', pattern])
|
||||||
|
else:
|
||||||
|
arguments.extend(
|
||||||
|
f'test.test_download.TestDownload.test_{test}' for test in tests)
|
||||||
|
|
||||||
|
print(f'Running {arguments}', flush=True)
|
||||||
|
return subprocess.call(arguments)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
try:
|
||||||
|
args = parse_args()
|
||||||
|
|
||||||
|
os.chdir(Path(__file__).parent.parent)
|
||||||
|
sys.exit(run_tests(*args.test, pattern=args.k, ci=bool(os.getenv('CI'))))
|
||||||
|
except KeyboardInterrupt:
|
||||||
|
pass
|
|
@ -1,14 +1,4 @@
|
||||||
#!/usr/bin/env sh
|
#!/usr/bin/env sh
|
||||||
|
|
||||||
if [ -z "$1" ]; then
|
>&2 echo 'run_tests.sh is deprecated. Please use `devscripts/run_tests.py` instead'
|
||||||
test_set='test'
|
python3 devscripts/run_tests.py "$1"
|
||||||
elif [ "$1" = 'core' ]; then
|
|
||||||
test_set="-m not download"
|
|
||||||
elif [ "$1" = 'download' ]; then
|
|
||||||
test_set="-m download"
|
|
||||||
else
|
|
||||||
echo 'Invalid test type "'"$1"'". Use "core" | "download"'
|
|
||||||
exit 1
|
|
||||||
fi
|
|
||||||
|
|
||||||
python3 -bb -Werror -m pytest "$test_set"
|
|
||||||
|
|
189
devscripts/tomlparse.py
Executable file
189
devscripts/tomlparse.py
Executable file
|
@ -0,0 +1,189 @@
|
||||||
|
#!/usr/bin/env python3
|
||||||
|
|
||||||
|
"""
|
||||||
|
Simple parser for spec compliant toml files
|
||||||
|
|
||||||
|
A simple toml parser for files that comply with the spec.
|
||||||
|
Should only be used to parse `pyproject.toml` for `install_deps.py`.
|
||||||
|
|
||||||
|
IMPORTANT: INVALID FILES OR MULTILINE STRINGS ARE NOT SUPPORTED!
|
||||||
|
"""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import datetime
|
||||||
|
import json
|
||||||
|
import re
|
||||||
|
|
||||||
|
WS = r'(?:[\ \t]*)'
|
||||||
|
STRING_RE = re.compile(r'"(?:\\.|[^\\"\n])*"|\'[^\'\n]*\'')
|
||||||
|
SINGLE_KEY_RE = re.compile(rf'{STRING_RE.pattern}|[A-Za-z0-9_-]+')
|
||||||
|
KEY_RE = re.compile(rf'{WS}(?:{SINGLE_KEY_RE.pattern}){WS}(?:\.{WS}(?:{SINGLE_KEY_RE.pattern}){WS})*')
|
||||||
|
EQUALS_RE = re.compile(rf'={WS}')
|
||||||
|
WS_RE = re.compile(WS)
|
||||||
|
|
||||||
|
_SUBTABLE = rf'(?P<subtable>^\[(?P<is_list>\[)?(?P<path>{KEY_RE.pattern})\]\]?)'
|
||||||
|
EXPRESSION_RE = re.compile(rf'^(?:{_SUBTABLE}|{KEY_RE.pattern}=)', re.MULTILINE)
|
||||||
|
|
||||||
|
LIST_WS_RE = re.compile(rf'{WS}((#[^\n]*)?\n{WS})*')
|
||||||
|
LEFTOVER_VALUE_RE = re.compile(r'[^,}\]\t\n#]+')
|
||||||
|
|
||||||
|
|
||||||
|
def parse_key(value: str):
|
||||||
|
for match in SINGLE_KEY_RE.finditer(value):
|
||||||
|
if match[0][0] == '"':
|
||||||
|
yield json.loads(match[0])
|
||||||
|
elif match[0][0] == '\'':
|
||||||
|
yield match[0][1:-1]
|
||||||
|
else:
|
||||||
|
yield match[0]
|
||||||
|
|
||||||
|
|
||||||
|
def get_target(root: dict, paths: list[str], is_list=False):
|
||||||
|
target = root
|
||||||
|
|
||||||
|
for index, key in enumerate(paths, 1):
|
||||||
|
use_list = is_list and index == len(paths)
|
||||||
|
result = target.get(key)
|
||||||
|
if result is None:
|
||||||
|
result = [] if use_list else {}
|
||||||
|
target[key] = result
|
||||||
|
|
||||||
|
if isinstance(result, dict):
|
||||||
|
target = result
|
||||||
|
elif use_list:
|
||||||
|
target = {}
|
||||||
|
result.append(target)
|
||||||
|
else:
|
||||||
|
target = result[-1]
|
||||||
|
|
||||||
|
assert isinstance(target, dict)
|
||||||
|
return target
|
||||||
|
|
||||||
|
|
||||||
|
def parse_enclosed(data: str, index: int, end: str, ws_re: re.Pattern):
|
||||||
|
index += 1
|
||||||
|
|
||||||
|
if match := ws_re.match(data, index):
|
||||||
|
index = match.end()
|
||||||
|
|
||||||
|
while data[index] != end:
|
||||||
|
index = yield True, index
|
||||||
|
|
||||||
|
if match := ws_re.match(data, index):
|
||||||
|
index = match.end()
|
||||||
|
|
||||||
|
if data[index] == ',':
|
||||||
|
index += 1
|
||||||
|
|
||||||
|
if match := ws_re.match(data, index):
|
||||||
|
index = match.end()
|
||||||
|
|
||||||
|
assert data[index] == end
|
||||||
|
yield False, index + 1
|
||||||
|
|
||||||
|
|
||||||
|
def parse_value(data: str, index: int):
|
||||||
|
if data[index] == '[':
|
||||||
|
result = []
|
||||||
|
|
||||||
|
indices = parse_enclosed(data, index, ']', LIST_WS_RE)
|
||||||
|
valid, index = next(indices)
|
||||||
|
while valid:
|
||||||
|
index, value = parse_value(data, index)
|
||||||
|
result.append(value)
|
||||||
|
valid, index = indices.send(index)
|
||||||
|
|
||||||
|
return index, result
|
||||||
|
|
||||||
|
if data[index] == '{':
|
||||||
|
result = {}
|
||||||
|
|
||||||
|
indices = parse_enclosed(data, index, '}', WS_RE)
|
||||||
|
valid, index = next(indices)
|
||||||
|
while valid:
|
||||||
|
valid, index = indices.send(parse_kv_pair(data, index, result))
|
||||||
|
|
||||||
|
return index, result
|
||||||
|
|
||||||
|
if match := STRING_RE.match(data, index):
|
||||||
|
return match.end(), json.loads(match[0]) if match[0][0] == '"' else match[0][1:-1]
|
||||||
|
|
||||||
|
match = LEFTOVER_VALUE_RE.match(data, index)
|
||||||
|
assert match
|
||||||
|
value = match[0].strip()
|
||||||
|
for func in [
|
||||||
|
int,
|
||||||
|
float,
|
||||||
|
datetime.time.fromisoformat,
|
||||||
|
datetime.date.fromisoformat,
|
||||||
|
datetime.datetime.fromisoformat,
|
||||||
|
{'true': True, 'false': False}.get,
|
||||||
|
]:
|
||||||
|
try:
|
||||||
|
value = func(value)
|
||||||
|
break
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
return match.end(), value
|
||||||
|
|
||||||
|
|
||||||
|
def parse_kv_pair(data: str, index: int, target: dict):
|
||||||
|
match = KEY_RE.match(data, index)
|
||||||
|
if not match:
|
||||||
|
return None
|
||||||
|
|
||||||
|
*keys, key = parse_key(match[0])
|
||||||
|
|
||||||
|
match = EQUALS_RE.match(data, match.end())
|
||||||
|
assert match
|
||||||
|
index = match.end()
|
||||||
|
|
||||||
|
index, value = parse_value(data, index)
|
||||||
|
get_target(target, keys)[key] = value
|
||||||
|
return index
|
||||||
|
|
||||||
|
|
||||||
|
def parse_toml(data: str):
|
||||||
|
root = {}
|
||||||
|
target = root
|
||||||
|
|
||||||
|
index = 0
|
||||||
|
while True:
|
||||||
|
match = EXPRESSION_RE.search(data, index)
|
||||||
|
if not match:
|
||||||
|
break
|
||||||
|
|
||||||
|
if match.group('subtable'):
|
||||||
|
index = match.end()
|
||||||
|
path, is_list = match.group('path', 'is_list')
|
||||||
|
target = get_target(root, list(parse_key(path)), bool(is_list))
|
||||||
|
continue
|
||||||
|
|
||||||
|
index = parse_kv_pair(data, match.start(), target)
|
||||||
|
assert index is not None
|
||||||
|
|
||||||
|
return root
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
import argparse
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
parser = argparse.ArgumentParser()
|
||||||
|
parser.add_argument('infile', type=Path, help='The TOML file to read as input')
|
||||||
|
args = parser.parse_args()
|
||||||
|
|
||||||
|
with args.infile.open('r', encoding='utf-8') as file:
|
||||||
|
data = file.read()
|
||||||
|
|
||||||
|
def default(obj):
|
||||||
|
if isinstance(obj, (datetime.date, datetime.time, datetime.datetime)):
|
||||||
|
return obj.isoformat()
|
||||||
|
|
||||||
|
print(json.dumps(parse_toml(data), default=default))
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
|
@ -1,39 +0,0 @@
|
||||||
#!/usr/bin/env python3
|
|
||||||
|
|
||||||
"""
|
|
||||||
Usage: python3 ./devscripts/update-formulae.py <path-to-formulae-rb> <version>
|
|
||||||
version can be either 0-aligned (yt-dlp version) or normalized (PyPi version)
|
|
||||||
"""
|
|
||||||
|
|
||||||
# Allow direct execution
|
|
||||||
import os
|
|
||||||
import sys
|
|
||||||
|
|
||||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
|
||||||
|
|
||||||
|
|
||||||
import json
|
|
||||||
import re
|
|
||||||
import urllib.request
|
|
||||||
|
|
||||||
from devscripts.utils import read_file, write_file
|
|
||||||
|
|
||||||
filename, version = sys.argv[1:]
|
|
||||||
|
|
||||||
normalized_version = '.'.join(str(int(x)) for x in version.split('.'))
|
|
||||||
|
|
||||||
pypi_release = json.loads(urllib.request.urlopen(
|
|
||||||
'https://pypi.org/pypi/yt-dlp/%s/json' % normalized_version
|
|
||||||
).read().decode())
|
|
||||||
|
|
||||||
tarball_file = next(x for x in pypi_release['urls'] if x['filename'].endswith('.tar.gz'))
|
|
||||||
|
|
||||||
sha256sum = tarball_file['digests']['sha256']
|
|
||||||
url = tarball_file['url']
|
|
||||||
|
|
||||||
formulae_text = read_file(filename)
|
|
||||||
|
|
||||||
formulae_text = re.sub(r'sha256 "[0-9a-f]*?"', 'sha256 "%s"' % sha256sum, formulae_text, count=1)
|
|
||||||
formulae_text = re.sub(r'url "[^"]*?"', 'url "%s"' % url, formulae_text, count=1)
|
|
||||||
|
|
||||||
write_file(filename, formulae_text)
|
|
|
@ -7,19 +7,20 @@
|
||||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||||
|
|
||||||
|
|
||||||
|
import argparse
|
||||||
import contextlib
|
import contextlib
|
||||||
import subprocess
|
|
||||||
import sys
|
import sys
|
||||||
from datetime import datetime
|
from datetime import datetime, timezone
|
||||||
|
|
||||||
from devscripts.utils import read_version, write_file
|
from devscripts.utils import read_version, run_process, write_file
|
||||||
|
|
||||||
|
|
||||||
def get_new_version(revision):
|
def get_new_version(version, revision):
|
||||||
version = datetime.utcnow().strftime('%Y.%m.%d')
|
if not version:
|
||||||
|
version = datetime.now(timezone.utc).strftime('%Y.%m.%d')
|
||||||
|
|
||||||
if revision:
|
if revision:
|
||||||
assert revision.isdigit(), 'Revision must be a number'
|
assert revision.isdecimal(), 'Revision must be a number'
|
||||||
else:
|
else:
|
||||||
old_version = read_version().split('.')
|
old_version = read_version().split('.')
|
||||||
if version.split('.') == old_version[:3]:
|
if version.split('.') == old_version[:3]:
|
||||||
|
@ -30,27 +31,52 @@ def get_new_version(revision):
|
||||||
|
|
||||||
def get_git_head():
|
def get_git_head():
|
||||||
with contextlib.suppress(Exception):
|
with contextlib.suppress(Exception):
|
||||||
sp = subprocess.Popen(['git', 'rev-parse', '--short', 'HEAD'], stdout=subprocess.PIPE)
|
return run_process('git', 'rev-parse', 'HEAD').stdout.strip()
|
||||||
return sp.communicate()[0].decode().strip() or None
|
|
||||||
|
|
||||||
|
|
||||||
VERSION = get_new_version((sys.argv + [''])[1])
|
VERSION_TEMPLATE = '''\
|
||||||
GIT_HEAD = get_git_head()
|
|
||||||
|
|
||||||
VERSION_FILE = f'''\
|
|
||||||
# Autogenerated by devscripts/update-version.py
|
# Autogenerated by devscripts/update-version.py
|
||||||
|
|
||||||
__version__ = {VERSION!r}
|
__version__ = {version!r}
|
||||||
|
|
||||||
RELEASE_GIT_HEAD = {GIT_HEAD!r}
|
RELEASE_GIT_HEAD = {git_head!r}
|
||||||
|
|
||||||
VARIANT = None
|
VARIANT = None
|
||||||
|
|
||||||
UPDATE_HINT = None
|
UPDATE_HINT = None
|
||||||
|
|
||||||
|
CHANNEL = {channel!r}
|
||||||
|
|
||||||
|
ORIGIN = {origin!r}
|
||||||
|
|
||||||
|
_pkg_version = {package_version!r}
|
||||||
'''
|
'''
|
||||||
|
|
||||||
write_file('yt_dlp/version.py', VERSION_FILE)
|
if __name__ == '__main__':
|
||||||
github_output = os.getenv('GITHUB_OUTPUT')
|
parser = argparse.ArgumentParser(description='Update the version.py file')
|
||||||
if github_output:
|
parser.add_argument(
|
||||||
write_file(github_output, f'ytdlp_version={VERSION}\n', 'a')
|
'-c', '--channel', default='stable',
|
||||||
print(f'\nVersion = {VERSION}, Git HEAD = {GIT_HEAD}')
|
help='Select update channel (default: %(default)s)')
|
||||||
|
parser.add_argument(
|
||||||
|
'-r', '--origin', default='local',
|
||||||
|
help='Select origin/repository (default: %(default)s)')
|
||||||
|
parser.add_argument(
|
||||||
|
'-s', '--suffix', default='',
|
||||||
|
help='Add an alphanumeric suffix to the package version, e.g. "dev"')
|
||||||
|
parser.add_argument(
|
||||||
|
'-o', '--output', default='yt_dlp/version.py',
|
||||||
|
help='The output file to write to (default: %(default)s)')
|
||||||
|
parser.add_argument(
|
||||||
|
'version', nargs='?', default=None,
|
||||||
|
help='A version or revision to use instead of generating one')
|
||||||
|
args = parser.parse_args()
|
||||||
|
|
||||||
|
git_head = get_git_head()
|
||||||
|
version = (
|
||||||
|
args.version if args.version and '.' in args.version
|
||||||
|
else get_new_version(None, args.version))
|
||||||
|
write_file(args.output, VERSION_TEMPLATE.format(
|
||||||
|
version=version, git_head=git_head, channel=args.channel, origin=args.origin,
|
||||||
|
package_version=f'{version}{args.suffix}'))
|
||||||
|
|
||||||
|
print(f'version={version} ({args.channel}), head={git_head}')
|
||||||
|
|
|
@ -1,5 +1,6 @@
|
||||||
import argparse
|
import argparse
|
||||||
import functools
|
import functools
|
||||||
|
import subprocess
|
||||||
|
|
||||||
|
|
||||||
def read_file(fname):
|
def read_file(fname):
|
||||||
|
@ -12,10 +13,11 @@ def write_file(fname, content, mode='w'):
|
||||||
return f.write(content)
|
return f.write(content)
|
||||||
|
|
||||||
|
|
||||||
# Get the version without importing the package
|
def read_version(fname='yt_dlp/version.py', varname='__version__'):
|
||||||
def read_version(fname='yt_dlp/version.py'):
|
"""Get the version without importing the package"""
|
||||||
exec(compile(read_file(fname), fname, 'exec'))
|
items = {}
|
||||||
return locals()['__version__']
|
exec(compile(read_file(fname), fname, 'exec'), items)
|
||||||
|
return items[varname]
|
||||||
|
|
||||||
|
|
||||||
def get_filename_args(has_infile=False, default_outfile=None):
|
def get_filename_args(has_infile=False, default_outfile=None):
|
||||||
|
@ -33,3 +35,13 @@ def get_filename_args(has_infile=False, default_outfile=None):
|
||||||
|
|
||||||
def compose_functions(*functions):
|
def compose_functions(*functions):
|
||||||
return lambda x: functools.reduce(lambda y, f: f(y), functions, x)
|
return lambda x: functools.reduce(lambda y, f: f(y), functions, x)
|
||||||
|
|
||||||
|
|
||||||
|
def run_process(*args, **kwargs):
|
||||||
|
kwargs.setdefault('text', True)
|
||||||
|
kwargs.setdefault('check', True)
|
||||||
|
kwargs.setdefault('capture_output', True)
|
||||||
|
if kwargs['text']:
|
||||||
|
kwargs.setdefault('encoding', 'utf-8')
|
||||||
|
kwargs.setdefault('errors', 'replace')
|
||||||
|
return subprocess.run(args, **kwargs)
|
||||||
|
|
29
public.key
Normal file
29
public.key
Normal file
|
@ -0,0 +1,29 @@
|
||||||
|
-----BEGIN PGP PUBLIC KEY BLOCK-----
|
||||||
|
|
||||||
|
mQINBGP78C4BEAD0rF9zjGPAt0thlt5C1ebzccAVX7Nb1v+eqQjk+WEZdTETVCg3
|
||||||
|
WAM5ngArlHdm/fZqzUgO+pAYrB60GKeg7ffUDf+S0XFKEZdeRLYeAaqqKhSibVal
|
||||||
|
DjvOBOztu3W607HLETQAqA7wTPuIt2WqmpL60NIcyr27LxqmgdN3mNvZ2iLO+bP0
|
||||||
|
nKR/C+PgE9H4ytywDa12zMx6PmZCnVOOOu6XZEFmdUxxdQ9fFDqd9LcBKY2LDOcS
|
||||||
|
Yo1saY0YWiZWHtzVoZu1kOzjnS5Fjq/yBHJLImDH7pNxHm7s/PnaurpmQFtDFruk
|
||||||
|
t+2lhDnpKUmGr/I/3IHqH/X+9nPoS4uiqQ5HpblB8BK+4WfpaiEg75LnvuOPfZIP
|
||||||
|
KYyXa/0A7QojMwgOrD88ozT+VCkKkkJ+ijXZ7gHNjmcBaUdKK7fDIEOYI63Lyc6Q
|
||||||
|
WkGQTigFffSUXWHDCO9aXNhP3ejqFWgGMtCUsrbkcJkWuWY7q5ARy/05HbSM3K4D
|
||||||
|
U9eqtnxmiV1WQ8nXuI9JgJQRvh5PTkny5LtxqzcmqvWO9TjHBbrs14BPEO9fcXxK
|
||||||
|
L/CFBbzXDSvvAgArdqqlMoncQ/yicTlfL6qzJ8EKFiqW14QMTdAn6SuuZTodXCTi
|
||||||
|
InwoT7WjjuFPKKdvfH1GP4bnqdzTnzLxCSDIEtfyfPsIX+9GI7Jkk/zZjQARAQAB
|
||||||
|
tDdTaW1vbiBTYXdpY2tpICh5dC1kbHAgc2lnbmluZyBrZXkpIDxjb250YWN0QGdy
|
||||||
|
dWI0ay54eXo+iQJOBBMBCgA4FiEErAy75oSNaoc0ZK9OV89lkztadYEFAmP78C4C
|
||||||
|
GwMFCwkIBwIGFQoJCAsCBBYCAwECHgECF4AACgkQV89lkztadYEVqQ//cW7TxhXg
|
||||||
|
7Xbh2EZQzXml0egn6j8QaV9KzGragMiShrlvTO2zXfLXqyizrFP4AspgjSn/4NrI
|
||||||
|
8mluom+Yi+qr7DXT4BjQqIM9y3AjwZPdywe912Lxcw52NNoPZCm24I9T7ySc8lmR
|
||||||
|
FQvZC0w4H/VTNj/2lgJ1dwMflpwvNRiWa5YzcFGlCUeDIPskLx9++AJE+xwU3LYm
|
||||||
|
jQQsPBqpHHiTBEJzMLl+rfd9Fg4N+QNzpFkTDW3EPerLuvJniSBBwZthqxeAtw4M
|
||||||
|
UiAXh6JvCc2hJkKCoygRfM281MeolvmsGNyQm+axlB0vyldiPP6BnaRgZlx+l6MU
|
||||||
|
cPqgHblb7RW5j9lfr6OYL7SceBIHNv0CFrt1OnkGo/tVMwcs8LH3Ae4a7UJlIceL
|
||||||
|
V54aRxSsZU7w4iX+PB79BWkEsQzwKrUuJVOeL4UDwWajp75OFaUqbS/slDDVXvK5
|
||||||
|
OIeuth3mA/adjdvgjPxhRQjA3l69rRWIJDrqBSHldmRsnX6cvXTDy8wSXZgy51lP
|
||||||
|
m4IVLHnCy9m4SaGGoAsfTZS0cC9FgjUIyTyrq9M67wOMpUxnuB0aRZgJE1DsI23E
|
||||||
|
qdvcSNVlO+39xM/KPWUEh6b83wMn88QeW+DCVGWACQq5N3YdPnAJa50617fGbY6I
|
||||||
|
gXIoRHXkDqe23PZ/jURYCv0sjVtjPoVC+bg=
|
||||||
|
=bJkn
|
||||||
|
-----END PGP PUBLIC KEY BLOCK-----
|
123
pyproject.toml
123
pyproject.toml
|
@ -1,5 +1,120 @@
|
||||||
[build-system]
|
[build-system]
|
||||||
build-backend = 'setuptools.build_meta'
|
requires = ["hatchling"]
|
||||||
# https://github.com/yt-dlp/yt-dlp/issues/5941
|
build-backend = "hatchling.build"
|
||||||
# https://github.com/pypa/distutils/issues/17
|
|
||||||
requires = ['setuptools > 50']
|
[project]
|
||||||
|
name = "yt-dlp"
|
||||||
|
maintainers = [
|
||||||
|
{name = "pukkandan", email = "pukkandan.ytdlp@gmail.com"},
|
||||||
|
{name = "Grub4K", email = "contact@grub4k.xyz"},
|
||||||
|
{name = "bashonly", email = "bashonly@protonmail.com"},
|
||||||
|
]
|
||||||
|
description = "A youtube-dl fork with additional features and patches"
|
||||||
|
readme = "README.md"
|
||||||
|
requires-python = ">=3.8"
|
||||||
|
keywords = [
|
||||||
|
"youtube-dl",
|
||||||
|
"video-downloader",
|
||||||
|
"youtube-downloader",
|
||||||
|
"sponsorblock",
|
||||||
|
"youtube-dlc",
|
||||||
|
"yt-dlp",
|
||||||
|
]
|
||||||
|
license = {file = "LICENSE"}
|
||||||
|
classifiers = [
|
||||||
|
"Topic :: Multimedia :: Video",
|
||||||
|
"Development Status :: 5 - Production/Stable",
|
||||||
|
"Environment :: Console",
|
||||||
|
"Programming Language :: Python",
|
||||||
|
"Programming Language :: Python :: 3 :: Only",
|
||||||
|
"Programming Language :: Python :: 3.8",
|
||||||
|
"Programming Language :: Python :: 3.9",
|
||||||
|
"Programming Language :: Python :: 3.10",
|
||||||
|
"Programming Language :: Python :: 3.11",
|
||||||
|
"Programming Language :: Python :: 3.12",
|
||||||
|
"Programming Language :: Python :: Implementation",
|
||||||
|
"Programming Language :: Python :: Implementation :: CPython",
|
||||||
|
"Programming Language :: Python :: Implementation :: PyPy",
|
||||||
|
"License :: OSI Approved :: The Unlicense (Unlicense)",
|
||||||
|
"Operating System :: OS Independent",
|
||||||
|
]
|
||||||
|
dynamic = ["version"]
|
||||||
|
dependencies = [
|
||||||
|
"brotli; implementation_name=='cpython'",
|
||||||
|
"brotlicffi; implementation_name!='cpython'",
|
||||||
|
"certifi",
|
||||||
|
"mutagen",
|
||||||
|
"pycryptodomex",
|
||||||
|
"requests>=2.31.0,<3",
|
||||||
|
"urllib3>=1.26.17,<3",
|
||||||
|
"websockets>=12.0",
|
||||||
|
]
|
||||||
|
|
||||||
|
[project.optional-dependencies]
|
||||||
|
secretstorage = [
|
||||||
|
"cffi",
|
||||||
|
"secretstorage",
|
||||||
|
]
|
||||||
|
build = [
|
||||||
|
"build",
|
||||||
|
"hatchling",
|
||||||
|
"pip",
|
||||||
|
"wheel",
|
||||||
|
]
|
||||||
|
dev = [
|
||||||
|
"flake8",
|
||||||
|
"isort",
|
||||||
|
"pytest",
|
||||||
|
]
|
||||||
|
pyinstaller = ["pyinstaller>=6.3"]
|
||||||
|
py2exe = ["py2exe>=0.12"]
|
||||||
|
|
||||||
|
[project.urls]
|
||||||
|
Documentation = "https://github.com/yt-dlp/yt-dlp#readme"
|
||||||
|
Repository = "https://github.com/yt-dlp/yt-dlp"
|
||||||
|
Tracker = "https://github.com/yt-dlp/yt-dlp/issues"
|
||||||
|
Funding = "https://github.com/yt-dlp/yt-dlp/blob/master/Collaborators.md#collaborators"
|
||||||
|
|
||||||
|
[project.scripts]
|
||||||
|
yt-dlp = "yt_dlp:main"
|
||||||
|
|
||||||
|
[project.entry-points.pyinstaller40]
|
||||||
|
hook-dirs = "yt_dlp.__pyinstaller:get_hook_dirs"
|
||||||
|
|
||||||
|
[tool.hatch.build.targets.sdist]
|
||||||
|
include = [
|
||||||
|
"/yt_dlp",
|
||||||
|
"/devscripts",
|
||||||
|
"/test",
|
||||||
|
"/.gitignore", # included by default, needed for auto-excludes
|
||||||
|
"/Changelog.md",
|
||||||
|
"/LICENSE", # included as license
|
||||||
|
"/pyproject.toml", # included by default
|
||||||
|
"/README.md", # included as readme
|
||||||
|
"/setup.cfg",
|
||||||
|
"/supportedsites.md",
|
||||||
|
]
|
||||||
|
exclude = ["/yt_dlp/__pyinstaller"]
|
||||||
|
artifacts = [
|
||||||
|
"/yt_dlp/extractor/lazy_extractors.py",
|
||||||
|
"/completions",
|
||||||
|
"/AUTHORS", # included by default
|
||||||
|
"/README.txt",
|
||||||
|
"/yt-dlp.1",
|
||||||
|
]
|
||||||
|
|
||||||
|
[tool.hatch.build.targets.wheel]
|
||||||
|
packages = ["yt_dlp"]
|
||||||
|
exclude = ["/yt_dlp/__pyinstaller"]
|
||||||
|
artifacts = ["/yt_dlp/extractor/lazy_extractors.py"]
|
||||||
|
|
||||||
|
[tool.hatch.build.targets.wheel.shared-data]
|
||||||
|
"completions/bash/yt-dlp" = "share/bash-completion/completions/yt-dlp"
|
||||||
|
"completions/zsh/_yt-dlp" = "share/zsh/site-functions/_yt-dlp"
|
||||||
|
"completions/fish/yt-dlp.fish" = "share/fish/vendor_completions.d/yt-dlp.fish"
|
||||||
|
"README.txt" = "share/doc/yt_dlp/README.txt"
|
||||||
|
"yt-dlp.1" = "share/man/man1/yt-dlp.1"
|
||||||
|
|
||||||
|
[tool.hatch.version]
|
||||||
|
path = "yt_dlp/version.py"
|
||||||
|
pattern = "_pkg_version = '(?P<version>[^']+)'"
|
||||||
|
|
|
@ -1,6 +0,0 @@
|
||||||
mutagen
|
|
||||||
pycryptodomex
|
|
||||||
websockets
|
|
||||||
brotli; platform_python_implementation=='CPython'
|
|
||||||
brotlicffi; platform_python_implementation!='CPython'
|
|
||||||
certifi
|
|
|
@ -1,7 +1,3 @@
|
||||||
[wheel]
|
|
||||||
universal = true
|
|
||||||
|
|
||||||
|
|
||||||
[flake8]
|
[flake8]
|
||||||
exclude = build,venv,.tox,.git,.pytest_cache
|
exclude = build,venv,.tox,.git,.pytest_cache
|
||||||
ignore = E402,E501,E731,E741,W503
|
ignore = E402,E501,E731,E741,W503
|
||||||
|
@ -26,7 +22,7 @@ markers =
|
||||||
|
|
||||||
[tox:tox]
|
[tox:tox]
|
||||||
skipsdist = true
|
skipsdist = true
|
||||||
envlist = py{36,37,38,39,310,311},pypy{36,37,38,39}
|
envlist = py{38,39,310,311,312},pypy{38,39,310}
|
||||||
skip_missing_interpreters = true
|
skip_missing_interpreters = true
|
||||||
|
|
||||||
[testenv] # tox
|
[testenv] # tox
|
||||||
|
@ -39,7 +35,7 @@ setenv =
|
||||||
|
|
||||||
|
|
||||||
[isort]
|
[isort]
|
||||||
py_version = 37
|
py_version = 38
|
||||||
multi_line_output = VERTICAL_HANGING_INDENT
|
multi_line_output = VERTICAL_HANGING_INDENT
|
||||||
line_length = 80
|
line_length = 80
|
||||||
reverse_relative = true
|
reverse_relative = true
|
||||||
|
|
172
setup.py
172
setup.py
|
@ -1,172 +0,0 @@
|
||||||
#!/usr/bin/env python3
|
|
||||||
|
|
||||||
# Allow execution from anywhere
|
|
||||||
import os
|
|
||||||
import sys
|
|
||||||
|
|
||||||
sys.path.insert(0, os.path.dirname(os.path.abspath(__file__)))
|
|
||||||
|
|
||||||
import subprocess
|
|
||||||
import warnings
|
|
||||||
|
|
||||||
try:
|
|
||||||
from setuptools import Command, find_packages, setup
|
|
||||||
setuptools_available = True
|
|
||||||
except ImportError:
|
|
||||||
from distutils.core import Command, setup
|
|
||||||
setuptools_available = False
|
|
||||||
|
|
||||||
from devscripts.utils import read_file, read_version
|
|
||||||
|
|
||||||
VERSION = read_version()
|
|
||||||
|
|
||||||
DESCRIPTION = 'A youtube-dl fork with additional features and patches'
|
|
||||||
|
|
||||||
LONG_DESCRIPTION = '\n\n'.join((
|
|
||||||
'Official repository: <https://github.com/yt-dlp/yt-dlp>',
|
|
||||||
'**PS**: Some links in this document will not work since this is a copy of the README.md from Github',
|
|
||||||
read_file('README.md')))
|
|
||||||
|
|
||||||
REQUIREMENTS = read_file('requirements.txt').splitlines()
|
|
||||||
|
|
||||||
|
|
||||||
def packages():
|
|
||||||
if setuptools_available:
|
|
||||||
return find_packages(exclude=('youtube_dl', 'youtube_dlc', 'test', 'ytdlp_plugins', 'devscripts'))
|
|
||||||
|
|
||||||
return [
|
|
||||||
'yt_dlp', 'yt_dlp.extractor', 'yt_dlp.downloader', 'yt_dlp.postprocessor', 'yt_dlp.compat',
|
|
||||||
]
|
|
||||||
|
|
||||||
|
|
||||||
def py2exe_params():
|
|
||||||
warnings.warn(
|
|
||||||
'py2exe builds do not support pycryptodomex and needs VC++14 to run. '
|
|
||||||
'It is recommended to run "pyinst.py" to build using pyinstaller instead')
|
|
||||||
|
|
||||||
return {
|
|
||||||
'console': [{
|
|
||||||
'script': './yt_dlp/__main__.py',
|
|
||||||
'dest_base': 'yt-dlp',
|
|
||||||
'icon_resources': [(1, 'devscripts/logo.ico')],
|
|
||||||
}],
|
|
||||||
'version_info': {
|
|
||||||
'version': VERSION,
|
|
||||||
'description': DESCRIPTION,
|
|
||||||
'comments': LONG_DESCRIPTION.split('\n')[0],
|
|
||||||
'product_name': 'yt-dlp',
|
|
||||||
'product_version': VERSION,
|
|
||||||
},
|
|
||||||
'options': {
|
|
||||||
'bundle_files': 0,
|
|
||||||
'compressed': 1,
|
|
||||||
'optimize': 2,
|
|
||||||
'dist_dir': './dist',
|
|
||||||
'excludes': ['Crypto', 'Cryptodome'], # py2exe cannot import Crypto
|
|
||||||
'dll_excludes': ['w9xpopen.exe', 'crypt32.dll'],
|
|
||||||
# Modules that are only imported dynamically must be added here
|
|
||||||
'includes': ['yt_dlp.compat._legacy'],
|
|
||||||
},
|
|
||||||
'zipfile': None,
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
def build_params():
|
|
||||||
files_spec = [
|
|
||||||
('share/bash-completion/completions', ['completions/bash/yt-dlp']),
|
|
||||||
('share/zsh/site-functions', ['completions/zsh/_yt-dlp']),
|
|
||||||
('share/fish/vendor_completions.d', ['completions/fish/yt-dlp.fish']),
|
|
||||||
('share/doc/yt_dlp', ['README.txt']),
|
|
||||||
('share/man/man1', ['yt-dlp.1'])
|
|
||||||
]
|
|
||||||
data_files = []
|
|
||||||
for dirname, files in files_spec:
|
|
||||||
resfiles = []
|
|
||||||
for fn in files:
|
|
||||||
if not os.path.exists(fn):
|
|
||||||
warnings.warn(f'Skipping file {fn} since it is not present. Try running " make pypi-files " first')
|
|
||||||
else:
|
|
||||||
resfiles.append(fn)
|
|
||||||
data_files.append((dirname, resfiles))
|
|
||||||
|
|
||||||
params = {'data_files': data_files}
|
|
||||||
|
|
||||||
if setuptools_available:
|
|
||||||
params['entry_points'] = {'console_scripts': ['yt-dlp = yt_dlp:main']}
|
|
||||||
else:
|
|
||||||
params['scripts'] = ['yt-dlp']
|
|
||||||
return params
|
|
||||||
|
|
||||||
|
|
||||||
class build_lazy_extractors(Command):
|
|
||||||
description = 'Build the extractor lazy loading module'
|
|
||||||
user_options = []
|
|
||||||
|
|
||||||
def initialize_options(self):
|
|
||||||
pass
|
|
||||||
|
|
||||||
def finalize_options(self):
|
|
||||||
pass
|
|
||||||
|
|
||||||
def run(self):
|
|
||||||
if self.dry_run:
|
|
||||||
print('Skipping build of lazy extractors in dry run mode')
|
|
||||||
return
|
|
||||||
subprocess.run([sys.executable, 'devscripts/make_lazy_extractors.py'])
|
|
||||||
|
|
||||||
|
|
||||||
def main():
|
|
||||||
if sys.argv[1:2] == ['py2exe']:
|
|
||||||
params = py2exe_params()
|
|
||||||
try:
|
|
||||||
from py2exe import freeze
|
|
||||||
except ImportError:
|
|
||||||
import py2exe # noqa: F401
|
|
||||||
warnings.warn('You are using an outdated version of py2exe. Support for this version will be removed in the future')
|
|
||||||
params['console'][0].update(params.pop('version_info'))
|
|
||||||
params['options'] = {'py2exe': params.pop('options')}
|
|
||||||
else:
|
|
||||||
return freeze(**params)
|
|
||||||
else:
|
|
||||||
params = build_params()
|
|
||||||
|
|
||||||
setup(
|
|
||||||
name='yt-dlp',
|
|
||||||
version=VERSION,
|
|
||||||
maintainer='pukkandan',
|
|
||||||
maintainer_email='pukkandan.ytdlp@gmail.com',
|
|
||||||
description=DESCRIPTION,
|
|
||||||
long_description=LONG_DESCRIPTION,
|
|
||||||
long_description_content_type='text/markdown',
|
|
||||||
url='https://github.com/yt-dlp/yt-dlp',
|
|
||||||
packages=packages(),
|
|
||||||
install_requires=REQUIREMENTS,
|
|
||||||
python_requires='>=3.7',
|
|
||||||
project_urls={
|
|
||||||
'Documentation': 'https://github.com/yt-dlp/yt-dlp#readme',
|
|
||||||
'Source': 'https://github.com/yt-dlp/yt-dlp',
|
|
||||||
'Tracker': 'https://github.com/yt-dlp/yt-dlp/issues',
|
|
||||||
'Funding': 'https://github.com/yt-dlp/yt-dlp/blob/master/Collaborators.md#collaborators',
|
|
||||||
},
|
|
||||||
classifiers=[
|
|
||||||
'Topic :: Multimedia :: Video',
|
|
||||||
'Development Status :: 5 - Production/Stable',
|
|
||||||
'Environment :: Console',
|
|
||||||
'Programming Language :: Python',
|
|
||||||
'Programming Language :: Python :: 3.7',
|
|
||||||
'Programming Language :: Python :: 3.8',
|
|
||||||
'Programming Language :: Python :: 3.9',
|
|
||||||
'Programming Language :: Python :: 3.10',
|
|
||||||
'Programming Language :: Python :: 3.11',
|
|
||||||
'Programming Language :: Python :: Implementation',
|
|
||||||
'Programming Language :: Python :: Implementation :: CPython',
|
|
||||||
'Programming Language :: Python :: Implementation :: PyPy',
|
|
||||||
'License :: Public Domain',
|
|
||||||
'Operating System :: OS Independent',
|
|
||||||
],
|
|
||||||
cmdclass={'build_lazy_extractors': build_lazy_extractors},
|
|
||||||
**params
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
main()
|
|
File diff suppressed because it is too large
Load diff
26
test/conftest.py
Normal file
26
test/conftest.py
Normal file
|
@ -0,0 +1,26 @@
|
||||||
|
import functools
|
||||||
|
import inspect
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
from yt_dlp.networking import RequestHandler
|
||||||
|
from yt_dlp.networking.common import _REQUEST_HANDLERS
|
||||||
|
from yt_dlp.utils._utils import _YDLLogger as FakeLogger
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def handler(request):
|
||||||
|
RH_KEY = request.param
|
||||||
|
if inspect.isclass(RH_KEY) and issubclass(RH_KEY, RequestHandler):
|
||||||
|
handler = RH_KEY
|
||||||
|
elif RH_KEY in _REQUEST_HANDLERS:
|
||||||
|
handler = _REQUEST_HANDLERS[RH_KEY]
|
||||||
|
else:
|
||||||
|
pytest.skip(f'{RH_KEY} request handler is not available')
|
||||||
|
|
||||||
|
return functools.partial(handler, logger=FakeLogger)
|
||||||
|
|
||||||
|
|
||||||
|
def validate_and_send(rh, req):
|
||||||
|
rh.validate(req)
|
||||||
|
return rh.send(req)
|
|
@ -10,7 +10,7 @@
|
||||||
import yt_dlp.extractor
|
import yt_dlp.extractor
|
||||||
from yt_dlp import YoutubeDL
|
from yt_dlp import YoutubeDL
|
||||||
from yt_dlp.compat import compat_os_name
|
from yt_dlp.compat import compat_os_name
|
||||||
from yt_dlp.utils import preferredencoding, write_string
|
from yt_dlp.utils import preferredencoding, try_call, write_string, find_available_port
|
||||||
|
|
||||||
if 'pytest' in sys.modules:
|
if 'pytest' in sys.modules:
|
||||||
import pytest
|
import pytest
|
||||||
|
@ -194,8 +194,8 @@ def sanitize_got_info_dict(got_dict):
|
||||||
'formats', 'thumbnails', 'subtitles', 'automatic_captions', 'comments', 'entries',
|
'formats', 'thumbnails', 'subtitles', 'automatic_captions', 'comments', 'entries',
|
||||||
|
|
||||||
# Auto-generated
|
# Auto-generated
|
||||||
'autonumber', 'playlist', 'format_index', 'video_ext', 'audio_ext', 'duration_string', 'epoch',
|
'autonumber', 'playlist', 'format_index', 'video_ext', 'audio_ext', 'duration_string', 'epoch', 'n_entries',
|
||||||
'fulltitle', 'extractor', 'extractor_key', 'filepath', 'infojson_filename', 'original_url', 'n_entries',
|
'fulltitle', 'extractor', 'extractor_key', 'filename', 'filepath', 'infojson_filename', 'original_url',
|
||||||
|
|
||||||
# Only live_status needs to be checked
|
# Only live_status needs to be checked
|
||||||
'is_live', 'was_live',
|
'is_live', 'was_live',
|
||||||
|
@ -214,14 +214,19 @@ def sanitize(key, value):
|
||||||
|
|
||||||
test_info_dict = {
|
test_info_dict = {
|
||||||
key: sanitize(key, value) for key, value in got_dict.items()
|
key: sanitize(key, value) for key, value in got_dict.items()
|
||||||
if value is not None and key not in IGNORED_FIELDS and not any(
|
if value is not None and key not in IGNORED_FIELDS and (
|
||||||
key.startswith(f'{prefix}_') for prefix in IGNORED_PREFIXES)
|
not any(key.startswith(f'{prefix}_') for prefix in IGNORED_PREFIXES)
|
||||||
|
or key == '_old_archive_ids')
|
||||||
}
|
}
|
||||||
|
|
||||||
# display_id may be generated from id
|
# display_id may be generated from id
|
||||||
if test_info_dict.get('display_id') == test_info_dict.get('id'):
|
if test_info_dict.get('display_id') == test_info_dict.get('id'):
|
||||||
test_info_dict.pop('display_id')
|
test_info_dict.pop('display_id')
|
||||||
|
|
||||||
|
# release_year may be generated from release_date
|
||||||
|
if try_call(lambda: test_info_dict['release_year'] == int(test_info_dict['release_date'][:4])):
|
||||||
|
test_info_dict.pop('release_year')
|
||||||
|
|
||||||
# Check url for flat entries
|
# Check url for flat entries
|
||||||
if got_dict.get('_type', 'video') != 'video' and got_dict.get('url'):
|
if got_dict.get('_type', 'video') != 'video' and got_dict.get('url'):
|
||||||
test_info_dict['url'] = got_dict['url']
|
test_info_dict['url'] = got_dict['url']
|
||||||
|
@ -324,3 +329,8 @@ def http_server_port(httpd):
|
||||||
else:
|
else:
|
||||||
sock = httpd.socket
|
sock = httpd.socket
|
||||||
return sock.getsockname()[1]
|
return sock.getsockname()[1]
|
||||||
|
|
||||||
|
|
||||||
|
def verify_address_availability(address):
|
||||||
|
if find_available_port(address) is None:
|
||||||
|
pytest.skip(f'Unable to bind to source address {address} (address may not exist)')
|
||||||
|
|
|
@ -69,6 +69,7 @@ def test_opengraph(self):
|
||||||
<meta name="og:test1" content='foo > < bar'/>
|
<meta name="og:test1" content='foo > < bar'/>
|
||||||
<meta name="og:test2" content="foo >//< bar"/>
|
<meta name="og:test2" content="foo >//< bar"/>
|
||||||
<meta property=og-test3 content='Ill-formatted opengraph'/>
|
<meta property=og-test3 content='Ill-formatted opengraph'/>
|
||||||
|
<meta property=og:test4 content=unquoted-value/>
|
||||||
'''
|
'''
|
||||||
self.assertEqual(ie._og_search_title(html), 'Foo')
|
self.assertEqual(ie._og_search_title(html), 'Foo')
|
||||||
self.assertEqual(ie._og_search_description(html), 'Some video\'s description ')
|
self.assertEqual(ie._og_search_description(html), 'Some video\'s description ')
|
||||||
|
@ -81,6 +82,7 @@ def test_opengraph(self):
|
||||||
self.assertEqual(ie._og_search_property(('test0', 'test1'), html), 'foo > < bar')
|
self.assertEqual(ie._og_search_property(('test0', 'test1'), html), 'foo > < bar')
|
||||||
self.assertRaises(RegexNotFoundError, ie._og_search_property, 'test0', html, None, fatal=True)
|
self.assertRaises(RegexNotFoundError, ie._og_search_property, 'test0', html, None, fatal=True)
|
||||||
self.assertRaises(RegexNotFoundError, ie._og_search_property, ('test0', 'test00'), html, None, fatal=True)
|
self.assertRaises(RegexNotFoundError, ie._og_search_property, ('test0', 'test00'), html, None, fatal=True)
|
||||||
|
self.assertEqual(ie._og_search_property('test4', html), 'unquoted-value')
|
||||||
|
|
||||||
def test_html_search_meta(self):
|
def test_html_search_meta(self):
|
||||||
ie = self.ie
|
ie = self.ie
|
||||||
|
@ -916,8 +918,6 @@ def test_parse_m3u8_formats(self):
|
||||||
'acodec': 'mp4a.40.2',
|
'acodec': 'mp4a.40.2',
|
||||||
'video_ext': 'mp4',
|
'video_ext': 'mp4',
|
||||||
'audio_ext': 'none',
|
'audio_ext': 'none',
|
||||||
'vbr': 263.851,
|
|
||||||
'abr': 0,
|
|
||||||
}, {
|
}, {
|
||||||
'format_id': '577',
|
'format_id': '577',
|
||||||
'format_index': None,
|
'format_index': None,
|
||||||
|
@ -935,8 +935,6 @@ def test_parse_m3u8_formats(self):
|
||||||
'acodec': 'mp4a.40.2',
|
'acodec': 'mp4a.40.2',
|
||||||
'video_ext': 'mp4',
|
'video_ext': 'mp4',
|
||||||
'audio_ext': 'none',
|
'audio_ext': 'none',
|
||||||
'vbr': 577.61,
|
|
||||||
'abr': 0,
|
|
||||||
}, {
|
}, {
|
||||||
'format_id': '915',
|
'format_id': '915',
|
||||||
'format_index': None,
|
'format_index': None,
|
||||||
|
@ -954,8 +952,6 @@ def test_parse_m3u8_formats(self):
|
||||||
'acodec': 'mp4a.40.2',
|
'acodec': 'mp4a.40.2',
|
||||||
'video_ext': 'mp4',
|
'video_ext': 'mp4',
|
||||||
'audio_ext': 'none',
|
'audio_ext': 'none',
|
||||||
'vbr': 915.905,
|
|
||||||
'abr': 0,
|
|
||||||
}, {
|
}, {
|
||||||
'format_id': '1030',
|
'format_id': '1030',
|
||||||
'format_index': None,
|
'format_index': None,
|
||||||
|
@ -973,8 +969,6 @@ def test_parse_m3u8_formats(self):
|
||||||
'acodec': 'mp4a.40.2',
|
'acodec': 'mp4a.40.2',
|
||||||
'video_ext': 'mp4',
|
'video_ext': 'mp4',
|
||||||
'audio_ext': 'none',
|
'audio_ext': 'none',
|
||||||
'vbr': 1030.138,
|
|
||||||
'abr': 0,
|
|
||||||
}, {
|
}, {
|
||||||
'format_id': '1924',
|
'format_id': '1924',
|
||||||
'format_index': None,
|
'format_index': None,
|
||||||
|
@ -992,8 +986,6 @@ def test_parse_m3u8_formats(self):
|
||||||
'acodec': 'mp4a.40.2',
|
'acodec': 'mp4a.40.2',
|
||||||
'video_ext': 'mp4',
|
'video_ext': 'mp4',
|
||||||
'audio_ext': 'none',
|
'audio_ext': 'none',
|
||||||
'vbr': 1924.009,
|
|
||||||
'abr': 0,
|
|
||||||
}],
|
}],
|
||||||
{
|
{
|
||||||
'en': [{
|
'en': [{
|
||||||
|
@ -1578,6 +1570,7 @@ def test_parse_ism_formats(self):
|
||||||
'vcodec': 'none',
|
'vcodec': 'none',
|
||||||
'acodec': 'AACL',
|
'acodec': 'AACL',
|
||||||
'protocol': 'ism',
|
'protocol': 'ism',
|
||||||
|
'audio_channels': 2,
|
||||||
'_download_params': {
|
'_download_params': {
|
||||||
'stream_type': 'audio',
|
'stream_type': 'audio',
|
||||||
'duration': 8880746666,
|
'duration': 8880746666,
|
||||||
|
@ -1591,9 +1584,6 @@ def test_parse_ism_formats(self):
|
||||||
'bits_per_sample': 16,
|
'bits_per_sample': 16,
|
||||||
'nal_unit_length_field': 4
|
'nal_unit_length_field': 4
|
||||||
},
|
},
|
||||||
'audio_ext': 'isma',
|
|
||||||
'video_ext': 'none',
|
|
||||||
'abr': 128,
|
|
||||||
}, {
|
}, {
|
||||||
'format_id': 'video-100',
|
'format_id': 'video-100',
|
||||||
'url': 'https://sdn-global-streaming-cache-3qsdn.akamaized.net/stream/3144/files/17/07/672975/3144-kZT4LWMQw6Rh7Kpd.ism/Manifest',
|
'url': 'https://sdn-global-streaming-cache-3qsdn.akamaized.net/stream/3144/files/17/07/672975/3144-kZT4LWMQw6Rh7Kpd.ism/Manifest',
|
||||||
|
@ -1617,9 +1607,6 @@ def test_parse_ism_formats(self):
|
||||||
'bits_per_sample': 16,
|
'bits_per_sample': 16,
|
||||||
'nal_unit_length_field': 4
|
'nal_unit_length_field': 4
|
||||||
},
|
},
|
||||||
'video_ext': 'ismv',
|
|
||||||
'audio_ext': 'none',
|
|
||||||
'vbr': 100,
|
|
||||||
}, {
|
}, {
|
||||||
'format_id': 'video-326',
|
'format_id': 'video-326',
|
||||||
'url': 'https://sdn-global-streaming-cache-3qsdn.akamaized.net/stream/3144/files/17/07/672975/3144-kZT4LWMQw6Rh7Kpd.ism/Manifest',
|
'url': 'https://sdn-global-streaming-cache-3qsdn.akamaized.net/stream/3144/files/17/07/672975/3144-kZT4LWMQw6Rh7Kpd.ism/Manifest',
|
||||||
|
@ -1643,9 +1630,6 @@ def test_parse_ism_formats(self):
|
||||||
'bits_per_sample': 16,
|
'bits_per_sample': 16,
|
||||||
'nal_unit_length_field': 4
|
'nal_unit_length_field': 4
|
||||||
},
|
},
|
||||||
'video_ext': 'ismv',
|
|
||||||
'audio_ext': 'none',
|
|
||||||
'vbr': 326,
|
|
||||||
}, {
|
}, {
|
||||||
'format_id': 'video-698',
|
'format_id': 'video-698',
|
||||||
'url': 'https://sdn-global-streaming-cache-3qsdn.akamaized.net/stream/3144/files/17/07/672975/3144-kZT4LWMQw6Rh7Kpd.ism/Manifest',
|
'url': 'https://sdn-global-streaming-cache-3qsdn.akamaized.net/stream/3144/files/17/07/672975/3144-kZT4LWMQw6Rh7Kpd.ism/Manifest',
|
||||||
|
@ -1669,9 +1653,6 @@ def test_parse_ism_formats(self):
|
||||||
'bits_per_sample': 16,
|
'bits_per_sample': 16,
|
||||||
'nal_unit_length_field': 4
|
'nal_unit_length_field': 4
|
||||||
},
|
},
|
||||||
'video_ext': 'ismv',
|
|
||||||
'audio_ext': 'none',
|
|
||||||
'vbr': 698,
|
|
||||||
}, {
|
}, {
|
||||||
'format_id': 'video-1493',
|
'format_id': 'video-1493',
|
||||||
'url': 'https://sdn-global-streaming-cache-3qsdn.akamaized.net/stream/3144/files/17/07/672975/3144-kZT4LWMQw6Rh7Kpd.ism/Manifest',
|
'url': 'https://sdn-global-streaming-cache-3qsdn.akamaized.net/stream/3144/files/17/07/672975/3144-kZT4LWMQw6Rh7Kpd.ism/Manifest',
|
||||||
|
@ -1695,9 +1676,6 @@ def test_parse_ism_formats(self):
|
||||||
'bits_per_sample': 16,
|
'bits_per_sample': 16,
|
||||||
'nal_unit_length_field': 4
|
'nal_unit_length_field': 4
|
||||||
},
|
},
|
||||||
'video_ext': 'ismv',
|
|
||||||
'audio_ext': 'none',
|
|
||||||
'vbr': 1493,
|
|
||||||
}, {
|
}, {
|
||||||
'format_id': 'video-4482',
|
'format_id': 'video-4482',
|
||||||
'url': 'https://sdn-global-streaming-cache-3qsdn.akamaized.net/stream/3144/files/17/07/672975/3144-kZT4LWMQw6Rh7Kpd.ism/Manifest',
|
'url': 'https://sdn-global-streaming-cache-3qsdn.akamaized.net/stream/3144/files/17/07/672975/3144-kZT4LWMQw6Rh7Kpd.ism/Manifest',
|
||||||
|
@ -1721,9 +1699,6 @@ def test_parse_ism_formats(self):
|
||||||
'bits_per_sample': 16,
|
'bits_per_sample': 16,
|
||||||
'nal_unit_length_field': 4
|
'nal_unit_length_field': 4
|
||||||
},
|
},
|
||||||
'video_ext': 'ismv',
|
|
||||||
'audio_ext': 'none',
|
|
||||||
'vbr': 4482,
|
|
||||||
}],
|
}],
|
||||||
{
|
{
|
||||||
'eng': [
|
'eng': [
|
||||||
|
@ -1747,34 +1722,6 @@ def test_parse_ism_formats(self):
|
||||||
'ec-3_test',
|
'ec-3_test',
|
||||||
'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||||
[{
|
[{
|
||||||
'format_id': 'audio_deu_1-224',
|
|
||||||
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
|
||||||
'manifest_url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
|
||||||
'ext': 'isma',
|
|
||||||
'tbr': 224,
|
|
||||||
'asr': 48000,
|
|
||||||
'vcodec': 'none',
|
|
||||||
'acodec': 'EC-3',
|
|
||||||
'protocol': 'ism',
|
|
||||||
'_download_params':
|
|
||||||
{
|
|
||||||
'stream_type': 'audio',
|
|
||||||
'duration': 370000000,
|
|
||||||
'timescale': 10000000,
|
|
||||||
'width': 0,
|
|
||||||
'height': 0,
|
|
||||||
'fourcc': 'EC-3',
|
|
||||||
'language': 'deu',
|
|
||||||
'codec_private_data': '00063F000000AF87FBA7022DFB42A4D405CD93843BDD0700200F00',
|
|
||||||
'sampling_rate': 48000,
|
|
||||||
'channels': 6,
|
|
||||||
'bits_per_sample': 16,
|
|
||||||
'nal_unit_length_field': 4
|
|
||||||
},
|
|
||||||
'audio_ext': 'isma',
|
|
||||||
'video_ext': 'none',
|
|
||||||
'abr': 224,
|
|
||||||
}, {
|
|
||||||
'format_id': 'audio_deu-127',
|
'format_id': 'audio_deu-127',
|
||||||
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||||
'manifest_url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
'manifest_url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||||
|
@ -1784,8 +1731,9 @@ def test_parse_ism_formats(self):
|
||||||
'vcodec': 'none',
|
'vcodec': 'none',
|
||||||
'acodec': 'AACL',
|
'acodec': 'AACL',
|
||||||
'protocol': 'ism',
|
'protocol': 'ism',
|
||||||
'_download_params':
|
'language': 'deu',
|
||||||
{
|
'audio_channels': 2,
|
||||||
|
'_download_params': {
|
||||||
'stream_type': 'audio',
|
'stream_type': 'audio',
|
||||||
'duration': 370000000,
|
'duration': 370000000,
|
||||||
'timescale': 10000000,
|
'timescale': 10000000,
|
||||||
|
@ -1799,9 +1747,32 @@ def test_parse_ism_formats(self):
|
||||||
'bits_per_sample': 16,
|
'bits_per_sample': 16,
|
||||||
'nal_unit_length_field': 4
|
'nal_unit_length_field': 4
|
||||||
},
|
},
|
||||||
'audio_ext': 'isma',
|
}, {
|
||||||
'video_ext': 'none',
|
'format_id': 'audio_deu_1-224',
|
||||||
'abr': 127,
|
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||||
|
'manifest_url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||||
|
'ext': 'isma',
|
||||||
|
'tbr': 224,
|
||||||
|
'asr': 48000,
|
||||||
|
'vcodec': 'none',
|
||||||
|
'acodec': 'EC-3',
|
||||||
|
'protocol': 'ism',
|
||||||
|
'language': 'deu',
|
||||||
|
'audio_channels': 6,
|
||||||
|
'_download_params': {
|
||||||
|
'stream_type': 'audio',
|
||||||
|
'duration': 370000000,
|
||||||
|
'timescale': 10000000,
|
||||||
|
'width': 0,
|
||||||
|
'height': 0,
|
||||||
|
'fourcc': 'EC-3',
|
||||||
|
'language': 'deu',
|
||||||
|
'codec_private_data': '00063F000000AF87FBA7022DFB42A4D405CD93843BDD0700200F00',
|
||||||
|
'sampling_rate': 48000,
|
||||||
|
'channels': 6,
|
||||||
|
'bits_per_sample': 16,
|
||||||
|
'nal_unit_length_field': 4
|
||||||
|
},
|
||||||
}, {
|
}, {
|
||||||
'format_id': 'video_deu-23',
|
'format_id': 'video_deu-23',
|
||||||
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||||
|
@ -1813,8 +1784,8 @@ def test_parse_ism_formats(self):
|
||||||
'vcodec': 'AVC1',
|
'vcodec': 'AVC1',
|
||||||
'acodec': 'none',
|
'acodec': 'none',
|
||||||
'protocol': 'ism',
|
'protocol': 'ism',
|
||||||
'_download_params':
|
'language': 'deu',
|
||||||
{
|
'_download_params': {
|
||||||
'stream_type': 'video',
|
'stream_type': 'video',
|
||||||
'duration': 370000000,
|
'duration': 370000000,
|
||||||
'timescale': 10000000,
|
'timescale': 10000000,
|
||||||
|
@ -1827,9 +1798,6 @@ def test_parse_ism_formats(self):
|
||||||
'bits_per_sample': 16,
|
'bits_per_sample': 16,
|
||||||
'nal_unit_length_field': 4
|
'nal_unit_length_field': 4
|
||||||
},
|
},
|
||||||
'video_ext': 'ismv',
|
|
||||||
'audio_ext': 'none',
|
|
||||||
'vbr': 23,
|
|
||||||
}, {
|
}, {
|
||||||
'format_id': 'video_deu-403',
|
'format_id': 'video_deu-403',
|
||||||
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||||
|
@ -1841,8 +1809,8 @@ def test_parse_ism_formats(self):
|
||||||
'vcodec': 'AVC1',
|
'vcodec': 'AVC1',
|
||||||
'acodec': 'none',
|
'acodec': 'none',
|
||||||
'protocol': 'ism',
|
'protocol': 'ism',
|
||||||
'_download_params':
|
'language': 'deu',
|
||||||
{
|
'_download_params': {
|
||||||
'stream_type': 'video',
|
'stream_type': 'video',
|
||||||
'duration': 370000000,
|
'duration': 370000000,
|
||||||
'timescale': 10000000,
|
'timescale': 10000000,
|
||||||
|
@ -1855,9 +1823,6 @@ def test_parse_ism_formats(self):
|
||||||
'bits_per_sample': 16,
|
'bits_per_sample': 16,
|
||||||
'nal_unit_length_field': 4
|
'nal_unit_length_field': 4
|
||||||
},
|
},
|
||||||
'video_ext': 'ismv',
|
|
||||||
'audio_ext': 'none',
|
|
||||||
'vbr': 403,
|
|
||||||
}, {
|
}, {
|
||||||
'format_id': 'video_deu-680',
|
'format_id': 'video_deu-680',
|
||||||
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||||
|
@ -1869,8 +1834,8 @@ def test_parse_ism_formats(self):
|
||||||
'vcodec': 'AVC1',
|
'vcodec': 'AVC1',
|
||||||
'acodec': 'none',
|
'acodec': 'none',
|
||||||
'protocol': 'ism',
|
'protocol': 'ism',
|
||||||
'_download_params':
|
'language': 'deu',
|
||||||
{
|
'_download_params': {
|
||||||
'stream_type': 'video',
|
'stream_type': 'video',
|
||||||
'duration': 370000000,
|
'duration': 370000000,
|
||||||
'timescale': 10000000,
|
'timescale': 10000000,
|
||||||
|
@ -1883,9 +1848,6 @@ def test_parse_ism_formats(self):
|
||||||
'bits_per_sample': 16,
|
'bits_per_sample': 16,
|
||||||
'nal_unit_length_field': 4
|
'nal_unit_length_field': 4
|
||||||
},
|
},
|
||||||
'video_ext': 'ismv',
|
|
||||||
'audio_ext': 'none',
|
|
||||||
'vbr': 680,
|
|
||||||
}, {
|
}, {
|
||||||
'format_id': 'video_deu-1253',
|
'format_id': 'video_deu-1253',
|
||||||
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||||
|
@ -1897,8 +1859,9 @@ def test_parse_ism_formats(self):
|
||||||
'vcodec': 'AVC1',
|
'vcodec': 'AVC1',
|
||||||
'acodec': 'none',
|
'acodec': 'none',
|
||||||
'protocol': 'ism',
|
'protocol': 'ism',
|
||||||
'_download_params':
|
'vbr': 1253,
|
||||||
{
|
'language': 'deu',
|
||||||
|
'_download_params': {
|
||||||
'stream_type': 'video',
|
'stream_type': 'video',
|
||||||
'duration': 370000000,
|
'duration': 370000000,
|
||||||
'timescale': 10000000,
|
'timescale': 10000000,
|
||||||
|
@ -1911,9 +1874,6 @@ def test_parse_ism_formats(self):
|
||||||
'bits_per_sample': 16,
|
'bits_per_sample': 16,
|
||||||
'nal_unit_length_field': 4
|
'nal_unit_length_field': 4
|
||||||
},
|
},
|
||||||
'video_ext': 'ismv',
|
|
||||||
'audio_ext': 'none',
|
|
||||||
'vbr': 1253,
|
|
||||||
}, {
|
}, {
|
||||||
'format_id': 'video_deu-2121',
|
'format_id': 'video_deu-2121',
|
||||||
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||||
|
@ -1925,8 +1885,8 @@ def test_parse_ism_formats(self):
|
||||||
'vcodec': 'AVC1',
|
'vcodec': 'AVC1',
|
||||||
'acodec': 'none',
|
'acodec': 'none',
|
||||||
'protocol': 'ism',
|
'protocol': 'ism',
|
||||||
'_download_params':
|
'language': 'deu',
|
||||||
{
|
'_download_params': {
|
||||||
'stream_type': 'video',
|
'stream_type': 'video',
|
||||||
'duration': 370000000,
|
'duration': 370000000,
|
||||||
'timescale': 10000000,
|
'timescale': 10000000,
|
||||||
|
@ -1939,9 +1899,6 @@ def test_parse_ism_formats(self):
|
||||||
'bits_per_sample': 16,
|
'bits_per_sample': 16,
|
||||||
'nal_unit_length_field': 4
|
'nal_unit_length_field': 4
|
||||||
},
|
},
|
||||||
'video_ext': 'ismv',
|
|
||||||
'audio_ext': 'none',
|
|
||||||
'vbr': 2121,
|
|
||||||
}, {
|
}, {
|
||||||
'format_id': 'video_deu-3275',
|
'format_id': 'video_deu-3275',
|
||||||
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||||
|
@ -1953,8 +1910,8 @@ def test_parse_ism_formats(self):
|
||||||
'vcodec': 'AVC1',
|
'vcodec': 'AVC1',
|
||||||
'acodec': 'none',
|
'acodec': 'none',
|
||||||
'protocol': 'ism',
|
'protocol': 'ism',
|
||||||
'_download_params':
|
'language': 'deu',
|
||||||
{
|
'_download_params': {
|
||||||
'stream_type': 'video',
|
'stream_type': 'video',
|
||||||
'duration': 370000000,
|
'duration': 370000000,
|
||||||
'timescale': 10000000,
|
'timescale': 10000000,
|
||||||
|
@ -1967,9 +1924,6 @@ def test_parse_ism_formats(self):
|
||||||
'bits_per_sample': 16,
|
'bits_per_sample': 16,
|
||||||
'nal_unit_length_field': 4
|
'nal_unit_length_field': 4
|
||||||
},
|
},
|
||||||
'video_ext': 'ismv',
|
|
||||||
'audio_ext': 'none',
|
|
||||||
'vbr': 3275,
|
|
||||||
}, {
|
}, {
|
||||||
'format_id': 'video_deu-5300',
|
'format_id': 'video_deu-5300',
|
||||||
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||||
|
@ -1981,8 +1935,8 @@ def test_parse_ism_formats(self):
|
||||||
'vcodec': 'AVC1',
|
'vcodec': 'AVC1',
|
||||||
'acodec': 'none',
|
'acodec': 'none',
|
||||||
'protocol': 'ism',
|
'protocol': 'ism',
|
||||||
'_download_params':
|
'language': 'deu',
|
||||||
{
|
'_download_params': {
|
||||||
'stream_type': 'video',
|
'stream_type': 'video',
|
||||||
'duration': 370000000,
|
'duration': 370000000,
|
||||||
'timescale': 10000000,
|
'timescale': 10000000,
|
||||||
|
@ -1995,9 +1949,6 @@ def test_parse_ism_formats(self):
|
||||||
'bits_per_sample': 16,
|
'bits_per_sample': 16,
|
||||||
'nal_unit_length_field': 4
|
'nal_unit_length_field': 4
|
||||||
},
|
},
|
||||||
'video_ext': 'ismv',
|
|
||||||
'audio_ext': 'none',
|
|
||||||
'vbr': 5300,
|
|
||||||
}, {
|
}, {
|
||||||
'format_id': 'video_deu-8079',
|
'format_id': 'video_deu-8079',
|
||||||
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
'url': 'https://smstr01.dmm.t-online.de/smooth24/smoothstream_m1/streaming/sony/9221438342941275747/636887760842957027/25_km_h-Trailer-9221571562372022953_deu_20_1300k_HD_H_264_ISMV.ism/Manifest',
|
||||||
|
@ -2009,8 +1960,8 @@ def test_parse_ism_formats(self):
|
||||||
'vcodec': 'AVC1',
|
'vcodec': 'AVC1',
|
||||||
'acodec': 'none',
|
'acodec': 'none',
|
||||||
'protocol': 'ism',
|
'protocol': 'ism',
|
||||||
'_download_params':
|
'language': 'deu',
|
||||||
{
|
'_download_params': {
|
||||||
'stream_type': 'video',
|
'stream_type': 'video',
|
||||||
'duration': 370000000,
|
'duration': 370000000,
|
||||||
'timescale': 10000000,
|
'timescale': 10000000,
|
||||||
|
@ -2023,9 +1974,6 @@ def test_parse_ism_formats(self):
|
||||||
'bits_per_sample': 16,
|
'bits_per_sample': 16,
|
||||||
'nal_unit_length_field': 4
|
'nal_unit_length_field': 4
|
||||||
},
|
},
|
||||||
'video_ext': 'ismv',
|
|
||||||
'audio_ext': 'none',
|
|
||||||
'vbr': 8079,
|
|
||||||
}],
|
}],
|
||||||
{},
|
{},
|
||||||
),
|
),
|
||||||
|
|
|
@ -10,9 +10,8 @@
|
||||||
|
|
||||||
import copy
|
import copy
|
||||||
import json
|
import json
|
||||||
import urllib.error
|
|
||||||
|
|
||||||
from test.helper import FakeYDL, assertRegexpMatches
|
from test.helper import FakeYDL, assertRegexpMatches, try_rm
|
||||||
from yt_dlp import YoutubeDL
|
from yt_dlp import YoutubeDL
|
||||||
from yt_dlp.compat import compat_os_name
|
from yt_dlp.compat import compat_os_name
|
||||||
from yt_dlp.extractor import YoutubeIE
|
from yt_dlp.extractor import YoutubeIE
|
||||||
|
@ -25,6 +24,7 @@
|
||||||
int_or_none,
|
int_or_none,
|
||||||
match_filter_func,
|
match_filter_func,
|
||||||
)
|
)
|
||||||
|
from yt_dlp.utils.traversal import traverse_obj
|
||||||
|
|
||||||
TEST_URL = 'http://localhost/sample.mp4'
|
TEST_URL = 'http://localhost/sample.mp4'
|
||||||
|
|
||||||
|
@ -140,6 +140,8 @@ def test(inp, *expected, multi=False):
|
||||||
test('example-with-dashes', 'example-with-dashes')
|
test('example-with-dashes', 'example-with-dashes')
|
||||||
test('all', '2', '47', '45', 'example-with-dashes', '35')
|
test('all', '2', '47', '45', 'example-with-dashes', '35')
|
||||||
test('mergeall', '2+47+45+example-with-dashes+35', multi=True)
|
test('mergeall', '2+47+45+example-with-dashes+35', multi=True)
|
||||||
|
# See: https://github.com/yt-dlp/yt-dlp/pulls/8797
|
||||||
|
test('7_a/worst', '35')
|
||||||
|
|
||||||
def test_format_selection_audio(self):
|
def test_format_selection_audio(self):
|
||||||
formats = [
|
formats = [
|
||||||
|
@ -669,7 +671,7 @@ def test(tmpl, expected, *, info=None, **params):
|
||||||
for (name, got), expect in zip((('outtmpl', out), ('filename', fname)), expected):
|
for (name, got), expect in zip((('outtmpl', out), ('filename', fname)), expected):
|
||||||
if callable(expect):
|
if callable(expect):
|
||||||
self.assertTrue(expect(got), f'Wrong {name} from {tmpl}')
|
self.assertTrue(expect(got), f'Wrong {name} from {tmpl}')
|
||||||
else:
|
elif expect is not None:
|
||||||
self.assertEqual(got, expect, f'Wrong {name} from {tmpl}')
|
self.assertEqual(got, expect, f'Wrong {name} from {tmpl}')
|
||||||
|
|
||||||
# Side-effects
|
# Side-effects
|
||||||
|
@ -684,7 +686,8 @@ def test(tmpl, expected, *, info=None, **params):
|
||||||
test('%(id)s.%(ext)s', '1234.mp4')
|
test('%(id)s.%(ext)s', '1234.mp4')
|
||||||
test('%(duration_string)s', ('27:46:40', '27-46-40'))
|
test('%(duration_string)s', ('27:46:40', '27-46-40'))
|
||||||
test('%(resolution)s', '1080p')
|
test('%(resolution)s', '1080p')
|
||||||
test('%(playlist_index)s', '001')
|
test('%(playlist_index|)s', '001')
|
||||||
|
test('%(playlist_index&{}!)s', '1!')
|
||||||
test('%(playlist_autonumber)s', '02')
|
test('%(playlist_autonumber)s', '02')
|
||||||
test('%(autonumber)s', '00001')
|
test('%(autonumber)s', '00001')
|
||||||
test('%(autonumber+2)03d', '005', autonumber_start=3)
|
test('%(autonumber+2)03d', '005', autonumber_start=3)
|
||||||
|
@ -727,7 +730,7 @@ def expect_same_infodict(out):
|
||||||
self.assertEqual(got_dict.get(info_field), expected, info_field)
|
self.assertEqual(got_dict.get(info_field), expected, info_field)
|
||||||
return True
|
return True
|
||||||
|
|
||||||
test('%()j', (expect_same_infodict, str))
|
test('%()j', (expect_same_infodict, None))
|
||||||
|
|
||||||
# NA placeholder
|
# NA placeholder
|
||||||
NA_TEST_OUTTMPL = '%(uploader_date)s-%(width)d-%(x|def)s-%(id)s.%(ext)s'
|
NA_TEST_OUTTMPL = '%(uploader_date)s-%(width)d-%(x|def)s-%(id)s.%(ext)s'
|
||||||
|
@ -755,20 +758,23 @@ def expect_same_infodict(out):
|
||||||
test('%(ext)c', 'm')
|
test('%(ext)c', 'm')
|
||||||
test('%(id)d %(id)r', "1234 '1234'")
|
test('%(id)d %(id)r', "1234 '1234'")
|
||||||
test('%(id)r %(height)r', "'1234' 1080")
|
test('%(id)r %(height)r', "'1234' 1080")
|
||||||
|
test('%(title5)a %(height)a', (R"'\xe1\xe9\xed \U0001d400' 1080", None))
|
||||||
test('%(ext)s-%(ext|def)d', 'mp4-def')
|
test('%(ext)s-%(ext|def)d', 'mp4-def')
|
||||||
test('%(width|0)04d', '0000')
|
test('%(width|0)04d', '0')
|
||||||
test('a%(width|)d', 'a', outtmpl_na_placeholder='none')
|
test('a%(width|b)d', 'ab', outtmpl_na_placeholder='none')
|
||||||
|
|
||||||
FORMATS = self.outtmpl_info['formats']
|
FORMATS = self.outtmpl_info['formats']
|
||||||
sanitize = lambda x: x.replace(':', ':').replace('"', """).replace('\n', ' ')
|
|
||||||
|
|
||||||
# Custom type casting
|
# Custom type casting
|
||||||
test('%(formats.:.id)l', 'id 1, id 2, id 3')
|
test('%(formats.:.id)l', 'id 1, id 2, id 3')
|
||||||
test('%(formats.:.id)#l', ('id 1\nid 2\nid 3', 'id 1 id 2 id 3'))
|
test('%(formats.:.id)#l', ('id 1\nid 2\nid 3', 'id 1 id 2 id 3'))
|
||||||
test('%(ext)l', 'mp4')
|
test('%(ext)l', 'mp4')
|
||||||
test('%(formats.:.id) 18l', ' id 1, id 2, id 3')
|
test('%(formats.:.id) 18l', ' id 1, id 2, id 3')
|
||||||
test('%(formats)j', (json.dumps(FORMATS), sanitize(json.dumps(FORMATS))))
|
test('%(formats)j', (json.dumps(FORMATS), None))
|
||||||
test('%(formats)#j', (json.dumps(FORMATS, indent=4), sanitize(json.dumps(FORMATS, indent=4))))
|
test('%(formats)#j', (
|
||||||
|
json.dumps(FORMATS, indent=4),
|
||||||
|
json.dumps(FORMATS, indent=4).replace(':', ':').replace('"', """).replace('\n', ' ')
|
||||||
|
))
|
||||||
test('%(title5).3B', 'á')
|
test('%(title5).3B', 'á')
|
||||||
test('%(title5)U', 'áéí 𝐀')
|
test('%(title5)U', 'áéí 𝐀')
|
||||||
test('%(title5)#U', 'a\u0301e\u0301i\u0301 𝐀')
|
test('%(title5)#U', 'a\u0301e\u0301i\u0301 𝐀')
|
||||||
|
@ -780,9 +786,9 @@ def expect_same_infodict(out):
|
||||||
test('%(title4)#S', 'foo_bar_test')
|
test('%(title4)#S', 'foo_bar_test')
|
||||||
test('%(title4).10S', ('foo "bar" ', 'foo "bar"' + ('#' if compat_os_name == 'nt' else ' ')))
|
test('%(title4).10S', ('foo "bar" ', 'foo "bar"' + ('#' if compat_os_name == 'nt' else ' ')))
|
||||||
if compat_os_name == 'nt':
|
if compat_os_name == 'nt':
|
||||||
test('%(title4)q', ('"foo \\"bar\\" test"', ""foo ⧹"bar⧹" test""))
|
test('%(title4)q', ('"foo ""bar"" test"', None))
|
||||||
test('%(formats.:.id)#q', ('"id 1" "id 2" "id 3"', '"id 1" "id 2" "id 3"'))
|
test('%(formats.:.id)#q', ('"id 1" "id 2" "id 3"', None))
|
||||||
test('%(formats.0.id)#q', ('"id 1"', '"id 1"'))
|
test('%(formats.0.id)#q', ('"id 1"', None))
|
||||||
else:
|
else:
|
||||||
test('%(title4)q', ('\'foo "bar" test\'', '\'foo "bar" test\''))
|
test('%(title4)q', ('\'foo "bar" test\'', '\'foo "bar" test\''))
|
||||||
test('%(formats.:.id)#q', "'id 1' 'id 2' 'id 3'")
|
test('%(formats.:.id)#q', "'id 1' 'id 2' 'id 3'")
|
||||||
|
@ -793,8 +799,9 @@ def expect_same_infodict(out):
|
||||||
test('%(title|%)s %(title|%%)s', '% %%')
|
test('%(title|%)s %(title|%%)s', '% %%')
|
||||||
test('%(id+1-height+3)05d', '00158')
|
test('%(id+1-height+3)05d', '00158')
|
||||||
test('%(width+100)05d', 'NA')
|
test('%(width+100)05d', 'NA')
|
||||||
test('%(formats.0) 15s', ('% 15s' % FORMATS[0], '% 15s' % sanitize(str(FORMATS[0]))))
|
test('%(filesize*8)d', '8192')
|
||||||
test('%(formats.0)r', (repr(FORMATS[0]), sanitize(repr(FORMATS[0]))))
|
test('%(formats.0) 15s', ('% 15s' % FORMATS[0], None))
|
||||||
|
test('%(formats.0)r', (repr(FORMATS[0]), None))
|
||||||
test('%(height.0)03d', '001')
|
test('%(height.0)03d', '001')
|
||||||
test('%(-height.0)04d', '-001')
|
test('%(-height.0)04d', '-001')
|
||||||
test('%(formats.-1.id)s', FORMATS[-1]['id'])
|
test('%(formats.-1.id)s', FORMATS[-1]['id'])
|
||||||
|
@ -806,7 +813,7 @@ def expect_same_infodict(out):
|
||||||
out = json.dumps([{'id': f['id'], 'height.:2': str(f['height'])[:2]}
|
out = json.dumps([{'id': f['id'], 'height.:2': str(f['height'])[:2]}
|
||||||
if 'height' in f else {'id': f['id']}
|
if 'height' in f else {'id': f['id']}
|
||||||
for f in FORMATS])
|
for f in FORMATS])
|
||||||
test('%(formats.:.{id,height.:2})j', (out, sanitize(out)))
|
test('%(formats.:.{id,height.:2})j', (out, None))
|
||||||
test('%(formats.:.{id,height}.id)l', ', '.join(f['id'] for f in FORMATS))
|
test('%(formats.:.{id,height}.id)l', ', '.join(f['id'] for f in FORMATS))
|
||||||
test('%(.{id,title})j', ('{"id": "1234"}', '{"id": "1234"}'))
|
test('%(.{id,title})j', ('{"id": "1234"}', '{"id": "1234"}'))
|
||||||
|
|
||||||
|
@ -822,6 +829,11 @@ def expect_same_infodict(out):
|
||||||
test('%(title&foo|baz)s.bar', 'baz.bar')
|
test('%(title&foo|baz)s.bar', 'baz.bar')
|
||||||
test('%(x,id&foo|baz)s.bar', 'foo.bar')
|
test('%(x,id&foo|baz)s.bar', 'foo.bar')
|
||||||
test('%(x,title&foo|baz)s.bar', 'baz.bar')
|
test('%(x,title&foo|baz)s.bar', 'baz.bar')
|
||||||
|
test('%(id&a\nb|)s', ('a\nb', 'a b'))
|
||||||
|
test('%(id&hi {:>10} {}|)s', 'hi 1234 1234')
|
||||||
|
test(R'%(id&{0} {}|)s', 'NA')
|
||||||
|
test(R'%(id&{0.1}|)s', 'NA')
|
||||||
|
test('%(height&{:,d})S', '1,080')
|
||||||
|
|
||||||
# Laziness
|
# Laziness
|
||||||
def gen():
|
def gen():
|
||||||
|
@ -867,12 +879,12 @@ def test_postprocessors(self):
|
||||||
|
|
||||||
class SimplePP(PostProcessor):
|
class SimplePP(PostProcessor):
|
||||||
def run(self, info):
|
def run(self, info):
|
||||||
with open(audiofile, 'wt') as f:
|
with open(audiofile, 'w') as f:
|
||||||
f.write('EXAMPLE')
|
f.write('EXAMPLE')
|
||||||
return [info['filepath']], info
|
return [info['filepath']], info
|
||||||
|
|
||||||
def run_pp(params, PP):
|
def run_pp(params, PP):
|
||||||
with open(filename, 'wt') as f:
|
with open(filename, 'w') as f:
|
||||||
f.write('EXAMPLE')
|
f.write('EXAMPLE')
|
||||||
ydl = YoutubeDL(params)
|
ydl = YoutubeDL(params)
|
||||||
ydl.add_post_processor(PP())
|
ydl.add_post_processor(PP())
|
||||||
|
@ -891,7 +903,7 @@ def run_pp(params, PP):
|
||||||
|
|
||||||
class ModifierPP(PostProcessor):
|
class ModifierPP(PostProcessor):
|
||||||
def run(self, info):
|
def run(self, info):
|
||||||
with open(info['filepath'], 'wt') as f:
|
with open(info['filepath'], 'w') as f:
|
||||||
f.write('MODIFIED')
|
f.write('MODIFIED')
|
||||||
return [], info
|
return [], info
|
||||||
|
|
||||||
|
@ -1093,11 +1105,6 @@ def test_selection(params, expected_ids, evaluate_all=False):
|
||||||
test_selection({'playlist_items': '-15::2'}, INDICES[1::2], True)
|
test_selection({'playlist_items': '-15::2'}, INDICES[1::2], True)
|
||||||
test_selection({'playlist_items': '-15::15'}, [], True)
|
test_selection({'playlist_items': '-15::15'}, [], True)
|
||||||
|
|
||||||
def test_urlopen_no_file_protocol(self):
|
|
||||||
# see https://github.com/ytdl-org/youtube-dl/issues/8227
|
|
||||||
ydl = YDL()
|
|
||||||
self.assertRaises(urllib.error.URLError, ydl.urlopen, 'file:///etc/passwd')
|
|
||||||
|
|
||||||
def test_do_not_override_ie_key_in_url_transparent(self):
|
def test_do_not_override_ie_key_in_url_transparent(self):
|
||||||
ydl = YDL()
|
ydl = YDL()
|
||||||
|
|
||||||
|
@ -1211,6 +1218,129 @@ def _real_extract(self, url):
|
||||||
self.assertEqual(downloaded['extractor'], 'Video')
|
self.assertEqual(downloaded['extractor'], 'Video')
|
||||||
self.assertEqual(downloaded['extractor_key'], 'Video')
|
self.assertEqual(downloaded['extractor_key'], 'Video')
|
||||||
|
|
||||||
|
def test_header_cookies(self):
|
||||||
|
from http.cookiejar import Cookie
|
||||||
|
|
||||||
|
ydl = FakeYDL()
|
||||||
|
ydl.report_warning = lambda *_, **__: None
|
||||||
|
|
||||||
|
def cookie(name, value, version=None, domain='', path='', secure=False, expires=None):
|
||||||
|
return Cookie(
|
||||||
|
version or 0, name, value, None, False,
|
||||||
|
domain, bool(domain), bool(domain), path, bool(path),
|
||||||
|
secure, expires, False, None, None, rest={})
|
||||||
|
|
||||||
|
_test_url = 'https://yt.dlp/test'
|
||||||
|
|
||||||
|
def test(encoded_cookies, cookies, *, headers=False, round_trip=None, error_re=None):
|
||||||
|
def _test():
|
||||||
|
ydl.cookiejar.clear()
|
||||||
|
ydl._load_cookies(encoded_cookies, autoscope=headers)
|
||||||
|
if headers:
|
||||||
|
ydl._apply_header_cookies(_test_url)
|
||||||
|
data = {'url': _test_url}
|
||||||
|
ydl._calc_headers(data)
|
||||||
|
self.assertCountEqual(
|
||||||
|
map(vars, ydl.cookiejar), map(vars, cookies),
|
||||||
|
'Extracted cookiejar.Cookie is not the same')
|
||||||
|
if not headers:
|
||||||
|
self.assertEqual(
|
||||||
|
data.get('cookies'), round_trip or encoded_cookies,
|
||||||
|
'Cookie is not the same as round trip')
|
||||||
|
ydl.__dict__['_YoutubeDL__header_cookies'] = []
|
||||||
|
|
||||||
|
with self.subTest(msg=encoded_cookies):
|
||||||
|
if not error_re:
|
||||||
|
_test()
|
||||||
|
return
|
||||||
|
with self.assertRaisesRegex(Exception, error_re):
|
||||||
|
_test()
|
||||||
|
|
||||||
|
test('test=value; Domain=.yt.dlp', [cookie('test', 'value', domain='.yt.dlp')])
|
||||||
|
test('test=value', [cookie('test', 'value')], error_re=r'Unscoped cookies are not allowed')
|
||||||
|
test('cookie1=value1; Domain=.yt.dlp; Path=/test; cookie2=value2; Domain=.yt.dlp; Path=/', [
|
||||||
|
cookie('cookie1', 'value1', domain='.yt.dlp', path='/test'),
|
||||||
|
cookie('cookie2', 'value2', domain='.yt.dlp', path='/')])
|
||||||
|
test('test=value; Domain=.yt.dlp; Path=/test; Secure; Expires=9999999999', [
|
||||||
|
cookie('test', 'value', domain='.yt.dlp', path='/test', secure=True, expires=9999999999)])
|
||||||
|
test('test="value; "; path=/test; domain=.yt.dlp', [
|
||||||
|
cookie('test', 'value; ', domain='.yt.dlp', path='/test')],
|
||||||
|
round_trip='test="value\\073 "; Domain=.yt.dlp; Path=/test')
|
||||||
|
test('name=; Domain=.yt.dlp', [cookie('name', '', domain='.yt.dlp')],
|
||||||
|
round_trip='name=""; Domain=.yt.dlp')
|
||||||
|
|
||||||
|
test('test=value', [cookie('test', 'value', domain='.yt.dlp')], headers=True)
|
||||||
|
test('cookie1=value; Domain=.yt.dlp; cookie2=value', [], headers=True, error_re=r'Invalid syntax')
|
||||||
|
ydl.deprecated_feature = ydl.report_error
|
||||||
|
test('test=value', [], headers=True, error_re=r'Passing cookies as a header is a potential security risk')
|
||||||
|
|
||||||
|
def test_infojson_cookies(self):
|
||||||
|
TEST_FILE = 'test_infojson_cookies.info.json'
|
||||||
|
TEST_URL = 'https://example.com/example.mp4'
|
||||||
|
COOKIES = 'a=b; Domain=.example.com; c=d; Domain=.example.com'
|
||||||
|
COOKIE_HEADER = {'Cookie': 'a=b; c=d'}
|
||||||
|
|
||||||
|
ydl = FakeYDL()
|
||||||
|
ydl.process_info = lambda x: ydl._write_info_json('test', x, TEST_FILE)
|
||||||
|
|
||||||
|
def make_info(info_header_cookies=False, fmts_header_cookies=False, cookies_field=False):
|
||||||
|
fmt = {'url': TEST_URL}
|
||||||
|
if fmts_header_cookies:
|
||||||
|
fmt['http_headers'] = COOKIE_HEADER
|
||||||
|
if cookies_field:
|
||||||
|
fmt['cookies'] = COOKIES
|
||||||
|
return _make_result([fmt], http_headers=COOKIE_HEADER if info_header_cookies else None)
|
||||||
|
|
||||||
|
def test(initial_info, note):
|
||||||
|
result = {}
|
||||||
|
result['processed'] = ydl.process_ie_result(initial_info)
|
||||||
|
self.assertTrue(ydl.cookiejar.get_cookies_for_url(TEST_URL),
|
||||||
|
msg=f'No cookies set in cookiejar after initial process when {note}')
|
||||||
|
ydl.cookiejar.clear()
|
||||||
|
with open(TEST_FILE) as infojson:
|
||||||
|
result['loaded'] = ydl.sanitize_info(json.load(infojson), True)
|
||||||
|
result['final'] = ydl.process_ie_result(result['loaded'].copy(), download=False)
|
||||||
|
self.assertTrue(ydl.cookiejar.get_cookies_for_url(TEST_URL),
|
||||||
|
msg=f'No cookies set in cookiejar after final process when {note}')
|
||||||
|
ydl.cookiejar.clear()
|
||||||
|
for key in ('processed', 'loaded', 'final'):
|
||||||
|
info = result[key]
|
||||||
|
self.assertIsNone(
|
||||||
|
traverse_obj(info, ((None, ('formats', 0)), 'http_headers', 'Cookie'), casesense=False, get_all=False),
|
||||||
|
msg=f'Cookie header not removed in {key} result when {note}')
|
||||||
|
self.assertEqual(
|
||||||
|
traverse_obj(info, ((None, ('formats', 0)), 'cookies'), get_all=False), COOKIES,
|
||||||
|
msg=f'No cookies field found in {key} result when {note}')
|
||||||
|
|
||||||
|
test({'url': TEST_URL, 'http_headers': COOKIE_HEADER, 'id': '1', 'title': 'x'}, 'no formats field')
|
||||||
|
test(make_info(info_header_cookies=True), 'info_dict header cokies')
|
||||||
|
test(make_info(fmts_header_cookies=True), 'format header cookies')
|
||||||
|
test(make_info(info_header_cookies=True, fmts_header_cookies=True), 'info_dict and format header cookies')
|
||||||
|
test(make_info(info_header_cookies=True, fmts_header_cookies=True, cookies_field=True), 'all cookies fields')
|
||||||
|
test(make_info(cookies_field=True), 'cookies format field')
|
||||||
|
test({'url': TEST_URL, 'cookies': COOKIES, 'id': '1', 'title': 'x'}, 'info_dict cookies field only')
|
||||||
|
|
||||||
|
try_rm(TEST_FILE)
|
||||||
|
|
||||||
|
def test_add_headers_cookie(self):
|
||||||
|
def check_for_cookie_header(result):
|
||||||
|
return traverse_obj(result, ((None, ('formats', 0)), 'http_headers', 'Cookie'), casesense=False, get_all=False)
|
||||||
|
|
||||||
|
ydl = FakeYDL({'http_headers': {'Cookie': 'a=b'}})
|
||||||
|
ydl._apply_header_cookies(_make_result([])['webpage_url']) # Scope to input webpage URL: .example.com
|
||||||
|
|
||||||
|
fmt = {'url': 'https://example.com/video.mp4'}
|
||||||
|
result = ydl.process_ie_result(_make_result([fmt]), download=False)
|
||||||
|
self.assertIsNone(check_for_cookie_header(result), msg='http_headers cookies in result info_dict')
|
||||||
|
self.assertEqual(result.get('cookies'), 'a=b; Domain=.example.com', msg='No cookies were set in cookies field')
|
||||||
|
self.assertIn('a=b', ydl.cookiejar.get_cookie_header(fmt['url']), msg='No cookies were set in cookiejar')
|
||||||
|
|
||||||
|
fmt = {'url': 'https://wrong.com/video.mp4'}
|
||||||
|
result = ydl.process_ie_result(_make_result([fmt]), download=False)
|
||||||
|
self.assertIsNone(check_for_cookie_header(result), msg='http_headers cookies for wrong domain')
|
||||||
|
self.assertFalse(result.get('cookies'), msg='Cookies set in cookies field for wrong domain')
|
||||||
|
self.assertFalse(ydl.cookiejar.get_cookie_header(fmt['url']), msg='Cookies set in cookiejar for wrong domain')
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
unittest.main()
|
unittest.main()
|
||||||
|
|
|
@ -11,16 +11,16 @@
|
||||||
import re
|
import re
|
||||||
import tempfile
|
import tempfile
|
||||||
|
|
||||||
from yt_dlp.utils import YoutubeDLCookieJar
|
from yt_dlp.cookies import YoutubeDLCookieJar
|
||||||
|
|
||||||
|
|
||||||
class TestYoutubeDLCookieJar(unittest.TestCase):
|
class TestYoutubeDLCookieJar(unittest.TestCase):
|
||||||
def test_keep_session_cookies(self):
|
def test_keep_session_cookies(self):
|
||||||
cookiejar = YoutubeDLCookieJar('./test/testdata/cookies/session_cookies.txt')
|
cookiejar = YoutubeDLCookieJar('./test/testdata/cookies/session_cookies.txt')
|
||||||
cookiejar.load(ignore_discard=True, ignore_expires=True)
|
cookiejar.load()
|
||||||
tf = tempfile.NamedTemporaryFile(delete=False)
|
tf = tempfile.NamedTemporaryFile(delete=False)
|
||||||
try:
|
try:
|
||||||
cookiejar.save(filename=tf.name, ignore_discard=True, ignore_expires=True)
|
cookiejar.save(filename=tf.name)
|
||||||
temp = tf.read().decode()
|
temp = tf.read().decode()
|
||||||
self.assertTrue(re.search(
|
self.assertTrue(re.search(
|
||||||
r'www\.foobar\.foobar\s+FALSE\s+/\s+TRUE\s+0\s+YoutubeDLExpiresEmpty\s+YoutubeDLExpiresEmptyValue', temp))
|
r'www\.foobar\.foobar\s+FALSE\s+/\s+TRUE\s+0\s+YoutubeDLExpiresEmpty\s+YoutubeDLExpiresEmptyValue', temp))
|
||||||
|
@ -32,7 +32,7 @@ def test_keep_session_cookies(self):
|
||||||
|
|
||||||
def test_strip_httponly_prefix(self):
|
def test_strip_httponly_prefix(self):
|
||||||
cookiejar = YoutubeDLCookieJar('./test/testdata/cookies/httponly_cookies.txt')
|
cookiejar = YoutubeDLCookieJar('./test/testdata/cookies/httponly_cookies.txt')
|
||||||
cookiejar.load(ignore_discard=True, ignore_expires=True)
|
cookiejar.load()
|
||||||
|
|
||||||
def assert_cookie_has_value(key):
|
def assert_cookie_has_value(key):
|
||||||
self.assertEqual(cookiejar._cookies['www.foobar.foobar']['/'][key].value, key + '_VALUE')
|
self.assertEqual(cookiejar._cookies['www.foobar.foobar']['/'][key].value, key + '_VALUE')
|
||||||
|
@ -42,11 +42,25 @@ def assert_cookie_has_value(key):
|
||||||
|
|
||||||
def test_malformed_cookies(self):
|
def test_malformed_cookies(self):
|
||||||
cookiejar = YoutubeDLCookieJar('./test/testdata/cookies/malformed_cookies.txt')
|
cookiejar = YoutubeDLCookieJar('./test/testdata/cookies/malformed_cookies.txt')
|
||||||
cookiejar.load(ignore_discard=True, ignore_expires=True)
|
cookiejar.load()
|
||||||
# Cookies should be empty since all malformed cookie file entries
|
# Cookies should be empty since all malformed cookie file entries
|
||||||
# will be ignored
|
# will be ignored
|
||||||
self.assertFalse(cookiejar._cookies)
|
self.assertFalse(cookiejar._cookies)
|
||||||
|
|
||||||
|
def test_get_cookie_header(self):
|
||||||
|
cookiejar = YoutubeDLCookieJar('./test/testdata/cookies/httponly_cookies.txt')
|
||||||
|
cookiejar.load()
|
||||||
|
header = cookiejar.get_cookie_header('https://www.foobar.foobar')
|
||||||
|
self.assertIn('HTTPONLY_COOKIE', header)
|
||||||
|
|
||||||
|
def test_get_cookies_for_url(self):
|
||||||
|
cookiejar = YoutubeDLCookieJar('./test/testdata/cookies/session_cookies.txt')
|
||||||
|
cookiejar.load()
|
||||||
|
cookies = cookiejar.get_cookies_for_url('https://www.foobar.foobar/')
|
||||||
|
self.assertEqual(len(cookies), 2)
|
||||||
|
cookies = cookiejar.get_cookies_for_url('https://foobar.foobar/')
|
||||||
|
self.assertFalse(cookies)
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
unittest.main()
|
unittest.main()
|
||||||
|
|
|
@ -26,7 +26,7 @@
|
||||||
key_expansion,
|
key_expansion,
|
||||||
pad_block,
|
pad_block,
|
||||||
)
|
)
|
||||||
from yt_dlp.dependencies import Cryptodome_AES
|
from yt_dlp.dependencies import Cryptodome
|
||||||
from yt_dlp.utils import bytes_to_intlist, intlist_to_bytes
|
from yt_dlp.utils import bytes_to_intlist, intlist_to_bytes
|
||||||
|
|
||||||
# the encrypted data can be generate with 'devscripts/generate_aes_testdata.py'
|
# the encrypted data can be generate with 'devscripts/generate_aes_testdata.py'
|
||||||
|
@ -48,7 +48,7 @@ def test_cbc_decrypt(self):
|
||||||
data = b'\x97\x92+\xe5\x0b\xc3\x18\x91ky9m&\xb3\xb5@\xe6\x27\xc2\x96.\xc8u\x88\xab9-[\x9e|\xf1\xcd'
|
data = b'\x97\x92+\xe5\x0b\xc3\x18\x91ky9m&\xb3\xb5@\xe6\x27\xc2\x96.\xc8u\x88\xab9-[\x9e|\xf1\xcd'
|
||||||
decrypted = intlist_to_bytes(aes_cbc_decrypt(bytes_to_intlist(data), self.key, self.iv))
|
decrypted = intlist_to_bytes(aes_cbc_decrypt(bytes_to_intlist(data), self.key, self.iv))
|
||||||
self.assertEqual(decrypted.rstrip(b'\x08'), self.secret_msg)
|
self.assertEqual(decrypted.rstrip(b'\x08'), self.secret_msg)
|
||||||
if Cryptodome_AES:
|
if Cryptodome.AES:
|
||||||
decrypted = aes_cbc_decrypt_bytes(data, intlist_to_bytes(self.key), intlist_to_bytes(self.iv))
|
decrypted = aes_cbc_decrypt_bytes(data, intlist_to_bytes(self.key), intlist_to_bytes(self.iv))
|
||||||
self.assertEqual(decrypted.rstrip(b'\x08'), self.secret_msg)
|
self.assertEqual(decrypted.rstrip(b'\x08'), self.secret_msg)
|
||||||
|
|
||||||
|
@ -78,7 +78,7 @@ def test_gcm_decrypt(self):
|
||||||
decrypted = intlist_to_bytes(aes_gcm_decrypt_and_verify(
|
decrypted = intlist_to_bytes(aes_gcm_decrypt_and_verify(
|
||||||
bytes_to_intlist(data), self.key, bytes_to_intlist(authentication_tag), self.iv[:12]))
|
bytes_to_intlist(data), self.key, bytes_to_intlist(authentication_tag), self.iv[:12]))
|
||||||
self.assertEqual(decrypted.rstrip(b'\x08'), self.secret_msg)
|
self.assertEqual(decrypted.rstrip(b'\x08'), self.secret_msg)
|
||||||
if Cryptodome_AES:
|
if Cryptodome.AES:
|
||||||
decrypted = aes_gcm_decrypt_and_verify_bytes(
|
decrypted = aes_gcm_decrypt_and_verify_bytes(
|
||||||
data, intlist_to_bytes(self.key), authentication_tag, intlist_to_bytes(self.iv[:12]))
|
data, intlist_to_bytes(self.key), authentication_tag, intlist_to_bytes(self.iv[:12]))
|
||||||
self.assertEqual(decrypted.rstrip(b'\x08'), self.secret_msg)
|
self.assertEqual(decrypted.rstrip(b'\x08'), self.secret_msg)
|
||||||
|
|
|
@ -10,6 +10,7 @@
|
||||||
|
|
||||||
from test.helper import is_download_test, try_rm
|
from test.helper import is_download_test, try_rm
|
||||||
from yt_dlp import YoutubeDL
|
from yt_dlp import YoutubeDL
|
||||||
|
from yt_dlp.utils import DownloadError
|
||||||
|
|
||||||
|
|
||||||
def _download_restricted(url, filename, age):
|
def _download_restricted(url, filename, age):
|
||||||
|
@ -25,10 +26,14 @@ def _download_restricted(url, filename, age):
|
||||||
ydl.add_default_info_extractors()
|
ydl.add_default_info_extractors()
|
||||||
json_filename = os.path.splitext(filename)[0] + '.info.json'
|
json_filename = os.path.splitext(filename)[0] + '.info.json'
|
||||||
try_rm(json_filename)
|
try_rm(json_filename)
|
||||||
ydl.download([url])
|
try:
|
||||||
res = os.path.exists(json_filename)
|
ydl.download([url])
|
||||||
try_rm(json_filename)
|
except DownloadError:
|
||||||
return res
|
pass
|
||||||
|
else:
|
||||||
|
return os.path.exists(json_filename)
|
||||||
|
finally:
|
||||||
|
try_rm(json_filename)
|
||||||
|
|
||||||
|
|
||||||
@is_download_test
|
@is_download_test
|
||||||
|
@ -38,12 +43,12 @@ def _assert_restricted(self, url, filename, age, old_age=None):
|
||||||
self.assertFalse(_download_restricted(url, filename, age))
|
self.assertFalse(_download_restricted(url, filename, age))
|
||||||
|
|
||||||
def test_youtube(self):
|
def test_youtube(self):
|
||||||
self._assert_restricted('07FYdnEawAQ', '07FYdnEawAQ.mp4', 10)
|
self._assert_restricted('HtVdAasjOgU', 'HtVdAasjOgU.mp4', 10)
|
||||||
|
|
||||||
def test_youporn(self):
|
def test_youporn(self):
|
||||||
self._assert_restricted(
|
self._assert_restricted(
|
||||||
'http://www.youporn.com/watch/505835/sex-ed-is-it-safe-to-masturbate-daily/',
|
'https://www.youporn.com/watch/16715086/sex-ed-in-detention-18-asmr/',
|
||||||
'505835.mp4', 2, old_age=25)
|
'16715086.mp4', 2, old_age=25)
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
|
|
|
@ -9,15 +9,16 @@
|
||||||
|
|
||||||
|
|
||||||
import struct
|
import struct
|
||||||
import urllib.parse
|
|
||||||
|
|
||||||
from yt_dlp import compat
|
from yt_dlp import compat
|
||||||
|
from yt_dlp.compat import urllib # isort: split
|
||||||
from yt_dlp.compat import (
|
from yt_dlp.compat import (
|
||||||
compat_etree_fromstring,
|
compat_etree_fromstring,
|
||||||
compat_expanduser,
|
compat_expanduser,
|
||||||
compat_urllib_parse_unquote,
|
compat_urllib_parse_unquote,
|
||||||
compat_urllib_parse_urlencode,
|
compat_urllib_parse_urlencode,
|
||||||
)
|
)
|
||||||
|
from yt_dlp.compat.urllib.request import getproxies
|
||||||
|
|
||||||
|
|
||||||
class TestCompat(unittest.TestCase):
|
class TestCompat(unittest.TestCase):
|
||||||
|
@ -28,8 +29,10 @@ def test_compat_passthrough(self):
|
||||||
with self.assertWarns(DeprecationWarning):
|
with self.assertWarns(DeprecationWarning):
|
||||||
compat.WINDOWS_VT_MODE
|
compat.WINDOWS_VT_MODE
|
||||||
|
|
||||||
# TODO: Test submodule
|
self.assertEqual(urllib.request.getproxies, getproxies)
|
||||||
# compat.asyncio.events # Must not raise error
|
|
||||||
|
with self.assertWarns(DeprecationWarning):
|
||||||
|
compat.compat_pycrypto_AES # Must not raise error
|
||||||
|
|
||||||
def test_compat_expanduser(self):
|
def test_compat_expanduser(self):
|
||||||
old_home = os.environ.get('HOME')
|
old_home = os.environ.get('HOME')
|
||||||
|
|
|
@ -49,32 +49,38 @@ def test_get_desktop_environment(self):
|
||||||
""" based on https://chromium.googlesource.com/chromium/src/+/refs/heads/main/base/nix/xdg_util_unittest.cc """
|
""" based on https://chromium.googlesource.com/chromium/src/+/refs/heads/main/base/nix/xdg_util_unittest.cc """
|
||||||
test_cases = [
|
test_cases = [
|
||||||
({}, _LinuxDesktopEnvironment.OTHER),
|
({}, _LinuxDesktopEnvironment.OTHER),
|
||||||
|
({'DESKTOP_SESSION': 'my_custom_de'}, _LinuxDesktopEnvironment.OTHER),
|
||||||
|
({'XDG_CURRENT_DESKTOP': 'my_custom_de'}, _LinuxDesktopEnvironment.OTHER),
|
||||||
|
|
||||||
({'DESKTOP_SESSION': 'gnome'}, _LinuxDesktopEnvironment.GNOME),
|
({'DESKTOP_SESSION': 'gnome'}, _LinuxDesktopEnvironment.GNOME),
|
||||||
({'DESKTOP_SESSION': 'mate'}, _LinuxDesktopEnvironment.GNOME),
|
({'DESKTOP_SESSION': 'mate'}, _LinuxDesktopEnvironment.GNOME),
|
||||||
({'DESKTOP_SESSION': 'kde4'}, _LinuxDesktopEnvironment.KDE),
|
({'DESKTOP_SESSION': 'kde4'}, _LinuxDesktopEnvironment.KDE4),
|
||||||
({'DESKTOP_SESSION': 'kde'}, _LinuxDesktopEnvironment.KDE),
|
({'DESKTOP_SESSION': 'kde'}, _LinuxDesktopEnvironment.KDE3),
|
||||||
({'DESKTOP_SESSION': 'xfce'}, _LinuxDesktopEnvironment.XFCE),
|
({'DESKTOP_SESSION': 'xfce'}, _LinuxDesktopEnvironment.XFCE),
|
||||||
|
|
||||||
({'GNOME_DESKTOP_SESSION_ID': 1}, _LinuxDesktopEnvironment.GNOME),
|
({'GNOME_DESKTOP_SESSION_ID': 1}, _LinuxDesktopEnvironment.GNOME),
|
||||||
({'KDE_FULL_SESSION': 1}, _LinuxDesktopEnvironment.KDE),
|
({'KDE_FULL_SESSION': 1}, _LinuxDesktopEnvironment.KDE3),
|
||||||
|
({'KDE_FULL_SESSION': 1, 'DESKTOP_SESSION': 'kde4'}, _LinuxDesktopEnvironment.KDE4),
|
||||||
|
|
||||||
({'XDG_CURRENT_DESKTOP': 'X-Cinnamon'}, _LinuxDesktopEnvironment.CINNAMON),
|
({'XDG_CURRENT_DESKTOP': 'X-Cinnamon'}, _LinuxDesktopEnvironment.CINNAMON),
|
||||||
|
({'XDG_CURRENT_DESKTOP': 'Deepin'}, _LinuxDesktopEnvironment.DEEPIN),
|
||||||
({'XDG_CURRENT_DESKTOP': 'GNOME'}, _LinuxDesktopEnvironment.GNOME),
|
({'XDG_CURRENT_DESKTOP': 'GNOME'}, _LinuxDesktopEnvironment.GNOME),
|
||||||
({'XDG_CURRENT_DESKTOP': 'GNOME:GNOME-Classic'}, _LinuxDesktopEnvironment.GNOME),
|
({'XDG_CURRENT_DESKTOP': 'GNOME:GNOME-Classic'}, _LinuxDesktopEnvironment.GNOME),
|
||||||
({'XDG_CURRENT_DESKTOP': 'GNOME : GNOME-Classic'}, _LinuxDesktopEnvironment.GNOME),
|
({'XDG_CURRENT_DESKTOP': 'GNOME : GNOME-Classic'}, _LinuxDesktopEnvironment.GNOME),
|
||||||
|
|
||||||
({'XDG_CURRENT_DESKTOP': 'Unity', 'DESKTOP_SESSION': 'gnome-fallback'}, _LinuxDesktopEnvironment.GNOME),
|
({'XDG_CURRENT_DESKTOP': 'Unity', 'DESKTOP_SESSION': 'gnome-fallback'}, _LinuxDesktopEnvironment.GNOME),
|
||||||
({'XDG_CURRENT_DESKTOP': 'KDE', 'KDE_SESSION_VERSION': '5'}, _LinuxDesktopEnvironment.KDE),
|
({'XDG_CURRENT_DESKTOP': 'KDE', 'KDE_SESSION_VERSION': '5'}, _LinuxDesktopEnvironment.KDE5),
|
||||||
({'XDG_CURRENT_DESKTOP': 'KDE'}, _LinuxDesktopEnvironment.KDE),
|
({'XDG_CURRENT_DESKTOP': 'KDE', 'KDE_SESSION_VERSION': '6'}, _LinuxDesktopEnvironment.KDE6),
|
||||||
|
({'XDG_CURRENT_DESKTOP': 'KDE'}, _LinuxDesktopEnvironment.KDE4),
|
||||||
({'XDG_CURRENT_DESKTOP': 'Pantheon'}, _LinuxDesktopEnvironment.PANTHEON),
|
({'XDG_CURRENT_DESKTOP': 'Pantheon'}, _LinuxDesktopEnvironment.PANTHEON),
|
||||||
|
({'XDG_CURRENT_DESKTOP': 'UKUI'}, _LinuxDesktopEnvironment.UKUI),
|
||||||
({'XDG_CURRENT_DESKTOP': 'Unity'}, _LinuxDesktopEnvironment.UNITY),
|
({'XDG_CURRENT_DESKTOP': 'Unity'}, _LinuxDesktopEnvironment.UNITY),
|
||||||
({'XDG_CURRENT_DESKTOP': 'Unity:Unity7'}, _LinuxDesktopEnvironment.UNITY),
|
({'XDG_CURRENT_DESKTOP': 'Unity:Unity7'}, _LinuxDesktopEnvironment.UNITY),
|
||||||
({'XDG_CURRENT_DESKTOP': 'Unity:Unity8'}, _LinuxDesktopEnvironment.UNITY),
|
({'XDG_CURRENT_DESKTOP': 'Unity:Unity8'}, _LinuxDesktopEnvironment.UNITY),
|
||||||
]
|
]
|
||||||
|
|
||||||
for env, expected_desktop_environment in test_cases:
|
for env, expected_desktop_environment in test_cases:
|
||||||
self.assertEqual(_get_linux_desktop_environment(env), expected_desktop_environment)
|
self.assertEqual(_get_linux_desktop_environment(env, Logger()), expected_desktop_environment)
|
||||||
|
|
||||||
def test_chrome_cookie_decryptor_linux_derive_key(self):
|
def test_chrome_cookie_decryptor_linux_derive_key(self):
|
||||||
key = LinuxChromeCookieDecryptor.derive_key(b'abc')
|
key = LinuxChromeCookieDecryptor.derive_key(b'abc')
|
||||||
|
|
|
@ -10,10 +10,7 @@
|
||||||
|
|
||||||
import collections
|
import collections
|
||||||
import hashlib
|
import hashlib
|
||||||
import http.client
|
|
||||||
import json
|
import json
|
||||||
import socket
|
|
||||||
import urllib.error
|
|
||||||
|
|
||||||
from test.helper import (
|
from test.helper import (
|
||||||
assertGreaterEqual,
|
assertGreaterEqual,
|
||||||
|
@ -29,10 +26,12 @@
|
||||||
|
|
||||||
import yt_dlp.YoutubeDL # isort: split
|
import yt_dlp.YoutubeDL # isort: split
|
||||||
from yt_dlp.extractor import get_info_extractor
|
from yt_dlp.extractor import get_info_extractor
|
||||||
|
from yt_dlp.networking.exceptions import HTTPError, TransportError
|
||||||
from yt_dlp.utils import (
|
from yt_dlp.utils import (
|
||||||
DownloadError,
|
DownloadError,
|
||||||
ExtractorError,
|
ExtractorError,
|
||||||
UnavailableVideoError,
|
UnavailableVideoError,
|
||||||
|
YoutubeDLError,
|
||||||
format_bytes,
|
format_bytes,
|
||||||
join_nonempty,
|
join_nonempty,
|
||||||
)
|
)
|
||||||
|
@ -102,6 +101,8 @@ def print_skipping(reason):
|
||||||
print_skipping('IE marked as not _WORKING')
|
print_skipping('IE marked as not _WORKING')
|
||||||
|
|
||||||
for tc in test_cases:
|
for tc in test_cases:
|
||||||
|
if tc.get('expected_exception'):
|
||||||
|
continue
|
||||||
info_dict = tc.get('info_dict', {})
|
info_dict = tc.get('info_dict', {})
|
||||||
params = tc.get('params', {})
|
params = tc.get('params', {})
|
||||||
if not info_dict.get('id'):
|
if not info_dict.get('id'):
|
||||||
|
@ -141,6 +142,17 @@ def get_tc_filename(tc):
|
||||||
|
|
||||||
res_dict = None
|
res_dict = None
|
||||||
|
|
||||||
|
def match_exception(err):
|
||||||
|
expected_exception = test_case.get('expected_exception')
|
||||||
|
if not expected_exception:
|
||||||
|
return False
|
||||||
|
if err.__class__.__name__ == expected_exception:
|
||||||
|
return True
|
||||||
|
for exc in err.exc_info:
|
||||||
|
if exc.__class__.__name__ == expected_exception:
|
||||||
|
return True
|
||||||
|
return False
|
||||||
|
|
||||||
def try_rm_tcs_files(tcs=None):
|
def try_rm_tcs_files(tcs=None):
|
||||||
if tcs is None:
|
if tcs is None:
|
||||||
tcs = test_cases
|
tcs = test_cases
|
||||||
|
@ -162,8 +174,9 @@ def try_rm_tcs_files(tcs=None):
|
||||||
force_generic_extractor=params.get('force_generic_extractor', False))
|
force_generic_extractor=params.get('force_generic_extractor', False))
|
||||||
except (DownloadError, ExtractorError) as err:
|
except (DownloadError, ExtractorError) as err:
|
||||||
# Check if the exception is not a network related one
|
# Check if the exception is not a network related one
|
||||||
if (err.exc_info[0] not in (urllib.error.URLError, socket.timeout, UnavailableVideoError, http.client.BadStatusLine)
|
if not isinstance(err.exc_info[1], (TransportError, UnavailableVideoError)) or (isinstance(err.exc_info[1], HTTPError) and err.exc_info[1].status == 503):
|
||||||
or (err.exc_info[0] == urllib.error.HTTPError and err.exc_info[1].code == 503)):
|
if match_exception(err):
|
||||||
|
return
|
||||||
err.msg = f'{getattr(err, "msg", err)} ({tname})'
|
err.msg = f'{getattr(err, "msg", err)} ({tname})'
|
||||||
raise
|
raise
|
||||||
|
|
||||||
|
@ -174,6 +187,10 @@ def try_rm_tcs_files(tcs=None):
|
||||||
print(f'Retrying: {try_num} failed tries\n\n##########\n\n')
|
print(f'Retrying: {try_num} failed tries\n\n##########\n\n')
|
||||||
|
|
||||||
try_num += 1
|
try_num += 1
|
||||||
|
except YoutubeDLError as err:
|
||||||
|
if match_exception(err):
|
||||||
|
return
|
||||||
|
raise
|
||||||
else:
|
else:
|
||||||
break
|
break
|
||||||
|
|
||||||
|
@ -249,7 +266,7 @@ def try_rm_tcs_files(tcs=None):
|
||||||
# extractor returns full results even with extract_flat
|
# extractor returns full results even with extract_flat
|
||||||
res_tcs = [{'info_dict': e} for e in res_dict['entries']]
|
res_tcs = [{'info_dict': e} for e in res_dict['entries']]
|
||||||
try_rm_tcs_files(res_tcs)
|
try_rm_tcs_files(res_tcs)
|
||||||
|
ydl.close()
|
||||||
return test_template
|
return test_template
|
||||||
|
|
||||||
|
|
||||||
|
|
139
test/test_downloader_external.py
Normal file
139
test/test_downloader_external.py
Normal file
|
@ -0,0 +1,139 @@
|
||||||
|
#!/usr/bin/env python3
|
||||||
|
|
||||||
|
# Allow direct execution
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import unittest
|
||||||
|
|
||||||
|
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||||
|
|
||||||
|
import http.cookiejar
|
||||||
|
|
||||||
|
from test.helper import FakeYDL
|
||||||
|
from yt_dlp.downloader.external import (
|
||||||
|
Aria2cFD,
|
||||||
|
AxelFD,
|
||||||
|
CurlFD,
|
||||||
|
FFmpegFD,
|
||||||
|
HttpieFD,
|
||||||
|
WgetFD,
|
||||||
|
)
|
||||||
|
|
||||||
|
TEST_COOKIE = {
|
||||||
|
'version': 0,
|
||||||
|
'name': 'test',
|
||||||
|
'value': 'ytdlp',
|
||||||
|
'port': None,
|
||||||
|
'port_specified': False,
|
||||||
|
'domain': '.example.com',
|
||||||
|
'domain_specified': True,
|
||||||
|
'domain_initial_dot': False,
|
||||||
|
'path': '/',
|
||||||
|
'path_specified': True,
|
||||||
|
'secure': False,
|
||||||
|
'expires': None,
|
||||||
|
'discard': False,
|
||||||
|
'comment': None,
|
||||||
|
'comment_url': None,
|
||||||
|
'rest': {},
|
||||||
|
}
|
||||||
|
|
||||||
|
TEST_INFO = {'url': 'http://www.example.com/'}
|
||||||
|
|
||||||
|
|
||||||
|
class TestHttpieFD(unittest.TestCase):
|
||||||
|
def test_make_cmd(self):
|
||||||
|
with FakeYDL() as ydl:
|
||||||
|
downloader = HttpieFD(ydl, {})
|
||||||
|
self.assertEqual(
|
||||||
|
downloader._make_cmd('test', TEST_INFO),
|
||||||
|
['http', '--download', '--output', 'test', 'http://www.example.com/'])
|
||||||
|
|
||||||
|
# Test cookie header is added
|
||||||
|
ydl.cookiejar.set_cookie(http.cookiejar.Cookie(**TEST_COOKIE))
|
||||||
|
self.assertEqual(
|
||||||
|
downloader._make_cmd('test', TEST_INFO),
|
||||||
|
['http', '--download', '--output', 'test', 'http://www.example.com/', 'Cookie:test=ytdlp'])
|
||||||
|
|
||||||
|
|
||||||
|
class TestAxelFD(unittest.TestCase):
|
||||||
|
def test_make_cmd(self):
|
||||||
|
with FakeYDL() as ydl:
|
||||||
|
downloader = AxelFD(ydl, {})
|
||||||
|
self.assertEqual(
|
||||||
|
downloader._make_cmd('test', TEST_INFO),
|
||||||
|
['axel', '-o', 'test', '--', 'http://www.example.com/'])
|
||||||
|
|
||||||
|
# Test cookie header is added
|
||||||
|
ydl.cookiejar.set_cookie(http.cookiejar.Cookie(**TEST_COOKIE))
|
||||||
|
self.assertEqual(
|
||||||
|
downloader._make_cmd('test', TEST_INFO),
|
||||||
|
['axel', '-o', 'test', '-H', 'Cookie: test=ytdlp', '--max-redirect=0', '--', 'http://www.example.com/'])
|
||||||
|
|
||||||
|
|
||||||
|
class TestWgetFD(unittest.TestCase):
|
||||||
|
def test_make_cmd(self):
|
||||||
|
with FakeYDL() as ydl:
|
||||||
|
downloader = WgetFD(ydl, {})
|
||||||
|
self.assertNotIn('--load-cookies', downloader._make_cmd('test', TEST_INFO))
|
||||||
|
# Test cookiejar tempfile arg is added
|
||||||
|
ydl.cookiejar.set_cookie(http.cookiejar.Cookie(**TEST_COOKIE))
|
||||||
|
self.assertIn('--load-cookies', downloader._make_cmd('test', TEST_INFO))
|
||||||
|
|
||||||
|
|
||||||
|
class TestCurlFD(unittest.TestCase):
|
||||||
|
def test_make_cmd(self):
|
||||||
|
with FakeYDL() as ydl:
|
||||||
|
downloader = CurlFD(ydl, {})
|
||||||
|
self.assertNotIn('--cookie', downloader._make_cmd('test', TEST_INFO))
|
||||||
|
# Test cookie header is added
|
||||||
|
ydl.cookiejar.set_cookie(http.cookiejar.Cookie(**TEST_COOKIE))
|
||||||
|
self.assertIn('--cookie', downloader._make_cmd('test', TEST_INFO))
|
||||||
|
self.assertIn('test=ytdlp', downloader._make_cmd('test', TEST_INFO))
|
||||||
|
|
||||||
|
|
||||||
|
class TestAria2cFD(unittest.TestCase):
|
||||||
|
def test_make_cmd(self):
|
||||||
|
with FakeYDL() as ydl:
|
||||||
|
downloader = Aria2cFD(ydl, {})
|
||||||
|
downloader._make_cmd('test', TEST_INFO)
|
||||||
|
self.assertFalse(hasattr(downloader, '_cookies_tempfile'))
|
||||||
|
|
||||||
|
# Test cookiejar tempfile arg is added
|
||||||
|
ydl.cookiejar.set_cookie(http.cookiejar.Cookie(**TEST_COOKIE))
|
||||||
|
cmd = downloader._make_cmd('test', TEST_INFO)
|
||||||
|
self.assertIn(f'--load-cookies={downloader._cookies_tempfile}', cmd)
|
||||||
|
|
||||||
|
|
||||||
|
@unittest.skipUnless(FFmpegFD.available(), 'ffmpeg not found')
|
||||||
|
class TestFFmpegFD(unittest.TestCase):
|
||||||
|
_args = []
|
||||||
|
|
||||||
|
def _test_cmd(self, args):
|
||||||
|
self._args = args
|
||||||
|
|
||||||
|
def test_make_cmd(self):
|
||||||
|
with FakeYDL() as ydl:
|
||||||
|
downloader = FFmpegFD(ydl, {})
|
||||||
|
downloader._debug_cmd = self._test_cmd
|
||||||
|
|
||||||
|
downloader._call_downloader('test', {**TEST_INFO, 'ext': 'mp4'})
|
||||||
|
self.assertEqual(self._args, [
|
||||||
|
'ffmpeg', '-y', '-hide_banner', '-i', 'http://www.example.com/',
|
||||||
|
'-c', 'copy', '-f', 'mp4', 'file:test'])
|
||||||
|
|
||||||
|
# Test cookies arg is added
|
||||||
|
ydl.cookiejar.set_cookie(http.cookiejar.Cookie(**TEST_COOKIE))
|
||||||
|
downloader._call_downloader('test', {**TEST_INFO, 'ext': 'mp4'})
|
||||||
|
self.assertEqual(self._args, [
|
||||||
|
'ffmpeg', '-y', '-hide_banner', '-cookies', 'test=ytdlp; path=/; domain=.example.com;\r\n',
|
||||||
|
'-i', 'http://www.example.com/', '-c', 'copy', '-f', 'mp4', 'file:test'])
|
||||||
|
|
||||||
|
# Test with non-url input (ffmpeg reads from stdin '-' for websockets)
|
||||||
|
downloader._call_downloader('test', {'url': 'x', 'ext': 'mp4'})
|
||||||
|
self.assertEqual(self._args, [
|
||||||
|
'ffmpeg', '-y', '-hide_banner', '-i', 'x', '-c', 'copy', '-f', 'mp4', 'file:test'])
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
unittest.main()
|
|
@ -16,6 +16,7 @@
|
||||||
from yt_dlp import YoutubeDL
|
from yt_dlp import YoutubeDL
|
||||||
from yt_dlp.downloader.http import HttpFD
|
from yt_dlp.downloader.http import HttpFD
|
||||||
from yt_dlp.utils import encodeFilename
|
from yt_dlp.utils import encodeFilename
|
||||||
|
from yt_dlp.utils._utils import _YDLLogger as FakeLogger
|
||||||
|
|
||||||
TEST_DIR = os.path.dirname(os.path.abspath(__file__))
|
TEST_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||||
|
|
||||||
|
@ -67,17 +68,6 @@ def do_GET(self):
|
||||||
assert False
|
assert False
|
||||||
|
|
||||||
|
|
||||||
class FakeLogger:
|
|
||||||
def debug(self, msg):
|
|
||||||
pass
|
|
||||||
|
|
||||||
def warning(self, msg):
|
|
||||||
pass
|
|
||||||
|
|
||||||
def error(self, msg):
|
|
||||||
pass
|
|
||||||
|
|
||||||
|
|
||||||
class TestHttpFD(unittest.TestCase):
|
class TestHttpFD(unittest.TestCase):
|
||||||
def setUp(self):
|
def setUp(self):
|
||||||
self.httpd = http.server.HTTPServer(
|
self.httpd = http.server.HTTPServer(
|
||||||
|
|
|
@ -45,6 +45,9 @@ def test_lazy_extractors(self):
|
||||||
self.assertTrue(os.path.exists(LAZY_EXTRACTORS))
|
self.assertTrue(os.path.exists(LAZY_EXTRACTORS))
|
||||||
|
|
||||||
_, stderr = self.run_yt_dlp(opts=('-s', 'test:'))
|
_, stderr = self.run_yt_dlp(opts=('-s', 'test:'))
|
||||||
|
# `MIN_RECOMMENDED` emits a deprecated feature warning for deprecated python versions
|
||||||
|
if stderr and stderr.startswith('Deprecated Feature: Support for Python'):
|
||||||
|
stderr = ''
|
||||||
self.assertFalse(stderr)
|
self.assertFalse(stderr)
|
||||||
|
|
||||||
subprocess.check_call([sys.executable, 'test/test_all_urls.py'], cwd=rootDir, stdout=subprocess.DEVNULL)
|
subprocess.check_call([sys.executable, 'test/test_all_urls.py'], cwd=rootDir, stdout=subprocess.DEVNULL)
|
||||||
|
|
|
@ -1,192 +0,0 @@
|
||||||
#!/usr/bin/env python3
|
|
||||||
|
|
||||||
# Allow direct execution
|
|
||||||
import os
|
|
||||||
import sys
|
|
||||||
import unittest
|
|
||||||
|
|
||||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
|
||||||
|
|
||||||
|
|
||||||
import http.server
|
|
||||||
import ssl
|
|
||||||
import threading
|
|
||||||
import urllib.request
|
|
||||||
|
|
||||||
from test.helper import http_server_port
|
|
||||||
from yt_dlp import YoutubeDL
|
|
||||||
|
|
||||||
TEST_DIR = os.path.dirname(os.path.abspath(__file__))
|
|
||||||
|
|
||||||
|
|
||||||
class HTTPTestRequestHandler(http.server.BaseHTTPRequestHandler):
|
|
||||||
def log_message(self, format, *args):
|
|
||||||
pass
|
|
||||||
|
|
||||||
def do_GET(self):
|
|
||||||
if self.path == '/video.html':
|
|
||||||
self.send_response(200)
|
|
||||||
self.send_header('Content-Type', 'text/html; charset=utf-8')
|
|
||||||
self.end_headers()
|
|
||||||
self.wfile.write(b'<html><video src="/vid.mp4" /></html>')
|
|
||||||
elif self.path == '/vid.mp4':
|
|
||||||
self.send_response(200)
|
|
||||||
self.send_header('Content-Type', 'video/mp4')
|
|
||||||
self.end_headers()
|
|
||||||
self.wfile.write(b'\x00\x00\x00\x00\x20\x66\x74[video]')
|
|
||||||
elif self.path == '/%E4%B8%AD%E6%96%87.html':
|
|
||||||
self.send_response(200)
|
|
||||||
self.send_header('Content-Type', 'text/html; charset=utf-8')
|
|
||||||
self.end_headers()
|
|
||||||
self.wfile.write(b'<html><video src="/vid.mp4" /></html>')
|
|
||||||
else:
|
|
||||||
assert False
|
|
||||||
|
|
||||||
|
|
||||||
class FakeLogger:
|
|
||||||
def debug(self, msg):
|
|
||||||
pass
|
|
||||||
|
|
||||||
def warning(self, msg):
|
|
||||||
pass
|
|
||||||
|
|
||||||
def error(self, msg):
|
|
||||||
pass
|
|
||||||
|
|
||||||
|
|
||||||
class TestHTTP(unittest.TestCase):
|
|
||||||
def setUp(self):
|
|
||||||
self.httpd = http.server.HTTPServer(
|
|
||||||
('127.0.0.1', 0), HTTPTestRequestHandler)
|
|
||||||
self.port = http_server_port(self.httpd)
|
|
||||||
self.server_thread = threading.Thread(target=self.httpd.serve_forever)
|
|
||||||
self.server_thread.daemon = True
|
|
||||||
self.server_thread.start()
|
|
||||||
|
|
||||||
|
|
||||||
class TestHTTPS(unittest.TestCase):
|
|
||||||
def setUp(self):
|
|
||||||
certfn = os.path.join(TEST_DIR, 'testcert.pem')
|
|
||||||
self.httpd = http.server.HTTPServer(
|
|
||||||
('127.0.0.1', 0), HTTPTestRequestHandler)
|
|
||||||
sslctx = ssl.SSLContext(ssl.PROTOCOL_TLS_SERVER)
|
|
||||||
sslctx.load_cert_chain(certfn, None)
|
|
||||||
self.httpd.socket = sslctx.wrap_socket(self.httpd.socket, server_side=True)
|
|
||||||
self.port = http_server_port(self.httpd)
|
|
||||||
self.server_thread = threading.Thread(target=self.httpd.serve_forever)
|
|
||||||
self.server_thread.daemon = True
|
|
||||||
self.server_thread.start()
|
|
||||||
|
|
||||||
def test_nocheckcertificate(self):
|
|
||||||
ydl = YoutubeDL({'logger': FakeLogger()})
|
|
||||||
self.assertRaises(
|
|
||||||
Exception,
|
|
||||||
ydl.extract_info, 'https://127.0.0.1:%d/video.html' % self.port)
|
|
||||||
|
|
||||||
ydl = YoutubeDL({'logger': FakeLogger(), 'nocheckcertificate': True})
|
|
||||||
r = ydl.extract_info('https://127.0.0.1:%d/video.html' % self.port)
|
|
||||||
self.assertEqual(r['url'], 'https://127.0.0.1:%d/vid.mp4' % self.port)
|
|
||||||
|
|
||||||
|
|
||||||
class TestClientCert(unittest.TestCase):
|
|
||||||
def setUp(self):
|
|
||||||
certfn = os.path.join(TEST_DIR, 'testcert.pem')
|
|
||||||
self.certdir = os.path.join(TEST_DIR, 'testdata', 'certificate')
|
|
||||||
cacertfn = os.path.join(self.certdir, 'ca.crt')
|
|
||||||
self.httpd = http.server.HTTPServer(('127.0.0.1', 0), HTTPTestRequestHandler)
|
|
||||||
sslctx = ssl.SSLContext(ssl.PROTOCOL_TLS_SERVER)
|
|
||||||
sslctx.verify_mode = ssl.CERT_REQUIRED
|
|
||||||
sslctx.load_verify_locations(cafile=cacertfn)
|
|
||||||
sslctx.load_cert_chain(certfn, None)
|
|
||||||
self.httpd.socket = sslctx.wrap_socket(self.httpd.socket, server_side=True)
|
|
||||||
self.port = http_server_port(self.httpd)
|
|
||||||
self.server_thread = threading.Thread(target=self.httpd.serve_forever)
|
|
||||||
self.server_thread.daemon = True
|
|
||||||
self.server_thread.start()
|
|
||||||
|
|
||||||
def _run_test(self, **params):
|
|
||||||
ydl = YoutubeDL({
|
|
||||||
'logger': FakeLogger(),
|
|
||||||
# Disable client-side validation of unacceptable self-signed testcert.pem
|
|
||||||
# The test is of a check on the server side, so unaffected
|
|
||||||
'nocheckcertificate': True,
|
|
||||||
**params,
|
|
||||||
})
|
|
||||||
r = ydl.extract_info('https://127.0.0.1:%d/video.html' % self.port)
|
|
||||||
self.assertEqual(r['url'], 'https://127.0.0.1:%d/vid.mp4' % self.port)
|
|
||||||
|
|
||||||
def test_certificate_combined_nopass(self):
|
|
||||||
self._run_test(client_certificate=os.path.join(self.certdir, 'clientwithkey.crt'))
|
|
||||||
|
|
||||||
def test_certificate_nocombined_nopass(self):
|
|
||||||
self._run_test(client_certificate=os.path.join(self.certdir, 'client.crt'),
|
|
||||||
client_certificate_key=os.path.join(self.certdir, 'client.key'))
|
|
||||||
|
|
||||||
def test_certificate_combined_pass(self):
|
|
||||||
self._run_test(client_certificate=os.path.join(self.certdir, 'clientwithencryptedkey.crt'),
|
|
||||||
client_certificate_password='foobar')
|
|
||||||
|
|
||||||
def test_certificate_nocombined_pass(self):
|
|
||||||
self._run_test(client_certificate=os.path.join(self.certdir, 'client.crt'),
|
|
||||||
client_certificate_key=os.path.join(self.certdir, 'clientencrypted.key'),
|
|
||||||
client_certificate_password='foobar')
|
|
||||||
|
|
||||||
|
|
||||||
def _build_proxy_handler(name):
|
|
||||||
class HTTPTestRequestHandler(http.server.BaseHTTPRequestHandler):
|
|
||||||
proxy_name = name
|
|
||||||
|
|
||||||
def log_message(self, format, *args):
|
|
||||||
pass
|
|
||||||
|
|
||||||
def do_GET(self):
|
|
||||||
self.send_response(200)
|
|
||||||
self.send_header('Content-Type', 'text/plain; charset=utf-8')
|
|
||||||
self.end_headers()
|
|
||||||
self.wfile.write(f'{self.proxy_name}: {self.path}'.encode())
|
|
||||||
return HTTPTestRequestHandler
|
|
||||||
|
|
||||||
|
|
||||||
class TestProxy(unittest.TestCase):
|
|
||||||
def setUp(self):
|
|
||||||
self.proxy = http.server.HTTPServer(
|
|
||||||
('127.0.0.1', 0), _build_proxy_handler('normal'))
|
|
||||||
self.port = http_server_port(self.proxy)
|
|
||||||
self.proxy_thread = threading.Thread(target=self.proxy.serve_forever)
|
|
||||||
self.proxy_thread.daemon = True
|
|
||||||
self.proxy_thread.start()
|
|
||||||
|
|
||||||
self.geo_proxy = http.server.HTTPServer(
|
|
||||||
('127.0.0.1', 0), _build_proxy_handler('geo'))
|
|
||||||
self.geo_port = http_server_port(self.geo_proxy)
|
|
||||||
self.geo_proxy_thread = threading.Thread(target=self.geo_proxy.serve_forever)
|
|
||||||
self.geo_proxy_thread.daemon = True
|
|
||||||
self.geo_proxy_thread.start()
|
|
||||||
|
|
||||||
def test_proxy(self):
|
|
||||||
geo_proxy = f'127.0.0.1:{self.geo_port}'
|
|
||||||
ydl = YoutubeDL({
|
|
||||||
'proxy': f'127.0.0.1:{self.port}',
|
|
||||||
'geo_verification_proxy': geo_proxy,
|
|
||||||
})
|
|
||||||
url = 'http://foo.com/bar'
|
|
||||||
response = ydl.urlopen(url).read().decode()
|
|
||||||
self.assertEqual(response, f'normal: {url}')
|
|
||||||
|
|
||||||
req = urllib.request.Request(url)
|
|
||||||
req.add_header('Ytdl-request-proxy', geo_proxy)
|
|
||||||
response = ydl.urlopen(req).read().decode()
|
|
||||||
self.assertEqual(response, f'geo: {url}')
|
|
||||||
|
|
||||||
def test_proxy_with_idn(self):
|
|
||||||
ydl = YoutubeDL({
|
|
||||||
'proxy': f'127.0.0.1:{self.port}',
|
|
||||||
})
|
|
||||||
url = 'http://中文.tw/'
|
|
||||||
response = ydl.urlopen(url).read().decode()
|
|
||||||
# b'xn--fiq228c' is '中文'.encode('idna')
|
|
||||||
self.assertEqual(response, 'normal: http://xn--fiq228c.tw/')
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
|
||||||
unittest.main()
|
|
|
@ -8,410 +8,372 @@
|
||||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||||
|
|
||||||
import math
|
import math
|
||||||
import re
|
|
||||||
|
|
||||||
from yt_dlp.jsinterp import JS_Undefined, JSInterpreter
|
from yt_dlp.jsinterp import JS_Undefined, JSInterpreter
|
||||||
|
|
||||||
|
|
||||||
|
class NaN:
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
class TestJSInterpreter(unittest.TestCase):
|
class TestJSInterpreter(unittest.TestCase):
|
||||||
|
def _test(self, jsi_or_code, expected, func='f', args=()):
|
||||||
|
if isinstance(jsi_or_code, str):
|
||||||
|
jsi_or_code = JSInterpreter(jsi_or_code)
|
||||||
|
got = jsi_or_code.call_function(func, *args)
|
||||||
|
if expected is NaN:
|
||||||
|
self.assertTrue(math.isnan(got), f'{got} is not NaN')
|
||||||
|
else:
|
||||||
|
self.assertEqual(got, expected)
|
||||||
|
|
||||||
def test_basic(self):
|
def test_basic(self):
|
||||||
jsi = JSInterpreter('function x(){;}')
|
jsi = JSInterpreter('function f(){;}')
|
||||||
self.assertEqual(jsi.call_function('x'), None)
|
self.assertEqual(repr(jsi.extract_function('f')), 'F<f>')
|
||||||
|
self._test(jsi, None)
|
||||||
|
|
||||||
jsi = JSInterpreter('function x3(){return 42;}')
|
self._test('function f(){return 42;}', 42)
|
||||||
self.assertEqual(jsi.call_function('x3'), 42)
|
self._test('function f(){42}', None)
|
||||||
|
self._test('var f = function(){return 42;}', 42)
|
||||||
|
|
||||||
jsi = JSInterpreter('function x3(){42}')
|
def test_add(self):
|
||||||
self.assertEqual(jsi.call_function('x3'), None)
|
self._test('function f(){return 42 + 7;}', 49)
|
||||||
|
self._test('function f(){return 42 + undefined;}', NaN)
|
||||||
|
self._test('function f(){return 42 + null;}', 42)
|
||||||
|
|
||||||
jsi = JSInterpreter('var x5 = function(){return 42;}')
|
def test_sub(self):
|
||||||
self.assertEqual(jsi.call_function('x5'), 42)
|
self._test('function f(){return 42 - 7;}', 35)
|
||||||
|
self._test('function f(){return 42 - undefined;}', NaN)
|
||||||
|
self._test('function f(){return 42 - null;}', 42)
|
||||||
|
|
||||||
|
def test_mul(self):
|
||||||
|
self._test('function f(){return 42 * 7;}', 294)
|
||||||
|
self._test('function f(){return 42 * undefined;}', NaN)
|
||||||
|
self._test('function f(){return 42 * null;}', 0)
|
||||||
|
|
||||||
|
def test_div(self):
|
||||||
|
jsi = JSInterpreter('function f(a, b){return a / b;}')
|
||||||
|
self._test(jsi, NaN, args=(0, 0))
|
||||||
|
self._test(jsi, NaN, args=(JS_Undefined, 1))
|
||||||
|
self._test(jsi, float('inf'), args=(2, 0))
|
||||||
|
self._test(jsi, 0, args=(0, 3))
|
||||||
|
|
||||||
|
def test_mod(self):
|
||||||
|
self._test('function f(){return 42 % 7;}', 0)
|
||||||
|
self._test('function f(){return 42 % 0;}', NaN)
|
||||||
|
self._test('function f(){return 42 % undefined;}', NaN)
|
||||||
|
|
||||||
|
def test_exp(self):
|
||||||
|
self._test('function f(){return 42 ** 2;}', 1764)
|
||||||
|
self._test('function f(){return 42 ** undefined;}', NaN)
|
||||||
|
self._test('function f(){return 42 ** null;}', 1)
|
||||||
|
self._test('function f(){return undefined ** 42;}', NaN)
|
||||||
|
|
||||||
def test_calc(self):
|
def test_calc(self):
|
||||||
jsi = JSInterpreter('function x4(a){return 2*a+1;}')
|
self._test('function f(a){return 2*a+1;}', 7, args=[3])
|
||||||
self.assertEqual(jsi.call_function('x4', 3), 7)
|
|
||||||
|
|
||||||
def test_empty_return(self):
|
def test_empty_return(self):
|
||||||
jsi = JSInterpreter('function f(){return; y()}')
|
self._test('function f(){return; y()}', None)
|
||||||
self.assertEqual(jsi.call_function('f'), None)
|
|
||||||
|
|
||||||
def test_morespace(self):
|
def test_morespace(self):
|
||||||
jsi = JSInterpreter('function x (a) { return 2 * a + 1 ; }')
|
self._test('function f (a) { return 2 * a + 1 ; }', 7, args=[3])
|
||||||
self.assertEqual(jsi.call_function('x', 3), 7)
|
self._test('function f () { x = 2 ; return x; }', 2)
|
||||||
|
|
||||||
jsi = JSInterpreter('function f () { x = 2 ; return x; }')
|
|
||||||
self.assertEqual(jsi.call_function('f'), 2)
|
|
||||||
|
|
||||||
def test_strange_chars(self):
|
def test_strange_chars(self):
|
||||||
jsi = JSInterpreter('function $_xY1 ($_axY1) { var $_axY2 = $_axY1 + 1; return $_axY2; }')
|
self._test('function $_xY1 ($_axY1) { var $_axY2 = $_axY1 + 1; return $_axY2; }',
|
||||||
self.assertEqual(jsi.call_function('$_xY1', 20), 21)
|
21, args=[20], func='$_xY1')
|
||||||
|
|
||||||
def test_operators(self):
|
def test_operators(self):
|
||||||
jsi = JSInterpreter('function f(){return 1 << 5;}')
|
self._test('function f(){return 1 << 5;}', 32)
|
||||||
self.assertEqual(jsi.call_function('f'), 32)
|
self._test('function f(){return 2 ** 5}', 32)
|
||||||
|
self._test('function f(){return 19 & 21;}', 17)
|
||||||
jsi = JSInterpreter('function f(){return 2 ** 5}')
|
self._test('function f(){return 11 >> 2;}', 2)
|
||||||
self.assertEqual(jsi.call_function('f'), 32)
|
self._test('function f(){return []? 2+3: 4;}', 5)
|
||||||
|
self._test('function f(){return 1 == 2}', False)
|
||||||
jsi = JSInterpreter('function f(){return 19 & 21;}')
|
self._test('function f(){return 0 && 1 || 2;}', 2)
|
||||||
self.assertEqual(jsi.call_function('f'), 17)
|
self._test('function f(){return 0 ?? 42;}', 0)
|
||||||
|
self._test('function f(){return "life, the universe and everything" < 42;}', False)
|
||||||
jsi = JSInterpreter('function f(){return 11 >> 2;}')
|
|
||||||
self.assertEqual(jsi.call_function('f'), 2)
|
|
||||||
|
|
||||||
jsi = JSInterpreter('function f(){return []? 2+3: 4;}')
|
|
||||||
self.assertEqual(jsi.call_function('f'), 5)
|
|
||||||
|
|
||||||
jsi = JSInterpreter('function f(){return 1 == 2}')
|
|
||||||
self.assertEqual(jsi.call_function('f'), False)
|
|
||||||
|
|
||||||
jsi = JSInterpreter('function f(){return 0 && 1 || 2;}')
|
|
||||||
self.assertEqual(jsi.call_function('f'), 2)
|
|
||||||
|
|
||||||
jsi = JSInterpreter('function f(){return 0 ?? 42;}')
|
|
||||||
self.assertEqual(jsi.call_function('f'), 0)
|
|
||||||
|
|
||||||
jsi = JSInterpreter('function f(){return "life, the universe and everything" < 42;}')
|
|
||||||
self.assertFalse(jsi.call_function('f'))
|
|
||||||
|
|
||||||
def test_array_access(self):
|
def test_array_access(self):
|
||||||
jsi = JSInterpreter('function f(){var x = [1,2,3]; x[0] = 4; x[0] = 5; x[2.0] = 7; return x;}')
|
self._test('function f(){var x = [1,2,3]; x[0] = 4; x[0] = 5; x[2.0] = 7; return x;}', [5, 2, 7])
|
||||||
self.assertEqual(jsi.call_function('f'), [5, 2, 7])
|
|
||||||
|
|
||||||
def test_parens(self):
|
def test_parens(self):
|
||||||
jsi = JSInterpreter('function f(){return (1) + (2) * ((( (( (((((3)))))) )) ));}')
|
self._test('function f(){return (1) + (2) * ((( (( (((((3)))))) )) ));}', 7)
|
||||||
self.assertEqual(jsi.call_function('f'), 7)
|
self._test('function f(){return (1 + 2) * 3;}', 9)
|
||||||
|
|
||||||
jsi = JSInterpreter('function f(){return (1 + 2) * 3;}')
|
|
||||||
self.assertEqual(jsi.call_function('f'), 9)
|
|
||||||
|
|
||||||
def test_quotes(self):
|
def test_quotes(self):
|
||||||
jsi = JSInterpreter(R'function f(){return "a\"\\("}')
|
self._test(R'function f(){return "a\"\\("}', R'a"\(')
|
||||||
self.assertEqual(jsi.call_function('f'), R'a"\(')
|
|
||||||
|
|
||||||
def test_assignments(self):
|
def test_assignments(self):
|
||||||
jsi = JSInterpreter('function f(){var x = 20; x = 30 + 1; return x;}')
|
self._test('function f(){var x = 20; x = 30 + 1; return x;}', 31)
|
||||||
self.assertEqual(jsi.call_function('f'), 31)
|
self._test('function f(){var x = 20; x += 30 + 1; return x;}', 51)
|
||||||
|
self._test('function f(){var x = 20; x -= 30 + 1; return x;}', -11)
|
||||||
jsi = JSInterpreter('function f(){var x = 20; x += 30 + 1; return x;}')
|
|
||||||
self.assertEqual(jsi.call_function('f'), 51)
|
|
||||||
|
|
||||||
jsi = JSInterpreter('function f(){var x = 20; x -= 30 + 1; return x;}')
|
|
||||||
self.assertEqual(jsi.call_function('f'), -11)
|
|
||||||
|
|
||||||
|
@unittest.skip('Not implemented')
|
||||||
def test_comments(self):
|
def test_comments(self):
|
||||||
'Skipping: Not yet fully implemented'
|
self._test('''
|
||||||
return
|
function f() {
|
||||||
jsi = JSInterpreter('''
|
var x = /* 1 + */ 2;
|
||||||
function x() {
|
var y = /* 30
|
||||||
var x = /* 1 + */ 2;
|
* 40 */ 50;
|
||||||
var y = /* 30
|
return x + y;
|
||||||
* 40 */ 50;
|
}
|
||||||
return x + y;
|
''', 52)
|
||||||
}
|
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), 52)
|
|
||||||
|
|
||||||
jsi = JSInterpreter('''
|
self._test('''
|
||||||
function f() {
|
function f() {
|
||||||
var x = "/*";
|
var x = "/*";
|
||||||
var y = 1 /* comment */ + 2;
|
var y = 1 /* comment */ + 2;
|
||||||
return y;
|
return y;
|
||||||
}
|
}
|
||||||
''')
|
''', 3)
|
||||||
self.assertEqual(jsi.call_function('f'), 3)
|
|
||||||
|
|
||||||
def test_precedence(self):
|
def test_precedence(self):
|
||||||
jsi = JSInterpreter('''
|
self._test('''
|
||||||
function x() {
|
function f() {
|
||||||
var a = [10, 20, 30, 40, 50];
|
var a = [10, 20, 30, 40, 50];
|
||||||
var b = 6;
|
var b = 6;
|
||||||
a[0]=a[b%a.length];
|
a[0]=a[b%a.length];
|
||||||
return a;
|
return a;
|
||||||
}''')
|
}
|
||||||
self.assertEqual(jsi.call_function('x'), [20, 20, 30, 40, 50])
|
''', [20, 20, 30, 40, 50])
|
||||||
|
|
||||||
def test_builtins(self):
|
def test_builtins(self):
|
||||||
jsi = JSInterpreter('''
|
self._test('function f() { return NaN }', NaN)
|
||||||
function x() { return NaN }
|
|
||||||
''')
|
|
||||||
self.assertTrue(math.isnan(jsi.call_function('x')))
|
|
||||||
|
|
||||||
jsi = JSInterpreter('''
|
def test_date(self):
|
||||||
function x() { return new Date('Wednesday 31 December 1969 18:01:26 MDT') - 0; }
|
self._test('function f() { return new Date("Wednesday 31 December 1969 18:01:26 MDT") - 0; }', 86000)
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), 86000)
|
jsi = JSInterpreter('function f(dt) { return new Date(dt) - 0; }')
|
||||||
jsi = JSInterpreter('''
|
self._test(jsi, 86000, args=['Wednesday 31 December 1969 18:01:26 MDT'])
|
||||||
function x(dt) { return new Date(dt) - 0; }
|
self._test(jsi, 86000, args=['12/31/1969 18:01:26 MDT']) # m/d/y
|
||||||
''')
|
self._test(jsi, 0, args=['1 January 1970 00:00:00 UTC'])
|
||||||
self.assertEqual(jsi.call_function('x', 'Wednesday 31 December 1969 18:01:26 MDT'), 86000)
|
|
||||||
|
|
||||||
def test_call(self):
|
def test_call(self):
|
||||||
jsi = JSInterpreter('''
|
jsi = JSInterpreter('''
|
||||||
function x() { return 2; }
|
function x() { return 2; }
|
||||||
function y(a) { return x() + (a?a:0); }
|
function y(a) { return x() + (a?a:0); }
|
||||||
function z() { return y(3); }
|
function z() { return y(3); }
|
||||||
''')
|
''')
|
||||||
self.assertEqual(jsi.call_function('z'), 5)
|
self._test(jsi, 5, func='z')
|
||||||
self.assertEqual(jsi.call_function('y'), 2)
|
self._test(jsi, 2, func='y')
|
||||||
|
|
||||||
|
def test_if(self):
|
||||||
|
self._test('''
|
||||||
|
function f() {
|
||||||
|
let a = 9;
|
||||||
|
if (0==0) {a++}
|
||||||
|
return a
|
||||||
|
}
|
||||||
|
''', 10)
|
||||||
|
|
||||||
|
self._test('''
|
||||||
|
function f() {
|
||||||
|
if (0==0) {return 10}
|
||||||
|
}
|
||||||
|
''', 10)
|
||||||
|
|
||||||
|
self._test('''
|
||||||
|
function f() {
|
||||||
|
if (0!=0) {return 1}
|
||||||
|
else {return 10}
|
||||||
|
}
|
||||||
|
''', 10)
|
||||||
|
|
||||||
|
""" # Unsupported
|
||||||
|
self._test('''
|
||||||
|
function f() {
|
||||||
|
if (0!=0) {return 1}
|
||||||
|
else if (1==0) {return 2}
|
||||||
|
else {return 10}
|
||||||
|
}
|
||||||
|
''', 10)
|
||||||
|
"""
|
||||||
|
|
||||||
def test_for_loop(self):
|
def test_for_loop(self):
|
||||||
jsi = JSInterpreter('''
|
self._test('function f() { a=0; for (i=0; i-10; i++) {a++} return a }', 10)
|
||||||
function x() { a=0; for (i=0; i-10; i++) {a++} return a }
|
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), 10)
|
|
||||||
|
|
||||||
def test_switch(self):
|
def test_switch(self):
|
||||||
jsi = JSInterpreter('''
|
jsi = JSInterpreter('''
|
||||||
function x(f) { switch(f){
|
function f(x) { switch(x){
|
||||||
case 1:f+=1;
|
case 1:x+=1;
|
||||||
case 2:f+=2;
|
case 2:x+=2;
|
||||||
case 3:f+=3;break;
|
case 3:x+=3;break;
|
||||||
case 4:f+=4;
|
case 4:x+=4;
|
||||||
default:f=0;
|
default:x=0;
|
||||||
} return f }
|
} return x }
|
||||||
''')
|
''')
|
||||||
self.assertEqual(jsi.call_function('x', 1), 7)
|
self._test(jsi, 7, args=[1])
|
||||||
self.assertEqual(jsi.call_function('x', 3), 6)
|
self._test(jsi, 6, args=[3])
|
||||||
self.assertEqual(jsi.call_function('x', 5), 0)
|
self._test(jsi, 0, args=[5])
|
||||||
|
|
||||||
def test_switch_default(self):
|
def test_switch_default(self):
|
||||||
jsi = JSInterpreter('''
|
jsi = JSInterpreter('''
|
||||||
function x(f) { switch(f){
|
function f(x) { switch(x){
|
||||||
case 2: f+=2;
|
case 2: x+=2;
|
||||||
default: f-=1;
|
default: x-=1;
|
||||||
case 5:
|
case 5:
|
||||||
case 6: f+=6;
|
case 6: x+=6;
|
||||||
case 0: break;
|
case 0: break;
|
||||||
case 1: f+=1;
|
case 1: x+=1;
|
||||||
} return f }
|
} return x }
|
||||||
''')
|
''')
|
||||||
self.assertEqual(jsi.call_function('x', 1), 2)
|
self._test(jsi, 2, args=[1])
|
||||||
self.assertEqual(jsi.call_function('x', 5), 11)
|
self._test(jsi, 11, args=[5])
|
||||||
self.assertEqual(jsi.call_function('x', 9), 14)
|
self._test(jsi, 14, args=[9])
|
||||||
|
|
||||||
def test_try(self):
|
def test_try(self):
|
||||||
jsi = JSInterpreter('''
|
self._test('function f() { try{return 10} catch(e){return 5} }', 10)
|
||||||
function x() { try{return 10} catch(e){return 5} }
|
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), 10)
|
|
||||||
|
|
||||||
def test_catch(self):
|
def test_catch(self):
|
||||||
jsi = JSInterpreter('''
|
self._test('function f() { try{throw 10} catch(e){return 5} }', 5)
|
||||||
function x() { try{throw 10} catch(e){return 5} }
|
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), 5)
|
|
||||||
|
|
||||||
def test_finally(self):
|
def test_finally(self):
|
||||||
jsi = JSInterpreter('''
|
self._test('function f() { try{throw 10} finally {return 42} }', 42)
|
||||||
function x() { try{throw 10} finally {return 42} }
|
self._test('function f() { try{throw 10} catch(e){return 5} finally {return 42} }', 42)
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), 42)
|
|
||||||
jsi = JSInterpreter('''
|
|
||||||
function x() { try{throw 10} catch(e){return 5} finally {return 42} }
|
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), 42)
|
|
||||||
|
|
||||||
def test_nested_try(self):
|
def test_nested_try(self):
|
||||||
jsi = JSInterpreter('''
|
self._test('''
|
||||||
function x() {try {
|
function f() {try {
|
||||||
try{throw 10} finally {throw 42}
|
try{throw 10} finally {throw 42}
|
||||||
} catch(e){return 5} }
|
} catch(e){return 5} }
|
||||||
''')
|
''', 5)
|
||||||
self.assertEqual(jsi.call_function('x'), 5)
|
|
||||||
|
|
||||||
def test_for_loop_continue(self):
|
def test_for_loop_continue(self):
|
||||||
jsi = JSInterpreter('''
|
self._test('function f() { a=0; for (i=0; i-10; i++) { continue; a++ } return a }', 0)
|
||||||
function x() { a=0; for (i=0; i-10; i++) { continue; a++ } return a }
|
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), 0)
|
|
||||||
|
|
||||||
def test_for_loop_break(self):
|
def test_for_loop_break(self):
|
||||||
jsi = JSInterpreter('''
|
self._test('function f() { a=0; for (i=0; i-10; i++) { break; a++ } return a }', 0)
|
||||||
function x() { a=0; for (i=0; i-10; i++) { break; a++ } return a }
|
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), 0)
|
|
||||||
|
|
||||||
def test_for_loop_try(self):
|
def test_for_loop_try(self):
|
||||||
jsi = JSInterpreter('''
|
self._test('''
|
||||||
function x() {
|
function f() {
|
||||||
for (i=0; i-10; i++) { try { if (i == 5) throw i} catch {return 10} finally {break} };
|
for (i=0; i-10; i++) { try { if (i == 5) throw i} catch {return 10} finally {break} };
|
||||||
return 42 }
|
return 42 }
|
||||||
''')
|
''', 42)
|
||||||
self.assertEqual(jsi.call_function('x'), 42)
|
|
||||||
|
|
||||||
def test_literal_list(self):
|
def test_literal_list(self):
|
||||||
jsi = JSInterpreter('''
|
self._test('function f() { return [1, 2, "asdf", [5, 6, 7]][3] }', [5, 6, 7])
|
||||||
function x() { return [1, 2, "asdf", [5, 6, 7]][3] }
|
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), [5, 6, 7])
|
|
||||||
|
|
||||||
def test_comma(self):
|
def test_comma(self):
|
||||||
jsi = JSInterpreter('''
|
self._test('function f() { a=5; a -= 1, a+=3; return a }', 7)
|
||||||
function x() { a=5; a -= 1, a+=3; return a }
|
self._test('function f() { a=5; return (a -= 1, a+=3, a); }', 7)
|
||||||
''')
|
self._test('function f() { return (l=[0,1,2,3], function(a, b){return a+b})((l[1], l[2]), l[3]) }', 5)
|
||||||
self.assertEqual(jsi.call_function('x'), 7)
|
|
||||||
|
|
||||||
jsi = JSInterpreter('''
|
|
||||||
function x() { a=5; return (a -= 1, a+=3, a); }
|
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), 7)
|
|
||||||
|
|
||||||
jsi = JSInterpreter('''
|
|
||||||
function x() { return (l=[0,1,2,3], function(a, b){return a+b})((l[1], l[2]), l[3]) }
|
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), 5)
|
|
||||||
|
|
||||||
def test_void(self):
|
def test_void(self):
|
||||||
jsi = JSInterpreter('''
|
self._test('function f() { return void 42; }', None)
|
||||||
function x() { return void 42; }
|
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), None)
|
|
||||||
|
|
||||||
def test_return_function(self):
|
def test_return_function(self):
|
||||||
jsi = JSInterpreter('''
|
jsi = JSInterpreter('''
|
||||||
function x() { return [1, function(){return 1}][1] }
|
function f() { return [1, function(){return 1}][1] }
|
||||||
''')
|
''')
|
||||||
self.assertEqual(jsi.call_function('x')([]), 1)
|
self.assertEqual(jsi.call_function('f')([]), 1)
|
||||||
|
|
||||||
def test_null(self):
|
def test_null(self):
|
||||||
jsi = JSInterpreter('''
|
self._test('function f() { return null; }', None)
|
||||||
function x() { return null; }
|
self._test('function f() { return [null > 0, null < 0, null == 0, null === 0]; }',
|
||||||
''')
|
[False, False, False, False])
|
||||||
self.assertEqual(jsi.call_function('x'), None)
|
self._test('function f() { return [null >= 0, null <= 0]; }', [True, True])
|
||||||
|
|
||||||
jsi = JSInterpreter('''
|
|
||||||
function x() { return [null > 0, null < 0, null == 0, null === 0]; }
|
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), [False, False, False, False])
|
|
||||||
|
|
||||||
jsi = JSInterpreter('''
|
|
||||||
function x() { return [null >= 0, null <= 0]; }
|
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), [True, True])
|
|
||||||
|
|
||||||
def test_undefined(self):
|
def test_undefined(self):
|
||||||
jsi = JSInterpreter('''
|
self._test('function f() { return undefined === undefined; }', True)
|
||||||
function x() { return undefined === undefined; }
|
self._test('function f() { return undefined; }', JS_Undefined)
|
||||||
''')
|
self._test('function f() {return undefined ?? 42; }', 42)
|
||||||
self.assertEqual(jsi.call_function('x'), True)
|
self._test('function f() { let v; return v; }', JS_Undefined)
|
||||||
|
self._test('function f() { let v; return v**0; }', 1)
|
||||||
|
self._test('function f() { let v; return [v>42, v<=42, v&&42, 42&&v]; }',
|
||||||
|
[False, False, JS_Undefined, JS_Undefined])
|
||||||
|
|
||||||
|
self._test('''
|
||||||
|
function f() { return [
|
||||||
|
undefined === undefined,
|
||||||
|
undefined == undefined,
|
||||||
|
undefined == null,
|
||||||
|
undefined < undefined,
|
||||||
|
undefined > undefined,
|
||||||
|
undefined === 0,
|
||||||
|
undefined == 0,
|
||||||
|
undefined < 0,
|
||||||
|
undefined > 0,
|
||||||
|
undefined >= 0,
|
||||||
|
undefined <= 0,
|
||||||
|
undefined > null,
|
||||||
|
undefined < null,
|
||||||
|
undefined === null
|
||||||
|
]; }
|
||||||
|
''', list(map(bool, (1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0))))
|
||||||
|
|
||||||
jsi = JSInterpreter('''
|
jsi = JSInterpreter('''
|
||||||
function x() { return undefined; }
|
function f() { let v; return [42+v, v+42, v**42, 42**v, 0**v]; }
|
||||||
''')
|
''')
|
||||||
self.assertEqual(jsi.call_function('x'), JS_Undefined)
|
for y in jsi.call_function('f'):
|
||||||
|
|
||||||
jsi = JSInterpreter('''
|
|
||||||
function x() { let v; return v; }
|
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), JS_Undefined)
|
|
||||||
|
|
||||||
jsi = JSInterpreter('''
|
|
||||||
function x() { return [undefined === undefined, undefined == undefined, undefined < undefined, undefined > undefined]; }
|
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), [True, True, False, False])
|
|
||||||
|
|
||||||
jsi = JSInterpreter('''
|
|
||||||
function x() { return [undefined === 0, undefined == 0, undefined < 0, undefined > 0]; }
|
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), [False, False, False, False])
|
|
||||||
|
|
||||||
jsi = JSInterpreter('''
|
|
||||||
function x() { return [undefined >= 0, undefined <= 0]; }
|
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), [False, False])
|
|
||||||
|
|
||||||
jsi = JSInterpreter('''
|
|
||||||
function x() { return [undefined > null, undefined < null, undefined == null, undefined === null]; }
|
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), [False, False, True, False])
|
|
||||||
|
|
||||||
jsi = JSInterpreter('''
|
|
||||||
function x() { return [undefined === null, undefined == null, undefined < null, undefined > null]; }
|
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), [False, True, False, False])
|
|
||||||
|
|
||||||
jsi = JSInterpreter('''
|
|
||||||
function x() { let v; return [42+v, v+42, v**42, 42**v, 0**v]; }
|
|
||||||
''')
|
|
||||||
for y in jsi.call_function('x'):
|
|
||||||
self.assertTrue(math.isnan(y))
|
self.assertTrue(math.isnan(y))
|
||||||
|
|
||||||
jsi = JSInterpreter('''
|
|
||||||
function x() { let v; return v**0; }
|
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), 1)
|
|
||||||
|
|
||||||
jsi = JSInterpreter('''
|
|
||||||
function x() { let v; return [v>42, v<=42, v&&42, 42&&v]; }
|
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), [False, False, JS_Undefined, JS_Undefined])
|
|
||||||
|
|
||||||
jsi = JSInterpreter('function x(){return undefined ?? 42; }')
|
|
||||||
self.assertEqual(jsi.call_function('x'), 42)
|
|
||||||
|
|
||||||
def test_object(self):
|
def test_object(self):
|
||||||
jsi = JSInterpreter('''
|
self._test('function f() { return {}; }', {})
|
||||||
function x() { return {}; }
|
self._test('function f() { let a = {m1: 42, m2: 0 }; return [a["m1"], a.m2]; }', [42, 0])
|
||||||
''')
|
self._test('function f() { let a; return a?.qq; }', JS_Undefined)
|
||||||
self.assertEqual(jsi.call_function('x'), {})
|
self._test('function f() { let a = {m1: 42, m2: 0 }; return a?.qq; }', JS_Undefined)
|
||||||
|
|
||||||
jsi = JSInterpreter('''
|
|
||||||
function x() { let a = {m1: 42, m2: 0 }; return [a["m1"], a.m2]; }
|
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), [42, 0])
|
|
||||||
|
|
||||||
jsi = JSInterpreter('''
|
|
||||||
function x() { let a; return a?.qq; }
|
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), JS_Undefined)
|
|
||||||
|
|
||||||
jsi = JSInterpreter('''
|
|
||||||
function x() { let a = {m1: 42, m2: 0 }; return a?.qq; }
|
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), JS_Undefined)
|
|
||||||
|
|
||||||
def test_regex(self):
|
def test_regex(self):
|
||||||
jsi = JSInterpreter('''
|
self._test('function f() { let a=/,,[/,913,/](,)}/; }', None)
|
||||||
function x() { let a=/,,[/,913,/](,)}/; }
|
self._test('function f() { let a=/,,[/,913,/](,)}/; return a; }', R'/,,[/,913,/](,)}/0')
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x'), None)
|
|
||||||
|
|
||||||
jsi = JSInterpreter('''
|
R''' # We are not compiling regex
|
||||||
function x() { let a=/,,[/,913,/](,)}/; return a; }
|
jsi = JSInterpreter('function f() { let a=/,,[/,913,/](,)}/; return a; }')
|
||||||
''')
|
self.assertIsInstance(jsi.call_function('f'), re.Pattern)
|
||||||
self.assertIsInstance(jsi.call_function('x'), re.Pattern)
|
|
||||||
|
|
||||||
jsi = JSInterpreter('''
|
jsi = JSInterpreter('function f() { let a=/,,[/,913,/](,)}/i; return a; }')
|
||||||
function x() { let a=/,,[/,913,/](,)}/i; return a; }
|
self.assertEqual(jsi.call_function('f').flags & re.I, re.I)
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x').flags & re.I, re.I)
|
|
||||||
|
|
||||||
jsi = JSInterpreter(R'''
|
jsi = JSInterpreter(R'function f() { let a=/,][}",],()}(\[)/; return a; }')
|
||||||
function x() { let a=/,][}",],()}(\[)/; return a; }
|
self.assertEqual(jsi.call_function('f').pattern, r',][}",],()}(\[)')
|
||||||
''')
|
|
||||||
self.assertEqual(jsi.call_function('x').pattern, r',][}",],()}(\[)')
|
|
||||||
|
|
||||||
jsi = JSInterpreter(R'''
|
jsi = JSInterpreter(R'function f() { let a=[/[)\\]/]; return a[0]; }')
|
||||||
function x() { let a=[/[)\\]/]; return a[0]; }
|
self.assertEqual(jsi.call_function('f').pattern, r'[)\\]')
|
||||||
''')
|
'''
|
||||||
self.assertEqual(jsi.call_function('x').pattern, r'[)\\]')
|
|
||||||
|
@unittest.skip('Not implemented')
|
||||||
|
def test_replace(self):
|
||||||
|
self._test('function f() { let a="data-name".replace("data-", ""); return a }',
|
||||||
|
'name')
|
||||||
|
self._test('function f() { let a="data-name".replace(new RegExp("^.+-"), ""); return a; }',
|
||||||
|
'name')
|
||||||
|
self._test('function f() { let a="data-name".replace(/^.+-/, ""); return a; }',
|
||||||
|
'name')
|
||||||
|
self._test('function f() { let a="data-name".replace(/a/g, "o"); return a; }',
|
||||||
|
'doto-nome')
|
||||||
|
self._test('function f() { let a="data-name".replaceAll("a", "o"); return a; }',
|
||||||
|
'doto-nome')
|
||||||
|
|
||||||
def test_char_code_at(self):
|
def test_char_code_at(self):
|
||||||
jsi = JSInterpreter('function x(i){return "test".charCodeAt(i)}')
|
jsi = JSInterpreter('function f(i){return "test".charCodeAt(i)}')
|
||||||
self.assertEqual(jsi.call_function('x', 0), 116)
|
self._test(jsi, 116, args=[0])
|
||||||
self.assertEqual(jsi.call_function('x', 1), 101)
|
self._test(jsi, 101, args=[1])
|
||||||
self.assertEqual(jsi.call_function('x', 2), 115)
|
self._test(jsi, 115, args=[2])
|
||||||
self.assertEqual(jsi.call_function('x', 3), 116)
|
self._test(jsi, 116, args=[3])
|
||||||
self.assertEqual(jsi.call_function('x', 4), None)
|
self._test(jsi, None, args=[4])
|
||||||
self.assertEqual(jsi.call_function('x', 'not_a_number'), 116)
|
self._test(jsi, 116, args=['not_a_number'])
|
||||||
|
|
||||||
def test_bitwise_operators_overflow(self):
|
def test_bitwise_operators_overflow(self):
|
||||||
jsi = JSInterpreter('function x(){return -524999584 << 5}')
|
self._test('function f(){return -524999584 << 5}', 379882496)
|
||||||
self.assertEqual(jsi.call_function('x'), 379882496)
|
self._test('function f(){return 1236566549 << 5}', 915423904)
|
||||||
|
|
||||||
jsi = JSInterpreter('function x(){return 1236566549 << 5}')
|
def test_bitwise_operators_typecast(self):
|
||||||
self.assertEqual(jsi.call_function('x'), 915423904)
|
self._test('function f(){return null << 5}', 0)
|
||||||
|
self._test('function f(){return undefined >> 5}', 0)
|
||||||
|
self._test('function f(){return 42 << NaN}', 42)
|
||||||
|
|
||||||
|
def test_negative(self):
|
||||||
|
self._test('function f(){return 2 * -2.0 ;}', -4)
|
||||||
|
self._test('function f(){return 2 - - -2 ;}', 0)
|
||||||
|
self._test('function f(){return 2 - - - -2 ;}', 4)
|
||||||
|
self._test('function f(){return 2 - + + - -2;}', 0)
|
||||||
|
self._test('function f(){return 2 + - + - -2;}', 0)
|
||||||
|
|
||||||
|
@unittest.skip('Not implemented')
|
||||||
|
def test_packed(self):
|
||||||
|
jsi = JSInterpreter('''function f(p,a,c,k,e,d){while(c--)if(k[c])p=p.replace(new RegExp('\\b'+c.toString(a)+'\\b','g'),k[c]);return p}''')
|
||||||
|
self.assertEqual(jsi.call_function('f', '''h 7=g("1j");7.7h({7g:[{33:"w://7f-7e-7d-7c.v.7b/7a/79/78/77/76.74?t=73&s=2s&e=72&f=2t&71=70.0.0.1&6z=6y&6x=6w"}],6v:"w://32.v.u/6u.31",16:"r%",15:"r%",6t:"6s",6r:"",6q:"l",6p:"l",6o:"6n",6m:\'6l\',6k:"6j",9:[{33:"/2u?b=6i&n=50&6h=w://32.v.u/6g.31",6f:"6e"}],1y:{6d:1,6c:\'#6b\',6a:\'#69\',68:"67",66:30,65:r,},"64":{63:"%62 2m%m%61%5z%5y%5x.u%5w%5v%5u.2y%22 2k%m%1o%22 5t%m%1o%22 5s%m%1o%22 2j%m%5r%22 16%m%5q%22 15%m%5p%22 5o%2z%5n%5m%2z",5l:"w://v.u/d/1k/5k.2y",5j:[]},\'5i\':{"5h":"5g"},5f:"5e",5d:"w://v.u",5c:{},5b:l,1x:[0.25,0.50,0.75,1,1.25,1.5,2]});h 1m,1n,5a;h 59=0,58=0;h 7=g("1j");h 2x=0,57=0,56=0;$.55({54:{\'53-52\':\'2i-51\'}});7.j(\'4z\',6(x){c(5>0&&x.1l>=5&&1n!=1){1n=1;$(\'q.4y\').4x(\'4w\')}});7.j(\'13\',6(x){2x=x.1l});7.j(\'2g\',6(x){2w(x)});7.j(\'4v\',6(){$(\'q.2v\').4u()});6 2w(x){$(\'q.2v\').4t();c(1m)19;1m=1;17=0;c(4s.4r===l){17=1}$.4q(\'/2u?b=4p&2l=1k&4o=2t-4n-4m-2s-4l&4k=&4j=&4i=&17=\'+17,6(2r){$(\'#4h\').4g(2r)});$(\'.3-8-4f-4e:4d("4c")\').2h(6(e){2q();g().4b(0);g().4a(l)});6 2q(){h $14=$("<q />").2p({1l:"49",16:"r%",15:"r%",48:0,2n:0,2o:47,46:"45(10%, 10%, 10%, 0.4)","44-43":"42"});$("<41 />").2p({16:"60%",15:"60%",2o:40,"3z-2n":"3y"}).3x({\'2m\':\'/?b=3w&2l=1k\',\'2k\':\'0\',\'2j\':\'2i\'}).2f($14);$14.2h(6(){$(3v).3u();g().2g()});$14.2f($(\'#1j\'))}g().13(0);}6 3t(){h 9=7.1b(2e);2d.2c(9);c(9.n>1){1r(i=0;i<9.n;i++){c(9[i].1a==2e){2d.2c(\'!!=\'+i);7.1p(i)}}}}7.j(\'3s\',6(){g().1h("/2a/3r.29","3q 10 28",6(){g().13(g().27()+10)},"2b");$("q[26=2b]").23().21(\'.3-20-1z\');g().1h("/2a/3p.29","3o 10 28",6(){h 12=g().27()-10;c(12<0)12=0;g().13(12)},"24");$("q[26=24]").23().21(\'.3-20-1z\');});6 1i(){}7.j(\'3n\',6(){1i()});7.j(\'3m\',6(){1i()});7.j("k",6(y){h 9=7.1b();c(9.n<2)19;$(\'.3-8-3l-3k\').3j(6(){$(\'#3-8-a-k\').1e(\'3-8-a-z\');$(\'.3-a-k\').p(\'o-1f\',\'11\')});7.1h("/3i/3h.3g","3f 3e",6(){$(\'.3-1w\').3d(\'3-8-1v\');$(\'.3-8-1y, .3-8-1x\').p(\'o-1g\',\'11\');c($(\'.3-1w\').3c(\'3-8-1v\')){$(\'.3-a-k\').p(\'o-1g\',\'l\');$(\'.3-a-k\').p(\'o-1f\',\'l\');$(\'.3-8-a\').1e(\'3-8-a-z\');$(\'.3-8-a:1u\').3b(\'3-8-a-z\')}3a{$(\'.3-a-k\').p(\'o-1g\',\'11\');$(\'.3-a-k\').p(\'o-1f\',\'11\');$(\'.3-8-a:1u\').1e(\'3-8-a-z\')}},"39");7.j("38",6(y){1d.37(\'1c\',y.9[y.36].1a)});c(1d.1t(\'1c\')){35("1s(1d.1t(\'1c\'));",34)}});h 18;6 1s(1q){h 9=7.1b();c(9.n>1){1r(i=0;i<9.n;i++){c(9[i].1a==1q){c(i==18){19}18=i;7.1p(i)}}}}',36,270,'|||jw|||function|player|settings|tracks|submenu||if||||jwplayer|var||on|audioTracks|true|3D|length|aria|attr|div|100|||sx|filemoon|https||event|active||false|tt|seek|dd|height|width|adb|current_audio|return|name|getAudioTracks|default_audio|localStorage|removeClass|expanded|checked|addButton|callMeMaybe|vplayer|0fxcyc2ajhp1|position|vvplay|vvad|220|setCurrentAudioTrack|audio_name|for|audio_set|getItem|last|open|controls|playbackRates|captions|rewind|icon|insertAfter||detach|ff00||button|getPosition|sec|png|player8|ff11|log|console|track_name|appendTo|play|click|no|scrolling|frameborder|file_code|src|top|zIndex|css|showCCform|data|1662367683|383371|dl|video_ad|doPlay|prevt|mp4|3E||jpg|thumbs|file|300|setTimeout|currentTrack|setItem|audioTrackChanged|dualSound|else|addClass|hasClass|toggleClass|Track|Audio|svg|dualy|images|mousedown|buttons|topbar|playAttemptFailed|beforePlay|Rewind|fr|Forward|ff|ready|set_audio_track|remove|this|upload_srt|prop|50px|margin|1000001|iframe|center|align|text|rgba|background|1000000|left|absolute|pause|setCurrentCaptions|Upload|contains|item|content|html|fviews|referer|prem|embed|3e57249ef633e0d03bf76ceb8d8a4b65|216|83|hash|view|get|TokenZir|window|hide|show|complete|slow|fadeIn|video_ad_fadein|time||cache|Cache|Content|headers|ajaxSetup|v2done|tott|vastdone2|vastdone1|vvbefore|playbackRateControls|cast|aboutlink|FileMoon|abouttext|UHD|1870|qualityLabels|sites|GNOME_POWER|link|2Fiframe|3C|allowfullscreen|22360|22640|22no|marginheight|marginwidth|2FGNOME_POWER|2F0fxcyc2ajhp1|2Fe|2Ffilemoon|2F|3A||22https|3Ciframe|code|sharing|fontOpacity|backgroundOpacity|Tahoma|fontFamily|303030|backgroundColor|FFFFFF|color|userFontScale|thumbnails|kind|0fxcyc2ajhp10000|url|get_slides|start|startparam|none|preload|html5|primary|hlshtml|androidhls|duration|uniform|stretching|0fxcyc2ajhp1_xt|image|2048|sp|6871|asn|127|srv|43200|_g3XlBcu2lmD9oDexD2NLWSmah2Nu3XcDrl93m9PwXY|m3u8||master|0fxcyc2ajhp1_x|00076|01|hls2|to|s01|delivery|storage|moon|sources|setup'''.split('|')))
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
|
|
1584
test/test_networking.py
Normal file
1584
test/test_networking.py
Normal file
File diff suppressed because it is too large
Load diff
208
test/test_networking_utils.py
Normal file
208
test/test_networking_utils.py
Normal file
|
@ -0,0 +1,208 @@
|
||||||
|
#!/usr/bin/env python3
|
||||||
|
|
||||||
|
# Allow direct execution
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||||
|
|
||||||
|
import io
|
||||||
|
import random
|
||||||
|
import ssl
|
||||||
|
|
||||||
|
from yt_dlp.cookies import YoutubeDLCookieJar
|
||||||
|
from yt_dlp.dependencies import certifi
|
||||||
|
from yt_dlp.networking import Response
|
||||||
|
from yt_dlp.networking._helper import (
|
||||||
|
InstanceStoreMixin,
|
||||||
|
add_accept_encoding_header,
|
||||||
|
get_redirect_method,
|
||||||
|
make_socks_proxy_opts,
|
||||||
|
select_proxy,
|
||||||
|
ssl_load_certs,
|
||||||
|
)
|
||||||
|
from yt_dlp.networking.exceptions import (
|
||||||
|
HTTPError,
|
||||||
|
IncompleteRead,
|
||||||
|
)
|
||||||
|
from yt_dlp.socks import ProxyType
|
||||||
|
from yt_dlp.utils.networking import HTTPHeaderDict
|
||||||
|
|
||||||
|
TEST_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||||
|
|
||||||
|
|
||||||
|
class TestNetworkingUtils:
|
||||||
|
|
||||||
|
def test_select_proxy(self):
|
||||||
|
proxies = {
|
||||||
|
'all': 'socks5://example.com',
|
||||||
|
'http': 'http://example.com:1080',
|
||||||
|
'no': 'bypass.example.com,yt-dl.org'
|
||||||
|
}
|
||||||
|
|
||||||
|
assert select_proxy('https://example.com', proxies) == proxies['all']
|
||||||
|
assert select_proxy('http://example.com', proxies) == proxies['http']
|
||||||
|
assert select_proxy('http://bypass.example.com', proxies) is None
|
||||||
|
assert select_proxy('https://yt-dl.org', proxies) is None
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('socks_proxy,expected', [
|
||||||
|
('socks5h://example.com', {
|
||||||
|
'proxytype': ProxyType.SOCKS5,
|
||||||
|
'addr': 'example.com',
|
||||||
|
'port': 1080,
|
||||||
|
'rdns': True,
|
||||||
|
'username': None,
|
||||||
|
'password': None
|
||||||
|
}),
|
||||||
|
('socks5://user:@example.com:5555', {
|
||||||
|
'proxytype': ProxyType.SOCKS5,
|
||||||
|
'addr': 'example.com',
|
||||||
|
'port': 5555,
|
||||||
|
'rdns': False,
|
||||||
|
'username': 'user',
|
||||||
|
'password': ''
|
||||||
|
}),
|
||||||
|
('socks4://u%40ser:pa%20ss@127.0.0.1:1080', {
|
||||||
|
'proxytype': ProxyType.SOCKS4,
|
||||||
|
'addr': '127.0.0.1',
|
||||||
|
'port': 1080,
|
||||||
|
'rdns': False,
|
||||||
|
'username': 'u@ser',
|
||||||
|
'password': 'pa ss'
|
||||||
|
}),
|
||||||
|
('socks4a://:pa%20ss@127.0.0.1', {
|
||||||
|
'proxytype': ProxyType.SOCKS4A,
|
||||||
|
'addr': '127.0.0.1',
|
||||||
|
'port': 1080,
|
||||||
|
'rdns': True,
|
||||||
|
'username': '',
|
||||||
|
'password': 'pa ss'
|
||||||
|
})
|
||||||
|
])
|
||||||
|
def test_make_socks_proxy_opts(self, socks_proxy, expected):
|
||||||
|
assert make_socks_proxy_opts(socks_proxy) == expected
|
||||||
|
|
||||||
|
def test_make_socks_proxy_unknown(self):
|
||||||
|
with pytest.raises(ValueError, match='Unknown SOCKS proxy version: socks'):
|
||||||
|
make_socks_proxy_opts('socks://127.0.0.1')
|
||||||
|
|
||||||
|
@pytest.mark.skipif(not certifi, reason='certifi is not installed')
|
||||||
|
def test_load_certifi(self):
|
||||||
|
context_certifi = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
|
||||||
|
context_certifi.load_verify_locations(cafile=certifi.where())
|
||||||
|
context = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
|
||||||
|
ssl_load_certs(context, use_certifi=True)
|
||||||
|
assert context.get_ca_certs() == context_certifi.get_ca_certs()
|
||||||
|
|
||||||
|
context_default = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
|
||||||
|
context_default.load_default_certs()
|
||||||
|
context = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
|
||||||
|
ssl_load_certs(context, use_certifi=False)
|
||||||
|
assert context.get_ca_certs() == context_default.get_ca_certs()
|
||||||
|
|
||||||
|
if context_default.get_ca_certs() == context_certifi.get_ca_certs():
|
||||||
|
pytest.skip('System uses certifi as default. The test is not valid')
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('method,status,expected', [
|
||||||
|
('GET', 303, 'GET'),
|
||||||
|
('HEAD', 303, 'HEAD'),
|
||||||
|
('PUT', 303, 'GET'),
|
||||||
|
('POST', 301, 'GET'),
|
||||||
|
('HEAD', 301, 'HEAD'),
|
||||||
|
('POST', 302, 'GET'),
|
||||||
|
('HEAD', 302, 'HEAD'),
|
||||||
|
('PUT', 302, 'PUT'),
|
||||||
|
('POST', 308, 'POST'),
|
||||||
|
('POST', 307, 'POST'),
|
||||||
|
('HEAD', 308, 'HEAD'),
|
||||||
|
('HEAD', 307, 'HEAD'),
|
||||||
|
])
|
||||||
|
def test_get_redirect_method(self, method, status, expected):
|
||||||
|
assert get_redirect_method(method, status) == expected
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('headers,supported_encodings,expected', [
|
||||||
|
({'Accept-Encoding': 'br'}, ['gzip', 'br'], {'Accept-Encoding': 'br'}),
|
||||||
|
({}, ['gzip', 'br'], {'Accept-Encoding': 'gzip, br'}),
|
||||||
|
({'Content-type': 'application/json'}, [], {'Content-type': 'application/json', 'Accept-Encoding': 'identity'}),
|
||||||
|
])
|
||||||
|
def test_add_accept_encoding_header(self, headers, supported_encodings, expected):
|
||||||
|
headers = HTTPHeaderDict(headers)
|
||||||
|
add_accept_encoding_header(headers, supported_encodings)
|
||||||
|
assert headers == HTTPHeaderDict(expected)
|
||||||
|
|
||||||
|
|
||||||
|
class TestInstanceStoreMixin:
|
||||||
|
|
||||||
|
class FakeInstanceStoreMixin(InstanceStoreMixin):
|
||||||
|
def _create_instance(self, **kwargs):
|
||||||
|
return random.randint(0, 1000000)
|
||||||
|
|
||||||
|
def _close_instance(self, instance):
|
||||||
|
pass
|
||||||
|
|
||||||
|
def test_mixin(self):
|
||||||
|
mixin = self.FakeInstanceStoreMixin()
|
||||||
|
assert mixin._get_instance(d={'a': 1, 'b': 2, 'c': {'d', 4}}) == mixin._get_instance(d={'a': 1, 'b': 2, 'c': {'d', 4}})
|
||||||
|
|
||||||
|
assert mixin._get_instance(d={'a': 1, 'b': 2, 'c': {'e', 4}}) != mixin._get_instance(d={'a': 1, 'b': 2, 'c': {'d', 4}})
|
||||||
|
|
||||||
|
assert mixin._get_instance(d={'a': 1, 'b': 2, 'c': {'d', 4}} != mixin._get_instance(d={'a': 1, 'b': 2, 'g': {'d', 4}}))
|
||||||
|
|
||||||
|
assert mixin._get_instance(d={'a': 1}, e=[1, 2, 3]) == mixin._get_instance(d={'a': 1}, e=[1, 2, 3])
|
||||||
|
|
||||||
|
assert mixin._get_instance(d={'a': 1}, e=[1, 2, 3]) != mixin._get_instance(d={'a': 1}, e=[1, 2, 3, 4])
|
||||||
|
|
||||||
|
cookiejar = YoutubeDLCookieJar()
|
||||||
|
assert mixin._get_instance(b=[1, 2], c=cookiejar) == mixin._get_instance(b=[1, 2], c=cookiejar)
|
||||||
|
|
||||||
|
assert mixin._get_instance(b=[1, 2], c=cookiejar) != mixin._get_instance(b=[1, 2], c=YoutubeDLCookieJar())
|
||||||
|
|
||||||
|
# Different order
|
||||||
|
assert mixin._get_instance(c=cookiejar, b=[1, 2]) == mixin._get_instance(b=[1, 2], c=cookiejar)
|
||||||
|
|
||||||
|
m = mixin._get_instance(t=1234)
|
||||||
|
assert mixin._get_instance(t=1234) == m
|
||||||
|
mixin._clear_instances()
|
||||||
|
assert mixin._get_instance(t=1234) != m
|
||||||
|
|
||||||
|
|
||||||
|
class TestNetworkingExceptions:
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def create_response(status):
|
||||||
|
return Response(fp=io.BytesIO(b'test'), url='http://example.com', headers={'tesT': 'test'}, status=status)
|
||||||
|
|
||||||
|
def test_http_error(self):
|
||||||
|
|
||||||
|
response = self.create_response(403)
|
||||||
|
error = HTTPError(response)
|
||||||
|
|
||||||
|
assert error.status == 403
|
||||||
|
assert str(error) == error.msg == 'HTTP Error 403: Forbidden'
|
||||||
|
assert error.reason == response.reason
|
||||||
|
assert error.response is response
|
||||||
|
|
||||||
|
data = error.response.read()
|
||||||
|
assert data == b'test'
|
||||||
|
assert repr(error) == '<HTTPError 403: Forbidden>'
|
||||||
|
|
||||||
|
def test_redirect_http_error(self):
|
||||||
|
response = self.create_response(301)
|
||||||
|
error = HTTPError(response, redirect_loop=True)
|
||||||
|
assert str(error) == error.msg == 'HTTP Error 301: Moved Permanently (redirect loop detected)'
|
||||||
|
assert error.reason == 'Moved Permanently'
|
||||||
|
|
||||||
|
def test_incomplete_read_error(self):
|
||||||
|
error = IncompleteRead(4, 3, cause='test')
|
||||||
|
assert isinstance(error, IncompleteRead)
|
||||||
|
assert repr(error) == '<IncompleteRead: 4 bytes read, 3 more expected>'
|
||||||
|
assert str(error) == error.msg == '4 bytes read, 3 more expected'
|
||||||
|
assert error.partial == 4
|
||||||
|
assert error.expected == 3
|
||||||
|
assert error.cause == 'test'
|
||||||
|
|
||||||
|
error = IncompleteRead(3)
|
||||||
|
assert repr(error) == '<IncompleteRead: 3 bytes read>'
|
||||||
|
assert str(error) == '3 bytes read'
|
|
@ -1,113 +1,476 @@
|
||||||
#!/usr/bin/env python3
|
#!/usr/bin/env python3
|
||||||
|
|
||||||
# Allow direct execution
|
# Allow direct execution
|
||||||
import os
|
import os
|
||||||
import sys
|
import sys
|
||||||
|
import threading
|
||||||
import unittest
|
import unittest
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||||
|
|
||||||
|
import abc
|
||||||
|
import contextlib
|
||||||
|
import enum
|
||||||
|
import functools
|
||||||
|
import http.server
|
||||||
|
import json
|
||||||
import random
|
import random
|
||||||
import subprocess
|
import socket
|
||||||
import urllib.request
|
import struct
|
||||||
|
import time
|
||||||
|
from socketserver import (
|
||||||
|
BaseRequestHandler,
|
||||||
|
StreamRequestHandler,
|
||||||
|
ThreadingTCPServer,
|
||||||
|
)
|
||||||
|
|
||||||
from test.helper import FakeYDL, get_params, is_download_test
|
from test.helper import http_server_port, verify_address_availability
|
||||||
|
from yt_dlp.networking import Request
|
||||||
|
from yt_dlp.networking.exceptions import ProxyError, TransportError
|
||||||
|
from yt_dlp.socks import (
|
||||||
|
SOCKS4_REPLY_VERSION,
|
||||||
|
SOCKS4_VERSION,
|
||||||
|
SOCKS5_USER_AUTH_SUCCESS,
|
||||||
|
SOCKS5_USER_AUTH_VERSION,
|
||||||
|
SOCKS5_VERSION,
|
||||||
|
Socks5AddressType,
|
||||||
|
Socks5Auth,
|
||||||
|
)
|
||||||
|
|
||||||
|
SOCKS5_USER_AUTH_FAILURE = 0x1
|
||||||
|
|
||||||
|
|
||||||
@is_download_test
|
class Socks4CD(enum.IntEnum):
|
||||||
class TestMultipleSocks(unittest.TestCase):
|
REQUEST_GRANTED = 90
|
||||||
@staticmethod
|
REQUEST_REJECTED_OR_FAILED = 91
|
||||||
def _check_params(attrs):
|
REQUEST_REJECTED_CANNOT_CONNECT_TO_IDENTD = 92
|
||||||
params = get_params()
|
REQUEST_REJECTED_DIFFERENT_USERID = 93
|
||||||
for attr in attrs:
|
|
||||||
if attr not in params:
|
|
||||||
print('Missing %s. Skipping.' % attr)
|
class Socks5Reply(enum.IntEnum):
|
||||||
|
SUCCEEDED = 0x0
|
||||||
|
GENERAL_FAILURE = 0x1
|
||||||
|
CONNECTION_NOT_ALLOWED = 0x2
|
||||||
|
NETWORK_UNREACHABLE = 0x3
|
||||||
|
HOST_UNREACHABLE = 0x4
|
||||||
|
CONNECTION_REFUSED = 0x5
|
||||||
|
TTL_EXPIRED = 0x6
|
||||||
|
COMMAND_NOT_SUPPORTED = 0x7
|
||||||
|
ADDRESS_TYPE_NOT_SUPPORTED = 0x8
|
||||||
|
|
||||||
|
|
||||||
|
class SocksTestRequestHandler(BaseRequestHandler):
|
||||||
|
|
||||||
|
def __init__(self, *args, socks_info=None, **kwargs):
|
||||||
|
self.socks_info = socks_info
|
||||||
|
super().__init__(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
class SocksProxyHandler(BaseRequestHandler):
|
||||||
|
def __init__(self, request_handler_class, socks_server_kwargs, *args, **kwargs):
|
||||||
|
self.socks_kwargs = socks_server_kwargs or {}
|
||||||
|
self.request_handler_class = request_handler_class
|
||||||
|
super().__init__(*args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
class Socks5ProxyHandler(StreamRequestHandler, SocksProxyHandler):
|
||||||
|
|
||||||
|
# SOCKS5 protocol https://tools.ietf.org/html/rfc1928
|
||||||
|
# SOCKS5 username/password authentication https://tools.ietf.org/html/rfc1929
|
||||||
|
|
||||||
|
def handle(self):
|
||||||
|
sleep = self.socks_kwargs.get('sleep')
|
||||||
|
if sleep:
|
||||||
|
time.sleep(sleep)
|
||||||
|
version, nmethods = self.connection.recv(2)
|
||||||
|
assert version == SOCKS5_VERSION
|
||||||
|
methods = list(self.connection.recv(nmethods))
|
||||||
|
|
||||||
|
auth = self.socks_kwargs.get('auth')
|
||||||
|
|
||||||
|
if auth is not None and Socks5Auth.AUTH_USER_PASS not in methods:
|
||||||
|
self.connection.sendall(struct.pack('!BB', SOCKS5_VERSION, Socks5Auth.AUTH_NO_ACCEPTABLE))
|
||||||
|
self.server.close_request(self.request)
|
||||||
|
return
|
||||||
|
|
||||||
|
elif Socks5Auth.AUTH_USER_PASS in methods:
|
||||||
|
self.connection.sendall(struct.pack("!BB", SOCKS5_VERSION, Socks5Auth.AUTH_USER_PASS))
|
||||||
|
|
||||||
|
_, user_len = struct.unpack('!BB', self.connection.recv(2))
|
||||||
|
username = self.connection.recv(user_len).decode()
|
||||||
|
pass_len = ord(self.connection.recv(1))
|
||||||
|
password = self.connection.recv(pass_len).decode()
|
||||||
|
|
||||||
|
if username == auth[0] and password == auth[1]:
|
||||||
|
self.connection.sendall(struct.pack('!BB', SOCKS5_USER_AUTH_VERSION, SOCKS5_USER_AUTH_SUCCESS))
|
||||||
|
else:
|
||||||
|
self.connection.sendall(struct.pack('!BB', SOCKS5_USER_AUTH_VERSION, SOCKS5_USER_AUTH_FAILURE))
|
||||||
|
self.server.close_request(self.request)
|
||||||
return
|
return
|
||||||
return params
|
|
||||||
|
|
||||||
def test_proxy_http(self):
|
elif Socks5Auth.AUTH_NONE in methods:
|
||||||
params = self._check_params(['primary_proxy', 'primary_server_ip'])
|
self.connection.sendall(struct.pack('!BB', SOCKS5_VERSION, Socks5Auth.AUTH_NONE))
|
||||||
if params is None:
|
else:
|
||||||
return
|
self.connection.sendall(struct.pack('!BB', SOCKS5_VERSION, Socks5Auth.AUTH_NO_ACCEPTABLE))
|
||||||
ydl = FakeYDL({
|
self.server.close_request(self.request)
|
||||||
'proxy': params['primary_proxy']
|
|
||||||
})
|
|
||||||
self.assertEqual(
|
|
||||||
ydl.urlopen('http://yt-dl.org/ip').read().decode(),
|
|
||||||
params['primary_server_ip'])
|
|
||||||
|
|
||||||
def test_proxy_https(self):
|
|
||||||
params = self._check_params(['primary_proxy', 'primary_server_ip'])
|
|
||||||
if params is None:
|
|
||||||
return
|
|
||||||
ydl = FakeYDL({
|
|
||||||
'proxy': params['primary_proxy']
|
|
||||||
})
|
|
||||||
self.assertEqual(
|
|
||||||
ydl.urlopen('https://yt-dl.org/ip').read().decode(),
|
|
||||||
params['primary_server_ip'])
|
|
||||||
|
|
||||||
def test_secondary_proxy_http(self):
|
|
||||||
params = self._check_params(['secondary_proxy', 'secondary_server_ip'])
|
|
||||||
if params is None:
|
|
||||||
return
|
|
||||||
ydl = FakeYDL()
|
|
||||||
req = urllib.request.Request('http://yt-dl.org/ip')
|
|
||||||
req.add_header('Ytdl-request-proxy', params['secondary_proxy'])
|
|
||||||
self.assertEqual(
|
|
||||||
ydl.urlopen(req).read().decode(),
|
|
||||||
params['secondary_server_ip'])
|
|
||||||
|
|
||||||
def test_secondary_proxy_https(self):
|
|
||||||
params = self._check_params(['secondary_proxy', 'secondary_server_ip'])
|
|
||||||
if params is None:
|
|
||||||
return
|
|
||||||
ydl = FakeYDL()
|
|
||||||
req = urllib.request.Request('https://yt-dl.org/ip')
|
|
||||||
req.add_header('Ytdl-request-proxy', params['secondary_proxy'])
|
|
||||||
self.assertEqual(
|
|
||||||
ydl.urlopen(req).read().decode(),
|
|
||||||
params['secondary_server_ip'])
|
|
||||||
|
|
||||||
|
|
||||||
@is_download_test
|
|
||||||
class TestSocks(unittest.TestCase):
|
|
||||||
_SKIP_SOCKS_TEST = True
|
|
||||||
|
|
||||||
def setUp(self):
|
|
||||||
if self._SKIP_SOCKS_TEST:
|
|
||||||
return
|
return
|
||||||
|
|
||||||
self.port = random.randint(20000, 30000)
|
version, command, _, address_type = struct.unpack('!BBBB', self.connection.recv(4))
|
||||||
self.server_process = subprocess.Popen([
|
socks_info = {
|
||||||
'srelay', '-f', '-i', '127.0.0.1:%d' % self.port],
|
'version': version,
|
||||||
stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
|
'auth_methods': methods,
|
||||||
|
'command': command,
|
||||||
|
'client_address': self.client_address,
|
||||||
|
'ipv4_address': None,
|
||||||
|
'domain_address': None,
|
||||||
|
'ipv6_address': None,
|
||||||
|
}
|
||||||
|
if address_type == Socks5AddressType.ATYP_IPV4:
|
||||||
|
socks_info['ipv4_address'] = socket.inet_ntoa(self.connection.recv(4))
|
||||||
|
elif address_type == Socks5AddressType.ATYP_DOMAINNAME:
|
||||||
|
socks_info['domain_address'] = self.connection.recv(ord(self.connection.recv(1))).decode()
|
||||||
|
elif address_type == Socks5AddressType.ATYP_IPV6:
|
||||||
|
socks_info['ipv6_address'] = socket.inet_ntop(socket.AF_INET6, self.connection.recv(16))
|
||||||
|
else:
|
||||||
|
self.server.close_request(self.request)
|
||||||
|
|
||||||
def tearDown(self):
|
socks_info['port'] = struct.unpack('!H', self.connection.recv(2))[0]
|
||||||
if self._SKIP_SOCKS_TEST:
|
|
||||||
|
# dummy response, the returned IP is just a placeholder
|
||||||
|
self.connection.sendall(struct.pack(
|
||||||
|
'!BBBBIH', SOCKS5_VERSION, self.socks_kwargs.get('reply', Socks5Reply.SUCCEEDED), 0x0, 0x1, 0x7f000001, 40000))
|
||||||
|
|
||||||
|
self.request_handler_class(self.request, self.client_address, self.server, socks_info=socks_info)
|
||||||
|
|
||||||
|
|
||||||
|
class Socks4ProxyHandler(StreamRequestHandler, SocksProxyHandler):
|
||||||
|
|
||||||
|
# SOCKS4 protocol http://www.openssh.com/txt/socks4.protocol
|
||||||
|
# SOCKS4A protocol http://www.openssh.com/txt/socks4a.protocol
|
||||||
|
|
||||||
|
def _read_until_null(self):
|
||||||
|
return b''.join(iter(functools.partial(self.connection.recv, 1), b'\x00'))
|
||||||
|
|
||||||
|
def handle(self):
|
||||||
|
sleep = self.socks_kwargs.get('sleep')
|
||||||
|
if sleep:
|
||||||
|
time.sleep(sleep)
|
||||||
|
socks_info = {
|
||||||
|
'version': SOCKS4_VERSION,
|
||||||
|
'command': None,
|
||||||
|
'client_address': self.client_address,
|
||||||
|
'ipv4_address': None,
|
||||||
|
'port': None,
|
||||||
|
'domain_address': None,
|
||||||
|
}
|
||||||
|
version, command, dest_port, dest_ip = struct.unpack('!BBHI', self.connection.recv(8))
|
||||||
|
socks_info['port'] = dest_port
|
||||||
|
socks_info['command'] = command
|
||||||
|
if version != SOCKS4_VERSION:
|
||||||
|
self.server.close_request(self.request)
|
||||||
|
return
|
||||||
|
use_remote_dns = False
|
||||||
|
if 0x0 < dest_ip <= 0xFF:
|
||||||
|
use_remote_dns = True
|
||||||
|
else:
|
||||||
|
socks_info['ipv4_address'] = socket.inet_ntoa(struct.pack("!I", dest_ip))
|
||||||
|
|
||||||
|
user_id = self._read_until_null().decode()
|
||||||
|
if user_id != (self.socks_kwargs.get('user_id') or ''):
|
||||||
|
self.connection.sendall(struct.pack(
|
||||||
|
'!BBHI', SOCKS4_REPLY_VERSION, Socks4CD.REQUEST_REJECTED_DIFFERENT_USERID, 0x00, 0x00000000))
|
||||||
|
self.server.close_request(self.request)
|
||||||
return
|
return
|
||||||
|
|
||||||
self.server_process.terminate()
|
if use_remote_dns:
|
||||||
self.server_process.communicate()
|
socks_info['domain_address'] = self._read_until_null().decode()
|
||||||
|
|
||||||
def _get_ip(self, protocol):
|
# dummy response, the returned IP is just a placeholder
|
||||||
if self._SKIP_SOCKS_TEST:
|
self.connection.sendall(
|
||||||
return '127.0.0.1'
|
struct.pack(
|
||||||
|
'!BBHI', SOCKS4_REPLY_VERSION,
|
||||||
|
self.socks_kwargs.get('cd_reply', Socks4CD.REQUEST_GRANTED), 40000, 0x7f000001))
|
||||||
|
|
||||||
ydl = FakeYDL({
|
self.request_handler_class(self.request, self.client_address, self.server, socks_info=socks_info)
|
||||||
'proxy': '%s://127.0.0.1:%d' % (protocol, self.port),
|
|
||||||
})
|
|
||||||
return ydl.urlopen('http://yt-dl.org/ip').read().decode()
|
|
||||||
|
|
||||||
def test_socks4(self):
|
|
||||||
self.assertTrue(isinstance(self._get_ip('socks4'), str))
|
|
||||||
|
|
||||||
def test_socks4a(self):
|
class IPv6ThreadingTCPServer(ThreadingTCPServer):
|
||||||
self.assertTrue(isinstance(self._get_ip('socks4a'), str))
|
address_family = socket.AF_INET6
|
||||||
|
|
||||||
def test_socks5(self):
|
|
||||||
self.assertTrue(isinstance(self._get_ip('socks5'), str))
|
class SocksHTTPTestRequestHandler(http.server.BaseHTTPRequestHandler, SocksTestRequestHandler):
|
||||||
|
def do_GET(self):
|
||||||
|
if self.path == '/socks_info':
|
||||||
|
payload = json.dumps(self.socks_info.copy())
|
||||||
|
self.send_response(200)
|
||||||
|
self.send_header('Content-Type', 'application/json; charset=utf-8')
|
||||||
|
self.send_header('Content-Length', str(len(payload)))
|
||||||
|
self.end_headers()
|
||||||
|
self.wfile.write(payload.encode())
|
||||||
|
|
||||||
|
|
||||||
|
class SocksWebSocketTestRequestHandler(SocksTestRequestHandler):
|
||||||
|
def handle(self):
|
||||||
|
import websockets.sync.server
|
||||||
|
protocol = websockets.ServerProtocol()
|
||||||
|
connection = websockets.sync.server.ServerConnection(socket=self.request, protocol=protocol, close_timeout=0)
|
||||||
|
connection.handshake()
|
||||||
|
connection.send(json.dumps(self.socks_info))
|
||||||
|
connection.close()
|
||||||
|
|
||||||
|
|
||||||
|
@contextlib.contextmanager
|
||||||
|
def socks_server(socks_server_class, request_handler, bind_ip=None, **socks_server_kwargs):
|
||||||
|
server = server_thread = None
|
||||||
|
try:
|
||||||
|
bind_address = bind_ip or '127.0.0.1'
|
||||||
|
server_type = ThreadingTCPServer if '.' in bind_address else IPv6ThreadingTCPServer
|
||||||
|
server = server_type(
|
||||||
|
(bind_address, 0), functools.partial(socks_server_class, request_handler, socks_server_kwargs))
|
||||||
|
server_port = http_server_port(server)
|
||||||
|
server_thread = threading.Thread(target=server.serve_forever)
|
||||||
|
server_thread.daemon = True
|
||||||
|
server_thread.start()
|
||||||
|
if '.' not in bind_address:
|
||||||
|
yield f'[{bind_address}]:{server_port}'
|
||||||
|
else:
|
||||||
|
yield f'{bind_address}:{server_port}'
|
||||||
|
finally:
|
||||||
|
server.shutdown()
|
||||||
|
server.server_close()
|
||||||
|
server_thread.join(2.0)
|
||||||
|
|
||||||
|
|
||||||
|
class SocksProxyTestContext(abc.ABC):
|
||||||
|
REQUEST_HANDLER_CLASS = None
|
||||||
|
|
||||||
|
def socks_server(self, server_class, *args, **kwargs):
|
||||||
|
return socks_server(server_class, self.REQUEST_HANDLER_CLASS, *args, **kwargs)
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def socks_info_request(self, handler, target_domain=None, target_port=None, **req_kwargs) -> dict:
|
||||||
|
"""return a dict of socks_info"""
|
||||||
|
|
||||||
|
|
||||||
|
class HTTPSocksTestProxyContext(SocksProxyTestContext):
|
||||||
|
REQUEST_HANDLER_CLASS = SocksHTTPTestRequestHandler
|
||||||
|
|
||||||
|
def socks_info_request(self, handler, target_domain=None, target_port=None, **req_kwargs):
|
||||||
|
request = Request(f'http://{target_domain or "127.0.0.1"}:{target_port or "40000"}/socks_info', **req_kwargs)
|
||||||
|
handler.validate(request)
|
||||||
|
return json.loads(handler.send(request).read().decode())
|
||||||
|
|
||||||
|
|
||||||
|
class WebSocketSocksTestProxyContext(SocksProxyTestContext):
|
||||||
|
REQUEST_HANDLER_CLASS = SocksWebSocketTestRequestHandler
|
||||||
|
|
||||||
|
def socks_info_request(self, handler, target_domain=None, target_port=None, **req_kwargs):
|
||||||
|
request = Request(f'ws://{target_domain or "127.0.0.1"}:{target_port or "40000"}', **req_kwargs)
|
||||||
|
handler.validate(request)
|
||||||
|
ws = handler.send(request)
|
||||||
|
ws.send('socks_info')
|
||||||
|
socks_info = ws.recv()
|
||||||
|
ws.close()
|
||||||
|
return json.loads(socks_info)
|
||||||
|
|
||||||
|
|
||||||
|
CTX_MAP = {
|
||||||
|
'http': HTTPSocksTestProxyContext,
|
||||||
|
'ws': WebSocketSocksTestProxyContext,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture(scope='module')
|
||||||
|
def ctx(request):
|
||||||
|
return CTX_MAP[request.param]()
|
||||||
|
|
||||||
|
|
||||||
|
class TestSocks4Proxy:
|
||||||
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('Websockets', 'ws')], indirect=True)
|
||||||
|
def test_socks4_no_auth(self, handler, ctx):
|
||||||
|
with handler() as rh:
|
||||||
|
with ctx.socks_server(Socks4ProxyHandler) as server_address:
|
||||||
|
response = ctx.socks_info_request(
|
||||||
|
rh, proxies={'all': f'socks4://{server_address}'})
|
||||||
|
assert response['version'] == 4
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('Websockets', 'ws')], indirect=True)
|
||||||
|
def test_socks4_auth(self, handler, ctx):
|
||||||
|
with handler() as rh:
|
||||||
|
with ctx.socks_server(Socks4ProxyHandler, user_id='user') as server_address:
|
||||||
|
with pytest.raises(ProxyError):
|
||||||
|
ctx.socks_info_request(rh, proxies={'all': f'socks4://{server_address}'})
|
||||||
|
response = ctx.socks_info_request(
|
||||||
|
rh, proxies={'all': f'socks4://user:@{server_address}'})
|
||||||
|
assert response['version'] == 4
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('Websockets', 'ws')], indirect=True)
|
||||||
|
def test_socks4a_ipv4_target(self, handler, ctx):
|
||||||
|
with ctx.socks_server(Socks4ProxyHandler) as server_address:
|
||||||
|
with handler(proxies={'all': f'socks4a://{server_address}'}) as rh:
|
||||||
|
response = ctx.socks_info_request(rh, target_domain='127.0.0.1')
|
||||||
|
assert response['version'] == 4
|
||||||
|
assert (response['ipv4_address'] == '127.0.0.1') != (response['domain_address'] == '127.0.0.1')
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('Websockets', 'ws')], indirect=True)
|
||||||
|
def test_socks4a_domain_target(self, handler, ctx):
|
||||||
|
with ctx.socks_server(Socks4ProxyHandler) as server_address:
|
||||||
|
with handler(proxies={'all': f'socks4a://{server_address}'}) as rh:
|
||||||
|
response = ctx.socks_info_request(rh, target_domain='localhost')
|
||||||
|
assert response['version'] == 4
|
||||||
|
assert response['ipv4_address'] is None
|
||||||
|
assert response['domain_address'] == 'localhost'
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('Websockets', 'ws')], indirect=True)
|
||||||
|
def test_ipv4_client_source_address(self, handler, ctx):
|
||||||
|
with ctx.socks_server(Socks4ProxyHandler) as server_address:
|
||||||
|
source_address = f'127.0.0.{random.randint(5, 255)}'
|
||||||
|
verify_address_availability(source_address)
|
||||||
|
with handler(proxies={'all': f'socks4://{server_address}'},
|
||||||
|
source_address=source_address) as rh:
|
||||||
|
response = ctx.socks_info_request(rh)
|
||||||
|
assert response['client_address'][0] == source_address
|
||||||
|
assert response['version'] == 4
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('Websockets', 'ws')], indirect=True)
|
||||||
|
@pytest.mark.parametrize('reply_code', [
|
||||||
|
Socks4CD.REQUEST_REJECTED_OR_FAILED,
|
||||||
|
Socks4CD.REQUEST_REJECTED_CANNOT_CONNECT_TO_IDENTD,
|
||||||
|
Socks4CD.REQUEST_REJECTED_DIFFERENT_USERID,
|
||||||
|
])
|
||||||
|
def test_socks4_errors(self, handler, ctx, reply_code):
|
||||||
|
with ctx.socks_server(Socks4ProxyHandler, cd_reply=reply_code) as server_address:
|
||||||
|
with handler(proxies={'all': f'socks4://{server_address}'}) as rh:
|
||||||
|
with pytest.raises(ProxyError):
|
||||||
|
ctx.socks_info_request(rh)
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('Websockets', 'ws')], indirect=True)
|
||||||
|
def test_ipv6_socks4_proxy(self, handler, ctx):
|
||||||
|
with ctx.socks_server(Socks4ProxyHandler, bind_ip='::1') as server_address:
|
||||||
|
with handler(proxies={'all': f'socks4://{server_address}'}) as rh:
|
||||||
|
response = ctx.socks_info_request(rh, target_domain='127.0.0.1')
|
||||||
|
assert response['client_address'][0] == '::1'
|
||||||
|
assert response['ipv4_address'] == '127.0.0.1'
|
||||||
|
assert response['version'] == 4
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('Websockets', 'ws')], indirect=True)
|
||||||
|
def test_timeout(self, handler, ctx):
|
||||||
|
with ctx.socks_server(Socks4ProxyHandler, sleep=2) as server_address:
|
||||||
|
with handler(proxies={'all': f'socks4://{server_address}'}, timeout=0.5) as rh:
|
||||||
|
with pytest.raises(TransportError):
|
||||||
|
ctx.socks_info_request(rh)
|
||||||
|
|
||||||
|
|
||||||
|
class TestSocks5Proxy:
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('Websockets', 'ws')], indirect=True)
|
||||||
|
def test_socks5_no_auth(self, handler, ctx):
|
||||||
|
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
||||||
|
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
|
||||||
|
response = ctx.socks_info_request(rh)
|
||||||
|
assert response['auth_methods'] == [0x0]
|
||||||
|
assert response['version'] == 5
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('Websockets', 'ws')], indirect=True)
|
||||||
|
def test_socks5_user_pass(self, handler, ctx):
|
||||||
|
with ctx.socks_server(Socks5ProxyHandler, auth=('test', 'testpass')) as server_address:
|
||||||
|
with handler() as rh:
|
||||||
|
with pytest.raises(ProxyError):
|
||||||
|
ctx.socks_info_request(rh, proxies={'all': f'socks5://{server_address}'})
|
||||||
|
|
||||||
|
response = ctx.socks_info_request(
|
||||||
|
rh, proxies={'all': f'socks5://test:testpass@{server_address}'})
|
||||||
|
|
||||||
|
assert response['auth_methods'] == [Socks5Auth.AUTH_NONE, Socks5Auth.AUTH_USER_PASS]
|
||||||
|
assert response['version'] == 5
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('Websockets', 'ws')], indirect=True)
|
||||||
|
def test_socks5_ipv4_target(self, handler, ctx):
|
||||||
|
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
||||||
|
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
|
||||||
|
response = ctx.socks_info_request(rh, target_domain='127.0.0.1')
|
||||||
|
assert response['ipv4_address'] == '127.0.0.1'
|
||||||
|
assert response['version'] == 5
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('Websockets', 'ws')], indirect=True)
|
||||||
|
def test_socks5_domain_target(self, handler, ctx):
|
||||||
|
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
||||||
|
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
|
||||||
|
response = ctx.socks_info_request(rh, target_domain='localhost')
|
||||||
|
assert (response['ipv4_address'] == '127.0.0.1') != (response['ipv6_address'] == '::1')
|
||||||
|
assert response['version'] == 5
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('Websockets', 'ws')], indirect=True)
|
||||||
|
def test_socks5h_domain_target(self, handler, ctx):
|
||||||
|
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
||||||
|
with handler(proxies={'all': f'socks5h://{server_address}'}) as rh:
|
||||||
|
response = ctx.socks_info_request(rh, target_domain='localhost')
|
||||||
|
assert response['ipv4_address'] is None
|
||||||
|
assert response['domain_address'] == 'localhost'
|
||||||
|
assert response['version'] == 5
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('Websockets', 'ws')], indirect=True)
|
||||||
|
def test_socks5h_ip_target(self, handler, ctx):
|
||||||
|
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
||||||
|
with handler(proxies={'all': f'socks5h://{server_address}'}) as rh:
|
||||||
|
response = ctx.socks_info_request(rh, target_domain='127.0.0.1')
|
||||||
|
assert response['ipv4_address'] == '127.0.0.1'
|
||||||
|
assert response['domain_address'] is None
|
||||||
|
assert response['version'] == 5
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('Websockets', 'ws')], indirect=True)
|
||||||
|
def test_socks5_ipv6_destination(self, handler, ctx):
|
||||||
|
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
||||||
|
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
|
||||||
|
response = ctx.socks_info_request(rh, target_domain='[::1]')
|
||||||
|
assert response['ipv6_address'] == '::1'
|
||||||
|
assert response['version'] == 5
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('Websockets', 'ws')], indirect=True)
|
||||||
|
def test_ipv6_socks5_proxy(self, handler, ctx):
|
||||||
|
with ctx.socks_server(Socks5ProxyHandler, bind_ip='::1') as server_address:
|
||||||
|
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
|
||||||
|
response = ctx.socks_info_request(rh, target_domain='127.0.0.1')
|
||||||
|
assert response['client_address'][0] == '::1'
|
||||||
|
assert response['ipv4_address'] == '127.0.0.1'
|
||||||
|
assert response['version'] == 5
|
||||||
|
|
||||||
|
# XXX: is there any feasible way of testing IPv6 source addresses?
|
||||||
|
# Same would go for non-proxy source_address test...
|
||||||
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('Websockets', 'ws')], indirect=True)
|
||||||
|
def test_ipv4_client_source_address(self, handler, ctx):
|
||||||
|
with ctx.socks_server(Socks5ProxyHandler) as server_address:
|
||||||
|
source_address = f'127.0.0.{random.randint(5, 255)}'
|
||||||
|
verify_address_availability(source_address)
|
||||||
|
with handler(proxies={'all': f'socks5://{server_address}'}, source_address=source_address) as rh:
|
||||||
|
response = ctx.socks_info_request(rh)
|
||||||
|
assert response['client_address'][0] == source_address
|
||||||
|
assert response['version'] == 5
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Requests', 'http'), ('Websockets', 'ws')], indirect=True)
|
||||||
|
@pytest.mark.parametrize('reply_code', [
|
||||||
|
Socks5Reply.GENERAL_FAILURE,
|
||||||
|
Socks5Reply.CONNECTION_NOT_ALLOWED,
|
||||||
|
Socks5Reply.NETWORK_UNREACHABLE,
|
||||||
|
Socks5Reply.HOST_UNREACHABLE,
|
||||||
|
Socks5Reply.CONNECTION_REFUSED,
|
||||||
|
Socks5Reply.TTL_EXPIRED,
|
||||||
|
Socks5Reply.COMMAND_NOT_SUPPORTED,
|
||||||
|
Socks5Reply.ADDRESS_TYPE_NOT_SUPPORTED,
|
||||||
|
])
|
||||||
|
def test_socks5_errors(self, handler, ctx, reply_code):
|
||||||
|
with ctx.socks_server(Socks5ProxyHandler, reply=reply_code) as server_address:
|
||||||
|
with handler(proxies={'all': f'socks5://{server_address}'}) as rh:
|
||||||
|
with pytest.raises(ProxyError):
|
||||||
|
ctx.socks_info_request(rh)
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler,ctx', [('Urllib', 'http'), ('Websockets', 'ws')], indirect=True)
|
||||||
|
def test_timeout(self, handler, ctx):
|
||||||
|
with ctx.socks_server(Socks5ProxyHandler, sleep=2) as server_address:
|
||||||
|
with handler(proxies={'all': f'socks5://{server_address}'}, timeout=1) as rh:
|
||||||
|
with pytest.raises(TransportError):
|
||||||
|
ctx.socks_info_request(rh)
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
|
|
228
test/test_update.py
Normal file
228
test/test_update.py
Normal file
|
@ -0,0 +1,228 @@
|
||||||
|
#!/usr/bin/env python3
|
||||||
|
|
||||||
|
# Allow direct execution
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import unittest
|
||||||
|
|
||||||
|
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||||
|
|
||||||
|
|
||||||
|
from test.helper import FakeYDL, report_warning
|
||||||
|
from yt_dlp.update import UpdateInfo, Updater
|
||||||
|
|
||||||
|
|
||||||
|
# XXX: Keep in sync with yt_dlp.update.UPDATE_SOURCES
|
||||||
|
TEST_UPDATE_SOURCES = {
|
||||||
|
'stable': 'yt-dlp/yt-dlp',
|
||||||
|
'nightly': 'yt-dlp/yt-dlp-nightly-builds',
|
||||||
|
'master': 'yt-dlp/yt-dlp-master-builds',
|
||||||
|
}
|
||||||
|
|
||||||
|
TEST_API_DATA = {
|
||||||
|
'yt-dlp/yt-dlp/latest': {
|
||||||
|
'tag_name': '2023.12.31',
|
||||||
|
'target_commitish': 'bbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbbb',
|
||||||
|
'name': 'yt-dlp 2023.12.31',
|
||||||
|
'body': 'BODY',
|
||||||
|
},
|
||||||
|
'yt-dlp/yt-dlp-nightly-builds/latest': {
|
||||||
|
'tag_name': '2023.12.31.123456',
|
||||||
|
'target_commitish': 'master',
|
||||||
|
'name': 'yt-dlp nightly 2023.12.31.123456',
|
||||||
|
'body': 'Generated from: https://github.com/yt-dlp/yt-dlp/commit/cccccccccccccccccccccccccccccccccccccccc',
|
||||||
|
},
|
||||||
|
'yt-dlp/yt-dlp-master-builds/latest': {
|
||||||
|
'tag_name': '2023.12.31.987654',
|
||||||
|
'target_commitish': 'master',
|
||||||
|
'name': 'yt-dlp master 2023.12.31.987654',
|
||||||
|
'body': 'Generated from: https://github.com/yt-dlp/yt-dlp/commit/dddddddddddddddddddddddddddddddddddddddd',
|
||||||
|
},
|
||||||
|
'yt-dlp/yt-dlp/tags/testing': {
|
||||||
|
'tag_name': 'testing',
|
||||||
|
'target_commitish': '9999999999999999999999999999999999999999',
|
||||||
|
'name': 'testing',
|
||||||
|
'body': 'BODY',
|
||||||
|
},
|
||||||
|
'fork/yt-dlp/latest': {
|
||||||
|
'tag_name': '2050.12.31',
|
||||||
|
'target_commitish': 'eeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeee',
|
||||||
|
'name': '2050.12.31',
|
||||||
|
'body': 'BODY',
|
||||||
|
},
|
||||||
|
'fork/yt-dlp/tags/pr0000': {
|
||||||
|
'tag_name': 'pr0000',
|
||||||
|
'target_commitish': 'ffffffffffffffffffffffffffffffffffffffff',
|
||||||
|
'name': 'pr1234 2023.11.11.000000',
|
||||||
|
'body': 'BODY',
|
||||||
|
},
|
||||||
|
'fork/yt-dlp/tags/pr1234': {
|
||||||
|
'tag_name': 'pr1234',
|
||||||
|
'target_commitish': '0000000000000000000000000000000000000000',
|
||||||
|
'name': 'pr1234 2023.12.31.555555',
|
||||||
|
'body': 'BODY',
|
||||||
|
},
|
||||||
|
'fork/yt-dlp/tags/pr9999': {
|
||||||
|
'tag_name': 'pr9999',
|
||||||
|
'target_commitish': '1111111111111111111111111111111111111111',
|
||||||
|
'name': 'pr9999',
|
||||||
|
'body': 'BODY',
|
||||||
|
},
|
||||||
|
'fork/yt-dlp-satellite/tags/pr987': {
|
||||||
|
'tag_name': 'pr987',
|
||||||
|
'target_commitish': 'master',
|
||||||
|
'name': 'pr987',
|
||||||
|
'body': 'Generated from: https://github.com/yt-dlp/yt-dlp/commit/2222222222222222222222222222222222222222',
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
TEST_LOCKFILE_COMMENT = '# This file is used for regulating self-update'
|
||||||
|
|
||||||
|
TEST_LOCKFILE_V1 = r'''%s
|
||||||
|
lock 2022.08.18.36 .+ Python 3\.6
|
||||||
|
lock 2023.11.16 (?!win_x86_exe).+ Python 3\.7
|
||||||
|
lock 2023.11.16 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
||||||
|
''' % TEST_LOCKFILE_COMMENT
|
||||||
|
|
||||||
|
TEST_LOCKFILE_V2_TMPL = r'''%s
|
||||||
|
lockV2 yt-dlp/yt-dlp 2022.08.18.36 .+ Python 3\.6
|
||||||
|
lockV2 yt-dlp/yt-dlp 2023.11.16 (?!win_x86_exe).+ Python 3\.7
|
||||||
|
lockV2 yt-dlp/yt-dlp 2023.11.16 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
||||||
|
lockV2 yt-dlp/yt-dlp-nightly-builds 2023.11.15.232826 (?!win_x86_exe).+ Python 3\.7
|
||||||
|
lockV2 yt-dlp/yt-dlp-nightly-builds 2023.11.15.232826 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
||||||
|
lockV2 yt-dlp/yt-dlp-master-builds 2023.11.15.232812 (?!win_x86_exe).+ Python 3\.7
|
||||||
|
lockV2 yt-dlp/yt-dlp-master-builds 2023.11.15.232812 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
||||||
|
'''
|
||||||
|
|
||||||
|
TEST_LOCKFILE_V2 = TEST_LOCKFILE_V2_TMPL % TEST_LOCKFILE_COMMENT
|
||||||
|
|
||||||
|
TEST_LOCKFILE_ACTUAL = TEST_LOCKFILE_V2_TMPL % TEST_LOCKFILE_V1.rstrip('\n')
|
||||||
|
|
||||||
|
TEST_LOCKFILE_FORK = r'''%s# Test if a fork blocks updates to non-numeric tags
|
||||||
|
lockV2 fork/yt-dlp pr0000 .+ Python 3.6
|
||||||
|
lockV2 fork/yt-dlp pr1234 (?!win_x86_exe).+ Python 3\.7
|
||||||
|
lockV2 fork/yt-dlp pr1234 win_x86_exe .+ Windows-(?:Vista|2008Server)
|
||||||
|
lockV2 fork/yt-dlp pr9999 .+ Python 3.11
|
||||||
|
''' % TEST_LOCKFILE_ACTUAL
|
||||||
|
|
||||||
|
|
||||||
|
class FakeUpdater(Updater):
|
||||||
|
current_version = '2022.01.01'
|
||||||
|
current_commit = 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa'
|
||||||
|
|
||||||
|
_channel = 'stable'
|
||||||
|
_origin = 'yt-dlp/yt-dlp'
|
||||||
|
_update_sources = TEST_UPDATE_SOURCES
|
||||||
|
|
||||||
|
def _download_update_spec(self, *args, **kwargs):
|
||||||
|
return TEST_LOCKFILE_ACTUAL
|
||||||
|
|
||||||
|
def _call_api(self, tag):
|
||||||
|
tag = f'tags/{tag}' if tag != 'latest' else tag
|
||||||
|
return TEST_API_DATA[f'{self.requested_repo}/{tag}']
|
||||||
|
|
||||||
|
def _report_error(self, msg, *args, **kwargs):
|
||||||
|
report_warning(msg)
|
||||||
|
|
||||||
|
|
||||||
|
class TestUpdate(unittest.TestCase):
|
||||||
|
maxDiff = None
|
||||||
|
|
||||||
|
def test_update_spec(self):
|
||||||
|
ydl = FakeYDL()
|
||||||
|
updater = FakeUpdater(ydl, 'stable')
|
||||||
|
|
||||||
|
def test(lockfile, identifier, input_tag, expect_tag, exact=False, repo='yt-dlp/yt-dlp'):
|
||||||
|
updater._identifier = identifier
|
||||||
|
updater._exact = exact
|
||||||
|
updater.requested_repo = repo
|
||||||
|
result = updater._process_update_spec(lockfile, input_tag)
|
||||||
|
self.assertEqual(
|
||||||
|
result, expect_tag,
|
||||||
|
f'{identifier!r} requesting {repo}@{input_tag} (exact={exact}) '
|
||||||
|
f'returned {result!r} instead of {expect_tag!r}')
|
||||||
|
|
||||||
|
for lockfile in (TEST_LOCKFILE_V1, TEST_LOCKFILE_V2, TEST_LOCKFILE_ACTUAL, TEST_LOCKFILE_FORK):
|
||||||
|
# Normal operation
|
||||||
|
test(lockfile, 'zip Python 3.12.0', '2023.12.31', '2023.12.31')
|
||||||
|
test(lockfile, 'zip stable Python 3.12.0', '2023.12.31', '2023.12.31', exact=True)
|
||||||
|
# Python 3.6 --update should update only to its lock
|
||||||
|
test(lockfile, 'zip Python 3.6.0', '2023.11.16', '2022.08.18.36')
|
||||||
|
# --update-to an exact version later than the lock should return None
|
||||||
|
test(lockfile, 'zip stable Python 3.6.0', '2023.11.16', None, exact=True)
|
||||||
|
# Python 3.7 should be able to update to its lock
|
||||||
|
test(lockfile, 'zip Python 3.7.0', '2023.11.16', '2023.11.16')
|
||||||
|
test(lockfile, 'zip stable Python 3.7.1', '2023.11.16', '2023.11.16', exact=True)
|
||||||
|
# Non-win_x86_exe builds on py3.7 must be locked
|
||||||
|
test(lockfile, 'zip Python 3.7.1', '2023.12.31', '2023.11.16')
|
||||||
|
test(lockfile, 'zip stable Python 3.7.1', '2023.12.31', None, exact=True)
|
||||||
|
test( # Windows Vista w/ win_x86_exe must be locked
|
||||||
|
lockfile, 'win_x86_exe stable Python 3.7.9 (CPython x86 32bit) - Windows-Vista-6.0.6003-SP2',
|
||||||
|
'2023.12.31', '2023.11.16')
|
||||||
|
test( # Windows 2008Server w/ win_x86_exe must be locked
|
||||||
|
lockfile, 'win_x86_exe Python 3.7.9 (CPython x86 32bit) - Windows-2008Server',
|
||||||
|
'2023.12.31', None, exact=True)
|
||||||
|
test( # Windows 7 w/ win_x86_exe py3.7 build should be able to update beyond lock
|
||||||
|
lockfile, 'win_x86_exe stable Python 3.7.9 (CPython x86 32bit) - Windows-7-6.1.7601-SP1',
|
||||||
|
'2023.12.31', '2023.12.31')
|
||||||
|
test( # Windows 8.1 w/ '2008Server' in platform string should be able to update beyond lock
|
||||||
|
lockfile, 'win_x86_exe Python 3.7.9 (CPython x86 32bit) - Windows-post2008Server-6.2.9200',
|
||||||
|
'2023.12.31', '2023.12.31', exact=True)
|
||||||
|
|
||||||
|
# Forks can block updates to non-numeric tags rather than lock
|
||||||
|
test(TEST_LOCKFILE_FORK, 'zip Python 3.6.3', 'pr0000', None, repo='fork/yt-dlp')
|
||||||
|
test(TEST_LOCKFILE_FORK, 'zip stable Python 3.7.4', 'pr0000', 'pr0000', repo='fork/yt-dlp')
|
||||||
|
test(TEST_LOCKFILE_FORK, 'zip stable Python 3.7.4', 'pr1234', None, repo='fork/yt-dlp')
|
||||||
|
test(TEST_LOCKFILE_FORK, 'zip Python 3.8.1', 'pr1234', 'pr1234', repo='fork/yt-dlp', exact=True)
|
||||||
|
test(
|
||||||
|
TEST_LOCKFILE_FORK, 'win_x86_exe stable Python 3.7.9 (CPython x86 32bit) - Windows-Vista-6.0.6003-SP2',
|
||||||
|
'pr1234', None, repo='fork/yt-dlp')
|
||||||
|
test(
|
||||||
|
TEST_LOCKFILE_FORK, 'win_x86_exe stable Python 3.7.9 (CPython x86 32bit) - Windows-7-6.1.7601-SP1',
|
||||||
|
'2023.12.31', '2023.12.31', repo='fork/yt-dlp')
|
||||||
|
test(TEST_LOCKFILE_FORK, 'zip Python 3.11.2', 'pr9999', None, repo='fork/yt-dlp', exact=True)
|
||||||
|
test(TEST_LOCKFILE_FORK, 'zip stable Python 3.12.0', 'pr9999', 'pr9999', repo='fork/yt-dlp')
|
||||||
|
|
||||||
|
def test_query_update(self):
|
||||||
|
ydl = FakeYDL()
|
||||||
|
|
||||||
|
def test(target, expected, current_version=None, current_commit=None, identifier=None):
|
||||||
|
updater = FakeUpdater(ydl, target)
|
||||||
|
if current_version:
|
||||||
|
updater.current_version = current_version
|
||||||
|
if current_commit:
|
||||||
|
updater.current_commit = current_commit
|
||||||
|
updater._identifier = identifier or 'zip'
|
||||||
|
update_info = updater.query_update(_output=True)
|
||||||
|
self.assertDictEqual(
|
||||||
|
update_info.__dict__ if update_info else {}, expected.__dict__ if expected else {})
|
||||||
|
|
||||||
|
test('yt-dlp/yt-dlp@latest', UpdateInfo(
|
||||||
|
'2023.12.31', version='2023.12.31', requested_version='2023.12.31', commit='b' * 40))
|
||||||
|
test('yt-dlp/yt-dlp-nightly-builds@latest', UpdateInfo(
|
||||||
|
'2023.12.31.123456', version='2023.12.31.123456', requested_version='2023.12.31.123456', commit='c' * 40))
|
||||||
|
test('yt-dlp/yt-dlp-master-builds@latest', UpdateInfo(
|
||||||
|
'2023.12.31.987654', version='2023.12.31.987654', requested_version='2023.12.31.987654', commit='d' * 40))
|
||||||
|
test('fork/yt-dlp@latest', UpdateInfo(
|
||||||
|
'2050.12.31', version='2050.12.31', requested_version='2050.12.31', commit='e' * 40))
|
||||||
|
test('fork/yt-dlp@pr0000', UpdateInfo(
|
||||||
|
'pr0000', version='2023.11.11.000000', requested_version='2023.11.11.000000', commit='f' * 40))
|
||||||
|
test('fork/yt-dlp@pr1234', UpdateInfo(
|
||||||
|
'pr1234', version='2023.12.31.555555', requested_version='2023.12.31.555555', commit='0' * 40))
|
||||||
|
test('fork/yt-dlp@pr9999', UpdateInfo(
|
||||||
|
'pr9999', version=None, requested_version=None, commit='1' * 40))
|
||||||
|
test('fork/yt-dlp-satellite@pr987', UpdateInfo(
|
||||||
|
'pr987', version=None, requested_version=None, commit='2' * 40))
|
||||||
|
test('yt-dlp/yt-dlp', None, current_version='2024.01.01')
|
||||||
|
test('stable', UpdateInfo(
|
||||||
|
'2023.12.31', version='2023.12.31', requested_version='2023.12.31', commit='b' * 40))
|
||||||
|
test('nightly', UpdateInfo(
|
||||||
|
'2023.12.31.123456', version='2023.12.31.123456', requested_version='2023.12.31.123456', commit='c' * 40))
|
||||||
|
test('master', UpdateInfo(
|
||||||
|
'2023.12.31.987654', version='2023.12.31.987654', requested_version='2023.12.31.987654', commit='d' * 40))
|
||||||
|
test('testing', None, current_commit='9' * 40)
|
||||||
|
test('testing', UpdateInfo('testing', commit='9' * 40))
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
unittest.main()
|
|
@ -1,30 +0,0 @@
|
||||||
#!/usr/bin/env python3
|
|
||||||
|
|
||||||
# Allow direct execution
|
|
||||||
import os
|
|
||||||
import sys
|
|
||||||
import unittest
|
|
||||||
|
|
||||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
|
||||||
|
|
||||||
|
|
||||||
import json
|
|
||||||
|
|
||||||
from yt_dlp.update import rsa_verify
|
|
||||||
|
|
||||||
|
|
||||||
class TestUpdate(unittest.TestCase):
|
|
||||||
def test_rsa_verify(self):
|
|
||||||
UPDATES_RSA_KEY = (0x9d60ee4d8f805312fdb15a62f87b95bd66177b91df176765d13514a0f1754bcd2057295c5b6f1d35daa6742c3ffc9a82d3e118861c207995a8031e151d863c9927e304576bc80692bc8e094896fcf11b66f3e29e04e3a71e9a11558558acea1840aec37fc396fb6b65dc81a1c4144e03bd1c011de62e3f1357b327d08426fe93, 65537)
|
|
||||||
with open(os.path.join(os.path.dirname(os.path.abspath(__file__)), 'versions.json'), 'rb') as f:
|
|
||||||
versions_info = f.read().decode()
|
|
||||||
versions_info = json.loads(versions_info)
|
|
||||||
signature = versions_info['signature']
|
|
||||||
del versions_info['signature']
|
|
||||||
self.assertTrue(rsa_verify(
|
|
||||||
json.dumps(versions_info, sort_keys=True).encode(),
|
|
||||||
signature, UPDATES_RSA_KEY))
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
|
||||||
unittest.main()
|
|
|
@ -5,6 +5,7 @@
|
||||||
import re
|
import re
|
||||||
import sys
|
import sys
|
||||||
import unittest
|
import unittest
|
||||||
|
import warnings
|
||||||
|
|
||||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||||
|
|
||||||
|
@ -13,6 +14,7 @@
|
||||||
import io
|
import io
|
||||||
import itertools
|
import itertools
|
||||||
import json
|
import json
|
||||||
|
import subprocess
|
||||||
import xml.etree.ElementTree
|
import xml.etree.ElementTree
|
||||||
|
|
||||||
from yt_dlp.compat import (
|
from yt_dlp.compat import (
|
||||||
|
@ -27,6 +29,7 @@
|
||||||
InAdvancePagedList,
|
InAdvancePagedList,
|
||||||
LazyList,
|
LazyList,
|
||||||
OnDemandPagedList,
|
OnDemandPagedList,
|
||||||
|
Popen,
|
||||||
age_restricted,
|
age_restricted,
|
||||||
args_to_str,
|
args_to_str,
|
||||||
base_url,
|
base_url,
|
||||||
|
@ -46,10 +49,9 @@
|
||||||
encode_base_n,
|
encode_base_n,
|
||||||
encode_compat_str,
|
encode_compat_str,
|
||||||
encodeFilename,
|
encodeFilename,
|
||||||
escape_rfc3986,
|
|
||||||
escape_url,
|
|
||||||
expand_path,
|
expand_path,
|
||||||
extract_attributes,
|
extract_attributes,
|
||||||
|
extract_basic_auth,
|
||||||
find_xpath_attr,
|
find_xpath_attr,
|
||||||
fix_xml_ampersands,
|
fix_xml_ampersands,
|
||||||
float_or_none,
|
float_or_none,
|
||||||
|
@ -102,15 +104,16 @@
|
||||||
sanitize_filename,
|
sanitize_filename,
|
||||||
sanitize_path,
|
sanitize_path,
|
||||||
sanitize_url,
|
sanitize_url,
|
||||||
sanitized_Request,
|
|
||||||
shell_quote,
|
shell_quote,
|
||||||
smuggle_url,
|
smuggle_url,
|
||||||
|
str_or_none,
|
||||||
str_to_int,
|
str_to_int,
|
||||||
strip_jsonp,
|
strip_jsonp,
|
||||||
strip_or_none,
|
strip_or_none,
|
||||||
subtitles_filename,
|
subtitles_filename,
|
||||||
timeconvert,
|
timeconvert,
|
||||||
traverse_obj,
|
traverse_obj,
|
||||||
|
try_call,
|
||||||
unescapeHTML,
|
unescapeHTML,
|
||||||
unified_strdate,
|
unified_strdate,
|
||||||
unified_timestamp,
|
unified_timestamp,
|
||||||
|
@ -122,12 +125,19 @@
|
||||||
urlencode_postdata,
|
urlencode_postdata,
|
||||||
urljoin,
|
urljoin,
|
||||||
urshift,
|
urshift,
|
||||||
|
variadic,
|
||||||
version_tuple,
|
version_tuple,
|
||||||
xpath_attr,
|
xpath_attr,
|
||||||
xpath_element,
|
xpath_element,
|
||||||
xpath_text,
|
xpath_text,
|
||||||
xpath_with_ns,
|
xpath_with_ns,
|
||||||
)
|
)
|
||||||
|
from yt_dlp.utils.networking import (
|
||||||
|
HTTPHeaderDict,
|
||||||
|
escape_rfc3986,
|
||||||
|
normalize_url,
|
||||||
|
remove_dot_segments,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
class TestUtil(unittest.TestCase):
|
class TestUtil(unittest.TestCase):
|
||||||
|
@ -254,15 +264,6 @@ def test_sanitize_url(self):
|
||||||
self.assertEqual(sanitize_url('https://foo.bar'), 'https://foo.bar')
|
self.assertEqual(sanitize_url('https://foo.bar'), 'https://foo.bar')
|
||||||
self.assertEqual(sanitize_url('foo bar'), 'foo bar')
|
self.assertEqual(sanitize_url('foo bar'), 'foo bar')
|
||||||
|
|
||||||
def test_extract_basic_auth(self):
|
|
||||||
auth_header = lambda url: sanitized_Request(url).get_header('Authorization')
|
|
||||||
self.assertFalse(auth_header('http://foo.bar'))
|
|
||||||
self.assertFalse(auth_header('http://:foo.bar'))
|
|
||||||
self.assertEqual(auth_header('http://@foo.bar'), 'Basic Og==')
|
|
||||||
self.assertEqual(auth_header('http://:pass@foo.bar'), 'Basic OnBhc3M=')
|
|
||||||
self.assertEqual(auth_header('http://user:@foo.bar'), 'Basic dXNlcjo=')
|
|
||||||
self.assertEqual(auth_header('http://user:pass@foo.bar'), 'Basic dXNlcjpwYXNz')
|
|
||||||
|
|
||||||
def test_expand_path(self):
|
def test_expand_path(self):
|
||||||
def env(var):
|
def env(var):
|
||||||
return f'%{var}%' if sys.platform == 'win32' else f'${var}'
|
return f'%{var}%' if sys.platform == 'win32' else f'${var}'
|
||||||
|
@ -659,6 +660,8 @@ def test_parse_duration(self):
|
||||||
self.assertEqual(parse_duration('P0Y0M0DT0H4M20.880S'), 260.88)
|
self.assertEqual(parse_duration('P0Y0M0DT0H4M20.880S'), 260.88)
|
||||||
self.assertEqual(parse_duration('01:02:03:050'), 3723.05)
|
self.assertEqual(parse_duration('01:02:03:050'), 3723.05)
|
||||||
self.assertEqual(parse_duration('103:050'), 103.05)
|
self.assertEqual(parse_duration('103:050'), 103.05)
|
||||||
|
self.assertEqual(parse_duration('1HR 3MIN'), 3780)
|
||||||
|
self.assertEqual(parse_duration('2hrs 3mins'), 7380)
|
||||||
|
|
||||||
def test_fix_xml_ampersands(self):
|
def test_fix_xml_ampersands(self):
|
||||||
self.assertEqual(
|
self.assertEqual(
|
||||||
|
@ -935,24 +938,45 @@ def test_escape_rfc3986(self):
|
||||||
self.assertEqual(escape_rfc3986('foo bar'), 'foo%20bar')
|
self.assertEqual(escape_rfc3986('foo bar'), 'foo%20bar')
|
||||||
self.assertEqual(escape_rfc3986('foo%20bar'), 'foo%20bar')
|
self.assertEqual(escape_rfc3986('foo%20bar'), 'foo%20bar')
|
||||||
|
|
||||||
def test_escape_url(self):
|
def test_normalize_url(self):
|
||||||
self.assertEqual(
|
self.assertEqual(
|
||||||
escape_url('http://wowza.imust.org/srv/vod/telemb/new/UPLOAD/UPLOAD/20224_IncendieHavré_FD.mp4'),
|
normalize_url('http://wowza.imust.org/srv/vod/telemb/new/UPLOAD/UPLOAD/20224_IncendieHavré_FD.mp4'),
|
||||||
'http://wowza.imust.org/srv/vod/telemb/new/UPLOAD/UPLOAD/20224_IncendieHavre%CC%81_FD.mp4'
|
'http://wowza.imust.org/srv/vod/telemb/new/UPLOAD/UPLOAD/20224_IncendieHavre%CC%81_FD.mp4'
|
||||||
)
|
)
|
||||||
self.assertEqual(
|
self.assertEqual(
|
||||||
escape_url('http://www.ardmediathek.de/tv/Sturm-der-Liebe/Folge-2036-Zu-Mann-und-Frau-erklärt/Das-Erste/Video?documentId=22673108&bcastId=5290'),
|
normalize_url('http://www.ardmediathek.de/tv/Sturm-der-Liebe/Folge-2036-Zu-Mann-und-Frau-erklärt/Das-Erste/Video?documentId=22673108&bcastId=5290'),
|
||||||
'http://www.ardmediathek.de/tv/Sturm-der-Liebe/Folge-2036-Zu-Mann-und-Frau-erkl%C3%A4rt/Das-Erste/Video?documentId=22673108&bcastId=5290'
|
'http://www.ardmediathek.de/tv/Sturm-der-Liebe/Folge-2036-Zu-Mann-und-Frau-erkl%C3%A4rt/Das-Erste/Video?documentId=22673108&bcastId=5290'
|
||||||
)
|
)
|
||||||
self.assertEqual(
|
self.assertEqual(
|
||||||
escape_url('http://тест.рф/фрагмент'),
|
normalize_url('http://тест.рф/фрагмент'),
|
||||||
'http://xn--e1aybc.xn--p1ai/%D1%84%D1%80%D0%B0%D0%B3%D0%BC%D0%B5%D0%BD%D1%82'
|
'http://xn--e1aybc.xn--p1ai/%D1%84%D1%80%D0%B0%D0%B3%D0%BC%D0%B5%D0%BD%D1%82'
|
||||||
)
|
)
|
||||||
self.assertEqual(
|
self.assertEqual(
|
||||||
escape_url('http://тест.рф/абв?абв=абв#абв'),
|
normalize_url('http://тест.рф/абв?абв=абв#абв'),
|
||||||
'http://xn--e1aybc.xn--p1ai/%D0%B0%D0%B1%D0%B2?%D0%B0%D0%B1%D0%B2=%D0%B0%D0%B1%D0%B2#%D0%B0%D0%B1%D0%B2'
|
'http://xn--e1aybc.xn--p1ai/%D0%B0%D0%B1%D0%B2?%D0%B0%D0%B1%D0%B2=%D0%B0%D0%B1%D0%B2#%D0%B0%D0%B1%D0%B2'
|
||||||
)
|
)
|
||||||
self.assertEqual(escape_url('http://vimeo.com/56015672#at=0'), 'http://vimeo.com/56015672#at=0')
|
self.assertEqual(normalize_url('http://vimeo.com/56015672#at=0'), 'http://vimeo.com/56015672#at=0')
|
||||||
|
|
||||||
|
self.assertEqual(normalize_url('http://www.example.com/../a/b/../c/./d.html'), 'http://www.example.com/a/c/d.html')
|
||||||
|
|
||||||
|
def test_remove_dot_segments(self):
|
||||||
|
self.assertEqual(remove_dot_segments('/a/b/c/./../../g'), '/a/g')
|
||||||
|
self.assertEqual(remove_dot_segments('mid/content=5/../6'), 'mid/6')
|
||||||
|
self.assertEqual(remove_dot_segments('/ad/../cd'), '/cd')
|
||||||
|
self.assertEqual(remove_dot_segments('/ad/../cd/'), '/cd/')
|
||||||
|
self.assertEqual(remove_dot_segments('/..'), '/')
|
||||||
|
self.assertEqual(remove_dot_segments('/./'), '/')
|
||||||
|
self.assertEqual(remove_dot_segments('/./a'), '/a')
|
||||||
|
self.assertEqual(remove_dot_segments('/abc/./.././d/././e/.././f/./../../ghi'), '/ghi')
|
||||||
|
self.assertEqual(remove_dot_segments('/'), '/')
|
||||||
|
self.assertEqual(remove_dot_segments('/t'), '/t')
|
||||||
|
self.assertEqual(remove_dot_segments('t'), 't')
|
||||||
|
self.assertEqual(remove_dot_segments(''), '')
|
||||||
|
self.assertEqual(remove_dot_segments('/../a/b/c'), '/a/b/c')
|
||||||
|
self.assertEqual(remove_dot_segments('../a'), 'a')
|
||||||
|
self.assertEqual(remove_dot_segments('./a'), 'a')
|
||||||
|
self.assertEqual(remove_dot_segments('.'), '')
|
||||||
|
self.assertEqual(remove_dot_segments('////'), '////')
|
||||||
|
|
||||||
def test_js_to_json_vars_strings(self):
|
def test_js_to_json_vars_strings(self):
|
||||||
self.assertDictEqual(
|
self.assertDictEqual(
|
||||||
|
@ -1185,10 +1209,28 @@ def test_js_to_json_edgecases(self):
|
||||||
on = js_to_json('\'"\\""\'')
|
on = js_to_json('\'"\\""\'')
|
||||||
self.assertEqual(json.loads(on), '"""', msg='Unnecessary quote escape should be escaped')
|
self.assertEqual(json.loads(on), '"""', msg='Unnecessary quote escape should be escaped')
|
||||||
|
|
||||||
|
on = js_to_json('[new Date("spam"), \'("eggs")\']')
|
||||||
|
self.assertEqual(json.loads(on), ['spam', '("eggs")'], msg='Date regex should match a single string')
|
||||||
|
|
||||||
def test_js_to_json_malformed(self):
|
def test_js_to_json_malformed(self):
|
||||||
self.assertEqual(js_to_json('42a1'), '42"a1"')
|
self.assertEqual(js_to_json('42a1'), '42"a1"')
|
||||||
self.assertEqual(js_to_json('42a-1'), '42"a"-1')
|
self.assertEqual(js_to_json('42a-1'), '42"a"-1')
|
||||||
|
|
||||||
|
def test_js_to_json_template_literal(self):
|
||||||
|
self.assertEqual(js_to_json('`Hello ${name}`', {'name': '"world"'}), '"Hello world"')
|
||||||
|
self.assertEqual(js_to_json('`${name}${name}`', {'name': '"X"'}), '"XX"')
|
||||||
|
self.assertEqual(js_to_json('`${name}${name}`', {'name': '5'}), '"55"')
|
||||||
|
self.assertEqual(js_to_json('`${name}"${name}"`', {'name': '5'}), '"5\\"5\\""')
|
||||||
|
self.assertEqual(js_to_json('`${name}`', {}), '"name"')
|
||||||
|
|
||||||
|
def test_js_to_json_common_constructors(self):
|
||||||
|
self.assertEqual(json.loads(js_to_json('new Map([["a", 5]])')), {'a': 5})
|
||||||
|
self.assertEqual(json.loads(js_to_json('Array(5, 10)')), [5, 10])
|
||||||
|
self.assertEqual(json.loads(js_to_json('new Array(15,5)')), [15, 5])
|
||||||
|
self.assertEqual(json.loads(js_to_json('new Map([Array(5, 10),new Array(15,5)])')), {'5': 10, '15': 5})
|
||||||
|
self.assertEqual(json.loads(js_to_json('new Date("123")')), "123")
|
||||||
|
self.assertEqual(json.loads(js_to_json('new Date(\'2023-10-19\')')), "2023-10-19")
|
||||||
|
|
||||||
def test_extract_attributes(self):
|
def test_extract_attributes(self):
|
||||||
self.assertEqual(extract_attributes('<e x="y">'), {'x': 'y'})
|
self.assertEqual(extract_attributes('<e x="y">'), {'x': 'y'})
|
||||||
self.assertEqual(extract_attributes("<e x='y'>"), {'x': 'y'})
|
self.assertEqual(extract_attributes("<e x='y'>"), {'x': 'y'})
|
||||||
|
@ -1824,6 +1866,8 @@ def test_iri_to_uri(self):
|
||||||
def test_clean_podcast_url(self):
|
def test_clean_podcast_url(self):
|
||||||
self.assertEqual(clean_podcast_url('https://www.podtrac.com/pts/redirect.mp3/chtbl.com/track/5899E/traffic.megaphone.fm/HSW7835899191.mp3'), 'https://traffic.megaphone.fm/HSW7835899191.mp3')
|
self.assertEqual(clean_podcast_url('https://www.podtrac.com/pts/redirect.mp3/chtbl.com/track/5899E/traffic.megaphone.fm/HSW7835899191.mp3'), 'https://traffic.megaphone.fm/HSW7835899191.mp3')
|
||||||
self.assertEqual(clean_podcast_url('https://play.podtrac.com/npr-344098539/edge1.pod.npr.org/anon.npr-podcasts/podcast/npr/waitwait/2020/10/20201003_waitwait_wwdtmpodcast201003-015621a5-f035-4eca-a9a1-7c118d90bc3c.mp3'), 'https://edge1.pod.npr.org/anon.npr-podcasts/podcast/npr/waitwait/2020/10/20201003_waitwait_wwdtmpodcast201003-015621a5-f035-4eca-a9a1-7c118d90bc3c.mp3')
|
self.assertEqual(clean_podcast_url('https://play.podtrac.com/npr-344098539/edge1.pod.npr.org/anon.npr-podcasts/podcast/npr/waitwait/2020/10/20201003_waitwait_wwdtmpodcast201003-015621a5-f035-4eca-a9a1-7c118d90bc3c.mp3'), 'https://edge1.pod.npr.org/anon.npr-podcasts/podcast/npr/waitwait/2020/10/20201003_waitwait_wwdtmpodcast201003-015621a5-f035-4eca-a9a1-7c118d90bc3c.mp3')
|
||||||
|
self.assertEqual(clean_podcast_url('https://pdst.fm/e/2.gum.fm/chtbl.com/track/chrt.fm/track/34D33/pscrb.fm/rss/p/traffic.megaphone.fm/ITLLC7765286967.mp3?updated=1687282661'), 'https://traffic.megaphone.fm/ITLLC7765286967.mp3?updated=1687282661')
|
||||||
|
self.assertEqual(clean_podcast_url('https://pdst.fm/e/https://mgln.ai/e/441/www.buzzsprout.com/1121972/13019085-ep-252-the-deep-life-stack.mp3'), 'https://www.buzzsprout.com/1121972/13019085-ep-252-the-deep-life-stack.mp3')
|
||||||
|
|
||||||
def test_LazyList(self):
|
def test_LazyList(self):
|
||||||
it = list(range(10))
|
it = list(range(10))
|
||||||
|
@ -1966,6 +2010,35 @@ def test_get_compatible_ext(self):
|
||||||
self.assertEqual(get_compatible_ext(
|
self.assertEqual(get_compatible_ext(
|
||||||
vcodecs=['av1'], acodecs=['mp4a'], vexts=['webm'], aexts=['m4a'], preferences=('webm', 'mkv')), 'mkv')
|
vcodecs=['av1'], acodecs=['mp4a'], vexts=['webm'], aexts=['m4a'], preferences=('webm', 'mkv')), 'mkv')
|
||||||
|
|
||||||
|
def test_try_call(self):
|
||||||
|
def total(*x, **kwargs):
|
||||||
|
return sum(x) + sum(kwargs.values())
|
||||||
|
|
||||||
|
self.assertEqual(try_call(None), None,
|
||||||
|
msg='not a fn should give None')
|
||||||
|
self.assertEqual(try_call(lambda: 1), 1,
|
||||||
|
msg='int fn with no expected_type should give int')
|
||||||
|
self.assertEqual(try_call(lambda: 1, expected_type=int), 1,
|
||||||
|
msg='int fn with expected_type int should give int')
|
||||||
|
self.assertEqual(try_call(lambda: 1, expected_type=dict), None,
|
||||||
|
msg='int fn with wrong expected_type should give None')
|
||||||
|
self.assertEqual(try_call(total, args=(0, 1, 0, ), expected_type=int), 1,
|
||||||
|
msg='fn should accept arglist')
|
||||||
|
self.assertEqual(try_call(total, kwargs={'a': 0, 'b': 1, 'c': 0}, expected_type=int), 1,
|
||||||
|
msg='fn should accept kwargs')
|
||||||
|
self.assertEqual(try_call(lambda: 1, expected_type=dict), None,
|
||||||
|
msg='int fn with no expected_type should give None')
|
||||||
|
self.assertEqual(try_call(lambda x: {}, total, args=(42, ), expected_type=int), 42,
|
||||||
|
msg='expect first int result with expected_type int')
|
||||||
|
|
||||||
|
def test_variadic(self):
|
||||||
|
self.assertEqual(variadic(None), (None, ))
|
||||||
|
self.assertEqual(variadic('spam'), ('spam', ))
|
||||||
|
self.assertEqual(variadic('spam', allowed_types=dict), 'spam')
|
||||||
|
with warnings.catch_warnings():
|
||||||
|
warnings.simplefilter('ignore')
|
||||||
|
self.assertEqual(variadic('spam', allowed_types=[dict]), 'spam')
|
||||||
|
|
||||||
def test_traverse_obj(self):
|
def test_traverse_obj(self):
|
||||||
_TEST_DATA = {
|
_TEST_DATA = {
|
||||||
100: 100,
|
100: 100,
|
||||||
|
@ -1999,8 +2072,8 @@ def test_traverse_obj(self):
|
||||||
|
|
||||||
# Test Ellipsis behavior
|
# Test Ellipsis behavior
|
||||||
self.assertCountEqual(traverse_obj(_TEST_DATA, ...),
|
self.assertCountEqual(traverse_obj(_TEST_DATA, ...),
|
||||||
(item for item in _TEST_DATA.values() if item is not None),
|
(item for item in _TEST_DATA.values() if item not in (None, {})),
|
||||||
msg='`...` should give all values except `None`')
|
msg='`...` should give all non discarded values')
|
||||||
self.assertCountEqual(traverse_obj(_TEST_DATA, ('urls', 0, ...)), _TEST_DATA['urls'][0].values(),
|
self.assertCountEqual(traverse_obj(_TEST_DATA, ('urls', 0, ...)), _TEST_DATA['urls'][0].values(),
|
||||||
msg='`...` selection for dicts should select all values')
|
msg='`...` selection for dicts should select all values')
|
||||||
self.assertEqual(traverse_obj(_TEST_DATA, (..., ..., 'url')),
|
self.assertEqual(traverse_obj(_TEST_DATA, (..., ..., 'url')),
|
||||||
|
@ -2008,6 +2081,8 @@ def test_traverse_obj(self):
|
||||||
msg='nested `...` queries should work')
|
msg='nested `...` queries should work')
|
||||||
self.assertCountEqual(traverse_obj(_TEST_DATA, (..., ..., 'index')), range(4),
|
self.assertCountEqual(traverse_obj(_TEST_DATA, (..., ..., 'index')), range(4),
|
||||||
msg='`...` query result should be flattened')
|
msg='`...` query result should be flattened')
|
||||||
|
self.assertEqual(traverse_obj(iter(range(4)), ...), list(range(4)),
|
||||||
|
msg='`...` should accept iterables')
|
||||||
|
|
||||||
# Test function as key
|
# Test function as key
|
||||||
self.assertEqual(traverse_obj(_TEST_DATA, lambda x, y: x == 'urls' and isinstance(y, list)),
|
self.assertEqual(traverse_obj(_TEST_DATA, lambda x, y: x == 'urls' and isinstance(y, list)),
|
||||||
|
@ -2015,6 +2090,44 @@ def test_traverse_obj(self):
|
||||||
msg='function as query key should perform a filter based on (key, value)')
|
msg='function as query key should perform a filter based on (key, value)')
|
||||||
self.assertCountEqual(traverse_obj(_TEST_DATA, lambda _, x: isinstance(x[0], str)), {'str'},
|
self.assertCountEqual(traverse_obj(_TEST_DATA, lambda _, x: isinstance(x[0], str)), {'str'},
|
||||||
msg='exceptions in the query function should be catched')
|
msg='exceptions in the query function should be catched')
|
||||||
|
self.assertEqual(traverse_obj(iter(range(4)), lambda _, x: x % 2 == 0), [0, 2],
|
||||||
|
msg='function key should accept iterables')
|
||||||
|
if __debug__:
|
||||||
|
with self.assertRaises(Exception, msg='Wrong function signature should raise in debug'):
|
||||||
|
traverse_obj(_TEST_DATA, lambda a: ...)
|
||||||
|
with self.assertRaises(Exception, msg='Wrong function signature should raise in debug'):
|
||||||
|
traverse_obj(_TEST_DATA, lambda a, b, c: ...)
|
||||||
|
|
||||||
|
# Test set as key (transformation/type, like `expected_type`)
|
||||||
|
self.assertEqual(traverse_obj(_TEST_DATA, (..., {str.upper}, )), ['STR'],
|
||||||
|
msg='Function in set should be a transformation')
|
||||||
|
self.assertEqual(traverse_obj(_TEST_DATA, (..., {str})), ['str'],
|
||||||
|
msg='Type in set should be a type filter')
|
||||||
|
self.assertEqual(traverse_obj(_TEST_DATA, {dict}), _TEST_DATA,
|
||||||
|
msg='A single set should be wrapped into a path')
|
||||||
|
self.assertEqual(traverse_obj(_TEST_DATA, (..., {str.upper})), ['STR'],
|
||||||
|
msg='Transformation function should not raise')
|
||||||
|
self.assertEqual(traverse_obj(_TEST_DATA, (..., {str_or_none})),
|
||||||
|
[item for item in map(str_or_none, _TEST_DATA.values()) if item is not None],
|
||||||
|
msg='Function in set should be a transformation')
|
||||||
|
self.assertEqual(traverse_obj(_TEST_DATA, ('fail', {lambda _: 'const'})), 'const',
|
||||||
|
msg='Function in set should always be called')
|
||||||
|
if __debug__:
|
||||||
|
with self.assertRaises(Exception, msg='Sets with length != 1 should raise in debug'):
|
||||||
|
traverse_obj(_TEST_DATA, set())
|
||||||
|
with self.assertRaises(Exception, msg='Sets with length != 1 should raise in debug'):
|
||||||
|
traverse_obj(_TEST_DATA, {str.upper, str})
|
||||||
|
|
||||||
|
# Test `slice` as a key
|
||||||
|
_SLICE_DATA = [0, 1, 2, 3, 4]
|
||||||
|
self.assertEqual(traverse_obj(_TEST_DATA, ('dict', slice(1))), None,
|
||||||
|
msg='slice on a dictionary should not throw')
|
||||||
|
self.assertEqual(traverse_obj(_SLICE_DATA, slice(1)), _SLICE_DATA[:1],
|
||||||
|
msg='slice key should apply slice to sequence')
|
||||||
|
self.assertEqual(traverse_obj(_SLICE_DATA, slice(1, 2)), _SLICE_DATA[1:2],
|
||||||
|
msg='slice key should apply slice to sequence')
|
||||||
|
self.assertEqual(traverse_obj(_SLICE_DATA, slice(1, 4, 2)), _SLICE_DATA[1:4:2],
|
||||||
|
msg='slice key should apply slice to sequence')
|
||||||
|
|
||||||
# Test alternative paths
|
# Test alternative paths
|
||||||
self.assertEqual(traverse_obj(_TEST_DATA, 'fail', 'str'), 'str',
|
self.assertEqual(traverse_obj(_TEST_DATA, 'fail', 'str'), 'str',
|
||||||
|
@ -2060,15 +2173,23 @@ def test_traverse_obj(self):
|
||||||
{0: ['https://www.example.com/1', 'https://www.example.com/0']},
|
{0: ['https://www.example.com/1', 'https://www.example.com/0']},
|
||||||
msg='tripple nesting in dict path should be treated as branches')
|
msg='tripple nesting in dict path should be treated as branches')
|
||||||
self.assertEqual(traverse_obj(_TEST_DATA, {0: 'fail'}), {},
|
self.assertEqual(traverse_obj(_TEST_DATA, {0: 'fail'}), {},
|
||||||
msg='remove `None` values when dict key')
|
msg='remove `None` values when top level dict key fails')
|
||||||
self.assertEqual(traverse_obj(_TEST_DATA, {0: 'fail'}, default=...), {0: ...},
|
self.assertEqual(traverse_obj(_TEST_DATA, {0: 'fail'}, default=...), {0: ...},
|
||||||
msg='do not remove `None` values if `default`')
|
msg='use `default` if key fails and `default`')
|
||||||
self.assertEqual(traverse_obj(_TEST_DATA, {0: 'dict'}), {0: {}},
|
self.assertEqual(traverse_obj(_TEST_DATA, {0: 'dict'}), {},
|
||||||
msg='do not remove empty values when dict key')
|
msg='remove empty values when dict key')
|
||||||
self.assertEqual(traverse_obj(_TEST_DATA, {0: 'dict'}, default=...), {0: {}},
|
self.assertEqual(traverse_obj(_TEST_DATA, {0: 'dict'}, default=...), {0: ...},
|
||||||
msg='do not remove empty values when dict key and a default')
|
msg='use `default` when dict key and `default`')
|
||||||
self.assertEqual(traverse_obj(_TEST_DATA, {0: ('dict', ...)}), {0: []},
|
self.assertEqual(traverse_obj(_TEST_DATA, {0: {0: 'fail'}}), {},
|
||||||
msg='if branch in dict key not successful, return `[]`')
|
msg='remove empty values when nested dict key fails')
|
||||||
|
self.assertEqual(traverse_obj(None, {0: 'fail'}), {},
|
||||||
|
msg='default to dict if pruned')
|
||||||
|
self.assertEqual(traverse_obj(None, {0: 'fail'}, default=...), {0: ...},
|
||||||
|
msg='default to dict if pruned and default is given')
|
||||||
|
self.assertEqual(traverse_obj(_TEST_DATA, {0: {0: 'fail'}}, default=...), {0: {0: ...}},
|
||||||
|
msg='use nested `default` when nested dict key fails and `default`')
|
||||||
|
self.assertEqual(traverse_obj(_TEST_DATA, {0: ('dict', ...)}), {},
|
||||||
|
msg='remove key if branch in dict key not successful')
|
||||||
|
|
||||||
# Testing default parameter behavior
|
# Testing default parameter behavior
|
||||||
_DEFAULT_DATA = {'None': None, 'int': 0, 'list': []}
|
_DEFAULT_DATA = {'None': None, 'int': 0, 'list': []}
|
||||||
|
@ -2092,20 +2213,55 @@ def test_traverse_obj(self):
|
||||||
msg='if branched but not successful return `[]`, not `default`')
|
msg='if branched but not successful return `[]`, not `default`')
|
||||||
self.assertEqual(traverse_obj(_DEFAULT_DATA, ('list', ...)), [],
|
self.assertEqual(traverse_obj(_DEFAULT_DATA, ('list', ...)), [],
|
||||||
msg='if branched but object is empty return `[]`, not `default`')
|
msg='if branched but object is empty return `[]`, not `default`')
|
||||||
|
self.assertEqual(traverse_obj(None, ...), [],
|
||||||
|
msg='if branched but object is `None` return `[]`, not `default`')
|
||||||
|
self.assertEqual(traverse_obj({0: None}, (0, ...)), [],
|
||||||
|
msg='if branched but state is `None` return `[]`, not `default`')
|
||||||
|
|
||||||
|
branching_paths = [
|
||||||
|
('fail', ...),
|
||||||
|
(..., 'fail'),
|
||||||
|
100 * ('fail',) + (...,),
|
||||||
|
(...,) + 100 * ('fail',),
|
||||||
|
]
|
||||||
|
for branching_path in branching_paths:
|
||||||
|
self.assertEqual(traverse_obj({}, branching_path), [],
|
||||||
|
msg='if branched but state is `None`, return `[]` (not `default`)')
|
||||||
|
self.assertEqual(traverse_obj({}, 'fail', branching_path), [],
|
||||||
|
msg='if branching in last alternative and previous did not match, return `[]` (not `default`)')
|
||||||
|
self.assertEqual(traverse_obj({0: 'x'}, 0, branching_path), 'x',
|
||||||
|
msg='if branching in last alternative and previous did match, return single value')
|
||||||
|
self.assertEqual(traverse_obj({0: 'x'}, branching_path, 0), 'x',
|
||||||
|
msg='if branching in first alternative and non-branching path does match, return single value')
|
||||||
|
self.assertEqual(traverse_obj({}, branching_path, 'fail'), None,
|
||||||
|
msg='if branching in first alternative and non-branching path does not match, return `default`')
|
||||||
|
|
||||||
# Testing expected_type behavior
|
# Testing expected_type behavior
|
||||||
_EXPECTED_TYPE_DATA = {'str': 'str', 'int': 0}
|
_EXPECTED_TYPE_DATA = {'str': 'str', 'int': 0}
|
||||||
self.assertEqual(traverse_obj(_EXPECTED_TYPE_DATA, 'str', expected_type=str), 'str',
|
self.assertEqual(traverse_obj(_EXPECTED_TYPE_DATA, 'str', expected_type=str),
|
||||||
msg='accept matching `expected_type` type')
|
'str', msg='accept matching `expected_type` type')
|
||||||
self.assertEqual(traverse_obj(_EXPECTED_TYPE_DATA, 'str', expected_type=int), None,
|
self.assertEqual(traverse_obj(_EXPECTED_TYPE_DATA, 'str', expected_type=int),
|
||||||
msg='reject non matching `expected_type` type')
|
None, msg='reject non matching `expected_type` type')
|
||||||
self.assertEqual(traverse_obj(_EXPECTED_TYPE_DATA, 'int', expected_type=lambda x: str(x)), '0',
|
self.assertEqual(traverse_obj(_EXPECTED_TYPE_DATA, 'int', expected_type=lambda x: str(x)),
|
||||||
msg='transform type using type function')
|
'0', msg='transform type using type function')
|
||||||
self.assertEqual(traverse_obj(_EXPECTED_TYPE_DATA, 'str',
|
self.assertEqual(traverse_obj(_EXPECTED_TYPE_DATA, 'str', expected_type=lambda _: 1 / 0),
|
||||||
expected_type=lambda _: 1 / 0), None,
|
None, msg='wrap expected_type fuction in try_call')
|
||||||
msg='wrap expected_type fuction in try_call')
|
self.assertEqual(traverse_obj(_EXPECTED_TYPE_DATA, ..., expected_type=str),
|
||||||
self.assertEqual(traverse_obj(_EXPECTED_TYPE_DATA, ..., expected_type=str), ['str'],
|
['str'], msg='eliminate items that expected_type fails on')
|
||||||
msg='eliminate items that expected_type fails on')
|
self.assertEqual(traverse_obj(_TEST_DATA, {0: 100, 1: 1.2}, expected_type=int),
|
||||||
|
{0: 100}, msg='type as expected_type should filter dict values')
|
||||||
|
self.assertEqual(traverse_obj(_TEST_DATA, {0: 100, 1: 1.2, 2: 'None'}, expected_type=str_or_none),
|
||||||
|
{0: '100', 1: '1.2'}, msg='function as expected_type should transform dict values')
|
||||||
|
self.assertEqual(traverse_obj(_TEST_DATA, ({0: 1.2}, 0, {int_or_none}), expected_type=int),
|
||||||
|
1, msg='expected_type should not filter non final dict values')
|
||||||
|
self.assertEqual(traverse_obj(_TEST_DATA, {0: {0: 100, 1: 'str'}}, expected_type=int),
|
||||||
|
{0: {0: 100}}, msg='expected_type should transform deep dict values')
|
||||||
|
self.assertEqual(traverse_obj(_TEST_DATA, [({0: '...'}, {0: '...'})], expected_type=type(...)),
|
||||||
|
[{0: ...}, {0: ...}], msg='expected_type should transform branched dict values')
|
||||||
|
self.assertEqual(traverse_obj({1: {3: 4}}, [(1, 2), 3], expected_type=int),
|
||||||
|
[4], msg='expected_type regression for type matching in tuple branching')
|
||||||
|
self.assertEqual(traverse_obj(_TEST_DATA, ['data', ...], expected_type=int),
|
||||||
|
[], msg='expected_type regression for type matching in dict result')
|
||||||
|
|
||||||
# Test get_all behavior
|
# Test get_all behavior
|
||||||
_GET_ALL_DATA = {'key': [0, 1, 2]}
|
_GET_ALL_DATA = {'key': [0, 1, 2]}
|
||||||
|
@ -2145,31 +2301,23 @@ def test_traverse_obj(self):
|
||||||
traverse_string=True), '.',
|
traverse_string=True), '.',
|
||||||
msg='traverse into converted data if `traverse_string`')
|
msg='traverse into converted data if `traverse_string`')
|
||||||
self.assertEqual(traverse_obj(_TRAVERSE_STRING_DATA, ('str', ...),
|
self.assertEqual(traverse_obj(_TRAVERSE_STRING_DATA, ('str', ...),
|
||||||
traverse_string=True), list('str'),
|
traverse_string=True), 'str',
|
||||||
msg='`...` branching into string should result in list')
|
msg='`...` should result in string (same value) if `traverse_string`')
|
||||||
|
self.assertEqual(traverse_obj(_TRAVERSE_STRING_DATA, ('str', slice(0, None, 2)),
|
||||||
|
traverse_string=True), 'sr',
|
||||||
|
msg='`slice` should result in string if `traverse_string`')
|
||||||
|
self.assertEqual(traverse_obj(_TRAVERSE_STRING_DATA, ('str', lambda i, v: i or v == "s"),
|
||||||
|
traverse_string=True), 'str',
|
||||||
|
msg='function should result in string if `traverse_string`')
|
||||||
self.assertEqual(traverse_obj(_TRAVERSE_STRING_DATA, ('str', (0, 2)),
|
self.assertEqual(traverse_obj(_TRAVERSE_STRING_DATA, ('str', (0, 2)),
|
||||||
traverse_string=True), ['s', 'r'],
|
traverse_string=True), ['s', 'r'],
|
||||||
msg='branching into string should result in list')
|
msg='branching should result in list if `traverse_string`')
|
||||||
self.assertEqual(traverse_obj(_TRAVERSE_STRING_DATA, ('str', lambda _, x: x),
|
self.assertEqual(traverse_obj({}, (0, ...), traverse_string=True), [],
|
||||||
traverse_string=True), list('str'),
|
msg='branching should result in list if `traverse_string`')
|
||||||
msg='function branching into string should result in list')
|
self.assertEqual(traverse_obj({}, (0, lambda x, y: True), traverse_string=True), [],
|
||||||
|
msg='branching should result in list if `traverse_string`')
|
||||||
# Test is_user_input behavior
|
self.assertEqual(traverse_obj({}, (0, slice(1)), traverse_string=True), [],
|
||||||
_IS_USER_INPUT_DATA = {'range8': list(range(8))}
|
msg='branching should result in list if `traverse_string`')
|
||||||
self.assertEqual(traverse_obj(_IS_USER_INPUT_DATA, ('range8', '3'),
|
|
||||||
is_user_input=True), 3,
|
|
||||||
msg='allow for string indexing if `is_user_input`')
|
|
||||||
self.assertCountEqual(traverse_obj(_IS_USER_INPUT_DATA, ('range8', '3:'),
|
|
||||||
is_user_input=True), tuple(range(8))[3:],
|
|
||||||
msg='allow for string slice if `is_user_input`')
|
|
||||||
self.assertCountEqual(traverse_obj(_IS_USER_INPUT_DATA, ('range8', ':4:2'),
|
|
||||||
is_user_input=True), tuple(range(8))[:4:2],
|
|
||||||
msg='allow step in string slice if `is_user_input`')
|
|
||||||
self.assertCountEqual(traverse_obj(_IS_USER_INPUT_DATA, ('range8', ':'),
|
|
||||||
is_user_input=True), range(8),
|
|
||||||
msg='`:` should be treated as `...` if `is_user_input`')
|
|
||||||
with self.assertRaises(TypeError, msg='too many params should result in error'):
|
|
||||||
traverse_obj(_IS_USER_INPUT_DATA, ('range8', ':::'), is_user_input=True)
|
|
||||||
|
|
||||||
# Test re.Match as input obj
|
# Test re.Match as input obj
|
||||||
mobj = re.fullmatch(r'0(12)(?P<group>3)(4)?', '0123')
|
mobj = re.fullmatch(r'0(12)(?P<group>3)(4)?', '0123')
|
||||||
|
@ -2189,6 +2337,120 @@ def test_traverse_obj(self):
|
||||||
msg='failing str key on a `re.Match` should return `default`')
|
msg='failing str key on a `re.Match` should return `default`')
|
||||||
self.assertEqual(traverse_obj(mobj, 8), None,
|
self.assertEqual(traverse_obj(mobj, 8), None,
|
||||||
msg='failing int key on a `re.Match` should return `default`')
|
msg='failing int key on a `re.Match` should return `default`')
|
||||||
|
self.assertEqual(traverse_obj(mobj, lambda k, _: k in (0, 'group')), ['0123', '3'],
|
||||||
|
msg='function on a `re.Match` should give group name as well')
|
||||||
|
|
||||||
|
# Test xml.etree.ElementTree.Element as input obj
|
||||||
|
etree = xml.etree.ElementTree.fromstring('''<?xml version="1.0"?>
|
||||||
|
<data>
|
||||||
|
<country name="Liechtenstein">
|
||||||
|
<rank>1</rank>
|
||||||
|
<year>2008</year>
|
||||||
|
<gdppc>141100</gdppc>
|
||||||
|
<neighbor name="Austria" direction="E"/>
|
||||||
|
<neighbor name="Switzerland" direction="W"/>
|
||||||
|
</country>
|
||||||
|
<country name="Singapore">
|
||||||
|
<rank>4</rank>
|
||||||
|
<year>2011</year>
|
||||||
|
<gdppc>59900</gdppc>
|
||||||
|
<neighbor name="Malaysia" direction="N"/>
|
||||||
|
</country>
|
||||||
|
<country name="Panama">
|
||||||
|
<rank>68</rank>
|
||||||
|
<year>2011</year>
|
||||||
|
<gdppc>13600</gdppc>
|
||||||
|
<neighbor name="Costa Rica" direction="W"/>
|
||||||
|
<neighbor name="Colombia" direction="E"/>
|
||||||
|
</country>
|
||||||
|
</data>''')
|
||||||
|
self.assertEqual(traverse_obj(etree, ''), etree,
|
||||||
|
msg='empty str key should return the element itself')
|
||||||
|
self.assertEqual(traverse_obj(etree, 'country'), list(etree),
|
||||||
|
msg='str key should lead all children with that tag name')
|
||||||
|
self.assertEqual(traverse_obj(etree, ...), list(etree),
|
||||||
|
msg='`...` as key should return all children')
|
||||||
|
self.assertEqual(traverse_obj(etree, lambda _, x: x[0].text == '4'), [etree[1]],
|
||||||
|
msg='function as key should get element as value')
|
||||||
|
self.assertEqual(traverse_obj(etree, lambda i, _: i == 1), [etree[1]],
|
||||||
|
msg='function as key should get index as key')
|
||||||
|
self.assertEqual(traverse_obj(etree, 0), etree[0],
|
||||||
|
msg='int key should return the nth child')
|
||||||
|
self.assertEqual(traverse_obj(etree, './/neighbor/@name'),
|
||||||
|
['Austria', 'Switzerland', 'Malaysia', 'Costa Rica', 'Colombia'],
|
||||||
|
msg='`@<attribute>` at end of path should give that attribute')
|
||||||
|
self.assertEqual(traverse_obj(etree, '//neighbor/@fail'), [None, None, None, None, None],
|
||||||
|
msg='`@<nonexistant>` at end of path should give `None`')
|
||||||
|
self.assertEqual(traverse_obj(etree, ('//neighbor/@', 2)), {'name': 'Malaysia', 'direction': 'N'},
|
||||||
|
msg='`@` should give the full attribute dict')
|
||||||
|
self.assertEqual(traverse_obj(etree, '//year/text()'), ['2008', '2011', '2011'],
|
||||||
|
msg='`text()` at end of path should give the inner text')
|
||||||
|
self.assertEqual(traverse_obj(etree, '//*[@direction]/@direction'), ['E', 'W', 'N', 'W', 'E'],
|
||||||
|
msg='full python xpath features should be supported')
|
||||||
|
self.assertEqual(traverse_obj(etree, (0, '@name')), 'Liechtenstein',
|
||||||
|
msg='special transformations should act on current element')
|
||||||
|
self.assertEqual(traverse_obj(etree, ('country', 0, ..., 'text()', {int_or_none})), [1, 2008, 141100],
|
||||||
|
msg='special transformations should act on current element')
|
||||||
|
|
||||||
|
def test_http_header_dict(self):
|
||||||
|
headers = HTTPHeaderDict()
|
||||||
|
headers['ytdl-test'] = b'0'
|
||||||
|
self.assertEqual(list(headers.items()), [('Ytdl-Test', '0')])
|
||||||
|
headers['ytdl-test'] = 1
|
||||||
|
self.assertEqual(list(headers.items()), [('Ytdl-Test', '1')])
|
||||||
|
headers['Ytdl-test'] = '2'
|
||||||
|
self.assertEqual(list(headers.items()), [('Ytdl-Test', '2')])
|
||||||
|
self.assertTrue('ytDl-Test' in headers)
|
||||||
|
self.assertEqual(str(headers), str(dict(headers)))
|
||||||
|
self.assertEqual(repr(headers), str(dict(headers)))
|
||||||
|
|
||||||
|
headers.update({'X-dlp': 'data'})
|
||||||
|
self.assertEqual(set(headers.items()), {('Ytdl-Test', '2'), ('X-Dlp', 'data')})
|
||||||
|
self.assertEqual(dict(headers), {'Ytdl-Test': '2', 'X-Dlp': 'data'})
|
||||||
|
self.assertEqual(len(headers), 2)
|
||||||
|
self.assertEqual(headers.copy(), headers)
|
||||||
|
headers2 = HTTPHeaderDict({'X-dlp': 'data3'}, **headers, **{'X-dlp': 'data2'})
|
||||||
|
self.assertEqual(set(headers2.items()), {('Ytdl-Test', '2'), ('X-Dlp', 'data2')})
|
||||||
|
self.assertEqual(len(headers2), 2)
|
||||||
|
headers2.clear()
|
||||||
|
self.assertEqual(len(headers2), 0)
|
||||||
|
|
||||||
|
# ensure we prefer latter headers
|
||||||
|
headers3 = HTTPHeaderDict({'Ytdl-TeSt': 1}, {'Ytdl-test': 2})
|
||||||
|
self.assertEqual(set(headers3.items()), {('Ytdl-Test', '2')})
|
||||||
|
del headers3['ytdl-tesT']
|
||||||
|
self.assertEqual(dict(headers3), {})
|
||||||
|
|
||||||
|
headers4 = HTTPHeaderDict({'ytdl-test': 'data;'})
|
||||||
|
self.assertEqual(set(headers4.items()), {('Ytdl-Test', 'data;')})
|
||||||
|
|
||||||
|
# common mistake: strip whitespace from values
|
||||||
|
# https://github.com/yt-dlp/yt-dlp/issues/8729
|
||||||
|
headers5 = HTTPHeaderDict({'ytdl-test': ' data; '})
|
||||||
|
self.assertEqual(set(headers5.items()), {('Ytdl-Test', 'data;')})
|
||||||
|
|
||||||
|
def test_extract_basic_auth(self):
|
||||||
|
assert extract_basic_auth('http://:foo.bar') == ('http://:foo.bar', None)
|
||||||
|
assert extract_basic_auth('http://foo.bar') == ('http://foo.bar', None)
|
||||||
|
assert extract_basic_auth('http://@foo.bar') == ('http://foo.bar', 'Basic Og==')
|
||||||
|
assert extract_basic_auth('http://:pass@foo.bar') == ('http://foo.bar', 'Basic OnBhc3M=')
|
||||||
|
assert extract_basic_auth('http://user:@foo.bar') == ('http://foo.bar', 'Basic dXNlcjo=')
|
||||||
|
assert extract_basic_auth('http://user:pass@foo.bar') == ('http://foo.bar', 'Basic dXNlcjpwYXNz')
|
||||||
|
|
||||||
|
@unittest.skipUnless(compat_os_name == 'nt', 'Only relevant on Windows')
|
||||||
|
def test_Popen_windows_escaping(self):
|
||||||
|
def run_shell(args):
|
||||||
|
stdout, stderr, error = Popen.run(
|
||||||
|
args, text=True, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
|
||||||
|
assert not stderr
|
||||||
|
assert not error
|
||||||
|
return stdout
|
||||||
|
|
||||||
|
# Test escaping
|
||||||
|
assert run_shell(['echo', 'test"&']) == '"test""&"\n'
|
||||||
|
# Test if delayed expansion is disabled
|
||||||
|
assert run_shell(['echo', '^!']) == '"^!"\n'
|
||||||
|
assert run_shell('echo "^!"') == '"^!"\n'
|
||||||
|
|
||||||
|
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
|
|
383
test/test_websockets.py
Normal file
383
test/test_websockets.py
Normal file
|
@ -0,0 +1,383 @@
|
||||||
|
#!/usr/bin/env python3
|
||||||
|
|
||||||
|
# Allow direct execution
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
from test.helper import verify_address_availability
|
||||||
|
|
||||||
|
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
|
||||||
|
|
||||||
|
import http.client
|
||||||
|
import http.cookiejar
|
||||||
|
import http.server
|
||||||
|
import json
|
||||||
|
import random
|
||||||
|
import ssl
|
||||||
|
import threading
|
||||||
|
|
||||||
|
from yt_dlp import socks
|
||||||
|
from yt_dlp.cookies import YoutubeDLCookieJar
|
||||||
|
from yt_dlp.dependencies import websockets
|
||||||
|
from yt_dlp.networking import Request
|
||||||
|
from yt_dlp.networking.exceptions import (
|
||||||
|
CertificateVerifyError,
|
||||||
|
HTTPError,
|
||||||
|
ProxyError,
|
||||||
|
RequestError,
|
||||||
|
SSLError,
|
||||||
|
TransportError,
|
||||||
|
)
|
||||||
|
from yt_dlp.utils.networking import HTTPHeaderDict
|
||||||
|
|
||||||
|
from test.conftest import validate_and_send
|
||||||
|
|
||||||
|
TEST_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||||
|
|
||||||
|
|
||||||
|
def websocket_handler(websocket):
|
||||||
|
for message in websocket:
|
||||||
|
if isinstance(message, bytes):
|
||||||
|
if message == b'bytes':
|
||||||
|
return websocket.send('2')
|
||||||
|
elif isinstance(message, str):
|
||||||
|
if message == 'headers':
|
||||||
|
return websocket.send(json.dumps(dict(websocket.request.headers)))
|
||||||
|
elif message == 'path':
|
||||||
|
return websocket.send(websocket.request.path)
|
||||||
|
elif message == 'source_address':
|
||||||
|
return websocket.send(websocket.remote_address[0])
|
||||||
|
elif message == 'str':
|
||||||
|
return websocket.send('1')
|
||||||
|
return websocket.send(message)
|
||||||
|
|
||||||
|
|
||||||
|
def process_request(self, request):
|
||||||
|
if request.path.startswith('/gen_'):
|
||||||
|
status = http.HTTPStatus(int(request.path[5:]))
|
||||||
|
if 300 <= status.value <= 300:
|
||||||
|
return websockets.http11.Response(
|
||||||
|
status.value, status.phrase, websockets.datastructures.Headers([('Location', '/')]), b'')
|
||||||
|
return self.protocol.reject(status.value, status.phrase)
|
||||||
|
return self.protocol.accept(request)
|
||||||
|
|
||||||
|
|
||||||
|
def create_websocket_server(**ws_kwargs):
|
||||||
|
import websockets.sync.server
|
||||||
|
wsd = websockets.sync.server.serve(websocket_handler, '127.0.0.1', 0, process_request=process_request, **ws_kwargs)
|
||||||
|
ws_port = wsd.socket.getsockname()[1]
|
||||||
|
ws_server_thread = threading.Thread(target=wsd.serve_forever)
|
||||||
|
ws_server_thread.daemon = True
|
||||||
|
ws_server_thread.start()
|
||||||
|
return ws_server_thread, ws_port
|
||||||
|
|
||||||
|
|
||||||
|
def create_ws_websocket_server():
|
||||||
|
return create_websocket_server()
|
||||||
|
|
||||||
|
|
||||||
|
def create_wss_websocket_server():
|
||||||
|
certfn = os.path.join(TEST_DIR, 'testcert.pem')
|
||||||
|
sslctx = ssl.SSLContext(ssl.PROTOCOL_TLS_SERVER)
|
||||||
|
sslctx.load_cert_chain(certfn, None)
|
||||||
|
return create_websocket_server(ssl_context=sslctx)
|
||||||
|
|
||||||
|
|
||||||
|
MTLS_CERT_DIR = os.path.join(TEST_DIR, 'testdata', 'certificate')
|
||||||
|
|
||||||
|
|
||||||
|
def create_mtls_wss_websocket_server():
|
||||||
|
certfn = os.path.join(TEST_DIR, 'testcert.pem')
|
||||||
|
cacertfn = os.path.join(MTLS_CERT_DIR, 'ca.crt')
|
||||||
|
|
||||||
|
sslctx = ssl.SSLContext(ssl.PROTOCOL_TLS_SERVER)
|
||||||
|
sslctx.verify_mode = ssl.CERT_REQUIRED
|
||||||
|
sslctx.load_verify_locations(cafile=cacertfn)
|
||||||
|
sslctx.load_cert_chain(certfn, None)
|
||||||
|
|
||||||
|
return create_websocket_server(ssl_context=sslctx)
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.skipif(not websockets, reason='websockets must be installed to test websocket request handlers')
|
||||||
|
class TestWebsSocketRequestHandlerConformance:
|
||||||
|
@classmethod
|
||||||
|
def setup_class(cls):
|
||||||
|
cls.ws_thread, cls.ws_port = create_ws_websocket_server()
|
||||||
|
cls.ws_base_url = f'ws://127.0.0.1:{cls.ws_port}'
|
||||||
|
|
||||||
|
cls.wss_thread, cls.wss_port = create_wss_websocket_server()
|
||||||
|
cls.wss_base_url = f'wss://127.0.0.1:{cls.wss_port}'
|
||||||
|
|
||||||
|
cls.bad_wss_thread, cls.bad_wss_port = create_websocket_server(ssl_context=ssl.SSLContext(ssl.PROTOCOL_TLS_SERVER))
|
||||||
|
cls.bad_wss_host = f'wss://127.0.0.1:{cls.bad_wss_port}'
|
||||||
|
|
||||||
|
cls.mtls_wss_thread, cls.mtls_wss_port = create_mtls_wss_websocket_server()
|
||||||
|
cls.mtls_wss_base_url = f'wss://127.0.0.1:{cls.mtls_wss_port}'
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
|
||||||
|
def test_basic_websockets(self, handler):
|
||||||
|
with handler() as rh:
|
||||||
|
ws = validate_and_send(rh, Request(self.ws_base_url))
|
||||||
|
assert 'upgrade' in ws.headers
|
||||||
|
assert ws.status == 101
|
||||||
|
ws.send('foo')
|
||||||
|
assert ws.recv() == 'foo'
|
||||||
|
ws.close()
|
||||||
|
|
||||||
|
# https://www.rfc-editor.org/rfc/rfc6455.html#section-5.6
|
||||||
|
@pytest.mark.parametrize('msg,opcode', [('str', 1), (b'bytes', 2)])
|
||||||
|
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
|
||||||
|
def test_send_types(self, handler, msg, opcode):
|
||||||
|
with handler() as rh:
|
||||||
|
ws = validate_and_send(rh, Request(self.ws_base_url))
|
||||||
|
ws.send(msg)
|
||||||
|
assert int(ws.recv()) == opcode
|
||||||
|
ws.close()
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
|
||||||
|
def test_verify_cert(self, handler):
|
||||||
|
with handler() as rh:
|
||||||
|
with pytest.raises(CertificateVerifyError):
|
||||||
|
validate_and_send(rh, Request(self.wss_base_url))
|
||||||
|
|
||||||
|
with handler(verify=False) as rh:
|
||||||
|
ws = validate_and_send(rh, Request(self.wss_base_url))
|
||||||
|
assert ws.status == 101
|
||||||
|
ws.close()
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
|
||||||
|
def test_ssl_error(self, handler):
|
||||||
|
with handler(verify=False) as rh:
|
||||||
|
with pytest.raises(SSLError, match=r'ssl(?:v3|/tls) alert handshake failure') as exc_info:
|
||||||
|
validate_and_send(rh, Request(self.bad_wss_host))
|
||||||
|
assert not issubclass(exc_info.type, CertificateVerifyError)
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
|
||||||
|
@pytest.mark.parametrize('path,expected', [
|
||||||
|
# Unicode characters should be encoded with uppercase percent-encoding
|
||||||
|
('/中文', '/%E4%B8%AD%E6%96%87'),
|
||||||
|
# don't normalize existing percent encodings
|
||||||
|
('/%c7%9f', '/%c7%9f'),
|
||||||
|
])
|
||||||
|
def test_percent_encode(self, handler, path, expected):
|
||||||
|
with handler() as rh:
|
||||||
|
ws = validate_and_send(rh, Request(f'{self.ws_base_url}{path}'))
|
||||||
|
ws.send('path')
|
||||||
|
assert ws.recv() == expected
|
||||||
|
assert ws.status == 101
|
||||||
|
ws.close()
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
|
||||||
|
def test_remove_dot_segments(self, handler):
|
||||||
|
with handler() as rh:
|
||||||
|
# This isn't a comprehensive test,
|
||||||
|
# but it should be enough to check whether the handler is removing dot segments
|
||||||
|
ws = validate_and_send(rh, Request(f'{self.ws_base_url}/a/b/./../../test'))
|
||||||
|
assert ws.status == 101
|
||||||
|
ws.send('path')
|
||||||
|
assert ws.recv() == '/test'
|
||||||
|
ws.close()
|
||||||
|
|
||||||
|
# We are restricted to known HTTP status codes in http.HTTPStatus
|
||||||
|
# Redirects are not supported for websockets
|
||||||
|
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
|
||||||
|
@pytest.mark.parametrize('status', (200, 204, 301, 302, 303, 400, 500, 511))
|
||||||
|
def test_raise_http_error(self, handler, status):
|
||||||
|
with handler() as rh:
|
||||||
|
with pytest.raises(HTTPError) as exc_info:
|
||||||
|
validate_and_send(rh, Request(f'{self.ws_base_url}/gen_{status}'))
|
||||||
|
assert exc_info.value.status == status
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
|
||||||
|
@pytest.mark.parametrize('params,extensions', [
|
||||||
|
({'timeout': 0.00001}, {}),
|
||||||
|
({}, {'timeout': 0.00001}),
|
||||||
|
])
|
||||||
|
def test_timeout(self, handler, params, extensions):
|
||||||
|
with handler(**params) as rh:
|
||||||
|
with pytest.raises(TransportError):
|
||||||
|
validate_and_send(rh, Request(self.ws_base_url, extensions=extensions))
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
|
||||||
|
def test_cookies(self, handler):
|
||||||
|
cookiejar = YoutubeDLCookieJar()
|
||||||
|
cookiejar.set_cookie(http.cookiejar.Cookie(
|
||||||
|
version=0, name='test', value='ytdlp', port=None, port_specified=False,
|
||||||
|
domain='127.0.0.1', domain_specified=True, domain_initial_dot=False, path='/',
|
||||||
|
path_specified=True, secure=False, expires=None, discard=False, comment=None,
|
||||||
|
comment_url=None, rest={}))
|
||||||
|
|
||||||
|
with handler(cookiejar=cookiejar) as rh:
|
||||||
|
ws = validate_and_send(rh, Request(self.ws_base_url))
|
||||||
|
ws.send('headers')
|
||||||
|
assert json.loads(ws.recv())['cookie'] == 'test=ytdlp'
|
||||||
|
ws.close()
|
||||||
|
|
||||||
|
with handler() as rh:
|
||||||
|
ws = validate_and_send(rh, Request(self.ws_base_url))
|
||||||
|
ws.send('headers')
|
||||||
|
assert 'cookie' not in json.loads(ws.recv())
|
||||||
|
ws.close()
|
||||||
|
|
||||||
|
ws = validate_and_send(rh, Request(self.ws_base_url, extensions={'cookiejar': cookiejar}))
|
||||||
|
ws.send('headers')
|
||||||
|
assert json.loads(ws.recv())['cookie'] == 'test=ytdlp'
|
||||||
|
ws.close()
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
|
||||||
|
def test_source_address(self, handler):
|
||||||
|
source_address = f'127.0.0.{random.randint(5, 255)}'
|
||||||
|
verify_address_availability(source_address)
|
||||||
|
with handler(source_address=source_address) as rh:
|
||||||
|
ws = validate_and_send(rh, Request(self.ws_base_url))
|
||||||
|
ws.send('source_address')
|
||||||
|
assert source_address == ws.recv()
|
||||||
|
ws.close()
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
|
||||||
|
def test_response_url(self, handler):
|
||||||
|
with handler() as rh:
|
||||||
|
url = f'{self.ws_base_url}/something'
|
||||||
|
ws = validate_and_send(rh, Request(url))
|
||||||
|
assert ws.url == url
|
||||||
|
ws.close()
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
|
||||||
|
def test_request_headers(self, handler):
|
||||||
|
with handler(headers=HTTPHeaderDict({'test1': 'test', 'test2': 'test2'})) as rh:
|
||||||
|
# Global Headers
|
||||||
|
ws = validate_and_send(rh, Request(self.ws_base_url))
|
||||||
|
ws.send('headers')
|
||||||
|
headers = HTTPHeaderDict(json.loads(ws.recv()))
|
||||||
|
assert headers['test1'] == 'test'
|
||||||
|
ws.close()
|
||||||
|
|
||||||
|
# Per request headers, merged with global
|
||||||
|
ws = validate_and_send(rh, Request(
|
||||||
|
self.ws_base_url, headers={'test2': 'changed', 'test3': 'test3'}))
|
||||||
|
ws.send('headers')
|
||||||
|
headers = HTTPHeaderDict(json.loads(ws.recv()))
|
||||||
|
assert headers['test1'] == 'test'
|
||||||
|
assert headers['test2'] == 'changed'
|
||||||
|
assert headers['test3'] == 'test3'
|
||||||
|
ws.close()
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('client_cert', (
|
||||||
|
{'client_certificate': os.path.join(MTLS_CERT_DIR, 'clientwithkey.crt')},
|
||||||
|
{
|
||||||
|
'client_certificate': os.path.join(MTLS_CERT_DIR, 'client.crt'),
|
||||||
|
'client_certificate_key': os.path.join(MTLS_CERT_DIR, 'client.key'),
|
||||||
|
},
|
||||||
|
{
|
||||||
|
'client_certificate': os.path.join(MTLS_CERT_DIR, 'clientwithencryptedkey.crt'),
|
||||||
|
'client_certificate_password': 'foobar',
|
||||||
|
},
|
||||||
|
{
|
||||||
|
'client_certificate': os.path.join(MTLS_CERT_DIR, 'client.crt'),
|
||||||
|
'client_certificate_key': os.path.join(MTLS_CERT_DIR, 'clientencrypted.key'),
|
||||||
|
'client_certificate_password': 'foobar',
|
||||||
|
}
|
||||||
|
))
|
||||||
|
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
|
||||||
|
def test_mtls(self, handler, client_cert):
|
||||||
|
with handler(
|
||||||
|
# Disable client-side validation of unacceptable self-signed testcert.pem
|
||||||
|
# The test is of a check on the server side, so unaffected
|
||||||
|
verify=False,
|
||||||
|
client_cert=client_cert
|
||||||
|
) as rh:
|
||||||
|
validate_and_send(rh, Request(self.mtls_wss_base_url)).close()
|
||||||
|
|
||||||
|
|
||||||
|
def create_fake_ws_connection(raised):
|
||||||
|
import websockets.sync.client
|
||||||
|
|
||||||
|
class FakeWsConnection(websockets.sync.client.ClientConnection):
|
||||||
|
def __init__(self, *args, **kwargs):
|
||||||
|
class FakeResponse:
|
||||||
|
body = b''
|
||||||
|
headers = {}
|
||||||
|
status_code = 101
|
||||||
|
reason_phrase = 'test'
|
||||||
|
|
||||||
|
self.response = FakeResponse()
|
||||||
|
|
||||||
|
def send(self, *args, **kwargs):
|
||||||
|
raise raised()
|
||||||
|
|
||||||
|
def recv(self, *args, **kwargs):
|
||||||
|
raise raised()
|
||||||
|
|
||||||
|
def close(self, *args, **kwargs):
|
||||||
|
return
|
||||||
|
|
||||||
|
return FakeWsConnection()
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('handler', ['Websockets'], indirect=True)
|
||||||
|
class TestWebsocketsRequestHandler:
|
||||||
|
@pytest.mark.parametrize('raised,expected', [
|
||||||
|
# https://websockets.readthedocs.io/en/stable/reference/exceptions.html
|
||||||
|
(lambda: websockets.exceptions.InvalidURI(msg='test', uri='test://'), RequestError),
|
||||||
|
# Requires a response object. Should be covered by HTTP error tests.
|
||||||
|
# (lambda: websockets.exceptions.InvalidStatus(), TransportError),
|
||||||
|
(lambda: websockets.exceptions.InvalidHandshake(), TransportError),
|
||||||
|
# These are subclasses of InvalidHandshake
|
||||||
|
(lambda: websockets.exceptions.InvalidHeader(name='test'), TransportError),
|
||||||
|
(lambda: websockets.exceptions.NegotiationError(), TransportError),
|
||||||
|
# Catch-all
|
||||||
|
(lambda: websockets.exceptions.WebSocketException(), TransportError),
|
||||||
|
(lambda: TimeoutError(), TransportError),
|
||||||
|
# These may be raised by our create_connection implementation, which should also be caught
|
||||||
|
(lambda: OSError(), TransportError),
|
||||||
|
(lambda: ssl.SSLError(), SSLError),
|
||||||
|
(lambda: ssl.SSLCertVerificationError(), CertificateVerifyError),
|
||||||
|
(lambda: socks.ProxyError(), ProxyError),
|
||||||
|
])
|
||||||
|
def test_request_error_mapping(self, handler, monkeypatch, raised, expected):
|
||||||
|
import websockets.sync.client
|
||||||
|
|
||||||
|
import yt_dlp.networking._websockets
|
||||||
|
with handler() as rh:
|
||||||
|
def fake_connect(*args, **kwargs):
|
||||||
|
raise raised()
|
||||||
|
monkeypatch.setattr(yt_dlp.networking._websockets, 'create_connection', lambda *args, **kwargs: None)
|
||||||
|
monkeypatch.setattr(websockets.sync.client, 'connect', fake_connect)
|
||||||
|
with pytest.raises(expected) as exc_info:
|
||||||
|
rh.send(Request('ws://fake-url'))
|
||||||
|
assert exc_info.type is expected
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('raised,expected,match', [
|
||||||
|
# https://websockets.readthedocs.io/en/stable/reference/sync/client.html#websockets.sync.client.ClientConnection.send
|
||||||
|
(lambda: websockets.exceptions.ConnectionClosed(None, None), TransportError, None),
|
||||||
|
(lambda: RuntimeError(), TransportError, None),
|
||||||
|
(lambda: TimeoutError(), TransportError, None),
|
||||||
|
(lambda: TypeError(), RequestError, None),
|
||||||
|
(lambda: socks.ProxyError(), ProxyError, None),
|
||||||
|
# Catch-all
|
||||||
|
(lambda: websockets.exceptions.WebSocketException(), TransportError, None),
|
||||||
|
])
|
||||||
|
def test_ws_send_error_mapping(self, handler, monkeypatch, raised, expected, match):
|
||||||
|
from yt_dlp.networking._websockets import WebsocketsResponseAdapter
|
||||||
|
ws = WebsocketsResponseAdapter(create_fake_ws_connection(raised), url='ws://fake-url')
|
||||||
|
with pytest.raises(expected, match=match) as exc_info:
|
||||||
|
ws.send('test')
|
||||||
|
assert exc_info.type is expected
|
||||||
|
|
||||||
|
@pytest.mark.parametrize('raised,expected,match', [
|
||||||
|
# https://websockets.readthedocs.io/en/stable/reference/sync/client.html#websockets.sync.client.ClientConnection.recv
|
||||||
|
(lambda: websockets.exceptions.ConnectionClosed(None, None), TransportError, None),
|
||||||
|
(lambda: RuntimeError(), TransportError, None),
|
||||||
|
(lambda: TimeoutError(), TransportError, None),
|
||||||
|
(lambda: socks.ProxyError(), ProxyError, None),
|
||||||
|
# Catch-all
|
||||||
|
(lambda: websockets.exceptions.WebSocketException(), TransportError, None),
|
||||||
|
])
|
||||||
|
def test_ws_recv_error_mapping(self, handler, monkeypatch, raised, expected, match):
|
||||||
|
from yt_dlp.networking._websockets import WebsocketsResponseAdapter
|
||||||
|
ws = WebsocketsResponseAdapter(create_fake_ws_connection(raised), url='ws://fake-url')
|
||||||
|
with pytest.raises(expected, match=match) as exc_info:
|
||||||
|
ws.recv()
|
||||||
|
assert exc_info.type is expected
|
|
@ -62,10 +62,19 @@
|
||||||
'https://s.ytimg.com/yts/jsbin/html5player-en_US-vflKjOTVq/html5player.js',
|
'https://s.ytimg.com/yts/jsbin/html5player-en_US-vflKjOTVq/html5player.js',
|
||||||
'312AA52209E3623129A412D56A40F11CB0AF14AE.3EE09501CB14E3BCDC3B2AE808BF3F1D14E7FBF12',
|
'312AA52209E3623129A412D56A40F11CB0AF14AE.3EE09501CB14E3BCDC3B2AE808BF3F1D14E7FBF12',
|
||||||
'112AA5220913623229A412D56A40F11CB0AF14AE.3EE0950FCB14EEBCDC3B2AE808BF331D14E7FBF3',
|
'112AA5220913623229A412D56A40F11CB0AF14AE.3EE0950FCB14EEBCDC3B2AE808BF331D14E7FBF3',
|
||||||
)
|
),
|
||||||
|
(
|
||||||
|
'https://www.youtube.com/s/player/6ed0d907/player_ias.vflset/en_US/base.js',
|
||||||
|
'2aq0aqSyOoJXtK73m-uME_jv7-pT15gOFC02RFkGMqWpzEICs69VdbwQ0LDp1v7j8xx92efCJlYFYb1sUkkBSPOlPmXgIARw8JQ0qOAOAA',
|
||||||
|
'AOq0QJ8wRAIgXmPlOPSBkkUs1bYFYlJCfe29xx8j7v1pDL2QwbdV96sCIEzpWqMGkFR20CFOg51Tp-7vj_EMu-m37KtXJoOySqa0',
|
||||||
|
),
|
||||||
]
|
]
|
||||||
|
|
||||||
_NSIG_TESTS = [
|
_NSIG_TESTS = [
|
||||||
|
(
|
||||||
|
'https://www.youtube.com/s/player/7862ca1f/player_ias.vflset/en_US/base.js',
|
||||||
|
'X_LCxVDjAavgE5t', 'yxJ1dM6iz5ogUg',
|
||||||
|
),
|
||||||
(
|
(
|
||||||
'https://www.youtube.com/s/player/9216d1f7/player_ias.vflset/en_US/base.js',
|
'https://www.youtube.com/s/player/9216d1f7/player_ias.vflset/en_US/base.js',
|
||||||
'SLp9F5bwjAdhE9F-', 'gWnb9IK2DJ8Q1w',
|
'SLp9F5bwjAdhE9F-', 'gWnb9IK2DJ8Q1w',
|
||||||
|
@ -134,6 +143,26 @@
|
||||||
'https://www.youtube.com/s/player/7a062b77/player_ias.vflset/en_US/base.js',
|
'https://www.youtube.com/s/player/7a062b77/player_ias.vflset/en_US/base.js',
|
||||||
'NRcE3y3mVtm_cV-W', 'VbsCYUATvqlt5w',
|
'NRcE3y3mVtm_cV-W', 'VbsCYUATvqlt5w',
|
||||||
),
|
),
|
||||||
|
(
|
||||||
|
'https://www.youtube.com/s/player/dac945fd/player_ias.vflset/en_US/base.js',
|
||||||
|
'o8BkRxXhuYsBCWi6RplPdP', '3Lx32v_hmzTm6A',
|
||||||
|
),
|
||||||
|
(
|
||||||
|
'https://www.youtube.com/s/player/6f20102c/player_ias.vflset/en_US/base.js',
|
||||||
|
'lE8DhoDmKqnmJJ', 'pJTTX6XyJP2BYw',
|
||||||
|
),
|
||||||
|
(
|
||||||
|
'https://www.youtube.com/s/player/cfa9e7cb/player_ias.vflset/en_US/base.js',
|
||||||
|
'aCi3iElgd2kq0bxVbQ', 'QX1y8jGb2IbZ0w',
|
||||||
|
),
|
||||||
|
(
|
||||||
|
'https://www.youtube.com/s/player/8c7583ff/player_ias.vflset/en_US/base.js',
|
||||||
|
'1wWCVpRR96eAmMI87L', 'KSkWAVv1ZQxC3A',
|
||||||
|
),
|
||||||
|
(
|
||||||
|
'https://www.youtube.com/s/player/b7910ca8/player_ias.vflset/en_US/base.js',
|
||||||
|
'_hXMCwMt9qE310D', 'LoZMgkkofRMCZQ',
|
||||||
|
),
|
||||||
]
|
]
|
||||||
|
|
||||||
|
|
||||||
|
@ -210,7 +239,7 @@ def n_sig(jscode, sig_input):
|
||||||
|
|
||||||
|
|
||||||
make_sig_test = t_factory(
|
make_sig_test = t_factory(
|
||||||
'signature', signature, re.compile(r'.*-(?P<id>[a-zA-Z0-9_-]+)(?:/watch_as3|/html5player)?\.[a-z]+$'))
|
'signature', signature, re.compile(r'.*(?:-|/player/)(?P<id>[a-zA-Z0-9_-]+)(?:/.+\.js|(?:/watch_as3|/html5player)?\.[a-z]+)$'))
|
||||||
for test_spec in _SIG_TESTS:
|
for test_spec in _SIG_TESTS:
|
||||||
make_sig_test(*test_spec)
|
make_sig_test(*test_spec)
|
||||||
|
|
||||||
|
|
|
@ -1,34 +0,0 @@
|
||||||
{
|
|
||||||
"latest": "2013.01.06",
|
|
||||||
"signature": "72158cdba391628569ffdbea259afbcf279bbe3d8aeb7492690735dc1cfa6afa754f55c61196f3871d429599ab22f2667f1fec98865527b32632e7f4b3675a7ef0f0fbe084d359256ae4bba68f0d33854e531a70754712f244be71d4b92e664302aa99653ee4df19800d955b6c4149cd2b3f24288d6e4b40b16126e01f4c8ce6",
|
|
||||||
"versions": {
|
|
||||||
"2013.01.02": {
|
|
||||||
"bin": [
|
|
||||||
"http://youtube-dl.org/downloads/2013.01.02/youtube-dl",
|
|
||||||
"f5b502f8aaa77675c4884938b1e4871ebca2611813a0c0e74f60c0fbd6dcca6b"
|
|
||||||
],
|
|
||||||
"exe": [
|
|
||||||
"http://youtube-dl.org/downloads/2013.01.02/youtube-dl.exe",
|
|
||||||
"75fa89d2ce297d102ff27675aa9d92545bbc91013f52ec52868c069f4f9f0422"
|
|
||||||
],
|
|
||||||
"tar": [
|
|
||||||
"http://youtube-dl.org/downloads/2013.01.02/youtube-dl-2013.01.02.tar.gz",
|
|
||||||
"6a66d022ac8e1c13da284036288a133ec8dba003b7bd3a5179d0c0daca8c8196"
|
|
||||||
]
|
|
||||||
},
|
|
||||||
"2013.01.06": {
|
|
||||||
"bin": [
|
|
||||||
"http://youtube-dl.org/downloads/2013.01.06/youtube-dl",
|
|
||||||
"64b6ed8865735c6302e836d4d832577321b4519aa02640dc508580c1ee824049"
|
|
||||||
],
|
|
||||||
"exe": [
|
|
||||||
"http://youtube-dl.org/downloads/2013.01.06/youtube-dl.exe",
|
|
||||||
"58609baf91e4389d36e3ba586e21dab882daaaee537e4448b1265392ae86ff84"
|
|
||||||
],
|
|
||||||
"tar": [
|
|
||||||
"http://youtube-dl.org/downloads/2013.01.06/youtube-dl-2013.01.06.tar.gz",
|
|
||||||
"fe77ab20a95d980ed17a659aa67e371fdd4d656d19c4c7950e7b720b0c2f1a86"
|
|
||||||
]
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
|
@ -1 +1 @@
|
||||||
@py -bb -Werror -Xdev "%~dp0yt_dlp\__main__.py" %*
|
@py -Werror -Xdev "%~dp0yt_dlp\__main__.py" %*
|
||||||
|
|
|
@ -1,2 +1,2 @@
|
||||||
#!/usr/bin/env sh
|
#!/usr/bin/env sh
|
||||||
exec "${PYTHON:-python3}" -bb -Werror -Xdev "$(dirname "$(realpath "$0")")/yt_dlp/__main__.py" "$@"
|
exec "${PYTHON:-python3}" -Werror -Xdev "$(dirname "$(realpath "$0")")/yt_dlp/__main__.py" "$@"
|
||||||
|
|
File diff suppressed because it is too large
Load diff
|
@ -1,8 +1,8 @@
|
||||||
try:
|
import sys
|
||||||
import contextvars # noqa: F401
|
|
||||||
except Exception:
|
if sys.version_info < (3, 8):
|
||||||
raise Exception(
|
raise ImportError(
|
||||||
f'You are using an unsupported version of Python. Only Python versions 3.7 and above are supported by yt-dlp') # noqa: F541
|
f'You are using an unsupported version of Python. Only Python versions 3.8 and above are supported by yt-dlp') # noqa: F541
|
||||||
|
|
||||||
__license__ = 'Public Domain'
|
__license__ = 'Public Domain'
|
||||||
|
|
||||||
|
@ -12,7 +12,7 @@
|
||||||
import optparse
|
import optparse
|
||||||
import os
|
import os
|
||||||
import re
|
import re
|
||||||
import sys
|
import traceback
|
||||||
|
|
||||||
from .compat import compat_shlex_quote
|
from .compat import compat_shlex_quote
|
||||||
from .cookies import SUPPORTED_BROWSERS, SUPPORTED_KEYRINGS
|
from .cookies import SUPPORTED_BROWSERS, SUPPORTED_KEYRINGS
|
||||||
|
@ -56,11 +56,11 @@
|
||||||
read_stdin,
|
read_stdin,
|
||||||
render_table,
|
render_table,
|
||||||
setproctitle,
|
setproctitle,
|
||||||
std_headers,
|
|
||||||
traverse_obj,
|
traverse_obj,
|
||||||
variadic,
|
variadic,
|
||||||
write_string,
|
write_string,
|
||||||
)
|
)
|
||||||
|
from .utils.networking import std_headers
|
||||||
from .YoutubeDL import YoutubeDL
|
from .YoutubeDL import YoutubeDL
|
||||||
|
|
||||||
_IN_CLI = False
|
_IN_CLI = False
|
||||||
|
@ -73,14 +73,16 @@ def _exit(status=0, *args):
|
||||||
|
|
||||||
|
|
||||||
def get_urls(urls, batchfile, verbose):
|
def get_urls(urls, batchfile, verbose):
|
||||||
# Batch file verification
|
"""
|
||||||
|
@param verbose -1: quiet, 0: normal, 1: verbose
|
||||||
|
"""
|
||||||
batch_urls = []
|
batch_urls = []
|
||||||
if batchfile is not None:
|
if batchfile is not None:
|
||||||
try:
|
try:
|
||||||
batch_urls = read_batch_urls(
|
batch_urls = read_batch_urls(
|
||||||
read_stdin('URLs') if batchfile == '-'
|
read_stdin(None if verbose == -1 else 'URLs') if batchfile == '-'
|
||||||
else open(expand_path(batchfile), encoding='utf-8', errors='ignore'))
|
else open(expand_path(batchfile), encoding='utf-8', errors='ignore'))
|
||||||
if verbose:
|
if verbose == 1:
|
||||||
write_string('[debug] Batch file urls: ' + repr(batch_urls) + '\n')
|
write_string('[debug] Batch file urls: ' + repr(batch_urls) + '\n')
|
||||||
except OSError:
|
except OSError:
|
||||||
_exit(f'ERROR: batch file {batchfile} could not be read')
|
_exit(f'ERROR: batch file {batchfile} could not be read')
|
||||||
|
@ -187,8 +189,8 @@ def validate_minmax(min_val, max_val, min_name, max_name=None):
|
||||||
raise ValueError(f'{max_name} "{max_val}" must be must be greater than or equal to {min_name} "{min_val}"')
|
raise ValueError(f'{max_name} "{max_val}" must be must be greater than or equal to {min_name} "{min_val}"')
|
||||||
|
|
||||||
# Usernames and passwords
|
# Usernames and passwords
|
||||||
validate(not opts.usenetrc or (opts.username is None and opts.password is None),
|
validate(sum(map(bool, (opts.usenetrc, opts.netrc_cmd, opts.username))) <= 1, '.netrc',
|
||||||
'.netrc', msg='using {name} conflicts with giving username/password')
|
msg='{name}, netrc command and username/password are mutually exclusive options')
|
||||||
validate(opts.password is None or opts.username is not None, 'account username', msg='{name} missing')
|
validate(opts.password is None or opts.username is not None, 'account username', msg='{name} missing')
|
||||||
validate(opts.ap_password is None or opts.ap_username is not None,
|
validate(opts.ap_password is None or opts.ap_username is not None,
|
||||||
'TV Provider account username', msg='{name} missing')
|
'TV Provider account username', msg='{name} missing')
|
||||||
|
@ -318,31 +320,50 @@ def validate_outtmpl(tmpl, msg):
|
||||||
if outtmpl_default == '':
|
if outtmpl_default == '':
|
||||||
opts.skip_download = None
|
opts.skip_download = None
|
||||||
del opts.outtmpl['default']
|
del opts.outtmpl['default']
|
||||||
if outtmpl_default and not os.path.splitext(outtmpl_default)[1] and opts.extractaudio:
|
|
||||||
raise ValueError(
|
|
||||||
'Cannot download a video and extract audio into the same file! '
|
|
||||||
f'Use "{outtmpl_default}.%(ext)s" instead of "{outtmpl_default}" as the output template')
|
|
||||||
|
|
||||||
def parse_chapters(name, value):
|
def parse_chapters(name, value, advanced=False):
|
||||||
chapters, ranges = [], []
|
|
||||||
parse_timestamp = lambda x: float('inf') if x in ('inf', 'infinite') else parse_duration(x)
|
parse_timestamp = lambda x: float('inf') if x in ('inf', 'infinite') else parse_duration(x)
|
||||||
for regex in value or []:
|
TIMESTAMP_RE = r'''(?x)(?:
|
||||||
if regex.startswith('*'):
|
(?P<start_sign>-?)(?P<start>[^-]+)
|
||||||
for range_ in map(str.strip, regex[1:].split(',')):
|
)?\s*-\s*(?:
|
||||||
mobj = range_ != '-' and re.fullmatch(r'([^-]+)?\s*-\s*([^-]+)?', range_)
|
(?P<end_sign>-?)(?P<end>[^-]+)
|
||||||
dur = mobj and (parse_timestamp(mobj.group(1) or '0'), parse_timestamp(mobj.group(2) or 'inf'))
|
)?'''
|
||||||
if None in (dur or [None]):
|
|
||||||
raise ValueError(f'invalid {name} time range "{regex}". Must be of the form "*start-end"')
|
|
||||||
ranges.append(dur)
|
|
||||||
continue
|
|
||||||
try:
|
|
||||||
chapters.append(re.compile(regex))
|
|
||||||
except re.error as err:
|
|
||||||
raise ValueError(f'invalid {name} regex "{regex}" - {err}')
|
|
||||||
return chapters, ranges
|
|
||||||
|
|
||||||
opts.remove_chapters, opts.remove_ranges = parse_chapters('--remove-chapters', opts.remove_chapters)
|
chapters, ranges, from_url = [], [], False
|
||||||
opts.download_ranges = download_range_func(*parse_chapters('--download-sections', opts.download_ranges))
|
for regex in value or []:
|
||||||
|
if advanced and regex == '*from-url':
|
||||||
|
from_url = True
|
||||||
|
continue
|
||||||
|
elif not regex.startswith('*'):
|
||||||
|
try:
|
||||||
|
chapters.append(re.compile(regex))
|
||||||
|
except re.error as err:
|
||||||
|
raise ValueError(f'invalid {name} regex "{regex}" - {err}')
|
||||||
|
continue
|
||||||
|
|
||||||
|
for range_ in map(str.strip, regex[1:].split(',')):
|
||||||
|
mobj = range_ != '-' and re.fullmatch(TIMESTAMP_RE, range_)
|
||||||
|
dur = mobj and [parse_timestamp(mobj.group('start') or '0'), parse_timestamp(mobj.group('end') or 'inf')]
|
||||||
|
signs = mobj and (mobj.group('start_sign'), mobj.group('end_sign'))
|
||||||
|
|
||||||
|
err = None
|
||||||
|
if None in (dur or [None]):
|
||||||
|
err = 'Must be of the form "*start-end"'
|
||||||
|
elif not advanced and any(signs):
|
||||||
|
err = 'Negative timestamps are not allowed'
|
||||||
|
else:
|
||||||
|
dur[0] *= -1 if signs[0] else 1
|
||||||
|
dur[1] *= -1 if signs[1] else 1
|
||||||
|
if dur[1] == float('-inf'):
|
||||||
|
err = '"-inf" is not a valid end'
|
||||||
|
if err:
|
||||||
|
raise ValueError(f'invalid {name} time range "{regex}". {err}')
|
||||||
|
ranges.append(dur)
|
||||||
|
|
||||||
|
return chapters, ranges, from_url
|
||||||
|
|
||||||
|
opts.remove_chapters, opts.remove_ranges, _ = parse_chapters('--remove-chapters', opts.remove_chapters)
|
||||||
|
opts.download_ranges = download_range_func(*parse_chapters('--download-sections', opts.download_ranges, True))
|
||||||
|
|
||||||
# Cookies from browser
|
# Cookies from browser
|
||||||
if opts.cookiesfrombrowser:
|
if opts.cookiesfrombrowser:
|
||||||
|
@ -400,14 +421,19 @@ def metadataparser_actions(f):
|
||||||
except Exception as err:
|
except Exception as err:
|
||||||
raise ValueError(f'Invalid playlist-items {opts.playlist_items!r}: {err}')
|
raise ValueError(f'Invalid playlist-items {opts.playlist_items!r}: {err}')
|
||||||
|
|
||||||
geo_bypass_code = opts.geo_bypass_ip_block or opts.geo_bypass_country
|
opts.geo_bypass_country, opts.geo_bypass_ip_block = None, None
|
||||||
if geo_bypass_code is not None:
|
if opts.geo_bypass.lower() not in ('default', 'never'):
|
||||||
try:
|
try:
|
||||||
GeoUtils.random_ipv4(geo_bypass_code)
|
GeoUtils.random_ipv4(opts.geo_bypass)
|
||||||
except Exception:
|
except Exception:
|
||||||
raise ValueError('unsupported geo-bypass country or ip-block')
|
raise ValueError(f'Unsupported --xff "{opts.geo_bypass}"')
|
||||||
|
if len(opts.geo_bypass) == 2:
|
||||||
|
opts.geo_bypass_country = opts.geo_bypass
|
||||||
|
else:
|
||||||
|
opts.geo_bypass_ip_block = opts.geo_bypass
|
||||||
|
opts.geo_bypass = opts.geo_bypass.lower() != 'never'
|
||||||
|
|
||||||
opts.match_filter = match_filter_func(opts.match_filter)
|
opts.match_filter = match_filter_func(opts.match_filter, opts.breaking_match_filter)
|
||||||
|
|
||||||
if opts.download_archive is not None:
|
if opts.download_archive is not None:
|
||||||
opts.download_archive = expand_path(opts.download_archive)
|
opts.download_archive = expand_path(opts.download_archive)
|
||||||
|
@ -434,6 +460,10 @@ def metadataparser_actions(f):
|
||||||
elif ed and proto == 'default':
|
elif ed and proto == 'default':
|
||||||
default_downloader = ed.get_basename()
|
default_downloader = ed.get_basename()
|
||||||
|
|
||||||
|
for policy in opts.color.values():
|
||||||
|
if policy not in ('always', 'auto', 'no_color', 'never'):
|
||||||
|
raise ValueError(f'"{policy}" is not a valid color policy')
|
||||||
|
|
||||||
warnings, deprecation_warnings = [], []
|
warnings, deprecation_warnings = [], []
|
||||||
|
|
||||||
# Common mistake: -f best
|
# Common mistake: -f best
|
||||||
|
@ -693,7 +723,7 @@ def get_postprocessors(opts):
|
||||||
def parse_options(argv=None):
|
def parse_options(argv=None):
|
||||||
"""@returns ParsedOptions(parser, opts, urls, ydl_opts)"""
|
"""@returns ParsedOptions(parser, opts, urls, ydl_opts)"""
|
||||||
parser, opts, urls = parseOpts(argv)
|
parser, opts, urls = parseOpts(argv)
|
||||||
urls = get_urls(urls, opts.batchfile, opts.verbose)
|
urls = get_urls(urls, opts.batchfile, -1 if opts.quiet and not opts.verbose else opts.verbose)
|
||||||
|
|
||||||
set_compat_opts(opts)
|
set_compat_opts(opts)
|
||||||
try:
|
try:
|
||||||
|
@ -708,6 +738,8 @@ def parse_options(argv=None):
|
||||||
'dumpjson', 'dump_single_json', 'getdescription', 'getduration', 'getfilename',
|
'dumpjson', 'dump_single_json', 'getdescription', 'getduration', 'getfilename',
|
||||||
'getformat', 'getid', 'getthumbnail', 'gettitle', 'geturl'
|
'getformat', 'getid', 'getthumbnail', 'gettitle', 'geturl'
|
||||||
))
|
))
|
||||||
|
if opts.quiet is None:
|
||||||
|
opts.quiet = any_getting or opts.print_json or bool(opts.forceprint)
|
||||||
|
|
||||||
playlist_pps = [pp for pp in postprocessors if pp.get('when') == 'playlist']
|
playlist_pps = [pp for pp in postprocessors if pp.get('when') == 'playlist']
|
||||||
write_playlist_infojson = (opts.writeinfojson and not opts.clean_infojson
|
write_playlist_infojson = (opts.writeinfojson and not opts.clean_infojson
|
||||||
|
@ -733,6 +765,7 @@ def parse_options(argv=None):
|
||||||
return ParsedOptions(parser, opts, urls, {
|
return ParsedOptions(parser, opts, urls, {
|
||||||
'usenetrc': opts.usenetrc,
|
'usenetrc': opts.usenetrc,
|
||||||
'netrc_location': opts.netrc_location,
|
'netrc_location': opts.netrc_location,
|
||||||
|
'netrc_cmd': opts.netrc_cmd,
|
||||||
'username': opts.username,
|
'username': opts.username,
|
||||||
'password': opts.password,
|
'password': opts.password,
|
||||||
'twofactor': opts.twofactor,
|
'twofactor': opts.twofactor,
|
||||||
|
@ -743,7 +776,7 @@ def parse_options(argv=None):
|
||||||
'client_certificate': opts.client_certificate,
|
'client_certificate': opts.client_certificate,
|
||||||
'client_certificate_key': opts.client_certificate_key,
|
'client_certificate_key': opts.client_certificate_key,
|
||||||
'client_certificate_password': opts.client_certificate_password,
|
'client_certificate_password': opts.client_certificate_password,
|
||||||
'quiet': opts.quiet or any_getting or opts.print_json or bool(opts.forceprint),
|
'quiet': opts.quiet,
|
||||||
'no_warnings': opts.no_warnings,
|
'no_warnings': opts.no_warnings,
|
||||||
'forceurl': opts.geturl,
|
'forceurl': opts.geturl,
|
||||||
'forcetitle': opts.gettitle,
|
'forcetitle': opts.gettitle,
|
||||||
|
@ -890,7 +923,7 @@ def parse_options(argv=None):
|
||||||
'playlist_items': opts.playlist_items,
|
'playlist_items': opts.playlist_items,
|
||||||
'xattr_set_filesize': opts.xattr_set_filesize,
|
'xattr_set_filesize': opts.xattr_set_filesize,
|
||||||
'match_filter': opts.match_filter,
|
'match_filter': opts.match_filter,
|
||||||
'no_color': opts.no_color,
|
'color': opts.color,
|
||||||
'ffmpeg_location': opts.ffmpeg_location,
|
'ffmpeg_location': opts.ffmpeg_location,
|
||||||
'hls_prefer_native': opts.hls_prefer_native,
|
'hls_prefer_native': opts.hls_prefer_native,
|
||||||
'hls_use_mpegts': opts.hls_use_mpegts,
|
'hls_use_mpegts': opts.hls_use_mpegts,
|
||||||
|
@ -934,14 +967,18 @@ def _real_main(argv=None):
|
||||||
if opts.rm_cachedir:
|
if opts.rm_cachedir:
|
||||||
ydl.cache.remove()
|
ydl.cache.remove()
|
||||||
|
|
||||||
updater = Updater(ydl)
|
try:
|
||||||
if opts.update_self and updater.update() and actual_use:
|
updater = Updater(ydl, opts.update_self)
|
||||||
if updater.cmd:
|
if opts.update_self and updater.update() and actual_use:
|
||||||
return updater.restart()
|
if updater.cmd:
|
||||||
# This code is reachable only for zip variant in py < 3.10
|
return updater.restart()
|
||||||
# It makes sense to exit here, but the old behavior is to continue
|
# This code is reachable only for zip variant in py < 3.10
|
||||||
ydl.report_warning('Restart yt-dlp to use the updated version')
|
# It makes sense to exit here, but the old behavior is to continue
|
||||||
# return 100, 'ERROR: The program must exit for the update to complete'
|
ydl.report_warning('Restart yt-dlp to use the updated version')
|
||||||
|
# return 100, 'ERROR: The program must exit for the update to complete'
|
||||||
|
except Exception:
|
||||||
|
traceback.print_exc()
|
||||||
|
ydl._download_retcode = 100
|
||||||
|
|
||||||
if not actual_use:
|
if not actual_use:
|
||||||
if pre_process:
|
if pre_process:
|
||||||
|
@ -955,6 +992,8 @@ def _real_main(argv=None):
|
||||||
parser.destroy()
|
parser.destroy()
|
||||||
try:
|
try:
|
||||||
if opts.load_info_filename is not None:
|
if opts.load_info_filename is not None:
|
||||||
|
if all_urls:
|
||||||
|
ydl.report_warning('URLs are ignored due to --load-info-json')
|
||||||
return ydl.download_with_info_file(expand_path(opts.load_info_filename))
|
return ydl.download_with_info_file(expand_path(opts.load_info_filename))
|
||||||
else:
|
else:
|
||||||
return ydl.download(all_urls)
|
return ydl.download(all_urls)
|
||||||
|
|
5
yt_dlp/__pyinstaller/__init__.py
Normal file
5
yt_dlp/__pyinstaller/__init__.py
Normal file
|
@ -0,0 +1,5 @@
|
||||||
|
import os
|
||||||
|
|
||||||
|
|
||||||
|
def get_hook_dirs():
|
||||||
|
return [os.path.dirname(__file__)]
|
34
yt_dlp/__pyinstaller/hook-yt_dlp.py
Normal file
34
yt_dlp/__pyinstaller/hook-yt_dlp.py
Normal file
|
@ -0,0 +1,34 @@
|
||||||
|
import sys
|
||||||
|
|
||||||
|
from PyInstaller.utils.hooks import collect_submodules
|
||||||
|
|
||||||
|
|
||||||
|
def pycryptodome_module():
|
||||||
|
try:
|
||||||
|
import Cryptodome # noqa: F401
|
||||||
|
except ImportError:
|
||||||
|
try:
|
||||||
|
import Crypto # noqa: F401
|
||||||
|
print('WARNING: Using Crypto since Cryptodome is not available. '
|
||||||
|
'Install with: pip install pycryptodomex', file=sys.stderr)
|
||||||
|
return 'Crypto'
|
||||||
|
except ImportError:
|
||||||
|
pass
|
||||||
|
return 'Cryptodome'
|
||||||
|
|
||||||
|
|
||||||
|
def get_hidden_imports():
|
||||||
|
yield from ('yt_dlp.compat._legacy', 'yt_dlp.compat._deprecated')
|
||||||
|
yield from ('yt_dlp.utils._legacy', 'yt_dlp.utils._deprecated')
|
||||||
|
yield pycryptodome_module()
|
||||||
|
# Only `websockets` is required, others are collected just in case
|
||||||
|
for module in ('websockets', 'requests', 'urllib3'):
|
||||||
|
yield from collect_submodules(module)
|
||||||
|
# These are auto-detected, but explicitly add them just in case
|
||||||
|
yield from ('mutagen', 'brotli', 'certifi', 'secretstorage')
|
||||||
|
|
||||||
|
|
||||||
|
hiddenimports = list(get_hidden_imports())
|
||||||
|
print(f'Adding imports: {hiddenimports}')
|
||||||
|
|
||||||
|
excludedimports = ['youtube_dl', 'youtube_dlc', 'test', 'ytdlp_plugins', 'devscripts']
|
|
@ -2,17 +2,17 @@
|
||||||
from math import ceil
|
from math import ceil
|
||||||
|
|
||||||
from .compat import compat_ord
|
from .compat import compat_ord
|
||||||
from .dependencies import Cryptodome_AES
|
from .dependencies import Cryptodome
|
||||||
from .utils import bytes_to_intlist, intlist_to_bytes
|
from .utils import bytes_to_intlist, intlist_to_bytes
|
||||||
|
|
||||||
if Cryptodome_AES:
|
if Cryptodome.AES:
|
||||||
def aes_cbc_decrypt_bytes(data, key, iv):
|
def aes_cbc_decrypt_bytes(data, key, iv):
|
||||||
""" Decrypt bytes with AES-CBC using pycryptodome """
|
""" Decrypt bytes with AES-CBC using pycryptodome """
|
||||||
return Cryptodome_AES.new(key, Cryptodome_AES.MODE_CBC, iv).decrypt(data)
|
return Cryptodome.AES.new(key, Cryptodome.AES.MODE_CBC, iv).decrypt(data)
|
||||||
|
|
||||||
def aes_gcm_decrypt_and_verify_bytes(data, key, tag, nonce):
|
def aes_gcm_decrypt_and_verify_bytes(data, key, tag, nonce):
|
||||||
""" Decrypt bytes with AES-GCM using pycryptodome """
|
""" Decrypt bytes with AES-GCM using pycryptodome """
|
||||||
return Cryptodome_AES.new(key, Cryptodome_AES.MODE_GCM, nonce).decrypt_and_verify(data, tag)
|
return Cryptodome.AES.new(key, Cryptodome.AES.MODE_GCM, nonce).decrypt_and_verify(data, tag)
|
||||||
|
|
||||||
else:
|
else:
|
||||||
def aes_cbc_decrypt_bytes(data, key, iv):
|
def aes_cbc_decrypt_bytes(data, key, iv):
|
||||||
|
|
|
@ -1,5 +1,4 @@
|
||||||
import contextlib
|
import contextlib
|
||||||
import errno
|
|
||||||
import json
|
import json
|
||||||
import os
|
import os
|
||||||
import re
|
import re
|
||||||
|
@ -39,11 +38,7 @@ def store(self, section, key, data, dtype='json'):
|
||||||
|
|
||||||
fn = self._get_cache_fn(section, key, dtype)
|
fn = self._get_cache_fn(section, key, dtype)
|
||||||
try:
|
try:
|
||||||
try:
|
os.makedirs(os.path.dirname(fn), exist_ok=True)
|
||||||
os.makedirs(os.path.dirname(fn))
|
|
||||||
except OSError as ose:
|
|
||||||
if ose.errno != errno.EEXIST:
|
|
||||||
raise
|
|
||||||
self._ydl.write_debug(f'Saving {section}.{key} to cache')
|
self._ydl.write_debug(f'Saving {section}.{key} to cache')
|
||||||
write_json_file({'yt-dlp_version': __version__, 'data': data}, fn)
|
write_json_file({'yt-dlp_version': __version__, 'data': data}, fn)
|
||||||
except Exception:
|
except Exception:
|
||||||
|
|
5
yt_dlp/casefold.py
Normal file
5
yt_dlp/casefold.py
Normal file
|
@ -0,0 +1,5 @@
|
||||||
|
import warnings
|
||||||
|
|
||||||
|
warnings.warn(DeprecationWarning(f'{__name__} is deprecated'))
|
||||||
|
|
||||||
|
casefold = str.casefold
|
|
@ -1,14 +1,11 @@
|
||||||
import os
|
import os
|
||||||
import sys
|
import sys
|
||||||
import warnings
|
|
||||||
import xml.etree.ElementTree as etree
|
import xml.etree.ElementTree as etree
|
||||||
|
|
||||||
from ._deprecated import * # noqa: F401, F403
|
|
||||||
from .compat_utils import passthrough_module
|
from .compat_utils import passthrough_module
|
||||||
|
|
||||||
# XXX: Implement this the same way as other DeprecationWarnings without circular import
|
passthrough_module(__name__, '._deprecated')
|
||||||
passthrough_module(__name__, '._legacy', callback=lambda attr: warnings.warn(
|
del passthrough_module
|
||||||
DeprecationWarning(f'{__name__}.{attr} is deprecated'), stacklevel=3))
|
|
||||||
|
|
||||||
|
|
||||||
# HTMLParseError has been deprecated in Python 3.3 and removed in
|
# HTMLParseError has been deprecated in Python 3.3 and removed in
|
||||||
|
@ -33,7 +30,7 @@ def compat_etree_fromstring(text):
|
||||||
if compat_os_name == 'nt':
|
if compat_os_name == 'nt':
|
||||||
def compat_shlex_quote(s):
|
def compat_shlex_quote(s):
|
||||||
import re
|
import re
|
||||||
return s if re.match(r'^[-_\w./]+$', s) else '"%s"' % s.replace('"', '\\"')
|
return s if re.match(r'^[-_\w./]+$', s) else s.replace('"', '""').join('""')
|
||||||
else:
|
else:
|
||||||
from shlex import quote as compat_shlex_quote # noqa: F401
|
from shlex import quote as compat_shlex_quote # noqa: F401
|
||||||
|
|
||||||
|
@ -72,7 +69,11 @@ def compat_expanduser(path):
|
||||||
compat_expanduser = os.path.expanduser
|
compat_expanduser = os.path.expanduser
|
||||||
|
|
||||||
|
|
||||||
# NB: Add modules that are imported dynamically here so that PyInstaller can find them
|
def urllib_req_to_req(urllib_request):
|
||||||
# See https://github.com/pyinstaller/pyinstaller-hooks-contrib/issues/438
|
"""Convert urllib Request to a networking Request"""
|
||||||
if False:
|
from ..networking import Request
|
||||||
from . import _legacy # noqa: F401
|
from ..utils.networking import HTTPHeaderDict
|
||||||
|
return Request(
|
||||||
|
urllib_request.get_full_url(), data=urllib_request.data, method=urllib_request.get_method(),
|
||||||
|
headers=HTTPHeaderDict(urllib_request.headers, urllib_request.unredirected_hdrs),
|
||||||
|
extensions={'timeout': urllib_request.timeout} if hasattr(urllib_request, 'timeout') else None)
|
||||||
|
|
|
@ -1,4 +1,12 @@
|
||||||
"""Deprecated - New code should avoid these"""
|
"""Deprecated - New code should avoid these"""
|
||||||
|
import warnings
|
||||||
|
|
||||||
|
from .compat_utils import passthrough_module
|
||||||
|
|
||||||
|
# XXX: Implement this the same way as other DeprecationWarnings without circular import
|
||||||
|
passthrough_module(__name__, '.._legacy', callback=lambda attr: warnings.warn(
|
||||||
|
DeprecationWarning(f'{__name__}.{attr} is deprecated'), stacklevel=6))
|
||||||
|
del passthrough_module
|
||||||
|
|
||||||
import base64
|
import base64
|
||||||
import urllib.error
|
import urllib.error
|
||||||
|
@ -8,7 +16,6 @@
|
||||||
|
|
||||||
compat_b64decode = base64.b64decode
|
compat_b64decode = base64.b64decode
|
||||||
|
|
||||||
compat_HTTPError = urllib.error.HTTPError
|
|
||||||
compat_urlparse = urllib.parse
|
compat_urlparse = urllib.parse
|
||||||
compat_parse_qs = urllib.parse.parse_qs
|
compat_parse_qs = urllib.parse.parse_qs
|
||||||
compat_urllib_parse_unquote = urllib.parse.unquote
|
compat_urllib_parse_unquote = urllib.parse.unquote
|
||||||
|
|
|
@ -1,5 +1,6 @@
|
||||||
""" Do not use! """
|
""" Do not use! """
|
||||||
|
|
||||||
|
import base64
|
||||||
import collections
|
import collections
|
||||||
import ctypes
|
import ctypes
|
||||||
import getpass
|
import getpass
|
||||||
|
@ -15,12 +16,12 @@
|
||||||
import shutil
|
import shutil
|
||||||
import socket
|
import socket
|
||||||
import struct
|
import struct
|
||||||
|
import subprocess
|
||||||
import tokenize
|
import tokenize
|
||||||
import urllib.error
|
import urllib.error
|
||||||
import urllib.parse
|
import urllib.parse
|
||||||
import urllib.request
|
import urllib.request
|
||||||
import xml.etree.ElementTree as etree
|
import xml.etree.ElementTree as etree
|
||||||
from subprocess import DEVNULL
|
|
||||||
|
|
||||||
# isort: split
|
# isort: split
|
||||||
import asyncio # noqa: F401
|
import asyncio # noqa: F401
|
||||||
|
@ -29,10 +30,12 @@
|
||||||
from re import Pattern as compat_Pattern # noqa: F401
|
from re import Pattern as compat_Pattern # noqa: F401
|
||||||
from re import match as compat_Match # noqa: F401
|
from re import match as compat_Match # noqa: F401
|
||||||
|
|
||||||
|
from . import compat_expanduser, compat_HTMLParseError, compat_realpath
|
||||||
from .compat_utils import passthrough_module
|
from .compat_utils import passthrough_module
|
||||||
from ..dependencies import Cryptodome_AES as compat_pycrypto_AES # noqa: F401
|
|
||||||
from ..dependencies import brotli as compat_brotli # noqa: F401
|
from ..dependencies import brotli as compat_brotli # noqa: F401
|
||||||
from ..dependencies import websockets as compat_websockets # noqa: F401
|
from ..dependencies import websockets as compat_websockets # noqa: F401
|
||||||
|
from ..dependencies.Cryptodome import AES as compat_pycrypto_AES # noqa: F401
|
||||||
|
from ..networking.exceptions import HTTPError as compat_HTTPError # noqa: F401
|
||||||
|
|
||||||
passthrough_module(__name__, '...utils', ('WINDOWS_VT_MODE', 'windows_enable_vt_mode'))
|
passthrough_module(__name__, '...utils', ('WINDOWS_VT_MODE', 'windows_enable_vt_mode'))
|
||||||
|
|
||||||
|
@ -47,23 +50,25 @@ def compat_setenv(key, value, env=os.environ):
|
||||||
env[key] = value
|
env[key] = value
|
||||||
|
|
||||||
|
|
||||||
|
compat_base64_b64decode = base64.b64decode
|
||||||
compat_basestring = str
|
compat_basestring = str
|
||||||
compat_casefold = str.casefold
|
compat_casefold = str.casefold
|
||||||
compat_chr = chr
|
compat_chr = chr
|
||||||
compat_collections_abc = collections.abc
|
compat_collections_abc = collections.abc
|
||||||
compat_cookiejar = http.cookiejar
|
compat_cookiejar = compat_http_cookiejar = http.cookiejar
|
||||||
compat_cookiejar_Cookie = http.cookiejar.Cookie
|
compat_cookiejar_Cookie = compat_http_cookiejar_Cookie = http.cookiejar.Cookie
|
||||||
compat_cookies = http.cookies
|
compat_cookies = compat_http_cookies = http.cookies
|
||||||
compat_cookies_SimpleCookie = http.cookies.SimpleCookie
|
compat_cookies_SimpleCookie = compat_http_cookies_SimpleCookie = http.cookies.SimpleCookie
|
||||||
compat_etree_Element = etree.Element
|
compat_etree_Element = compat_xml_etree_ElementTree_Element = etree.Element
|
||||||
compat_etree_register_namespace = etree.register_namespace
|
compat_etree_register_namespace = compat_xml_etree_register_namespace = etree.register_namespace
|
||||||
compat_filter = filter
|
compat_filter = filter
|
||||||
compat_get_terminal_size = shutil.get_terminal_size
|
compat_get_terminal_size = shutil.get_terminal_size
|
||||||
compat_getenv = os.getenv
|
compat_getenv = os.getenv
|
||||||
compat_getpass = getpass.getpass
|
compat_getpass = compat_getpass_getpass = getpass.getpass
|
||||||
compat_html_entities = html.entities
|
compat_html_entities = html.entities
|
||||||
compat_html_entities_html5 = html.entities.html5
|
compat_html_entities_html5 = html.entities.html5
|
||||||
compat_HTMLParser = html.parser.HTMLParser
|
compat_html_parser_HTMLParseError = compat_HTMLParseError
|
||||||
|
compat_HTMLParser = compat_html_parser_HTMLParser = html.parser.HTMLParser
|
||||||
compat_http_client = http.client
|
compat_http_client = http.client
|
||||||
compat_http_server = http.server
|
compat_http_server = http.server
|
||||||
compat_input = input
|
compat_input = input
|
||||||
|
@ -72,16 +77,20 @@ def compat_setenv(key, value, env=os.environ):
|
||||||
compat_kwargs = lambda kwargs: kwargs
|
compat_kwargs = lambda kwargs: kwargs
|
||||||
compat_map = map
|
compat_map = map
|
||||||
compat_numeric_types = (int, float, complex)
|
compat_numeric_types = (int, float, complex)
|
||||||
|
compat_os_path_expanduser = compat_expanduser
|
||||||
|
compat_os_path_realpath = compat_realpath
|
||||||
compat_print = print
|
compat_print = print
|
||||||
compat_shlex_split = shlex.split
|
compat_shlex_split = shlex.split
|
||||||
compat_socket_create_connection = socket.create_connection
|
compat_socket_create_connection = socket.create_connection
|
||||||
compat_Struct = struct.Struct
|
compat_Struct = struct.Struct
|
||||||
compat_struct_pack = struct.pack
|
compat_struct_pack = struct.pack
|
||||||
compat_struct_unpack = struct.unpack
|
compat_struct_unpack = struct.unpack
|
||||||
compat_subprocess_get_DEVNULL = lambda: DEVNULL
|
compat_subprocess_get_DEVNULL = lambda: subprocess.DEVNULL
|
||||||
compat_tokenize_tokenize = tokenize.tokenize
|
compat_tokenize_tokenize = tokenize.tokenize
|
||||||
compat_urllib_error = urllib.error
|
compat_urllib_error = urllib.error
|
||||||
|
compat_urllib_HTTPError = compat_HTTPError
|
||||||
compat_urllib_parse = urllib.parse
|
compat_urllib_parse = urllib.parse
|
||||||
|
compat_urllib_parse_parse_qs = urllib.parse.parse_qs
|
||||||
compat_urllib_parse_quote = urllib.parse.quote
|
compat_urllib_parse_quote = urllib.parse.quote
|
||||||
compat_urllib_parse_quote_plus = urllib.parse.quote_plus
|
compat_urllib_parse_quote_plus = urllib.parse.quote_plus
|
||||||
compat_urllib_parse_unquote_plus = urllib.parse.unquote_plus
|
compat_urllib_parse_unquote_plus = urllib.parse.unquote_plus
|
||||||
|
@ -90,8 +99,10 @@ def compat_setenv(key, value, env=os.environ):
|
||||||
compat_urllib_request = urllib.request
|
compat_urllib_request = urllib.request
|
||||||
compat_urllib_request_DataHandler = urllib.request.DataHandler
|
compat_urllib_request_DataHandler = urllib.request.DataHandler
|
||||||
compat_urllib_response = urllib.response
|
compat_urllib_response = urllib.response
|
||||||
compat_urlretrieve = urllib.request.urlretrieve
|
compat_urlretrieve = compat_urllib_request_urlretrieve = urllib.request.urlretrieve
|
||||||
compat_xml_parse_error = etree.ParseError
|
compat_xml_parse_error = compat_xml_etree_ElementTree_ParseError = etree.ParseError
|
||||||
compat_xpath = lambda xpath: xpath
|
compat_xpath = lambda xpath: xpath
|
||||||
compat_zip = zip
|
compat_zip = zip
|
||||||
workaround_optparse_bug9161 = lambda: None
|
workaround_optparse_bug9161 = lambda: None
|
||||||
|
|
||||||
|
legacy = []
|
||||||
|
|
|
@ -1,5 +1,6 @@
|
||||||
import collections
|
import collections
|
||||||
import contextlib
|
import contextlib
|
||||||
|
import functools
|
||||||
import importlib
|
import importlib
|
||||||
import sys
|
import sys
|
||||||
import types
|
import types
|
||||||
|
@ -10,61 +11,73 @@
|
||||||
|
|
||||||
|
|
||||||
def get_package_info(module):
|
def get_package_info(module):
|
||||||
parent = module.__name__.split('.')[0]
|
return _Package(
|
||||||
parent_module = None
|
name=getattr(module, '_yt_dlp__identifier', module.__name__),
|
||||||
with contextlib.suppress(ImportError):
|
version=str(next(filter(None, (
|
||||||
parent_module = importlib.import_module(parent)
|
getattr(module, attr, None)
|
||||||
|
for attr in ('_yt_dlp__version', '__version__', 'version_string', 'version')
|
||||||
for attr in ('__version__', 'version_string', 'version'):
|
)), None)))
|
||||||
version = getattr(parent_module, attr, None)
|
|
||||||
if version is not None:
|
|
||||||
break
|
|
||||||
return _Package(getattr(module, '_yt_dlp__identifier', parent), str(version))
|
|
||||||
|
|
||||||
|
|
||||||
def _is_package(module):
|
def _is_package(module):
|
||||||
try:
|
return '__path__' in vars(module)
|
||||||
module.__getattribute__('__path__')
|
|
||||||
except AttributeError:
|
|
||||||
return False
|
|
||||||
return True
|
|
||||||
|
|
||||||
|
|
||||||
def passthrough_module(parent, child, allowed_attributes=None, *, callback=lambda _: None):
|
def _is_dunder(name):
|
||||||
parent_module = importlib.import_module(parent)
|
return name.startswith('__') and name.endswith('__')
|
||||||
child_module = None # Import child module only as needed
|
|
||||||
|
|
||||||
class PassthroughModule(types.ModuleType):
|
|
||||||
def __getattr__(self, attr):
|
|
||||||
if _is_package(parent_module):
|
|
||||||
with contextlib.suppress(ImportError):
|
|
||||||
return importlib.import_module(f'.{attr}', parent)
|
|
||||||
|
|
||||||
ret = self.__from_child(attr)
|
class EnhancedModule(types.ModuleType):
|
||||||
if ret is _NO_ATTRIBUTE:
|
def __bool__(self):
|
||||||
raise AttributeError(f'module {parent} has no attribute {attr}')
|
return vars(self).get('__bool__', lambda: True)()
|
||||||
callback(attr)
|
|
||||||
return ret
|
|
||||||
|
|
||||||
def __from_child(self, attr):
|
def __getattribute__(self, attr):
|
||||||
if allowed_attributes is None:
|
try:
|
||||||
if attr.startswith('__') and attr.endswith('__'):
|
ret = super().__getattribute__(attr)
|
||||||
return _NO_ATTRIBUTE
|
except AttributeError:
|
||||||
elif attr not in allowed_attributes:
|
if _is_dunder(attr):
|
||||||
|
raise
|
||||||
|
getter = getattr(self, '__getattr__', None)
|
||||||
|
if not getter:
|
||||||
|
raise
|
||||||
|
ret = getter(attr)
|
||||||
|
return ret.fget() if isinstance(ret, property) else ret
|
||||||
|
|
||||||
|
|
||||||
|
def passthrough_module(parent, child, allowed_attributes=(..., ), *, callback=lambda _: None):
|
||||||
|
"""Passthrough parent module into a child module, creating the parent if necessary"""
|
||||||
|
def __getattr__(attr):
|
||||||
|
if _is_package(parent):
|
||||||
|
with contextlib.suppress(ModuleNotFoundError):
|
||||||
|
return importlib.import_module(f'.{attr}', parent.__name__)
|
||||||
|
|
||||||
|
ret = from_child(attr)
|
||||||
|
if ret is _NO_ATTRIBUTE:
|
||||||
|
raise AttributeError(f'module {parent.__name__} has no attribute {attr}')
|
||||||
|
callback(attr)
|
||||||
|
return ret
|
||||||
|
|
||||||
|
@functools.lru_cache(maxsize=None)
|
||||||
|
def from_child(attr):
|
||||||
|
nonlocal child
|
||||||
|
if attr not in allowed_attributes:
|
||||||
|
if ... not in allowed_attributes or _is_dunder(attr):
|
||||||
return _NO_ATTRIBUTE
|
return _NO_ATTRIBUTE
|
||||||
|
|
||||||
nonlocal child_module
|
if isinstance(child, str):
|
||||||
child_module = child_module or importlib.import_module(child, parent)
|
child = importlib.import_module(child, parent.__name__)
|
||||||
|
|
||||||
with contextlib.suppress(AttributeError):
|
if _is_package(child):
|
||||||
return getattr(child_module, attr)
|
with contextlib.suppress(ImportError):
|
||||||
|
return passthrough_module(f'{parent.__name__}.{attr}',
|
||||||
|
importlib.import_module(f'.{attr}', child.__name__))
|
||||||
|
|
||||||
if _is_package(child_module):
|
with contextlib.suppress(AttributeError):
|
||||||
with contextlib.suppress(ImportError):
|
return getattr(child, attr)
|
||||||
return importlib.import_module(f'.{attr}', child)
|
|
||||||
|
|
||||||
return _NO_ATTRIBUTE
|
return _NO_ATTRIBUTE
|
||||||
|
|
||||||
# Python 3.6 does not have module level __getattr__
|
parent = sys.modules.get(parent, types.ModuleType(parent))
|
||||||
# https://peps.python.org/pep-0562/
|
parent.__class__ = EnhancedModule
|
||||||
sys.modules[parent].__class__ = PassthroughModule
|
parent.__getattr__ = __getattr__
|
||||||
|
return parent
|
||||||
|
|
|
@ -10,17 +10,3 @@
|
||||||
cache # >= 3.9
|
cache # >= 3.9
|
||||||
except NameError:
|
except NameError:
|
||||||
cache = lru_cache(maxsize=None)
|
cache = lru_cache(maxsize=None)
|
||||||
|
|
||||||
try:
|
|
||||||
cached_property # >= 3.8
|
|
||||||
except NameError:
|
|
||||||
class cached_property:
|
|
||||||
def __init__(self, func):
|
|
||||||
update_wrapper(self, func)
|
|
||||||
self.func = func
|
|
||||||
|
|
||||||
def __get__(self, instance, _):
|
|
||||||
if instance is None:
|
|
||||||
return self
|
|
||||||
setattr(instance, self.func.__name__, self.func(instance))
|
|
||||||
return getattr(instance, self.func.__name__)
|
|
||||||
|
|
13
yt_dlp/compat/types.py
Normal file
13
yt_dlp/compat/types.py
Normal file
|
@ -0,0 +1,13 @@
|
||||||
|
# flake8: noqa: F405
|
||||||
|
from types import * # noqa: F403
|
||||||
|
|
||||||
|
from .compat_utils import passthrough_module
|
||||||
|
|
||||||
|
passthrough_module(__name__, 'types')
|
||||||
|
del passthrough_module
|
||||||
|
|
||||||
|
try:
|
||||||
|
# NB: pypy has builtin NoneType, so checking NameError won't work
|
||||||
|
from types import NoneType # >= 3.10
|
||||||
|
except ImportError:
|
||||||
|
NoneType = type(None)
|
10
yt_dlp/compat/urllib/__init__.py
Normal file
10
yt_dlp/compat/urllib/__init__.py
Normal file
|
@ -0,0 +1,10 @@
|
||||||
|
# flake8: noqa: F405
|
||||||
|
from urllib import * # noqa: F403
|
||||||
|
|
||||||
|
del request # noqa: F821
|
||||||
|
from . import request # noqa: F401
|
||||||
|
|
||||||
|
from ..compat_utils import passthrough_module
|
||||||
|
|
||||||
|
passthrough_module(__name__, 'urllib')
|
||||||
|
del passthrough_module
|
40
yt_dlp/compat/urllib/request.py
Normal file
40
yt_dlp/compat/urllib/request.py
Normal file
|
@ -0,0 +1,40 @@
|
||||||
|
# flake8: noqa: F405
|
||||||
|
from urllib.request import * # noqa: F403
|
||||||
|
|
||||||
|
from ..compat_utils import passthrough_module
|
||||||
|
|
||||||
|
passthrough_module(__name__, 'urllib.request')
|
||||||
|
del passthrough_module
|
||||||
|
|
||||||
|
|
||||||
|
from .. import compat_os_name
|
||||||
|
|
||||||
|
if compat_os_name == 'nt':
|
||||||
|
# On older python versions, proxies are extracted from Windows registry erroneously. [1]
|
||||||
|
# If the https proxy in the registry does not have a scheme, urllib will incorrectly add https:// to it. [2]
|
||||||
|
# It is unlikely that the user has actually set it to be https, so we should be fine to safely downgrade
|
||||||
|
# it to http on these older python versions to avoid issues
|
||||||
|
# This also applies for ftp proxy type, as ftp:// proxy scheme is not supported.
|
||||||
|
# 1: https://github.com/python/cpython/issues/86793
|
||||||
|
# 2: https://github.com/python/cpython/blob/51f1ae5ceb0673316c4e4b0175384e892e33cc6e/Lib/urllib/request.py#L2683-L2698
|
||||||
|
import sys
|
||||||
|
from urllib.request import getproxies_environment, getproxies_registry
|
||||||
|
|
||||||
|
def getproxies_registry_patched():
|
||||||
|
proxies = getproxies_registry()
|
||||||
|
if (
|
||||||
|
sys.version_info >= (3, 10, 5) # https://docs.python.org/3.10/whatsnew/changelog.html#python-3-10-5-final
|
||||||
|
or (3, 9, 13) <= sys.version_info < (3, 10) # https://docs.python.org/3.9/whatsnew/changelog.html#python-3-9-13-final
|
||||||
|
):
|
||||||
|
return proxies
|
||||||
|
|
||||||
|
for scheme in ('https', 'ftp'):
|
||||||
|
if scheme in proxies and proxies[scheme].startswith(f'{scheme}://'):
|
||||||
|
proxies[scheme] = 'http' + proxies[scheme][len(scheme):]
|
||||||
|
|
||||||
|
return proxies
|
||||||
|
|
||||||
|
def getproxies():
|
||||||
|
return getproxies_environment() or getproxies_registry_patched()
|
||||||
|
|
||||||
|
del compat_os_name
|
|
@ -1,7 +1,10 @@
|
||||||
import base64
|
import base64
|
||||||
|
import collections
|
||||||
import contextlib
|
import contextlib
|
||||||
|
import glob
|
||||||
import http.cookiejar
|
import http.cookiejar
|
||||||
import http.cookies
|
import http.cookies
|
||||||
|
import io
|
||||||
import json
|
import json
|
||||||
import os
|
import os
|
||||||
import re
|
import re
|
||||||
|
@ -11,6 +14,7 @@
|
||||||
import sys
|
import sys
|
||||||
import tempfile
|
import tempfile
|
||||||
import time
|
import time
|
||||||
|
import urllib.request
|
||||||
from datetime import datetime, timedelta, timezone
|
from datetime import datetime, timedelta, timezone
|
||||||
from enum import Enum, auto
|
from enum import Enum, auto
|
||||||
from hashlib import pbkdf2_hmac
|
from hashlib import pbkdf2_hmac
|
||||||
|
@ -20,6 +24,8 @@
|
||||||
aes_gcm_decrypt_and_verify_bytes,
|
aes_gcm_decrypt_and_verify_bytes,
|
||||||
unpad_pkcs7,
|
unpad_pkcs7,
|
||||||
)
|
)
|
||||||
|
from .compat import functools # isort: split
|
||||||
|
from .compat import compat_os_name
|
||||||
from .dependencies import (
|
from .dependencies import (
|
||||||
_SECRETSTORAGE_UNAVAILABLE_REASON,
|
_SECRETSTORAGE_UNAVAILABLE_REASON,
|
||||||
secretstorage,
|
secretstorage,
|
||||||
|
@ -27,37 +33,26 @@
|
||||||
)
|
)
|
||||||
from .minicurses import MultilinePrinter, QuietMultilinePrinter
|
from .minicurses import MultilinePrinter, QuietMultilinePrinter
|
||||||
from .utils import (
|
from .utils import (
|
||||||
|
DownloadError,
|
||||||
Popen,
|
Popen,
|
||||||
YoutubeDLCookieJar,
|
|
||||||
error_to_str,
|
error_to_str,
|
||||||
expand_path,
|
expand_path,
|
||||||
is_path_like,
|
is_path_like,
|
||||||
|
sanitize_url,
|
||||||
|
str_or_none,
|
||||||
try_call,
|
try_call,
|
||||||
|
write_string,
|
||||||
)
|
)
|
||||||
|
from .utils._utils import _YDLLogger
|
||||||
|
from .utils.networking import normalize_url
|
||||||
|
|
||||||
CHROMIUM_BASED_BROWSERS = {'brave', 'chrome', 'chromium', 'edge', 'opera', 'vivaldi'}
|
CHROMIUM_BASED_BROWSERS = {'brave', 'chrome', 'chromium', 'edge', 'opera', 'vivaldi'}
|
||||||
SUPPORTED_BROWSERS = CHROMIUM_BASED_BROWSERS | {'firefox', 'safari'}
|
SUPPORTED_BROWSERS = CHROMIUM_BASED_BROWSERS | {'firefox', 'safari'}
|
||||||
|
|
||||||
|
|
||||||
class YDLLogger:
|
class YDLLogger(_YDLLogger):
|
||||||
def __init__(self, ydl=None):
|
def warning(self, message, only_once=False): # compat
|
||||||
self._ydl = ydl
|
return super().warning(message, once=only_once)
|
||||||
|
|
||||||
def debug(self, message):
|
|
||||||
if self._ydl:
|
|
||||||
self._ydl.write_debug(message)
|
|
||||||
|
|
||||||
def info(self, message):
|
|
||||||
if self._ydl:
|
|
||||||
self._ydl.to_screen(f'[Cookies] {message}')
|
|
||||||
|
|
||||||
def warning(self, message, only_once=False):
|
|
||||||
if self._ydl:
|
|
||||||
self._ydl.report_warning(message, only_once)
|
|
||||||
|
|
||||||
def error(self, message):
|
|
||||||
if self._ydl:
|
|
||||||
self._ydl.report_error(message)
|
|
||||||
|
|
||||||
class ProgressBar(MultilinePrinter):
|
class ProgressBar(MultilinePrinter):
|
||||||
_DELAY, _timer = 0.1, 0
|
_DELAY, _timer = 0.1, 0
|
||||||
|
@ -105,7 +100,7 @@ def load_cookies(cookie_file, browser_specification, ydl):
|
||||||
|
|
||||||
jar = YoutubeDLCookieJar(cookie_file)
|
jar = YoutubeDLCookieJar(cookie_file)
|
||||||
if not is_filename or os.access(cookie_file, os.R_OK):
|
if not is_filename or os.access(cookie_file, os.R_OK):
|
||||||
jar.load(ignore_discard=True, ignore_expires=True)
|
jar.load()
|
||||||
cookie_jars.append(jar)
|
cookie_jars.append(jar)
|
||||||
|
|
||||||
return _merge_cookie_jars(cookie_jars)
|
return _merge_cookie_jars(cookie_jars)
|
||||||
|
@ -130,13 +125,14 @@ def _extract_firefox_cookies(profile, container, logger):
|
||||||
return YoutubeDLCookieJar()
|
return YoutubeDLCookieJar()
|
||||||
|
|
||||||
if profile is None:
|
if profile is None:
|
||||||
search_root = _firefox_browser_dir()
|
search_roots = list(_firefox_browser_dirs())
|
||||||
elif _is_path(profile):
|
elif _is_path(profile):
|
||||||
search_root = profile
|
search_roots = [profile]
|
||||||
else:
|
else:
|
||||||
search_root = os.path.join(_firefox_browser_dir(), profile)
|
search_roots = [os.path.join(path, profile) for path in _firefox_browser_dirs()]
|
||||||
|
search_root = ', '.join(map(repr, search_roots))
|
||||||
|
|
||||||
cookie_database_path = _find_most_recently_used_file(search_root, 'cookies.sqlite', logger)
|
cookie_database_path = _newest(_firefox_cookie_dbs(search_roots))
|
||||||
if cookie_database_path is None:
|
if cookie_database_path is None:
|
||||||
raise FileNotFoundError(f'could not find firefox cookies database in {search_root}')
|
raise FileNotFoundError(f'could not find firefox cookies database in {search_root}')
|
||||||
logger.debug(f'Extracting cookies from: "{cookie_database_path}"')
|
logger.debug(f'Extracting cookies from: "{cookie_database_path}"')
|
||||||
|
@ -146,7 +142,7 @@ def _extract_firefox_cookies(profile, container, logger):
|
||||||
containers_path = os.path.join(os.path.dirname(cookie_database_path), 'containers.json')
|
containers_path = os.path.join(os.path.dirname(cookie_database_path), 'containers.json')
|
||||||
if not os.path.isfile(containers_path) or not os.access(containers_path, os.R_OK):
|
if not os.path.isfile(containers_path) or not os.access(containers_path, os.R_OK):
|
||||||
raise FileNotFoundError(f'could not read containers.json in {search_root}')
|
raise FileNotFoundError(f'could not read containers.json in {search_root}')
|
||||||
with open(containers_path) as containers:
|
with open(containers_path, encoding='utf8') as containers:
|
||||||
identities = json.load(containers).get('identities', [])
|
identities = json.load(containers).get('identities', [])
|
||||||
container_id = next((context.get('userContextId') for context in identities if container in (
|
container_id = next((context.get('userContextId') for context in identities if container in (
|
||||||
context.get('name'),
|
context.get('name'),
|
||||||
|
@ -190,12 +186,21 @@ def _extract_firefox_cookies(profile, container, logger):
|
||||||
cursor.connection.close()
|
cursor.connection.close()
|
||||||
|
|
||||||
|
|
||||||
def _firefox_browser_dir():
|
def _firefox_browser_dirs():
|
||||||
if sys.platform in ('cygwin', 'win32'):
|
if sys.platform in ('cygwin', 'win32'):
|
||||||
return os.path.expandvars(R'%APPDATA%\Mozilla\Firefox\Profiles')
|
yield os.path.expandvars(R'%APPDATA%\Mozilla\Firefox\Profiles')
|
||||||
|
|
||||||
elif sys.platform == 'darwin':
|
elif sys.platform == 'darwin':
|
||||||
return os.path.expanduser('~/Library/Application Support/Firefox')
|
yield os.path.expanduser('~/Library/Application Support/Firefox/Profiles')
|
||||||
return os.path.expanduser('~/.mozilla/firefox')
|
|
||||||
|
else:
|
||||||
|
yield from map(os.path.expanduser, ('~/.mozilla/firefox', '~/snap/firefox/common/.mozilla/firefox'))
|
||||||
|
|
||||||
|
|
||||||
|
def _firefox_cookie_dbs(roots):
|
||||||
|
for root in map(os.path.abspath, roots):
|
||||||
|
for pattern in ('', '*/', 'Profiles/*/'):
|
||||||
|
yield from glob.iglob(os.path.join(root, pattern, 'cookies.sqlite'))
|
||||||
|
|
||||||
|
|
||||||
def _get_chromium_based_browser_settings(browser_name):
|
def _get_chromium_based_browser_settings(browser_name):
|
||||||
|
@ -276,7 +281,7 @@ def _extract_chrome_cookies(browser_name, profile, keyring, logger):
|
||||||
logger.error(f'{browser_name} does not support profiles')
|
logger.error(f'{browser_name} does not support profiles')
|
||||||
search_root = config['browser_dir']
|
search_root = config['browser_dir']
|
||||||
|
|
||||||
cookie_database_path = _find_most_recently_used_file(search_root, 'Cookies', logger)
|
cookie_database_path = _newest(_find_files(search_root, 'Cookies', logger))
|
||||||
if cookie_database_path is None:
|
if cookie_database_path is None:
|
||||||
raise FileNotFoundError(f'could not find {browser_name} cookies database in "{search_root}"')
|
raise FileNotFoundError(f'could not find {browser_name} cookies database in "{search_root}"')
|
||||||
logger.debug(f'Extracting cookies from: "{cookie_database_path}"')
|
logger.debug(f'Extracting cookies from: "{cookie_database_path}"')
|
||||||
|
@ -315,6 +320,12 @@ def _extract_chrome_cookies(browser_name, profile, keyring, logger):
|
||||||
counts['unencrypted'] = unencrypted_cookies
|
counts['unencrypted'] = unencrypted_cookies
|
||||||
logger.debug(f'cookie version breakdown: {counts}')
|
logger.debug(f'cookie version breakdown: {counts}')
|
||||||
return jar
|
return jar
|
||||||
|
except PermissionError as error:
|
||||||
|
if compat_os_name == 'nt' and error.errno == 13:
|
||||||
|
message = 'Could not copy Chrome cookie database. See https://github.com/yt-dlp/yt-dlp/issues/7271 for more info'
|
||||||
|
logger.error(message)
|
||||||
|
raise DownloadError(message) # force exit
|
||||||
|
raise
|
||||||
finally:
|
finally:
|
||||||
if cursor is not None:
|
if cursor is not None:
|
||||||
cursor.connection.close()
|
cursor.connection.close()
|
||||||
|
@ -346,7 +357,9 @@ class ChromeCookieDecryptor:
|
||||||
Linux:
|
Linux:
|
||||||
- cookies are either v10 or v11
|
- cookies are either v10 or v11
|
||||||
- v10: AES-CBC encrypted with a fixed key
|
- v10: AES-CBC encrypted with a fixed key
|
||||||
|
- also attempts empty password if decryption fails
|
||||||
- v11: AES-CBC encrypted with an OS protected key (keyring)
|
- v11: AES-CBC encrypted with an OS protected key (keyring)
|
||||||
|
- also attempts empty password if decryption fails
|
||||||
- v11 keys can be stored in various places depending on the activate desktop environment [2]
|
- v11 keys can be stored in various places depending on the activate desktop environment [2]
|
||||||
|
|
||||||
Mac:
|
Mac:
|
||||||
|
@ -361,7 +374,7 @@ class ChromeCookieDecryptor:
|
||||||
|
|
||||||
Sources:
|
Sources:
|
||||||
- [1] https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/
|
- [1] https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/
|
||||||
- [2] https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/key_storage_linux.cc
|
- [2] https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/sync/key_storage_linux.cc
|
||||||
- KeyStorageLinux::CreateService
|
- KeyStorageLinux::CreateService
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
@ -383,32 +396,49 @@ class LinuxChromeCookieDecryptor(ChromeCookieDecryptor):
|
||||||
def __init__(self, browser_keyring_name, logger, *, keyring=None):
|
def __init__(self, browser_keyring_name, logger, *, keyring=None):
|
||||||
self._logger = logger
|
self._logger = logger
|
||||||
self._v10_key = self.derive_key(b'peanuts')
|
self._v10_key = self.derive_key(b'peanuts')
|
||||||
password = _get_linux_keyring_password(browser_keyring_name, keyring, logger)
|
self._empty_key = self.derive_key(b'')
|
||||||
self._v11_key = None if password is None else self.derive_key(password)
|
|
||||||
self._cookie_counts = {'v10': 0, 'v11': 0, 'other': 0}
|
self._cookie_counts = {'v10': 0, 'v11': 0, 'other': 0}
|
||||||
|
self._browser_keyring_name = browser_keyring_name
|
||||||
|
self._keyring = keyring
|
||||||
|
|
||||||
|
@functools.cached_property
|
||||||
|
def _v11_key(self):
|
||||||
|
password = _get_linux_keyring_password(self._browser_keyring_name, self._keyring, self._logger)
|
||||||
|
return None if password is None else self.derive_key(password)
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def derive_key(password):
|
def derive_key(password):
|
||||||
# values from
|
# values from
|
||||||
# https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/os_crypt_linux.cc
|
# https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/sync/os_crypt_linux.cc
|
||||||
return pbkdf2_sha1(password, salt=b'saltysalt', iterations=1, key_length=16)
|
return pbkdf2_sha1(password, salt=b'saltysalt', iterations=1, key_length=16)
|
||||||
|
|
||||||
def decrypt(self, encrypted_value):
|
def decrypt(self, encrypted_value):
|
||||||
|
"""
|
||||||
|
|
||||||
|
following the same approach as the fix in [1]: if cookies fail to decrypt then attempt to decrypt
|
||||||
|
with an empty password. The failure detection is not the same as what chromium uses so the
|
||||||
|
results won't be perfect
|
||||||
|
|
||||||
|
References:
|
||||||
|
- [1] https://chromium.googlesource.com/chromium/src/+/bbd54702284caca1f92d656fdcadf2ccca6f4165%5E%21/
|
||||||
|
- a bugfix to try an empty password as a fallback
|
||||||
|
"""
|
||||||
version = encrypted_value[:3]
|
version = encrypted_value[:3]
|
||||||
ciphertext = encrypted_value[3:]
|
ciphertext = encrypted_value[3:]
|
||||||
|
|
||||||
if version == b'v10':
|
if version == b'v10':
|
||||||
self._cookie_counts['v10'] += 1
|
self._cookie_counts['v10'] += 1
|
||||||
return _decrypt_aes_cbc(ciphertext, self._v10_key, self._logger)
|
return _decrypt_aes_cbc_multi(ciphertext, (self._v10_key, self._empty_key), self._logger)
|
||||||
|
|
||||||
elif version == b'v11':
|
elif version == b'v11':
|
||||||
self._cookie_counts['v11'] += 1
|
self._cookie_counts['v11'] += 1
|
||||||
if self._v11_key is None:
|
if self._v11_key is None:
|
||||||
self._logger.warning('cannot decrypt v11 cookies: no key found', only_once=True)
|
self._logger.warning('cannot decrypt v11 cookies: no key found', only_once=True)
|
||||||
return None
|
return None
|
||||||
return _decrypt_aes_cbc(ciphertext, self._v11_key, self._logger)
|
return _decrypt_aes_cbc_multi(ciphertext, (self._v11_key, self._empty_key), self._logger)
|
||||||
|
|
||||||
else:
|
else:
|
||||||
|
self._logger.warning(f'unknown cookie version: "{version}"', only_once=True)
|
||||||
self._cookie_counts['other'] += 1
|
self._cookie_counts['other'] += 1
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
@ -423,7 +453,7 @@ def __init__(self, browser_keyring_name, logger):
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def derive_key(password):
|
def derive_key(password):
|
||||||
# values from
|
# values from
|
||||||
# https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/os_crypt_mac.mm
|
# https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/sync/os_crypt_mac.mm
|
||||||
return pbkdf2_sha1(password, salt=b'saltysalt', iterations=1003, key_length=16)
|
return pbkdf2_sha1(password, salt=b'saltysalt', iterations=1003, key_length=16)
|
||||||
|
|
||||||
def decrypt(self, encrypted_value):
|
def decrypt(self, encrypted_value):
|
||||||
|
@ -436,12 +466,12 @@ def decrypt(self, encrypted_value):
|
||||||
self._logger.warning('cannot decrypt v10 cookies: no key found', only_once=True)
|
self._logger.warning('cannot decrypt v10 cookies: no key found', only_once=True)
|
||||||
return None
|
return None
|
||||||
|
|
||||||
return _decrypt_aes_cbc(ciphertext, self._v10_key, self._logger)
|
return _decrypt_aes_cbc_multi(ciphertext, (self._v10_key,), self._logger)
|
||||||
|
|
||||||
else:
|
else:
|
||||||
self._cookie_counts['other'] += 1
|
self._cookie_counts['other'] += 1
|
||||||
# other prefixes are considered 'old data' which were stored as plaintext
|
# other prefixes are considered 'old data' which were stored as plaintext
|
||||||
# https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/os_crypt_mac.mm
|
# https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/sync/os_crypt_mac.mm
|
||||||
return encrypted_value
|
return encrypted_value
|
||||||
|
|
||||||
|
|
||||||
|
@ -461,7 +491,7 @@ def decrypt(self, encrypted_value):
|
||||||
self._logger.warning('cannot decrypt v10 cookies: no key found', only_once=True)
|
self._logger.warning('cannot decrypt v10 cookies: no key found', only_once=True)
|
||||||
return None
|
return None
|
||||||
|
|
||||||
# https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/os_crypt_win.cc
|
# https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/sync/os_crypt_win.cc
|
||||||
# kNonceLength
|
# kNonceLength
|
||||||
nonce_length = 96 // 8
|
nonce_length = 96 // 8
|
||||||
# boringssl
|
# boringssl
|
||||||
|
@ -478,23 +508,27 @@ def decrypt(self, encrypted_value):
|
||||||
else:
|
else:
|
||||||
self._cookie_counts['other'] += 1
|
self._cookie_counts['other'] += 1
|
||||||
# any other prefix means the data is DPAPI encrypted
|
# any other prefix means the data is DPAPI encrypted
|
||||||
# https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/os_crypt_win.cc
|
# https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/sync/os_crypt_win.cc
|
||||||
return _decrypt_windows_dpapi(encrypted_value, self._logger).decode()
|
return _decrypt_windows_dpapi(encrypted_value, self._logger).decode()
|
||||||
|
|
||||||
|
|
||||||
def _extract_safari_cookies(profile, logger):
|
def _extract_safari_cookies(profile, logger):
|
||||||
if profile is not None:
|
|
||||||
logger.error('safari does not support profiles')
|
|
||||||
if sys.platform != 'darwin':
|
if sys.platform != 'darwin':
|
||||||
raise ValueError(f'unsupported platform: {sys.platform}')
|
raise ValueError(f'unsupported platform: {sys.platform}')
|
||||||
|
|
||||||
cookies_path = os.path.expanduser('~/Library/Cookies/Cookies.binarycookies')
|
if profile:
|
||||||
|
cookies_path = os.path.expanduser(profile)
|
||||||
if not os.path.isfile(cookies_path):
|
|
||||||
logger.debug('Trying secondary cookie location')
|
|
||||||
cookies_path = os.path.expanduser('~/Library/Containers/com.apple.Safari/Data/Library/Cookies/Cookies.binarycookies')
|
|
||||||
if not os.path.isfile(cookies_path):
|
if not os.path.isfile(cookies_path):
|
||||||
raise FileNotFoundError('could not find safari cookies database')
|
raise FileNotFoundError('custom safari cookies database not found')
|
||||||
|
|
||||||
|
else:
|
||||||
|
cookies_path = os.path.expanduser('~/Library/Cookies/Cookies.binarycookies')
|
||||||
|
|
||||||
|
if not os.path.isfile(cookies_path):
|
||||||
|
logger.debug('Trying secondary cookie location')
|
||||||
|
cookies_path = os.path.expanduser('~/Library/Containers/com.apple.Safari/Data/Library/Cookies/Cookies.binarycookies')
|
||||||
|
if not os.path.isfile(cookies_path):
|
||||||
|
raise FileNotFoundError('could not find safari cookies database')
|
||||||
|
|
||||||
with open(cookies_path, 'rb') as f:
|
with open(cookies_path, 'rb') as f:
|
||||||
cookies_data = f.read()
|
cookies_data = f.read()
|
||||||
|
@ -657,19 +691,27 @@ class _LinuxDesktopEnvironment(Enum):
|
||||||
"""
|
"""
|
||||||
OTHER = auto()
|
OTHER = auto()
|
||||||
CINNAMON = auto()
|
CINNAMON = auto()
|
||||||
|
DEEPIN = auto()
|
||||||
GNOME = auto()
|
GNOME = auto()
|
||||||
KDE = auto()
|
KDE3 = auto()
|
||||||
|
KDE4 = auto()
|
||||||
|
KDE5 = auto()
|
||||||
|
KDE6 = auto()
|
||||||
PANTHEON = auto()
|
PANTHEON = auto()
|
||||||
|
UKUI = auto()
|
||||||
UNITY = auto()
|
UNITY = auto()
|
||||||
XFCE = auto()
|
XFCE = auto()
|
||||||
|
LXQT = auto()
|
||||||
|
|
||||||
|
|
||||||
class _LinuxKeyring(Enum):
|
class _LinuxKeyring(Enum):
|
||||||
"""
|
"""
|
||||||
https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/key_storage_util_linux.h
|
https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/sync/key_storage_util_linux.h
|
||||||
SelectedLinuxBackend
|
SelectedLinuxBackend
|
||||||
"""
|
"""
|
||||||
KWALLET = auto()
|
KWALLET = auto() # KDE4
|
||||||
|
KWALLET5 = auto()
|
||||||
|
KWALLET6 = auto()
|
||||||
GNOMEKEYRING = auto()
|
GNOMEKEYRING = auto()
|
||||||
BASICTEXT = auto()
|
BASICTEXT = auto()
|
||||||
|
|
||||||
|
@ -677,7 +719,7 @@ class _LinuxKeyring(Enum):
|
||||||
SUPPORTED_KEYRINGS = _LinuxKeyring.__members__.keys()
|
SUPPORTED_KEYRINGS = _LinuxKeyring.__members__.keys()
|
||||||
|
|
||||||
|
|
||||||
def _get_linux_desktop_environment(env):
|
def _get_linux_desktop_environment(env, logger):
|
||||||
"""
|
"""
|
||||||
https://chromium.googlesource.com/chromium/src/+/refs/heads/main/base/nix/xdg_util.cc
|
https://chromium.googlesource.com/chromium/src/+/refs/heads/main/base/nix/xdg_util.cc
|
||||||
GetDesktopEnvironment
|
GetDesktopEnvironment
|
||||||
|
@ -692,51 +734,97 @@ def _get_linux_desktop_environment(env):
|
||||||
return _LinuxDesktopEnvironment.GNOME
|
return _LinuxDesktopEnvironment.GNOME
|
||||||
else:
|
else:
|
||||||
return _LinuxDesktopEnvironment.UNITY
|
return _LinuxDesktopEnvironment.UNITY
|
||||||
|
elif xdg_current_desktop == 'Deepin':
|
||||||
|
return _LinuxDesktopEnvironment.DEEPIN
|
||||||
elif xdg_current_desktop == 'GNOME':
|
elif xdg_current_desktop == 'GNOME':
|
||||||
return _LinuxDesktopEnvironment.GNOME
|
return _LinuxDesktopEnvironment.GNOME
|
||||||
elif xdg_current_desktop == 'X-Cinnamon':
|
elif xdg_current_desktop == 'X-Cinnamon':
|
||||||
return _LinuxDesktopEnvironment.CINNAMON
|
return _LinuxDesktopEnvironment.CINNAMON
|
||||||
elif xdg_current_desktop == 'KDE':
|
elif xdg_current_desktop == 'KDE':
|
||||||
return _LinuxDesktopEnvironment.KDE
|
kde_version = env.get('KDE_SESSION_VERSION', None)
|
||||||
|
if kde_version == '5':
|
||||||
|
return _LinuxDesktopEnvironment.KDE5
|
||||||
|
elif kde_version == '6':
|
||||||
|
return _LinuxDesktopEnvironment.KDE6
|
||||||
|
elif kde_version == '4':
|
||||||
|
return _LinuxDesktopEnvironment.KDE4
|
||||||
|
else:
|
||||||
|
logger.info(f'unknown KDE version: "{kde_version}". Assuming KDE4')
|
||||||
|
return _LinuxDesktopEnvironment.KDE4
|
||||||
elif xdg_current_desktop == 'Pantheon':
|
elif xdg_current_desktop == 'Pantheon':
|
||||||
return _LinuxDesktopEnvironment.PANTHEON
|
return _LinuxDesktopEnvironment.PANTHEON
|
||||||
elif xdg_current_desktop == 'XFCE':
|
elif xdg_current_desktop == 'XFCE':
|
||||||
return _LinuxDesktopEnvironment.XFCE
|
return _LinuxDesktopEnvironment.XFCE
|
||||||
|
elif xdg_current_desktop == 'UKUI':
|
||||||
|
return _LinuxDesktopEnvironment.UKUI
|
||||||
|
elif xdg_current_desktop == 'LXQt':
|
||||||
|
return _LinuxDesktopEnvironment.LXQT
|
||||||
|
else:
|
||||||
|
logger.info(f'XDG_CURRENT_DESKTOP is set to an unknown value: "{xdg_current_desktop}"')
|
||||||
|
|
||||||
elif desktop_session is not None:
|
elif desktop_session is not None:
|
||||||
if desktop_session in ('mate', 'gnome'):
|
if desktop_session == 'deepin':
|
||||||
|
return _LinuxDesktopEnvironment.DEEPIN
|
||||||
|
elif desktop_session in ('mate', 'gnome'):
|
||||||
return _LinuxDesktopEnvironment.GNOME
|
return _LinuxDesktopEnvironment.GNOME
|
||||||
elif 'kde' in desktop_session:
|
elif desktop_session in ('kde4', 'kde-plasma'):
|
||||||
return _LinuxDesktopEnvironment.KDE
|
return _LinuxDesktopEnvironment.KDE4
|
||||||
elif 'xfce' in desktop_session:
|
elif desktop_session == 'kde':
|
||||||
|
if 'KDE_SESSION_VERSION' in env:
|
||||||
|
return _LinuxDesktopEnvironment.KDE4
|
||||||
|
else:
|
||||||
|
return _LinuxDesktopEnvironment.KDE3
|
||||||
|
elif 'xfce' in desktop_session or desktop_session == 'xubuntu':
|
||||||
return _LinuxDesktopEnvironment.XFCE
|
return _LinuxDesktopEnvironment.XFCE
|
||||||
|
elif desktop_session == 'ukui':
|
||||||
|
return _LinuxDesktopEnvironment.UKUI
|
||||||
|
else:
|
||||||
|
logger.info(f'DESKTOP_SESSION is set to an unknown value: "{desktop_session}"')
|
||||||
|
|
||||||
else:
|
else:
|
||||||
if 'GNOME_DESKTOP_SESSION_ID' in env:
|
if 'GNOME_DESKTOP_SESSION_ID' in env:
|
||||||
return _LinuxDesktopEnvironment.GNOME
|
return _LinuxDesktopEnvironment.GNOME
|
||||||
elif 'KDE_FULL_SESSION' in env:
|
elif 'KDE_FULL_SESSION' in env:
|
||||||
return _LinuxDesktopEnvironment.KDE
|
if 'KDE_SESSION_VERSION' in env:
|
||||||
|
return _LinuxDesktopEnvironment.KDE4
|
||||||
|
else:
|
||||||
|
return _LinuxDesktopEnvironment.KDE3
|
||||||
return _LinuxDesktopEnvironment.OTHER
|
return _LinuxDesktopEnvironment.OTHER
|
||||||
|
|
||||||
|
|
||||||
def _choose_linux_keyring(logger):
|
def _choose_linux_keyring(logger):
|
||||||
"""
|
"""
|
||||||
https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/key_storage_util_linux.cc
|
SelectBackend in [1]
|
||||||
SelectBackend
|
|
||||||
|
There is currently support for forcing chromium to use BASIC_TEXT by creating a file called
|
||||||
|
`Disable Local Encryption` [1] in the user data dir. The function to write this file (`WriteBackendUse()` [1])
|
||||||
|
does not appear to be called anywhere other than in tests, so the user would have to create this file manually
|
||||||
|
and so would be aware enough to tell yt-dlp to use the BASIC_TEXT keyring.
|
||||||
|
|
||||||
|
References:
|
||||||
|
- [1] https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/sync/key_storage_util_linux.cc
|
||||||
"""
|
"""
|
||||||
desktop_environment = _get_linux_desktop_environment(os.environ)
|
desktop_environment = _get_linux_desktop_environment(os.environ, logger)
|
||||||
logger.debug(f'detected desktop environment: {desktop_environment.name}')
|
logger.debug(f'detected desktop environment: {desktop_environment.name}')
|
||||||
if desktop_environment == _LinuxDesktopEnvironment.KDE:
|
if desktop_environment == _LinuxDesktopEnvironment.KDE4:
|
||||||
linux_keyring = _LinuxKeyring.KWALLET
|
linux_keyring = _LinuxKeyring.KWALLET
|
||||||
elif desktop_environment == _LinuxDesktopEnvironment.OTHER:
|
elif desktop_environment == _LinuxDesktopEnvironment.KDE5:
|
||||||
|
linux_keyring = _LinuxKeyring.KWALLET5
|
||||||
|
elif desktop_environment == _LinuxDesktopEnvironment.KDE6:
|
||||||
|
linux_keyring = _LinuxKeyring.KWALLET6
|
||||||
|
elif desktop_environment in (
|
||||||
|
_LinuxDesktopEnvironment.KDE3, _LinuxDesktopEnvironment.LXQT, _LinuxDesktopEnvironment.OTHER
|
||||||
|
):
|
||||||
linux_keyring = _LinuxKeyring.BASICTEXT
|
linux_keyring = _LinuxKeyring.BASICTEXT
|
||||||
else:
|
else:
|
||||||
linux_keyring = _LinuxKeyring.GNOMEKEYRING
|
linux_keyring = _LinuxKeyring.GNOMEKEYRING
|
||||||
return linux_keyring
|
return linux_keyring
|
||||||
|
|
||||||
|
|
||||||
def _get_kwallet_network_wallet(logger):
|
def _get_kwallet_network_wallet(keyring, logger):
|
||||||
""" The name of the wallet used to store network passwords.
|
""" The name of the wallet used to store network passwords.
|
||||||
|
|
||||||
https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/kwallet_dbus.cc
|
https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/sync/kwallet_dbus.cc
|
||||||
KWalletDBus::NetworkWallet
|
KWalletDBus::NetworkWallet
|
||||||
which does a dbus call to the following function:
|
which does a dbus call to the following function:
|
||||||
https://api.kde.org/frameworks/kwallet/html/classKWallet_1_1Wallet.html
|
https://api.kde.org/frameworks/kwallet/html/classKWallet_1_1Wallet.html
|
||||||
|
@ -744,10 +832,22 @@ def _get_kwallet_network_wallet(logger):
|
||||||
"""
|
"""
|
||||||
default_wallet = 'kdewallet'
|
default_wallet = 'kdewallet'
|
||||||
try:
|
try:
|
||||||
|
if keyring == _LinuxKeyring.KWALLET:
|
||||||
|
service_name = 'org.kde.kwalletd'
|
||||||
|
wallet_path = '/modules/kwalletd'
|
||||||
|
elif keyring == _LinuxKeyring.KWALLET5:
|
||||||
|
service_name = 'org.kde.kwalletd5'
|
||||||
|
wallet_path = '/modules/kwalletd5'
|
||||||
|
elif keyring == _LinuxKeyring.KWALLET6:
|
||||||
|
service_name = 'org.kde.kwalletd6'
|
||||||
|
wallet_path = '/modules/kwalletd6'
|
||||||
|
else:
|
||||||
|
raise ValueError(keyring)
|
||||||
|
|
||||||
stdout, _, returncode = Popen.run([
|
stdout, _, returncode = Popen.run([
|
||||||
'dbus-send', '--session', '--print-reply=literal',
|
'dbus-send', '--session', '--print-reply=literal',
|
||||||
'--dest=org.kde.kwalletd5',
|
f'--dest={service_name}',
|
||||||
'/modules/kwalletd5',
|
wallet_path,
|
||||||
'org.kde.KWallet.networkWallet'
|
'org.kde.KWallet.networkWallet'
|
||||||
], text=True, stdout=subprocess.PIPE, stderr=subprocess.DEVNULL)
|
], text=True, stdout=subprocess.PIPE, stderr=subprocess.DEVNULL)
|
||||||
|
|
||||||
|
@ -762,8 +862,8 @@ def _get_kwallet_network_wallet(logger):
|
||||||
return default_wallet
|
return default_wallet
|
||||||
|
|
||||||
|
|
||||||
def _get_kwallet_password(browser_keyring_name, logger):
|
def _get_kwallet_password(browser_keyring_name, keyring, logger):
|
||||||
logger.debug('using kwallet-query to obtain password from kwallet')
|
logger.debug(f'using kwallet-query to obtain password from {keyring.name}')
|
||||||
|
|
||||||
if shutil.which('kwallet-query') is None:
|
if shutil.which('kwallet-query') is None:
|
||||||
logger.error('kwallet-query command not found. KWallet and kwallet-query '
|
logger.error('kwallet-query command not found. KWallet and kwallet-query '
|
||||||
|
@ -771,7 +871,7 @@ def _get_kwallet_password(browser_keyring_name, logger):
|
||||||
'included in the kwallet package for your distribution')
|
'included in the kwallet package for your distribution')
|
||||||
return b''
|
return b''
|
||||||
|
|
||||||
network_wallet = _get_kwallet_network_wallet(logger)
|
network_wallet = _get_kwallet_network_wallet(keyring, logger)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
stdout, _, returncode = Popen.run([
|
stdout, _, returncode = Popen.run([
|
||||||
|
@ -793,8 +893,9 @@ def _get_kwallet_password(browser_keyring_name, logger):
|
||||||
# checks hasEntry. To verify this:
|
# checks hasEntry. To verify this:
|
||||||
# dbus-monitor "interface='org.kde.KWallet'" "type=method_return"
|
# dbus-monitor "interface='org.kde.KWallet'" "type=method_return"
|
||||||
# while starting chrome.
|
# while starting chrome.
|
||||||
# this may be a bug as the intended behaviour is to generate a random password and store
|
# this was identified as a bug later and fixed in
|
||||||
# it, but that doesn't matter here.
|
# https://chromium.googlesource.com/chromium/src/+/bbd54702284caca1f92d656fdcadf2ccca6f4165%5E%21/#F0
|
||||||
|
# https://chromium.googlesource.com/chromium/src/+/5463af3c39d7f5b6d11db7fbd51e38cc1974d764
|
||||||
return b''
|
return b''
|
||||||
else:
|
else:
|
||||||
logger.debug('password found')
|
logger.debug('password found')
|
||||||
|
@ -832,8 +933,8 @@ def _get_linux_keyring_password(browser_keyring_name, keyring, logger):
|
||||||
keyring = _LinuxKeyring[keyring] if keyring else _choose_linux_keyring(logger)
|
keyring = _LinuxKeyring[keyring] if keyring else _choose_linux_keyring(logger)
|
||||||
logger.debug(f'Chosen keyring: {keyring.name}')
|
logger.debug(f'Chosen keyring: {keyring.name}')
|
||||||
|
|
||||||
if keyring == _LinuxKeyring.KWALLET:
|
if keyring in (_LinuxKeyring.KWALLET, _LinuxKeyring.KWALLET5, _LinuxKeyring.KWALLET6):
|
||||||
return _get_kwallet_password(browser_keyring_name, logger)
|
return _get_kwallet_password(browser_keyring_name, keyring, logger)
|
||||||
elif keyring == _LinuxKeyring.GNOMEKEYRING:
|
elif keyring == _LinuxKeyring.GNOMEKEYRING:
|
||||||
return _get_gnome_keyring_password(browser_keyring_name, logger)
|
return _get_gnome_keyring_password(browser_keyring_name, logger)
|
||||||
elif keyring == _LinuxKeyring.BASICTEXT:
|
elif keyring == _LinuxKeyring.BASICTEXT:
|
||||||
|
@ -861,7 +962,11 @@ def _get_mac_keyring_password(browser_keyring_name, logger):
|
||||||
|
|
||||||
|
|
||||||
def _get_windows_v10_key(browser_root, logger):
|
def _get_windows_v10_key(browser_root, logger):
|
||||||
path = _find_most_recently_used_file(browser_root, 'Local State', logger)
|
"""
|
||||||
|
References:
|
||||||
|
- [1] https://chromium.googlesource.com/chromium/src/+/refs/heads/main/components/os_crypt/sync/os_crypt_win.cc
|
||||||
|
"""
|
||||||
|
path = _newest(_find_files(browser_root, 'Local State', logger))
|
||||||
if path is None:
|
if path is None:
|
||||||
logger.error('could not find local state file')
|
logger.error('could not find local state file')
|
||||||
return None
|
return None
|
||||||
|
@ -869,11 +974,13 @@ def _get_windows_v10_key(browser_root, logger):
|
||||||
with open(path, encoding='utf8') as f:
|
with open(path, encoding='utf8') as f:
|
||||||
data = json.load(f)
|
data = json.load(f)
|
||||||
try:
|
try:
|
||||||
|
# kOsCryptEncryptedKeyPrefName in [1]
|
||||||
base64_key = data['os_crypt']['encrypted_key']
|
base64_key = data['os_crypt']['encrypted_key']
|
||||||
except KeyError:
|
except KeyError:
|
||||||
logger.error('no encrypted key in Local State')
|
logger.error('no encrypted key in Local State')
|
||||||
return None
|
return None
|
||||||
encrypted_key = base64.b64decode(base64_key)
|
encrypted_key = base64.b64decode(base64_key)
|
||||||
|
# kDPAPIKeyPrefix in [1]
|
||||||
prefix = b'DPAPI'
|
prefix = b'DPAPI'
|
||||||
if not encrypted_key.startswith(prefix):
|
if not encrypted_key.startswith(prefix):
|
||||||
logger.error('invalid key')
|
logger.error('invalid key')
|
||||||
|
@ -885,13 +992,15 @@ def pbkdf2_sha1(password, salt, iterations, key_length):
|
||||||
return pbkdf2_hmac('sha1', password, salt, iterations, key_length)
|
return pbkdf2_hmac('sha1', password, salt, iterations, key_length)
|
||||||
|
|
||||||
|
|
||||||
def _decrypt_aes_cbc(ciphertext, key, logger, initialization_vector=b' ' * 16):
|
def _decrypt_aes_cbc_multi(ciphertext, keys, logger, initialization_vector=b' ' * 16):
|
||||||
plaintext = unpad_pkcs7(aes_cbc_decrypt_bytes(ciphertext, key, initialization_vector))
|
for key in keys:
|
||||||
try:
|
plaintext = unpad_pkcs7(aes_cbc_decrypt_bytes(ciphertext, key, initialization_vector))
|
||||||
return plaintext.decode()
|
try:
|
||||||
except UnicodeDecodeError:
|
return plaintext.decode()
|
||||||
logger.warning('failed to decrypt cookie (AES-CBC) because UTF-8 decoding failed. Possibly the key is wrong?', only_once=True)
|
except UnicodeDecodeError:
|
||||||
return None
|
pass
|
||||||
|
logger.warning('failed to decrypt cookie (AES-CBC) because UTF-8 decoding failed. Possibly the key is wrong?', only_once=True)
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
def _decrypt_aes_gcm(ciphertext, key, nonce, authentication_tag, logger):
|
def _decrypt_aes_gcm(ciphertext, key, nonce, authentication_tag, logger):
|
||||||
|
@ -959,17 +1068,20 @@ def _get_column_names(cursor, table_name):
|
||||||
return [row[1].decode() for row in table_info]
|
return [row[1].decode() for row in table_info]
|
||||||
|
|
||||||
|
|
||||||
def _find_most_recently_used_file(root, filename, logger):
|
def _newest(files):
|
||||||
|
return max(files, key=lambda path: os.lstat(path).st_mtime, default=None)
|
||||||
|
|
||||||
|
|
||||||
|
def _find_files(root, filename, logger):
|
||||||
# if there are multiple browser profiles, take the most recently used one
|
# if there are multiple browser profiles, take the most recently used one
|
||||||
i, paths = 0, []
|
i = 0
|
||||||
with _create_progress_bar(logger) as progress_bar:
|
with _create_progress_bar(logger) as progress_bar:
|
||||||
for curr_root, dirs, files in os.walk(root):
|
for curr_root, _, files in os.walk(root):
|
||||||
for file in files:
|
for file in files:
|
||||||
i += 1
|
i += 1
|
||||||
progress_bar.print(f'Searching for "{filename}": {i: 6d} files searched')
|
progress_bar.print(f'Searching for "{filename}": {i: 6d} files searched')
|
||||||
if file == filename:
|
if file == filename:
|
||||||
paths.append(os.path.join(curr_root, file))
|
yield os.path.join(curr_root, file)
|
||||||
return None if not paths else max(paths, key=lambda path: os.lstat(path).st_mtime)
|
|
||||||
|
|
||||||
|
|
||||||
def _merge_cookie_jars(jars):
|
def _merge_cookie_jars(jars):
|
||||||
|
@ -983,7 +1095,7 @@ def _merge_cookie_jars(jars):
|
||||||
|
|
||||||
|
|
||||||
def _is_path(value):
|
def _is_path(value):
|
||||||
return os.path.sep in value
|
return any(sep in value for sep in (os.path.sep, os.path.altsep) if sep)
|
||||||
|
|
||||||
|
|
||||||
def _parse_browser_specification(browser_name, profile=None, keyring=None, container=None):
|
def _parse_browser_specification(browser_name, profile=None, keyring=None, container=None):
|
||||||
|
@ -1085,3 +1197,150 @@ def load(self, data):
|
||||||
|
|
||||||
else:
|
else:
|
||||||
morsel = None
|
morsel = None
|
||||||
|
|
||||||
|
|
||||||
|
class YoutubeDLCookieJar(http.cookiejar.MozillaCookieJar):
|
||||||
|
"""
|
||||||
|
See [1] for cookie file format.
|
||||||
|
|
||||||
|
1. https://curl.haxx.se/docs/http-cookies.html
|
||||||
|
"""
|
||||||
|
_HTTPONLY_PREFIX = '#HttpOnly_'
|
||||||
|
_ENTRY_LEN = 7
|
||||||
|
_HEADER = '''# Netscape HTTP Cookie File
|
||||||
|
# This file is generated by yt-dlp. Do not edit.
|
||||||
|
|
||||||
|
'''
|
||||||
|
_CookieFileEntry = collections.namedtuple(
|
||||||
|
'CookieFileEntry',
|
||||||
|
('domain_name', 'include_subdomains', 'path', 'https_only', 'expires_at', 'name', 'value'))
|
||||||
|
|
||||||
|
def __init__(self, filename=None, *args, **kwargs):
|
||||||
|
super().__init__(None, *args, **kwargs)
|
||||||
|
if is_path_like(filename):
|
||||||
|
filename = os.fspath(filename)
|
||||||
|
self.filename = filename
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _true_or_false(cndn):
|
||||||
|
return 'TRUE' if cndn else 'FALSE'
|
||||||
|
|
||||||
|
@contextlib.contextmanager
|
||||||
|
def open(self, file, *, write=False):
|
||||||
|
if is_path_like(file):
|
||||||
|
with open(file, 'w' if write else 'r', encoding='utf-8') as f:
|
||||||
|
yield f
|
||||||
|
else:
|
||||||
|
if write:
|
||||||
|
file.truncate(0)
|
||||||
|
yield file
|
||||||
|
|
||||||
|
def _really_save(self, f, ignore_discard, ignore_expires):
|
||||||
|
now = time.time()
|
||||||
|
for cookie in self:
|
||||||
|
if (not ignore_discard and cookie.discard
|
||||||
|
or not ignore_expires and cookie.is_expired(now)):
|
||||||
|
continue
|
||||||
|
name, value = cookie.name, cookie.value
|
||||||
|
if value is None:
|
||||||
|
# cookies.txt regards 'Set-Cookie: foo' as a cookie
|
||||||
|
# with no name, whereas http.cookiejar regards it as a
|
||||||
|
# cookie with no value.
|
||||||
|
name, value = '', name
|
||||||
|
f.write('%s\n' % '\t'.join((
|
||||||
|
cookie.domain,
|
||||||
|
self._true_or_false(cookie.domain.startswith('.')),
|
||||||
|
cookie.path,
|
||||||
|
self._true_or_false(cookie.secure),
|
||||||
|
str_or_none(cookie.expires, default=''),
|
||||||
|
name, value
|
||||||
|
)))
|
||||||
|
|
||||||
|
def save(self, filename=None, ignore_discard=True, ignore_expires=True):
|
||||||
|
"""
|
||||||
|
Save cookies to a file.
|
||||||
|
Code is taken from CPython 3.6
|
||||||
|
https://github.com/python/cpython/blob/8d999cbf4adea053be6dbb612b9844635c4dfb8e/Lib/http/cookiejar.py#L2091-L2117 """
|
||||||
|
|
||||||
|
if filename is None:
|
||||||
|
if self.filename is not None:
|
||||||
|
filename = self.filename
|
||||||
|
else:
|
||||||
|
raise ValueError(http.cookiejar.MISSING_FILENAME_TEXT)
|
||||||
|
|
||||||
|
# Store session cookies with `expires` set to 0 instead of an empty string
|
||||||
|
for cookie in self:
|
||||||
|
if cookie.expires is None:
|
||||||
|
cookie.expires = 0
|
||||||
|
|
||||||
|
with self.open(filename, write=True) as f:
|
||||||
|
f.write(self._HEADER)
|
||||||
|
self._really_save(f, ignore_discard, ignore_expires)
|
||||||
|
|
||||||
|
def load(self, filename=None, ignore_discard=True, ignore_expires=True):
|
||||||
|
"""Load cookies from a file."""
|
||||||
|
if filename is None:
|
||||||
|
if self.filename is not None:
|
||||||
|
filename = self.filename
|
||||||
|
else:
|
||||||
|
raise ValueError(http.cookiejar.MISSING_FILENAME_TEXT)
|
||||||
|
|
||||||
|
def prepare_line(line):
|
||||||
|
if line.startswith(self._HTTPONLY_PREFIX):
|
||||||
|
line = line[len(self._HTTPONLY_PREFIX):]
|
||||||
|
# comments and empty lines are fine
|
||||||
|
if line.startswith('#') or not line.strip():
|
||||||
|
return line
|
||||||
|
cookie_list = line.split('\t')
|
||||||
|
if len(cookie_list) != self._ENTRY_LEN:
|
||||||
|
raise http.cookiejar.LoadError('invalid length %d' % len(cookie_list))
|
||||||
|
cookie = self._CookieFileEntry(*cookie_list)
|
||||||
|
if cookie.expires_at and not cookie.expires_at.isdigit():
|
||||||
|
raise http.cookiejar.LoadError('invalid expires at %s' % cookie.expires_at)
|
||||||
|
return line
|
||||||
|
|
||||||
|
cf = io.StringIO()
|
||||||
|
with self.open(filename) as f:
|
||||||
|
for line in f:
|
||||||
|
try:
|
||||||
|
cf.write(prepare_line(line))
|
||||||
|
except http.cookiejar.LoadError as e:
|
||||||
|
if f'{line.strip()} '[0] in '[{"':
|
||||||
|
raise http.cookiejar.LoadError(
|
||||||
|
'Cookies file must be Netscape formatted, not JSON. See '
|
||||||
|
'https://github.com/yt-dlp/yt-dlp/wiki/FAQ#how-do-i-pass-cookies-to-yt-dlp')
|
||||||
|
write_string(f'WARNING: skipping cookie file entry due to {e}: {line!r}\n')
|
||||||
|
continue
|
||||||
|
cf.seek(0)
|
||||||
|
self._really_load(cf, filename, ignore_discard, ignore_expires)
|
||||||
|
# Session cookies are denoted by either `expires` field set to
|
||||||
|
# an empty string or 0. MozillaCookieJar only recognizes the former
|
||||||
|
# (see [1]). So we need force the latter to be recognized as session
|
||||||
|
# cookies on our own.
|
||||||
|
# Session cookies may be important for cookies-based authentication,
|
||||||
|
# e.g. usually, when user does not check 'Remember me' check box while
|
||||||
|
# logging in on a site, some important cookies are stored as session
|
||||||
|
# cookies so that not recognizing them will result in failed login.
|
||||||
|
# 1. https://bugs.python.org/issue17164
|
||||||
|
for cookie in self:
|
||||||
|
# Treat `expires=0` cookies as session cookies
|
||||||
|
if cookie.expires == 0:
|
||||||
|
cookie.expires = None
|
||||||
|
cookie.discard = True
|
||||||
|
|
||||||
|
def get_cookie_header(self, url):
|
||||||
|
"""Generate a Cookie HTTP header for a given url"""
|
||||||
|
cookie_req = urllib.request.Request(normalize_url(sanitize_url(url)))
|
||||||
|
self.add_cookie_header(cookie_req)
|
||||||
|
return cookie_req.get_header('Cookie')
|
||||||
|
|
||||||
|
def get_cookies_for_url(self, url):
|
||||||
|
"""Generate a list of Cookie objects for a given url"""
|
||||||
|
# Policy `_now` attribute must be set before calling `_cookies_for_request`
|
||||||
|
# Ref: https://github.com/python/cpython/blob/3.7/Lib/http/cookiejar.py#L1360
|
||||||
|
self._policy._now = self._now = int(time.time())
|
||||||
|
return self._cookies_for_request(urllib.request.Request(normalize_url(sanitize_url(url))))
|
||||||
|
|
||||||
|
def clear(self, *args, **kwargs):
|
||||||
|
with contextlib.suppress(KeyError):
|
||||||
|
return super().clear(*args, **kwargs)
|
||||||
|
|
38
yt_dlp/dependencies/Cryptodome.py
Normal file
38
yt_dlp/dependencies/Cryptodome.py
Normal file
|
@ -0,0 +1,38 @@
|
||||||
|
from ..compat.compat_utils import passthrough_module
|
||||||
|
|
||||||
|
try:
|
||||||
|
import Cryptodome as _parent
|
||||||
|
except ImportError:
|
||||||
|
try:
|
||||||
|
import Crypto as _parent
|
||||||
|
except (ImportError, SyntaxError): # Old Crypto gives SyntaxError in newer Python
|
||||||
|
_parent = passthrough_module(__name__, 'no_Cryptodome')
|
||||||
|
__bool__ = lambda: False
|
||||||
|
|
||||||
|
del passthrough_module
|
||||||
|
|
||||||
|
__version__ = ''
|
||||||
|
AES = PKCS1_v1_5 = Blowfish = PKCS1_OAEP = SHA1 = CMAC = RSA = None
|
||||||
|
try:
|
||||||
|
if _parent.__name__ == 'Cryptodome':
|
||||||
|
from Cryptodome import __version__
|
||||||
|
from Cryptodome.Cipher import AES, PKCS1_OAEP, Blowfish, PKCS1_v1_5
|
||||||
|
from Cryptodome.Hash import CMAC, SHA1
|
||||||
|
from Cryptodome.PublicKey import RSA
|
||||||
|
elif _parent.__name__ == 'Crypto':
|
||||||
|
from Crypto import __version__
|
||||||
|
from Crypto.Cipher import AES, PKCS1_OAEP, Blowfish, PKCS1_v1_5 # noqa: F401
|
||||||
|
from Crypto.Hash import CMAC, SHA1 # noqa: F401
|
||||||
|
from Crypto.PublicKey import RSA # noqa: F401
|
||||||
|
except ImportError:
|
||||||
|
__version__ = f'broken {__version__}'.strip()
|
||||||
|
|
||||||
|
|
||||||
|
_yt_dlp__identifier = _parent.__name__
|
||||||
|
if AES and _yt_dlp__identifier == 'Crypto':
|
||||||
|
try:
|
||||||
|
# In pycrypto, mode defaults to ECB. See:
|
||||||
|
# https://www.pycryptodome.org/en/latest/src/vs_pycrypto.html#:~:text=not%20have%20ECB%20as%20default%20mode
|
||||||
|
AES.new(b'abcdefghijklmnop')
|
||||||
|
except TypeError:
|
||||||
|
_yt_dlp__identifier = 'pycrypto'
|
|
@ -23,24 +23,6 @@
|
||||||
certifi = None
|
certifi = None
|
||||||
|
|
||||||
|
|
||||||
try:
|
|
||||||
from Cryptodome.Cipher import AES as Cryptodome_AES
|
|
||||||
except ImportError:
|
|
||||||
try:
|
|
||||||
from Crypto.Cipher import AES as Cryptodome_AES
|
|
||||||
except (ImportError, SyntaxError): # Old Crypto gives SyntaxError in newer Python
|
|
||||||
Cryptodome_AES = None
|
|
||||||
else:
|
|
||||||
try:
|
|
||||||
# In pycrypto, mode defaults to ECB. See:
|
|
||||||
# https://www.pycryptodome.org/en/latest/src/vs_pycrypto.html#:~:text=not%20have%20ECB%20as%20default%20mode
|
|
||||||
Cryptodome_AES.new(b'abcdefghijklmnop')
|
|
||||||
except TypeError:
|
|
||||||
pass
|
|
||||||
else:
|
|
||||||
Cryptodome_AES._yt_dlp__identifier = 'pycrypto'
|
|
||||||
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
import mutagen
|
import mutagen
|
||||||
except ImportError:
|
except ImportError:
|
||||||
|
@ -61,6 +43,8 @@
|
||||||
|
|
||||||
try:
|
try:
|
||||||
import sqlite3
|
import sqlite3
|
||||||
|
# We need to get the underlying `sqlite` version, see https://github.com/yt-dlp/yt-dlp/issues/8152
|
||||||
|
sqlite3._yt_dlp__version = sqlite3.sqlite_version
|
||||||
except ImportError:
|
except ImportError:
|
||||||
# although sqlite3 is part of the standard library, it is possible to compile python without
|
# although sqlite3 is part of the standard library, it is possible to compile python without
|
||||||
# sqlite support. See: https://github.com/yt-dlp/yt-dlp/issues/544
|
# sqlite support. See: https://github.com/yt-dlp/yt-dlp/issues/544
|
||||||
|
@ -74,6 +58,15 @@
|
||||||
# See https://github.com/yt-dlp/yt-dlp/issues/2633
|
# See https://github.com/yt-dlp/yt-dlp/issues/2633
|
||||||
websockets = None
|
websockets = None
|
||||||
|
|
||||||
|
try:
|
||||||
|
import urllib3
|
||||||
|
except ImportError:
|
||||||
|
urllib3 = None
|
||||||
|
|
||||||
|
try:
|
||||||
|
import requests
|
||||||
|
except ImportError:
|
||||||
|
requests = None
|
||||||
|
|
||||||
try:
|
try:
|
||||||
import xattr # xattr or pyxattr
|
import xattr # xattr or pyxattr
|
||||||
|
@ -84,12 +77,16 @@
|
||||||
xattr._yt_dlp__identifier = 'pyxattr'
|
xattr._yt_dlp__identifier = 'pyxattr'
|
||||||
|
|
||||||
|
|
||||||
|
from . import Cryptodome
|
||||||
|
|
||||||
all_dependencies = {k: v for k, v in globals().items() if not k.startswith('_')}
|
all_dependencies = {k: v for k, v in globals().items() if not k.startswith('_')}
|
||||||
|
|
||||||
|
|
||||||
available_dependencies = {k: v for k, v in all_dependencies.items() if v}
|
available_dependencies = {k: v for k, v in all_dependencies.items() if v}
|
||||||
|
|
||||||
|
|
||||||
|
# Deprecated
|
||||||
|
Cryptodome_AES = Cryptodome.AES
|
||||||
|
|
||||||
|
|
||||||
__all__ = [
|
__all__ = [
|
||||||
'all_dependencies',
|
'all_dependencies',
|
||||||
'available_dependencies',
|
'available_dependencies',
|
|
@ -30,7 +30,7 @@ def get_suitable_downloader(info_dict, params={}, default=NO_DEFAULT, protocol=N
|
||||||
from .http import HttpFD
|
from .http import HttpFD
|
||||||
from .ism import IsmFD
|
from .ism import IsmFD
|
||||||
from .mhtml import MhtmlFD
|
from .mhtml import MhtmlFD
|
||||||
from .niconico import NiconicoDmcFD
|
from .niconico import NiconicoDmcFD, NiconicoLiveFD
|
||||||
from .rtmp import RtmpFD
|
from .rtmp import RtmpFD
|
||||||
from .rtsp import RtspFD
|
from .rtsp import RtspFD
|
||||||
from .websocket import WebSocketFragmentFD
|
from .websocket import WebSocketFragmentFD
|
||||||
|
@ -50,6 +50,7 @@ def get_suitable_downloader(info_dict, params={}, default=NO_DEFAULT, protocol=N
|
||||||
'ism': IsmFD,
|
'ism': IsmFD,
|
||||||
'mhtml': MhtmlFD,
|
'mhtml': MhtmlFD,
|
||||||
'niconico_dmc': NiconicoDmcFD,
|
'niconico_dmc': NiconicoDmcFD,
|
||||||
|
'niconico_live': NiconicoLiveFD,
|
||||||
'fc2_live': FC2LiveFD,
|
'fc2_live': FC2LiveFD,
|
||||||
'websocket_frag': WebSocketFragmentFD,
|
'websocket_frag': WebSocketFragmentFD,
|
||||||
'youtube_live_chat': YoutubeLiveChatFD,
|
'youtube_live_chat': YoutubeLiveChatFD,
|
||||||
|
|
|
@ -49,10 +49,10 @@ class FileDownloader:
|
||||||
verbose: Print additional info to stdout.
|
verbose: Print additional info to stdout.
|
||||||
quiet: Do not print messages to stdout.
|
quiet: Do not print messages to stdout.
|
||||||
ratelimit: Download speed limit, in bytes/sec.
|
ratelimit: Download speed limit, in bytes/sec.
|
||||||
continuedl: Attempt to continue downloads if possible
|
|
||||||
throttledratelimit: Assume the download is being throttled below this speed (bytes/sec)
|
throttledratelimit: Assume the download is being throttled below this speed (bytes/sec)
|
||||||
retries: Number of times to retry for HTTP error 5xx
|
retries: Number of times to retry for expected network errors.
|
||||||
file_access_retries: Number of times to retry on file access error
|
Default is 0 for API, but 10 for CLI
|
||||||
|
file_access_retries: Number of times to retry on file access error (default: 3)
|
||||||
buffersize: Size of download buffer in bytes.
|
buffersize: Size of download buffer in bytes.
|
||||||
noresizebuffer: Do not automatically resize the download buffer.
|
noresizebuffer: Do not automatically resize the download buffer.
|
||||||
continuedl: Try to continue downloads if possible.
|
continuedl: Try to continue downloads if possible.
|
||||||
|
@ -138,17 +138,21 @@ def calc_percent(byte_counter, data_len):
|
||||||
def format_percent(percent):
|
def format_percent(percent):
|
||||||
return ' N/A%' if percent is None else f'{percent:>5.1f}%'
|
return ' N/A%' if percent is None else f'{percent:>5.1f}%'
|
||||||
|
|
||||||
@staticmethod
|
@classmethod
|
||||||
def calc_eta(start, now, total, current):
|
def calc_eta(cls, start_or_rate, now_or_remaining, total=NO_DEFAULT, current=NO_DEFAULT):
|
||||||
|
if total is NO_DEFAULT:
|
||||||
|
rate, remaining = start_or_rate, now_or_remaining
|
||||||
|
if None in (rate, remaining):
|
||||||
|
return None
|
||||||
|
return int(float(remaining) / rate)
|
||||||
|
|
||||||
|
start, now = start_or_rate, now_or_remaining
|
||||||
if total is None:
|
if total is None:
|
||||||
return None
|
return None
|
||||||
if now is None:
|
if now is None:
|
||||||
now = time.time()
|
now = time.time()
|
||||||
dif = now - start
|
rate = cls.calc_speed(start, now, current)
|
||||||
if current == 0 or dif < 0.001: # One millisecond
|
return rate and int((float(total) - float(current)) / rate)
|
||||||
return None
|
|
||||||
rate = float(current) / dif
|
|
||||||
return int((float(total) - float(current)) / rate)
|
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def calc_speed(start, now, bytes):
|
def calc_speed(start, now, bytes):
|
||||||
|
@ -165,6 +169,12 @@ def format_speed(speed):
|
||||||
def format_retries(retries):
|
def format_retries(retries):
|
||||||
return 'inf' if retries == float('inf') else int(retries)
|
return 'inf' if retries == float('inf') else int(retries)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def filesize_or_none(unencoded_filename):
|
||||||
|
if os.path.isfile(unencoded_filename):
|
||||||
|
return os.path.getsize(unencoded_filename)
|
||||||
|
return 0
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def best_block_size(elapsed_time, bytes):
|
def best_block_size(elapsed_time, bytes):
|
||||||
new_min = max(bytes / 2.0, 1.0)
|
new_min = max(bytes / 2.0, 1.0)
|
||||||
|
@ -225,7 +235,7 @@ def error_callback(err, count, retries, *, fd):
|
||||||
sleep_func=fd.params.get('retry_sleep_functions', {}).get('file_access'))
|
sleep_func=fd.params.get('retry_sleep_functions', {}).get('file_access'))
|
||||||
|
|
||||||
def wrapper(self, func, *args, **kwargs):
|
def wrapper(self, func, *args, **kwargs):
|
||||||
for retry in RetryManager(self.params.get('file_access_retries'), error_callback, fd=self):
|
for retry in RetryManager(self.params.get('file_access_retries', 3), error_callback, fd=self):
|
||||||
try:
|
try:
|
||||||
return func(self, *args, **kwargs)
|
return func(self, *args, **kwargs)
|
||||||
except OSError as err:
|
except OSError as err:
|
||||||
|
@ -245,7 +255,8 @@ def sanitize_open(self, filename, open_mode):
|
||||||
|
|
||||||
@wrap_file_access('remove')
|
@wrap_file_access('remove')
|
||||||
def try_remove(self, filename):
|
def try_remove(self, filename):
|
||||||
os.remove(filename)
|
if os.path.isfile(filename):
|
||||||
|
os.remove(filename)
|
||||||
|
|
||||||
@wrap_file_access('rename')
|
@wrap_file_access('rename')
|
||||||
def try_rename(self, old_filename, new_filename):
|
def try_rename(self, old_filename, new_filename):
|
||||||
|
@ -285,7 +296,8 @@ def _prepare_multiline_status(self, lines=1):
|
||||||
self._multiline = BreaklineStatusPrinter(self.ydl._out_files.out, lines)
|
self._multiline = BreaklineStatusPrinter(self.ydl._out_files.out, lines)
|
||||||
else:
|
else:
|
||||||
self._multiline = MultilinePrinter(self.ydl._out_files.out, lines, not self.params.get('quiet'))
|
self._multiline = MultilinePrinter(self.ydl._out_files.out, lines, not self.params.get('quiet'))
|
||||||
self._multiline.allow_colors = self._multiline._HAVE_FULLCAP and not self.params.get('no_color')
|
self._multiline.allow_colors = self.ydl._allow_colors.out and self.ydl._allow_colors.out != 'no_color'
|
||||||
|
self._multiline._HAVE_FULLCAP = self.ydl._allow_colors.out
|
||||||
|
|
||||||
def _finish_multiline_status(self):
|
def _finish_multiline_status(self):
|
||||||
self._multiline.end()
|
self._multiline.end()
|
||||||
|
@ -407,7 +419,6 @@ def download(self, filename, info_dict, subtitle=False):
|
||||||
"""Download to a filename using the info from info_dict
|
"""Download to a filename using the info from info_dict
|
||||||
Return True on success and False otherwise
|
Return True on success and False otherwise
|
||||||
"""
|
"""
|
||||||
|
|
||||||
nooverwrites_and_exists = (
|
nooverwrites_and_exists = (
|
||||||
not self.params.get('overwrites', True)
|
not self.params.get('overwrites', True)
|
||||||
and os.path.exists(encodeFilename(filename))
|
and os.path.exists(encodeFilename(filename))
|
||||||
|
|
|
@ -15,12 +15,15 @@ class DashSegmentsFD(FragmentFD):
|
||||||
FD_NAME = 'dashsegments'
|
FD_NAME = 'dashsegments'
|
||||||
|
|
||||||
def real_download(self, filename, info_dict):
|
def real_download(self, filename, info_dict):
|
||||||
if info_dict.get('is_live') and set(info_dict['protocol'].split('+')) != {'http_dash_segments_generator'}:
|
if 'http_dash_segments_generator' in info_dict['protocol'].split('+'):
|
||||||
self.report_error('Live DASH videos are not supported')
|
real_downloader = None # No external FD can support --live-from-start
|
||||||
|
else:
|
||||||
|
if info_dict.get('is_live'):
|
||||||
|
self.report_error('Live DASH videos are not supported')
|
||||||
|
real_downloader = get_suitable_downloader(
|
||||||
|
info_dict, self.params, None, protocol='dash_frag_urls', to_stdout=(filename == '-'))
|
||||||
|
|
||||||
real_start = time.time()
|
real_start = time.time()
|
||||||
real_downloader = get_suitable_downloader(
|
|
||||||
info_dict, self.params, None, protocol='dash_frag_urls', to_stdout=(filename == '-'))
|
|
||||||
|
|
||||||
requested_formats = [{**info_dict, **fmt} for fmt in info_dict.get('requested_formats', [])]
|
requested_formats = [{**info_dict, **fmt} for fmt in info_dict.get('requested_formats', [])]
|
||||||
args = []
|
args = []
|
||||||
|
|
|
@ -1,14 +1,16 @@
|
||||||
import enum
|
import enum
|
||||||
import json
|
import json
|
||||||
import os.path
|
import os
|
||||||
import re
|
import re
|
||||||
import subprocess
|
import subprocess
|
||||||
import sys
|
import sys
|
||||||
|
import tempfile
|
||||||
import time
|
import time
|
||||||
import uuid
|
import uuid
|
||||||
|
|
||||||
from .fragment import FragmentFD
|
from .fragment import FragmentFD
|
||||||
from ..compat import functools
|
from ..compat import functools
|
||||||
|
from ..networking import Request
|
||||||
from ..postprocessor.ffmpeg import EXT_TO_OUT_FORMATS, FFmpegPostProcessor
|
from ..postprocessor.ffmpeg import EXT_TO_OUT_FORMATS, FFmpegPostProcessor
|
||||||
from ..utils import (
|
from ..utils import (
|
||||||
Popen,
|
Popen,
|
||||||
|
@ -23,9 +25,7 @@
|
||||||
encodeArgument,
|
encodeArgument,
|
||||||
encodeFilename,
|
encodeFilename,
|
||||||
find_available_port,
|
find_available_port,
|
||||||
handle_youtubedl_headers,
|
|
||||||
remove_end,
|
remove_end,
|
||||||
sanitized_Request,
|
|
||||||
traverse_obj,
|
traverse_obj,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -43,6 +43,7 @@ class ExternalFD(FragmentFD):
|
||||||
def real_download(self, filename, info_dict):
|
def real_download(self, filename, info_dict):
|
||||||
self.report_destination(filename)
|
self.report_destination(filename)
|
||||||
tmpfilename = self.temp_name(filename)
|
tmpfilename = self.temp_name(filename)
|
||||||
|
self._cookies_tempfile = None
|
||||||
|
|
||||||
try:
|
try:
|
||||||
started = time.time()
|
started = time.time()
|
||||||
|
@ -55,6 +56,9 @@ def real_download(self, filename, info_dict):
|
||||||
# should take place
|
# should take place
|
||||||
retval = 0
|
retval = 0
|
||||||
self.to_screen('[%s] Interrupted by user' % self.get_basename())
|
self.to_screen('[%s] Interrupted by user' % self.get_basename())
|
||||||
|
finally:
|
||||||
|
if self._cookies_tempfile:
|
||||||
|
self.try_remove(self._cookies_tempfile)
|
||||||
|
|
||||||
if retval == 0:
|
if retval == 0:
|
||||||
status = {
|
status = {
|
||||||
|
@ -104,6 +108,7 @@ def supports(cls, info_dict):
|
||||||
return all((
|
return all((
|
||||||
not info_dict.get('to_stdout') or Features.TO_STDOUT in cls.SUPPORTED_FEATURES,
|
not info_dict.get('to_stdout') or Features.TO_STDOUT in cls.SUPPORTED_FEATURES,
|
||||||
'+' not in info_dict['protocol'] or Features.MULTIPLE_FORMATS in cls.SUPPORTED_FEATURES,
|
'+' not in info_dict['protocol'] or Features.MULTIPLE_FORMATS in cls.SUPPORTED_FEATURES,
|
||||||
|
not traverse_obj(info_dict, ('hls_aes', ...), 'extra_param_to_segment_url'),
|
||||||
all(proto in cls.SUPPORTED_PROTOCOLS for proto in info_dict['protocol'].split('+')),
|
all(proto in cls.SUPPORTED_PROTOCOLS for proto in info_dict['protocol'].split('+')),
|
||||||
))
|
))
|
||||||
|
|
||||||
|
@ -125,6 +130,16 @@ def _configuration_args(self, keys=None, *args, **kwargs):
|
||||||
self.get_basename(), self.params.get('external_downloader_args'), self.EXE_NAME,
|
self.get_basename(), self.params.get('external_downloader_args'), self.EXE_NAME,
|
||||||
keys, *args, **kwargs)
|
keys, *args, **kwargs)
|
||||||
|
|
||||||
|
def _write_cookies(self):
|
||||||
|
if not self.ydl.cookiejar.filename:
|
||||||
|
tmp_cookies = tempfile.NamedTemporaryFile(suffix='.cookies', delete=False)
|
||||||
|
tmp_cookies.close()
|
||||||
|
self._cookies_tempfile = tmp_cookies.name
|
||||||
|
self.to_screen(f'[download] Writing temporary cookies file to "{self._cookies_tempfile}"')
|
||||||
|
# real_download resets _cookies_tempfile; if it's None then save() will write to cookiejar.filename
|
||||||
|
self.ydl.cookiejar.save(self._cookies_tempfile)
|
||||||
|
return self.ydl.cookiejar.filename or self._cookies_tempfile
|
||||||
|
|
||||||
def _call_downloader(self, tmpfilename, info_dict):
|
def _call_downloader(self, tmpfilename, info_dict):
|
||||||
""" Either overwrite this or implement _make_cmd """
|
""" Either overwrite this or implement _make_cmd """
|
||||||
cmd = [encodeArgument(a) for a in self._make_cmd(tmpfilename, info_dict)]
|
cmd = [encodeArgument(a) for a in self._make_cmd(tmpfilename, info_dict)]
|
||||||
|
@ -175,7 +190,7 @@ def _call_downloader(self, tmpfilename, info_dict):
|
||||||
return 0
|
return 0
|
||||||
|
|
||||||
def _call_process(self, cmd, info_dict):
|
def _call_process(self, cmd, info_dict):
|
||||||
return Popen.run(cmd, text=True, stderr=subprocess.PIPE)
|
return Popen.run(cmd, text=True, stderr=subprocess.PIPE if self._CAPTURE_STDERR else None)
|
||||||
|
|
||||||
|
|
||||||
class CurlFD(ExternalFD):
|
class CurlFD(ExternalFD):
|
||||||
|
@ -184,6 +199,9 @@ class CurlFD(ExternalFD):
|
||||||
|
|
||||||
def _make_cmd(self, tmpfilename, info_dict):
|
def _make_cmd(self, tmpfilename, info_dict):
|
||||||
cmd = [self.exe, '--location', '-o', tmpfilename, '--compressed']
|
cmd = [self.exe, '--location', '-o', tmpfilename, '--compressed']
|
||||||
|
cookie_header = self.ydl.cookiejar.get_cookie_header(info_dict['url'])
|
||||||
|
if cookie_header:
|
||||||
|
cmd += ['--cookie', cookie_header]
|
||||||
if info_dict.get('http_headers') is not None:
|
if info_dict.get('http_headers') is not None:
|
||||||
for key, val in info_dict['http_headers'].items():
|
for key, val in info_dict['http_headers'].items():
|
||||||
cmd += ['--header', f'{key}: {val}']
|
cmd += ['--header', f'{key}: {val}']
|
||||||
|
@ -214,6 +232,9 @@ def _make_cmd(self, tmpfilename, info_dict):
|
||||||
if info_dict.get('http_headers') is not None:
|
if info_dict.get('http_headers') is not None:
|
||||||
for key, val in info_dict['http_headers'].items():
|
for key, val in info_dict['http_headers'].items():
|
||||||
cmd += ['-H', f'{key}: {val}']
|
cmd += ['-H', f'{key}: {val}']
|
||||||
|
cookie_header = self.ydl.cookiejar.get_cookie_header(info_dict['url'])
|
||||||
|
if cookie_header:
|
||||||
|
cmd += ['-H', f'Cookie: {cookie_header}', '--max-redirect=0']
|
||||||
cmd += self._configuration_args()
|
cmd += self._configuration_args()
|
||||||
cmd += ['--', info_dict['url']]
|
cmd += ['--', info_dict['url']]
|
||||||
return cmd
|
return cmd
|
||||||
|
@ -223,7 +244,9 @@ class WgetFD(ExternalFD):
|
||||||
AVAILABLE_OPT = '--version'
|
AVAILABLE_OPT = '--version'
|
||||||
|
|
||||||
def _make_cmd(self, tmpfilename, info_dict):
|
def _make_cmd(self, tmpfilename, info_dict):
|
||||||
cmd = [self.exe, '-O', tmpfilename, '-nv', '--no-cookies', '--compression=auto']
|
cmd = [self.exe, '-O', tmpfilename, '-nv', '--compression=auto']
|
||||||
|
if self.ydl.cookiejar.get_cookie_header(info_dict['url']):
|
||||||
|
cmd += ['--load-cookies', self._write_cookies()]
|
||||||
if info_dict.get('http_headers') is not None:
|
if info_dict.get('http_headers') is not None:
|
||||||
for key, val in info_dict['http_headers'].items():
|
for key, val in info_dict['http_headers'].items():
|
||||||
cmd += ['--header', f'{key}: {val}']
|
cmd += ['--header', f'{key}: {val}']
|
||||||
|
@ -271,7 +294,7 @@ def _call_downloader(self, tmpfilename, info_dict):
|
||||||
return super()._call_downloader(tmpfilename, info_dict)
|
return super()._call_downloader(tmpfilename, info_dict)
|
||||||
|
|
||||||
def _make_cmd(self, tmpfilename, info_dict):
|
def _make_cmd(self, tmpfilename, info_dict):
|
||||||
cmd = [self.exe, '-c',
|
cmd = [self.exe, '-c', '--no-conf',
|
||||||
'--console-log-level=warn', '--summary-interval=0', '--download-result=hide',
|
'--console-log-level=warn', '--summary-interval=0', '--download-result=hide',
|
||||||
'--http-accept-gzip=true', '--file-allocation=none', '-x16', '-j16', '-s16']
|
'--http-accept-gzip=true', '--file-allocation=none', '-x16', '-j16', '-s16']
|
||||||
if 'fragments' in info_dict:
|
if 'fragments' in info_dict:
|
||||||
|
@ -279,6 +302,8 @@ def _make_cmd(self, tmpfilename, info_dict):
|
||||||
else:
|
else:
|
||||||
cmd += ['--min-split-size', '1M']
|
cmd += ['--min-split-size', '1M']
|
||||||
|
|
||||||
|
if self.ydl.cookiejar.get_cookie_header(info_dict['url']):
|
||||||
|
cmd += [f'--load-cookies={self._write_cookies()}']
|
||||||
if info_dict.get('http_headers') is not None:
|
if info_dict.get('http_headers') is not None:
|
||||||
for key, val in info_dict['http_headers'].items():
|
for key, val in info_dict['http_headers'].items():
|
||||||
cmd += ['--header', f'{key}: {val}']
|
cmd += ['--header', f'{key}: {val}']
|
||||||
|
@ -310,7 +335,7 @@ def _make_cmd(self, tmpfilename, info_dict):
|
||||||
cmd += ['--auto-file-renaming=false']
|
cmd += ['--auto-file-renaming=false']
|
||||||
|
|
||||||
if 'fragments' in info_dict:
|
if 'fragments' in info_dict:
|
||||||
cmd += ['--file-allocation=none', '--uri-selector=inorder']
|
cmd += ['--uri-selector=inorder']
|
||||||
url_list_file = '%s.frag.urls' % tmpfilename
|
url_list_file = '%s.frag.urls' % tmpfilename
|
||||||
url_list = []
|
url_list = []
|
||||||
for frag_index, fragment in enumerate(info_dict['fragments']):
|
for frag_index, fragment in enumerate(info_dict['fragments']):
|
||||||
|
@ -333,13 +358,12 @@ def aria2c_rpc(self, rpc_port, rpc_secret, method, params=()):
|
||||||
'method': method,
|
'method': method,
|
||||||
'params': [f'token:{rpc_secret}', *params],
|
'params': [f'token:{rpc_secret}', *params],
|
||||||
}).encode('utf-8')
|
}).encode('utf-8')
|
||||||
request = sanitized_Request(
|
request = Request(
|
||||||
f'http://localhost:{rpc_port}/jsonrpc',
|
f'http://localhost:{rpc_port}/jsonrpc',
|
||||||
data=d, headers={
|
data=d, headers={
|
||||||
'Content-Type': 'application/json',
|
'Content-Type': 'application/json',
|
||||||
'Content-Length': f'{len(d)}',
|
'Content-Length': f'{len(d)}',
|
||||||
'Ytdl-request-proxy': '__noproxy__',
|
}, proxies={'all': None})
|
||||||
})
|
|
||||||
with self.ydl.urlopen(request) as r:
|
with self.ydl.urlopen(request) as r:
|
||||||
resp = json.load(r)
|
resp = json.load(r)
|
||||||
assert resp.get('id') == sanitycheck, 'Something went wrong with RPC server'
|
assert resp.get('id') == sanitycheck, 'Something went wrong with RPC server'
|
||||||
|
@ -417,6 +441,14 @@ def _make_cmd(self, tmpfilename, info_dict):
|
||||||
if info_dict.get('http_headers') is not None:
|
if info_dict.get('http_headers') is not None:
|
||||||
for key, val in info_dict['http_headers'].items():
|
for key, val in info_dict['http_headers'].items():
|
||||||
cmd += [f'{key}:{val}']
|
cmd += [f'{key}:{val}']
|
||||||
|
|
||||||
|
# httpie 3.1.0+ removes the Cookie header on redirect, so this should be safe for now. [1]
|
||||||
|
# If we ever need cookie handling for redirects, we can export the cookiejar into a session. [2]
|
||||||
|
# 1: https://github.com/httpie/httpie/security/advisories/GHSA-9w4w-cpc8-h2fq
|
||||||
|
# 2: https://httpie.io/docs/cli/sessions
|
||||||
|
cookie_header = self.ydl.cookiejar.get_cookie_header(info_dict['url'])
|
||||||
|
if cookie_header:
|
||||||
|
cmd += [f'Cookie:{cookie_header}']
|
||||||
return cmd
|
return cmd
|
||||||
|
|
||||||
|
|
||||||
|
@ -527,11 +559,16 @@ def _call_downloader(self, tmpfilename, info_dict):
|
||||||
|
|
||||||
selected_formats = info_dict.get('requested_formats') or [info_dict]
|
selected_formats = info_dict.get('requested_formats') or [info_dict]
|
||||||
for i, fmt in enumerate(selected_formats):
|
for i, fmt in enumerate(selected_formats):
|
||||||
if fmt.get('http_headers') and re.match(r'^https?://', fmt['url']):
|
is_http = re.match(r'^https?://', fmt['url'])
|
||||||
headers_dict = handle_youtubedl_headers(fmt['http_headers'])
|
cookies = self.ydl.cookiejar.get_cookies_for_url(fmt['url']) if is_http else []
|
||||||
|
if cookies:
|
||||||
|
args.extend(['-cookies', ''.join(
|
||||||
|
f'{cookie.name}={cookie.value}; path={cookie.path}; domain={cookie.domain};\r\n'
|
||||||
|
for cookie in cookies)])
|
||||||
|
if fmt.get('http_headers') and is_http:
|
||||||
# Trailing \r\n after each HTTP header is important to prevent warning from ffmpeg/avconv:
|
# Trailing \r\n after each HTTP header is important to prevent warning from ffmpeg/avconv:
|
||||||
# [http @ 00000000003d2fa0] No trailing CRLF found in HTTP header.
|
# [http @ 00000000003d2fa0] No trailing CRLF found in HTTP header.
|
||||||
args.extend(['-headers', ''.join(f'{key}: {val}\r\n' for key, val in headers_dict.items())])
|
args.extend(['-headers', ''.join(f'{key}: {val}\r\n' for key, val in fmt['http_headers'].items())])
|
||||||
|
|
||||||
if start_time:
|
if start_time:
|
||||||
args += ['-ss', str(start_time)]
|
args += ['-ss', str(start_time)]
|
||||||
|
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Reference in a new issue